Issue #138, August / September 2010

Intermediate

- Intermediate

Estimating a system’s maximum power load and then specifying a generator to match or slightly exceed the load estimate is a common practice, but one rife with problems. The *apparent loads (volt-amperes)* might be larger than the *real loads (watts)*, environmental factors may have been overlooked, and the generator’s specifications and features may not live up to the manufacturer’s marketing. Common results include an overloaded generator (and circuit breakers), an unreliable system, a dissatisfied system owner, and tarnished reputations.

This article considers the basics of generator sizing: establishing the load requirements, understanding the difference between apparent power and real power, assessing a generator’s environment, and clarifying generator specifications and features. We’ll also look at some inverter-charger features that can help reduce part of the peak load on a generator, thus reducing generator size (and cost). Understanding these factors can help you correctly size a generator to reliably meet a system’s needs.

The first step is to sum up the power loads that might be operated simultaneously. For this example, let’s say the combination of a well pump, a microwave, a fridge, a washing machine, some compact fluorescent lights, and other loads adds up to 3,600 W.

Although we casually tend to express generator power and loads in watts, generator specifications typically state power in volt-amps (VA). This is an important distinction, as a load with a low power factor may not draw many watts, or “real power,” but its VA load, or “apparent power,” may be higher. For example, a washing machine with a power factor (PF) of 0.5 (the ratio between “real power” and “apparent power”) might consume 500 W, but it’ll draw about 1,000 VA (120 V × 8.3 A) from a power source.

Here’s an example: A 500 W load with a 0.5 PF will draw 500 W ÷ 0.5 = 1,000 VA. 1,000 VA ÷ 120 VAC = 8.33 A. If the PF was 1.0 (i.e., purely resistive), the load current would be (500 W ÷ 1.0) ÷ 120 VAC = 4.17 A.

It’s the 1,000 VA’s 8.3 A that count against the generator’s current limit, not the 500 W. As a result, the generator is required to supply more current to meet the high apparent power demand.

The power factor of common loads varies from quite low (i.e., about 0.5 for the washing machine), to high (i.e., 1.0 for a resistive load). Applying an average power factor of 0.85 to a group of typical loads is a reasonable rule. In our example, the original 3,600 W peak “real load” estimate translates to an apparent load of about 4,300 VA (rounded up to the nearest hundred).

A generator’s power rating is based on its operation at sea level. Generator engine power decreases as altitude increases (thinner air), and a generator’s maximum electrical output drops accordingly. A power loss of about 3.5% per 1,000 feet of elevation gain is typical for gasoline-, diesel-, or propane-fueled generators; natural gas-fueled generators may suffer a power loss of about 5% per 1,000 feet. Additionally, the generator’s carburetor may need to be modified for high-altitude operation, even to achieve the reduced power rating.

Ambient temperature is a related complication, as typical power derating is about 1 to 2% for each 10°F above its nominal rating. Combining the conditions of high altitude and high temperature may require specifying a generator with a higher continuous rating. For example, say you’re in Denver, Colorado (elev. 5,000 ft.), and need a propane generator to deliver 4,300 VA during the summer days with temperatures at 90°F. Compensating for altitude would result in a 17.5% loss. If a generator’s “full” power specification is based on an ambient temperature of 60°F, then available output can be expected to decrease by about 3% at 90°F [(90°F - 60°F) × 1% ÷ 10°F]. So the actual rating needed would be about 5,400 VA.

or Register to post a comment:

Advertisement