View Single Post
  #5 (permalink)   Report Post  
Posted to rec.food.equipment
Randall Nortman Randall Nortman is offline
external usenet poster
 
Posts: 71
Default Microwave throwing breaker

On 2008-01-02, Peter A > wrote:
> In article >,
> says...
>>
>> "Randall Nortman" > wrote in message
>> >
>> > But none of
>> > that changes the fact that the microwave should not be drawing >16A
>> > for more than a fraction of a second, right?

>>
>> The formula is: Watts ÷ Volts = Amps
>> Assuming 1780 Watts, a perfect 120 volts = 14.83A That may be true in
>> perfect conditions as tested in the laboratory. The electric company is
>> allowed to vary the voltage and it may be a bit higher or lower at times,
>> especially during heavy use in the summer.
>> So, given the same 1780 Watts, we can have:
>> 1780 ÷ 110 = 16.18 Amps. This is within the normal and acceptable range. I
>> doubt you'd have trouble with a 20A breaker and that is why the code
>> requires a safety factor.
>>
>>
>>

>
> If the voltage goes down from 120 to 110, the current drawn will be
> less, not more. The wattage will be less too.
>
> current = voltage / resistance
>
> Resistance is a constant property of the microwave.


This assumes that the microwave is a passive, linear load, which is
probably not true. Many loads are not linear, and are designed to
draw constant power. This means their effective resistance
(impedance, actually) will decrease as voltage drops to keep power
constant (which requires drawing more current). Switching power
supplies for computers are a great example of this.

Now, I'm not saying I know whether microwaves are linear loads or not,
but I suspect they are not, and might draw more current as voltage
drops in order to maintain constant power.

--
Randall