determining wattage usage of electronics
I'm familiar with the amps x volts = watts. On many electronics, however, it gives a range for the volts, maybe 100-240 volts and then something like 1 amp for the current. Is this saying that the wattage in, say, Africa is higher than in the US? If it pulls 1 amp no matter, then would it be 120 watts (120 volts x 1 amp) in the US and 240 watts in Africa (240 volts x 1 amp)?
It pulls whatever power it needs and tolerates a wide range of input voltages. The wide range implies that it is a switch-mode power supply, as opposed to a linear supply.
You could infer that the max power draw is 1A(120v) = 120 w and the current draw would probably be a half amp at 240v.
Actual watts of any AC device is Volts X Amps X Power Factor.
Both linear and switching power supplies usually have horrible power factor. Somewhere around 20-50%. Power Factor usually goes lower as voltage increases. Some are corrected, but usually only large ones (over 1 KW).
The only way to know for sure is to use a watt meter, but it's pretty safe to assume the Power Factor to be 50%.
In the above example, it'd be pretty safe to assume that a device marked 100-240 volts and 1 amp would consume less than 60 watts, more like 40.
A better way is to look at its DC output. Say it's 12 volts and 2 amps. That's 24 watts (Power Factor has no effect on DC circuits). Figure about 25% more, and you're pretty close to the input power. In this case, it'd be about 30 watts.
The input amp figures on power supplies are notoriously inaccurate.
|All times are GMT -5. The time now is 11:04 PM.|
Copyright © 2003-2014 Escalate Media LP. All Rights Reserved