DIY Chatroom Home Improvement Forum

DIY Chatroom Home Improvement Forum (
-   Electrical (
-   -   determining wattage usage of electronics (

greenlivn 01-02-2010 06:11 PM

determining wattage usage of electronics
I'm familiar with the amps x volts = watts. On many electronics, however, it gives a range for the volts, maybe 100-240 volts and then something like 1 amp for the current. Is this saying that the wattage in, say, Africa is higher than in the US? If it pulls 1 amp no matter, then would it be 120 watts (120 volts x 1 amp) in the US and 240 watts in Africa (240 volts x 1 amp)?


Yoyizit 01-02-2010 06:23 PM

It pulls whatever power it needs and tolerates a wide range of input voltages. The wide range implies that it is a switch-mode power supply, as opposed to a linear supply.

You could infer that the max power draw is 1A(120v) = 120 w and the current draw would probably be a half amp at 240v.

micromind 01-02-2010 08:05 PM

Actual watts of any AC device is Volts X Amps X Power Factor.

Both linear and switching power supplies usually have horrible power factor. Somewhere around 20-50%. Power Factor usually goes lower as voltage increases. Some are corrected, but usually only large ones (over 1 KW).

The only way to know for sure is to use a watt meter, but it's pretty safe to assume the Power Factor to be 50%.

In the above example, it'd be pretty safe to assume that a device marked 100-240 volts and 1 amp would consume less than 60 watts, more like 40.

A better way is to look at its DC output. Say it's 12 volts and 2 amps. That's 24 watts (Power Factor has no effect on DC circuits). Figure about 25% more, and you're pretty close to the input power. In this case, it'd be about 30 watts.

The input amp figures on power supplies are notoriously inaccurate.


All times are GMT -5. The time now is 11:09 PM.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2015 DragonByte Technologies Ltd.