Calculating Watts Using Voltage X Amps? How About Different Voltage - Electrical - DIY Chatroom Home Improvement Forum

 DIY Chatroom Home Improvement Forum Calculating Watts using Voltage X Amps? How about different Voltage

 Thread Tools Tweet Share Display Modes
03-18-2019, 02:14 PM   #1
Member

Join Date: Mar 2018
Posts: 97
Rewards Points: 192

## Calculating Watts using Voltage X Amps? How about different Voltage

Hi there,
I got confused by this question:
So I have a laptop, on the power adapter it says:
Input: 110v-240v, 3.5 A
output: 12V 11.2 A

1. In order to calculate how many amps its using on my lines , I guess I should look at the input amp, not the output amp, am I right?

2. I want to calculate the watts, should I multiply the Voltage 110V X 3.5A? how about if I go to another country which uses 220V. Then the watts become 220V X 3.5A? I suppose the unit should use the same watt no matter which countries its used.

03-18-2019, 02:55 PM   #2
Member

Join Date: Aug 2011
Location: Washington, DC area
Posts: 719
Rewards Points: 1,341

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

What you have is basically a rectifier and transformer built into one. No matter what voltage you apply, the output is 134.4 watts—the power that your laptop uses. It has an autosensing function to tell it what mains voltage you're giving it.

03-18-2019, 02:56 PM   #3
Super Moderator

Join Date: Mar 2005
Location: Welland, Ontario
Posts: 18,140
Rewards Points: 22,638
Blog Entries: 11

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

Watts out equals watts in minus any power supply inefficiencies from stuff like heat generated.

Those ratings are the maximum the power pack can output not the actual usage. If you want to know what the supply is truly using you need a device like the "kill-a-watt" meter.
Here available from Home Depot.

https://www.homedepot.com/p/Kill-A-W...4400/202196386

03-18-2019, 10:03 PM   #4
Member

Join Date: Mar 2015
Location: Upper mid-west
Posts: 2,732
Rewards Points: 3,942

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

You're feeding a transformer with this device so the math doesn't work the way you think.

03-19-2019, 02:49 AM   #5
Member

Join Date: Jul 2009
Posts: 586
Rewards Points: 926

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

OP, I have found that the rated input can be a wildcard, seriously! They look at inrush and a few other factors. In your case, it was about a max of 135W output but 385W input, assuming 1.0 PF, which is insane.

To answer your question sort of directly, the stated input is pathetic information. It should include amperage for both stated voltages and even at worst case, the wattage looks really high so it would be best to plug into a device such as Joed suggested which will show the actual PF and show the real wattage used.

But to further answer, the actual watts input should remain close to the same so if the voltage goes up x2, the amperage would be approx /2. I use approx because the PF and EFF can change a bit with different voltage.

If you have a laptop PSU really providing the specs as listed at 1.0 PF, you need to get something else. That is ridiculous. My hotrod Dell Precision doesn't do that bad and it has a 2lb PSU that looks like cinder block.

03-19-2019, 02:55 AM   #6
Member

Join Date: Mar 2016
Location: Melbourne, Australia
Posts: 1,194
Rewards Points: 2,336

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

Quote:
 Originally Posted by rockman413 I got confused by this question: So I have a laptop, on the power adapter it says: Input: 110v-240v, 3.5 A output: 12V 11.2 A
Is this Power Supply one that came with the Laptop or is it a replacement?

According to http://energyusecalculator.com/electricity_laptop.htm,
"The power consumption of a laptop depends on the screen size, typically you will find power consumption as low as 20 watts and up to 100 watts when running off the battery. When charging the laptop battery power consumption will increase 10 to 20 percent, we estimate that 60 watts is average power consumption for a 14-15 inch laptop when plugged in."

All the Amp values given are Maxima.
While the output of the Power Supply can supply up to 11.2 A at 12 V, it may never need to do so. (What is specified on the "Name Plate" on the Laptop?)

The device concerned is a Switch Mode Power Supply (SMPS) and, when under a significant load within their ratings, these devices are highly efficient. A well designed example, such as those supplied with Laptops made by reputable manufacturers, should have an efficiency under these conditions of well over 90%.

From this you should see that, even if fully loaded and supplying the "rated" 134.4 Watts, the input current should not exceed 1.24 A when connected to 120 V since
134.4 W / 120 V = 1.12 A
and if the efficiency of the SMPT is "only" 90%, the input current will not exceed
1.12 A/0.9 = 1.24 A

(On 230 V, the current would be only 0.65 A under these conditions.)

However, at "switch on" there will be a "surge" current when the input capacitor in the SMPS is charged via a full-wave rectifier from the "mains" and this current will be higher when the SMPT is first connected to a higher voltage.
Hence the 3.5 A Input rating.

A total dissipation of 135 W is quite large and 60 W is not insignificant.
(Have you felt the heat from a 60 W incandescent lamp?)
It is for this reason that one usually notices that the underside of a Laptop can become "warm" so that ventilation is required in that region!

The SMPS may become slightly warm but, if its efficiency is above 90%, it should never become appreciably warm.

Last edited by FrodoOne; 03-19-2019 at 03:01 AM.

03-19-2019, 03:12 AM   #7
Member

Join Date: Jul 2009
Posts: 586
Rewards Points: 926

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

^^^Just FYI, you can throw that power consumption out the window if working with a good running laptop. Top level laptops run real electrics inside and require some power. My brick is rated to 400W and has been tested to nearly that.

However, my output is rated far more than 135W. I think 360W IIRC.

03-19-2019, 06:55 AM   #8
Member

Join Date: Jan 2012
Location: IL
Posts: 1,396
Rewards Points: 1,234

## Calculating Watts using Voltage X Amps? How about different Voltage

Those ratings are ranges. Power out will be voltage x current as measured, not as labeled. It will vary based on load at any moment.

Input wattage will be the same as output wattage plus the conversion losses of the power supply electronics. These supplies are typically in the 95% efficient range. So if you draw 100W you’ll put 105W in.

If you double the voltage on the input, input current will drop to 1/2 so that wattage “In” stays in step with output (plus losses).

Sent from my iPhone using Tapatalk

03-19-2019, 06:37 PM   #9
Member

Join Date: Oct 2010
Location: Brisbane, Australia.
Posts: 5,178
Rewards Points: 7,316

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

Quote:
 Originally Posted by rockman413 Hi there, I got confused by this question: So I have a laptop, on the power adapter it says: Input: 110v-240v, 3.5 A output: 12V 11.2 A 1. In order to calculate how many amps its using on my lines , I guess I should look at the input amp, not the output amp, am I right? Yes and no, 3.5A Is the highest possable figure, under the worst possable circumstances, such as 110V input, and taking into account the start up surgre, in realility it will be much less, probably half that figure. 2. I want to calculate the watts, should I multiply the Voltage 110V X 3.5A? how about if I go to another country which uses 220V. Then the watts become 220V X 3.5A? I suppose the unit should use the same watt no matter which countries its used.

Should I multiply the Voltage 110V X 3.5A ? YES.

if I go to another country which uses 220V. Then the watts become 220V X 3.5A? NO. it becomes 220V @ 1.75A.

IT BECOMES 385W ( 110 X 3.5 )
385W / 220V = 1.75A.
still 385W but at half the current.

03-19-2019, 11:05 PM   #10
Member

Join Date: Mar 2016
Location: Melbourne, Australia
Posts: 1,194
Rewards Points: 2,336

## Re: Calculating Watts using Voltage X Amps? How about different Voltage

Quote:
 Originally Posted by dmxtothemax Should I multiply the Voltage 110V X 3.5A ? YES. if I go to another country which uses 220V. Then the watts become 220V X 3.5A? NO. it becomes 220V @ 1.75A. IT BECOMES 385W ( 110 X 3.5 ) 385W / 220V = 1.75A. still 385W but at half the current.
Well, no!

As you wrote
"3.5A Is the highest possable figure, under the worst possable circumstances, such as 110V input, and taking into account the start up surgre, in realility it will be much less, probably half that figure."

The 3.5 A figure is the highest possible under "surge" conditions - which will be when the highest (allowable) voltage (240 V) is applied at "switch-on".
(Almost certainly, the SMPS has some input current limiting built into it.)

This current will flow only very briefly - less than 1 half-cycle, no more than 10 ms.

If the SMPS is run at its maximum capacity (134.4 W) on 220 V and is only 90% efficient, the input current (after the initial surge) will be
134.4 W / 220 V / 0.9 = 0.68 A

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are Off Pingbacks are Off Refbacks are Off Forum Rules

 Similar Threads Thread Thread Starter Forum Replies Last Post YerDugliness Electrical 10 02-26-2019 01:45 PM MLMIB Electrical 1 02-21-2016 02:57 PM suprashy Electrical 6 09-16-2007 04:37 PM girl who needs info Electrical 1 08-01-2007 04:48 AM Darylh Electrical 14 06-26-2006 11:57 AM

Top of Page | View New Posts