# Do you think a tool running at 240v works better than at 120v?

8610 Views 39 Replies 16 Participants Last post by  micromind
My brother and I had an interesting conversation, I purchased an old DeWalt radial arm saw wired for 240v and on the way home we got into discussing it can be wired/run at 120v as well. The motor says wired at 240v it draws 6.5 amps, if wired for 120v it draws 13 amps. The saw is wired for 240v from the factory.

On the long ride home came the discussion will the motor run better at 240v or 120v? Person A said 13 amps at 120v is the same as 6.5 amps at 240v they're the same and the motor will run the same. Person B said using 240v you have 2 opposite legs with push & pull and the motor runs better, why else does it come wired for 240v from the factory?

What's your thoughts or can you explain how different the motor will run wired either way? Thanks
1 - 20 of 40 Posts
Person A is correct. Having it wired for 240V allows you to use smaller gauge wires.
Voltage drop is also less when wired at 240V.
It won't make a noticeable difference. It might not make any difference at all. Power out = efficiency x power in.

Correction; I-squared-R loss will be 4x greater at 120v, so the efficiency is better at 240v. This is why cross-country xmission lines run at very high voltages.
I don't know with technical certainty, but I don't believe it matters. I think it just gives you a second wiring option for machines that use up a lot of juice and won't fit on a 120v 15 or 20A circuit. So instead of having to wire a 120v 30A circuit, you can do a 240v 15A.
His post said:

Requires 13 amps at 120 volts, so for him a 15 amp circuit would work 14 ga, 20 amp 12 ga, is better. It's about wire size. It's cheaper to run smaller gauge wires than heavier gauge. I will always use the 220 v if given the option. Some tools require it, usually 3 hp or greater and have the thermal safety switch/voltage drop shut down. BUT... 220v breakers cost more and take up more slots in the panel. So, maybe it's a toss up on the cost?:huh: bill
Motor wil perform the same both ways.

13 amps is a good size draw and will require larger supply wiring then the 240 to run properly.

If it is to be used in the field, the 120 option would make more sense but you would need to use at least 12 gauge cord and try to plug into 20 amp circuits to be safe.
It won't make a noticeable difference. It might not make any difference at all. Power out = efficiency x power in.

Correction; I-squared-R loss will be 4x greater at 120v, so the efficiency is better at 240v. This is why cross-country xmission lines run at very high voltages.

The reason that high voltage is used for transmission lines is because the higher the voltage the lower the amperage the smaller the wire needed.
Voltage drop is also less when wired at 240V.
Only if the same size wire is used and it would have to be a significant length to make a noticeable difference.
Being a guy who owns a table saw that can be wired for 120 or 240, I can say that it definitely works better on 240. It seems to get up to speed faster, and it runs cooler.

Also, I would think that since the current drawn at 240 V is less, there would be less self-inductance in the windings, which may be why it is able to start faster. But that's just speculation on my part.

And by the way, 120 V "pushes and pulls" with a hot and neutral just like 240 V does with two hots. Current flows to AND FROM a neutral just like any hot wire. The idea that the neutral is some kind of dead, limp stand-around is one of the most wide spread misconception about AC.
The reason that high voltage is used for transmission lines is because the higher the voltage the lower the amperage the smaller the wire needed.

Riiiiiigggght..... Because of the I²R losses.
Cost is the reason, I2R losses can be lessened with bigger wire lowering "R" or higher voltage lowering "I", it can also be lessened by lowering the frequency.
In theory it doesn't make any difference. In real life, any dual voltage motor will start quicker, have more power, and run cooler on the higher voltage.

Especially if it's a any distance from the source. Like more than 40'.

Rob
deleted.
If tool is more efficient @ 120v. or 240v.

[The reason that high voltage is used for transmission lines is because the higher the voltage the lower the amperage the smaller the wire needed.]
The voltage drop and consequently the efficiency factor only come into play with long distance transmission lines where distance and size of wires are concerned. In a hand-held, plug-in tool, there is no discernible difference. Whether it runs on 240v. and 6.5 Amps or 120v. and 13Amps. the power output is the same. P=ExI...! Additional proof is the fact that there is no compensation (for losses) at 120v.!!!:yes::no::drinkon't Drink and Drive!!!
no discernible difference
Then I'll go with a theoretical difference.
Higher current = higher operating temp. = shorter service lifetime.

13A into 2 ohms is 2x the heat and 2x the temp. rise above amb. of 6.5A into 4 ohms.

Nah, it's pretty clear that, like many others,he thinks of the grounded neutral as simply a sink for current, instead of the source that it really is on half of the cycle. For some reason, people think that because you ground one wire and equalize its potential relative to grounded objects, that it magically becomes a dead wire.

The fact is, a neutral is a hot wire, and current flows to and from it, depending on which half of the cycle it's on.
Large commercial buildings use 347 volts for the flourescent lighting, for the simple reason that more fixtures can be run from a #14 conductor fed from a 15 amp breaker, than those powered by 120 volts. This means major savings in wire and conduit sizes.
Horse power has a direct relationship with wattage. So if you can achieve the same wattage on small gauge wire by using a higher voltage, less power is lost in the supply conductors and more is available to the load.
This would be of greater advantage on a long feeder run vs. a short run.
1 - 20 of 40 Posts