DIY Home Improvement Forum banner
1 - 9 of 9 Posts

·
Registered
Joined
·
46 Posts
Discussion Starter · #1 ·
Hi. I`m originally from Europe now in Canada and something is puzzling me!
In Europe, for say a TV or Video, to be operated off the remote, it first has to be physically switched on via a button on the unit. When put on standby, an LED (usually red) will glow on the unit indicating that it is in standby and is still absorbing about 35% (I think I read) of the "ON" power. Since moving here I notice there is rarely a "Click" On/Off switch on the unit and it will be powered up/down solely by IR remote. Can anyone tell me if this a better and less energy comsuming method than the European type mentioned above, or is it simply the same , but because we don't see a light telling us there is power being consumed, we assume that when it's off, it's off?
I'd really like to hear your thoughts and knowledge on this one, as it's still bugging me!:notworthy:
 

·
DIYer
Joined
·
910 Posts
In a well designed product, the power usage from remote standby is very low, about a watt or two.

It is a real thing, but people often go a little overboard with this "power vampire" stuff.

Unless you are living off grid, it's probably not worth paying attention to.
 

·
Super Moderator
Joined
·
22,530 Posts
Turning off the VCR with a switch would cause the clock to need resetting and all your preprogrammed recordings would be lost. The power use when 'off' is less than 35% on power.
 

·
Registered
Joined
·
1,680 Posts
Newer electronics that are labeled "Energy Star" must conform to standards for conservation. If your equipment is more than 10yrs old, it may not be compliant. Not much you can do about it unless you want to replace.
Don't unplug, as this will (as stated previously) cause memory to be lost.

If I were you, I would focus more on conserving energy by changing incandescent light bulbs to CCFL. They consume about 1/4 as much power as the incandesecnts but provide the same amount of light.
Don't leave your computer running when you're not going to be using it for an hour or more. Mine draws about 500W, and it does add up over time.
There are many other ways to conserve. I wouldn't worry too much about small electronics like TV and VCR, etc, except that you should not leave them on when no one is using them.

FW
 

·
Registered
Joined
·
86 Posts
No, the TV that's on standby doesn't use 35% of the wattage that it would if it was on. (To the best of my understanding, some really old CRTs had pre-heaters that kept the tube "warmed up" and ready to go at a second's notice, but in later CRTs, electronics were developed that minimized the time it took for a cold tube to deliver a picture. Obviously, LCDs don't suffer from this, and would only use a couple watts in standby mode.)

A TV or stereo on standby should use no more than a few watts each, about as much as a night light bulb. That can add up though. Use a switched power strip for everything on the entertainment center except the VCR and the DVR/Tivo, and you can save a few watts. You can use one for the microwave, too, if you don't rely on its clock and it's not inconvenient to switch on and off. I don't think you'll save much money, but it still feels good saving a tiny bit of energy that otherwise does nothing useful.
 

·
Banned
Joined
·
5,990 Posts
You can check it for your house.

http://en.wikipedia.org/wiki/File:Electrical_meter.jpg

"The amount of energy represented by one revolution of the disc is denoted by the symbol Kh which is given in units of watt-hours per revolution. The value 7.2 is commonly seen. Using the value of Kh, one can determine their power consumption at any given time by timing the disc with a stopwatch. If the time in seconds taken by the disc to complete one revolution is t, then the power in watts is . For example, if Kh = 7.2, as above, and one revolution took place in 14.4 seconds, the power is 1800 watts. This method can be used to determine the power consumption of household devices by switching them on one by one."
 

·
Registered
Joined
·
86 Posts
Using that method to check see whether the TV is using 2 watts or 5 would take a long time, and be a powerfully boring use of time, would it not? (I'd think that's more useful for things like the refrigerator or the central AC.)

What about a plug-in meter like a Kill-A-Watt or Watts-Up?

They're really easy to use. I have one of the above (I can't remember which at the moment). The only problem is that it doesn't seem reliable to test loads under 10 watts or so, like the parasitic or "Vampire" loads of various electronics. Measuring a 40, 60, or 100 watt light bulb, it's dead-on. Therefore, to check "Vampire" loads, I use an extension cord to power a 40 watt lamp, and plug in the device alongside that. (Subtract the light bulb wattage from the readout to get the actual watts used by the device.)
 
1 - 9 of 9 Posts
Top