Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power Technology

Curbing Energy Use In Appliances That Are Off 409

KarmaOverDogma writes "The New York Times has an interesting piece on the slow but steady movement to reduce the power drain for appliances that are never truly turned off when they are powered down. In the typical house that's enough to light a 100-watt light bulb 24/7, according to Lawrence Berkeley National Laboratories, a research arm of the Energy Department. In the United States alone, over $1 billion per year is spent powering devices such as TV's VCR's, Computers and Chargers while they are 'off.' Called 'vampires' and 'wall-warts' by Energy Experts, there has been growing support of their recommendations to adopt industry-wide standards, which would require manufacturers to build appliances with significantly lower consumption when not in use."
This discussion has been archived. No new comments can be posted.

Curbing Energy Use In Appliances That Are Off

Comments Filter:
  • by jacksonai ( 604950 ) <taladon@gmail.com> on Friday November 18, 2005 @10:51PM (#14068398) Homepage
    Seriously, how low can they make the power consumption without raising the price of the item significantly? It seems to me that with Energy Star, eco friendly should already be in the stuff we buy.
    • by l2718 ( 514756 ) on Friday November 18, 2005 @10:56PM (#14068416)
      Easy solution: why does every *****ing appliance need to tell me what time it is?
    • by pla ( 258480 ) on Friday November 18, 2005 @11:06PM (#14068457) Journal
      Seriously, how low can they make the power consumption without raising the price of the item significantly?

      How about "very nearly zero"? Ideally, an "off" device would draw zero watts, but I realize we expect our toys to respond at a moments notice, and that takes some electricity.

      My TV, when off, draws 7 watts. That presumeably lets it remember its settings and watch for activity from the remote control. Those two tasks, however, should draw in the low milliwatts, certainly not more than a full watt.

      Printers also tend to have a very high idle current draw (and by idle I mean cold and in standby not just "not printing". 20-25W seems common for that - More than the total I use for actively lighting my house under normal conditions (assuming three CF lighbulbs at once, fron 5 to 9 watts each).


      Of course, I think we'd do a lot better to worry about the active draw of our appliances. For example, the humble 19" box-fan draws a whopping 150W on high. With only a tiny increase in cost, that can drop by a factor of three, yet no one cares because no one realizes what a massive power sucker they have sitting happily humming in the window.
      • actually, when almost all modern non-flatscreen TVs are turned off, they're basically half-on, so that they will start up faster. something about keeping the tube inside warmed up, I think.
        • I thought it was interesting that in England TVs have a true off setting as well. Basically a real switch on the front of the TV that turns it totally off. I got caught quite a few times by turning the TV off by the switch on the front - and then the remotes wouldn't turn it back on.

          Totally unlike how American TVs tend to work.
          • by jonwil ( 467024 )
            The same is true of australian TVs.

            All the TVs in this house have off buttons on the front that power it off so that all it does is remember settings and a standby mode on the controler.

            The Foxtel (Sattelite TV) box is always on and sucking juice.
            There is a "off" mode but all that really does is shuts down the video output, its still awake and listening to the sattelite (so it can download firmware updates, recieve encryption keys for the channels you are subscribed to and so on as well as notify the centra
      • I've been playing with X10. Wonder how much power an appliance module draws? For things like printers at least, turning it off completely and allowing it to still respond seems possible.

        For a TV though? Even with my littel packard bell IR reciever in the living room to let the server know I'm trying to turn it on, it's just not possible (even allowing for reduced response time). TV's generally don't have a serial port through which they can be turned on, after all.

      • My Canon is good (Score:3, Informative)

        Laser printers take a lot of power when standing by, like copy machines. They keep the innards partially warmed up for fast response.

        But my Canon inkjet (Pixma 8500) is fantastic. On the Kill-A-Watt you can see that within a minute of printing, it drops to less than 1W consumption. Measuring it over time (Kill-A-Watt doesn't measure less than 1W instantaneous) says it takes about 150mW in this standby mode. That's great. I suppose off takes less, I guess. I have mine set to turn off after 20 minutes. Honest
      • My TV, when off, draws 7 watts. That presumeably lets it remember its settings and watch for activity from the remote control. Those two tasks, however, should draw in the low milliwatts, certainly not more than a full watt.

        If it's not an LCD, then there's an 'electron gun' at the back of the tube. It needs to be hot enough so that the electrons jump off it, and they can be formed into a beam that can scan the picture out 50-60 times a second.

        If they didn't keep that hot, then it would take a minute or

        • If it's not an LCD, then there's an 'electron gun' at the back of the tube. It needs to be hot enough so that the electrons jump off it, and they can be formed into a beam that can scan the picture out 50-60 times a second.

          If they didn't keep that hot, then it would take a minute or so to warm up and you'd have to wait. It has to be quite hot, hundreds of degrees, but it's in a vacuum, so it doesn't take very much to keep it up to temp, just a few watts.


          How come my CRT monitors (which I turn on and off with
        • I don't believe modern TVs do this.

          It used to be common years ago, though. My family had a Panasonic TV that would INSTANTLY display a picture when powered on. No warmup time! If you looked through the vent slots when it was off, you could see the CRT cathode heaters glowing very softly. They glowed dimmer than they did when the TV was on, but they stayed warm enough for an instant image.

          The TV had a "vacation" switch on the back that acted as an on/off switch for this feature. When in "vacation" mode, the
  • Maybe... (Score:2, Funny)

    by Pao|o ( 92817 )
    People should unplug their appliances? Switch the main circuit breakers for a total stop of consumption...

    Heck maybe they should buy Macs with better performance per watt. ;)
  • I've read that many VCR's, DVD's, etc. use as much electricity when "off" as they do when in use, with the difference being as little as the amount of electricity used by the electric motors actually used to spin the DVD or move the tape.

    That is just lazy design and very wasteful.

    Some things like a Tivo of course need to remain "on" to record upcoming shows, but even then should be in a deep sleep until needed. However, that is not the case. They sit there, actively sucking down juice 24/7/365.
    • by Spoke ( 6112 ) on Friday November 18, 2005 @11:02PM (#14068442)
      I used to have a digital cable box which sucked down 30-45w all the time (or something, don't have it anymore, ditched it for normal cable). On/off, didn't make a difference. That thing was always hot.

      I've got plenty of wall-worts which suck power, even when nothing is plugged into them, but it's a PITA to unplug them. If the power strips they were plugged into didn't have other electronics plugged in, it'd be easy enough to hit that switch, but who wants a power strip or switch on every single wall-wart they have?

      Replacing the power supplies in my PCs with a high efficiency units from Seasonic made a noticable difference. Power draw was reduced 20-30% all the time which is nice.

      The charger for my Samsung A670 cell phone is the best, it doesn't use any power when plugged in without the phone. It's so light and small, it doesn't have your typical AC/DC converter in there, not sure how they convert wall power to DC to charge it.
      • I don't understand what a cable box needs 30-45W for, anyway. My entire computer -- display, disks, RAM, processor, etc. -- runs on 20W with the screen dimmed a bit, and it's a desktop-replacement laptop. I'm a physicist, not an engineer, so I may be missing something -- but what the hell does a cable box need that much power for?
    • I've read that many VCR's, DVD's, etc. use as much electricity when "off" as they do when in use, with the difference being as little as the amount of electricity used by the electric motors actually used to spin the DVD or move the tape.

      That's not USUALLY the case. The culprits tend to be the ultra-cheap Chinese brands like "Apex", which I've wrestled with many times.

      IMHO, the problem is much more than just the power draw... In an enclosed A/V cabinet, those devices will heat up an enclosed space even wh

  • by cyclocommuter ( 762131 ) on Friday November 18, 2005 @10:54PM (#14068411)
    I have been noticing that more of the latest gadgets like HDTVs, subwoofers, amplifiers, DVD players, etc., now just go into standby mode instead of turning off. I could actually hear the transformer of my subwoofer humming even when it is supposed to be off... The only way to turn it completely off is to unplug the power cord.
  • Meter (Score:2, Insightful)

    by 42Penguins ( 861511 )
    Do I smell the need for a review of an in-between appliance and wall power meter? What are some good ones that you've seen/used?
    • Re:Meter (Score:3, Informative)

      Thinkgeek.com just happens to have such a device: http://www.thinkgeek.com/gadgets/electronic/7657/ [thinkgeek.com]
      • I went to click on that to check and make sure of what it was, saying, "Yeah, when they actually have them in stock." When I got there, I said, "Son of a bitch, they're in stock!"

        Now if I only had the money to actually buy one...
    • by saskboy ( 600063 ) on Saturday November 19, 2005 @12:35AM (#14068872) Homepage Journal
      I got a P3 for my Dad, and have since borrowed it to meter nearly everything in my house just for fun. [Yeah I said fun, this is Slashdot and if I consider plugging things in to test for Wattage use as fun, that's fine.] I got the meter from eBay, it was about $30.

      Here are some of my results:
      Air Conditioner wall unit: 2 hours: 17 minutes 3.12 kWh and 1300W when running.

      Fridge from the 1970s, about 126W when running.

      Microwave from 1980, 888W when running

      Clock Radio from 1986, with the radio on and volume low, 0W measured.

      Computer 1800+ AMD, 3 IDE HD, and Radeon AIW 8500DV /Speakers/Monitor/Modems/Sony VCR, 13" TV, UPS, all typically used, but the computer running 24/7:
      185W approximately
      214 hours 38.62kWh
      1083 hours 188kWh
  • I've always had the following question, and this thread seems the perfect place to get a response: Does anyone know how large the difference in power consumption is for a typical, relatively modern, let's say 100 watt stereo when it is turned off (or according to the article, idling), vs when it is turned on under vid/aux mode, but with the volume completely down? (this is assuming no discs are spinning etc).
  • by bergeron76 ( 176351 ) on Friday November 18, 2005 @10:55PM (#14068415) Homepage
    How about a switch in each room that turns off all the crap inside of it?

    I've audited my home for vampires, and I've since been desoldering leds, and using X10 modules to turn off VCR clocks (I have both a watch and a cellphone - but thanks for the valueadd of a clock on my microwave, coffee maker, vcr, phone, scale, etc.)

     
    • The problem is that there are many devices which have very useful functions while "off". PVRs are one obvious example, but what about IR controlled components. The whole point is to be able to turn them on remotely. Since turning them on requires a detection circuit, that would require a standby mode. Turning off your coffeemaker because yo udon't need the clock is fine, but it won't turn itself on in the morning to make coffee if the outlet is switched off. Also, a higher energy standby mode means a faster
    • Desoldering LEDs? WTF? I'm sure that'll save you bundles...
    • by Waffle Iron ( 339739 ) on Saturday November 19, 2005 @12:41AM (#14068905)
      I've since been desoldering leds, and using X10 modules to turn off VCR clocks (I have both a watch and a cellphone - but thanks for the valueadd of a clock on my microwave, coffee maker, vcr, phone, scale, etc.)

      The problem isn't the clock in the device. The clock logic and LED display use up a tiny fraction of one watt. The problem is the power supply.

      Take the microwave for example: people expect to be able to walk up and start punching in a cooking time without first having to push a huge mechanical power switch. (The manufacturer doesn't want to design in a costly extra power switch either.)

      This means that the electronics need to be powered on at all times. That wouldn't be a problem, but most appliances use a simple transformer to drive their power supplies. Inexpensive transformers are leaky even when they are supplying no current to the secondary, so the microwave's transformer is probably wasting a couple of watts at all times. The solution to the problem is a better power supply, not omitting the clock or desoldering LEDs.

      Some recent wall warts and power bricks that I've got weigh almost nothing and don't seem to get hot. I presume that they've put in switching power circuits and eliminated the 60Hz transformer altogether. Putting that kind of power supply in every appliance would go a long way towards solving this problem.

  • $4 a person? (Score:5, Interesting)

    by readin ( 838620 ) on Friday November 18, 2005 @10:57PM (#14068417)
    In the United States alone, over $1 billion per year is spent...

    The US has about 300 million people. So that's less than $4 per person per year, or 16 bucks for a family of 4. Doesn't seem worth worrying about to me. A family of 4 spends more than that on a single tank of gas for their car.
    • Re:$4 a person? (Score:2, Insightful)

      It's not just the individual cost, it's the collective cost on the environment and the over-taxing of an already strained electrical system.

      For instance, if every household in America replaced one normal light bulb with a compact flourescent it would have the same environmental impact as taking 1 million cars off the road.

      There are plenty of simple actions that in and of themselves don't matter. But when multiplied by the number of people involved can spiral out of control.

      If one person goes to the beach a
    • Re:$4 a person? (Score:5, Interesting)

      by FFFish ( 7567 ) on Friday November 18, 2005 @11:25PM (#14068538) Homepage
      No shit. Over the past couple years I've replaced a furnace that has dropped my natural gas usage by over 40%, moved to CFLs as lightbulbs burn out, installed a smart thermostat, wrapped my hot water tank, and am making plans to renovate the kitchen, replacing an inefficient refridgerator, stove (goin' gas!), and dishwasher.

      I'm hardly going to feel bad because my television, stereo, and a few wall-wart power adapters are the equivalent of leaving a lightbulb on. Good god, let's worry about something that really matters, like why this model year's cars use almost as much gas as they got back in 1965. We've gained only a one mile per gallon in efficiency every five years?! WTF?
    • Re:$4 a person? (Score:5, Informative)

      by rtaylor ( 70602 ) on Friday November 18, 2005 @11:39PM (#14068590) Homepage
      Ahem.. That is $4 per person per year for the TV and VCR only (two devices).

      Microwave, washer, dryer, printer, phone, monitors, lamps, battery chargers (cell phone, laptop, etc.), cradles, etc. also take energy when in standby mode -- or what most people call off.

      They list 1000kw per year per household, so at 7 cents per kw that works out to closer to $70 per year. If it adds between $0.50 and $1 to the manufacturing cost to reduce that by 50% it would probably be a net-win for most devices plugged in for most than 6 months.
    • There are sillier programs. There was one energy program that tried to make ceiling fans more efficient, which is silly when A/C systems are in use that consume a lot more power. Even the example in the article compared a ceiling fan with incandescent bulb fixtures against an A/C, when a simple improvement would be to use bulb-shaped flourescents, which has little to do with the cooling efficiency of a fan.

      I'd say that it is something to look into anyways, though it would be nice to be assured that concer
    • Optimization (Score:2, Interesting)

      by everphilski ( 877346 )
      Your absolutely right. There are other very simple things, very cheap things - for example, insulative lining around your windows and doors, double paned windows, etc - that will save you so much more. $4 a person is a piss in the lake in comparison. My take? its a marketing scheme to get us to replace our existing appliances.
       
      -everphilski-
    • From the article summary (opting out of the free registration, I didn't read the article), the average household wastes 100 watts continuously on devices that are off. That's 2.4 kwh per day, or 876 kilowatt hours per year. Assuming electricity costs 10 cents per kilowatt hour, that would be $87.60 per year. Assuming there are 100 million households in the US, that would amount to 8.76 billion dollars. Does "over $1 billion" in this context actually mean "about $9 billion"?
  • Surge Protectors (Score:2, Informative)

    by Rinnt ( 917105 )
    What about using surge protectors to make sure your stuff is "off"? That's what I use for my whole network - okay, so it's only two computers. But still, everything runs to a master switch. When stuff is done for the day I hit the kill switch... I would say this cuts the power to the devices since my LAN link lights all go dead.
  • Wind power (Score:3, Interesting)

    by saskboy ( 600063 ) on Friday November 18, 2005 @10:57PM (#14068419) Homepage Journal
    I'd like someone to invent small wind generation units, that people can mount on their roof, and it would provide power to "vampire devices" so that your TV, VCR, and other remote controlled devices can have power, but not use anything from the power grid until they are turned on.

    Solar power would work too, but I suspect wind would be more powerful with a small generator, but anyone is free to correct me if they know better.
    • Solar power would work too, but I suspect wind would be more powerful with a small generator,

      It depends on where you live. In the middle of the sahara solar power would be the way to go. West coast of Ireland: can't go wrong with wind power. Here in Mebourne a combination of the two works quite well. Communities on French Island, east of here all use solar/wind power systems for their homes.

      I have considered building a simple DC supply out of a solar battery charger and a car battery. It should be good en

    • I'd like someone to invent small wind generation units, that people can mount on their roof, and it would provide power to "vampire devices" so that your TV, VCR, and other remote controlled devices can have power, but not use anything from the power grid until they are turned on.

      Until the wind stops. Then your VCR starts blinking 12:00

    • I'd like someone to invent small wind generation units, that people can mount on their roof, and it would provide power to "vampire devices" so that your TV, VCR, and other remote controlled devices can have power, but not use anything from the power grid until they are turned on.

      You can climb the 40 foot ladder, while I'll stay safely on the ground.

  • Kill A Watt (Score:5, Interesting)

    by DigitalRaptor ( 815681 ) on Friday November 18, 2005 @10:58PM (#14068420)
    I've long since wanted to get a Kill A Watt Meter [google.com] to check the power consumption of the equipment I have. At $35 it's a bargain.

    With electricity prices skyrocketing I'm noticing which lights are on the most and replacing them with full spectrum compact flourescents [fullspectr...utions.com] that have a really nice, white light but use about 1/5 the juice.

    • I've got a Kill-A-Watt, it's pretty useful. You'll go around measuring how much various components draw on/off.

      My biggest beef with Compact Flourescents is that some of them take a while to warm up and produce usable light. It's most noticable with the ones I've got which have a plastic cover around the light to make it look like a "normal" light bulb (important for the spouse when the bulb is exposed).

      I wish I could find some that lit to near full brightness in a few seconds instead of the 15-30 they take
      • Re:Kill A Watt (Score:3, Informative)

        by pla ( 258480 )
        I wish I could find some that lit to near full brightness in a few seconds instead of the 15-30 they take to warm up.

        In this case, "you get what you pay for".

        I have all CFs bulbs in my house, and have noticed that the $5/3-packs from WallyWorld or Home Depot tend to take a second to start and then a long time to warm up, while the $7-each ones come on at full brightness just as fast as an incandescent.


        Personally, I'll deal with the 30-second delay. ;-)
      • I highly recommend Longstar bulbs. They turn on instantly and are at full brightness in less than five seconds. I have them all over my house and in the fixtures outside. They stand up pretty well to Seattle temperature fluctuations.
        http://www.soslightbulbs.com/shop/customer/home.ph p?cat=1038 [soslightbulbs.com]
        I prefer the 30W super daylight spectrum (6400K color temp), myself:
        http://www.soslightbulbs.com/shop/customer/product .php?productid=131150&cat=1038&page=2 [soslightbulbs.com]

        Also, for your lower-wattage needs, technology has evo
      • Why? When I turn on a light it is for one of two reasons. Either I'm passing through the room, or I'm going to spend time in it. If I'm passing through I just need a nightlight to keep my from tripping on the cat. If I'm going to spend time in the room, by the time I sit down and arrange my books there is full light, which is when I need it.

        It would be nice if the cheap lamps were instant on - I agree. However 30 seconds is a big deal.

    • I just use a multimeter to measure the current draw. Easiest way is to plug the device into an extension cord off-set, so only one prong is actually plugged in. Then set your meter to current measurement, and bridge the connection with the two multimeter probes. current x voltage = watts. Saves you $35 if you got a multimeter laying around or can borrow one.
      • current x potential (voltage) / power factor = power (Watts)

        You cannot measure power factor with a multimeter. A Kill-A-Watt measures both VA (what you measured) and power (Watts) and since it knows both, power factor too.

        Additionally, with your system if a device has a large surge current it might blow your meter. Or you might hurt yourself. Better to use an inductive current clamp (around only one wire, you cannot pass the entire power cord of line, load and ground through it), since it cannot overload in
        • current x potential x power_factor = power
          Power factor varies from 0 (pure reactance) to 1 (pure resistance) and is equal to the cosine of the difference between the current phase and voltage phase. Other than that minor goof, a very nice write-up.

          The most compelling reason for using the "Kill-A-Watt" over a multimeter is safety. We had someone at work who wanted to brew up a power line wattmeter, and I persuaded him that it would be cheaper and much safer to buy a ready made wattmeter. The project did ge

    • I got a Kill A Watt about a month ago. Quite a cool device.

      I recently built a MythTV box, and since it is hooked up to the TV, I have no use for a graphics card (except for the fact the system wont boot without one....and the occasional debugging). So I took the Kill A Watt, plugged the computer into it, and swapped every card in one at a time until I found the one that gave me the lowest consumption. Of all the spare graphics cards, the one with the highest consumption was a banshe, at about 20 or 25 watts
    • I highly recommend them.

      Lets me find things like my new Athlon X2 4200+ system takes less power than my old P4 3.0GHz (esp. at idle). And my Athlon XP 1700+ before that takes less than either of them, even at idle when they others are at full bore.

      Sadly, it also tells me my P4 3.0GHz took 5W when "off", and 5W when in suspend to RAM (S3 standby), but my new Athlon 64 X2 takes 7W when "off" and 12W when in suspend to RAM (S3 standby). That'll cost me $7/year just to have this computer.

      I know the power consum
  • thats it? (Score:3, Insightful)

    by Joffy ( 905928 ) on Friday November 18, 2005 @10:58PM (#14068423)
    One 100 watt light bulbs worth? Making everyone use more efficient lights would save a lot more than that. Filament based lights have got to go! My gadget's LEDs are more than enough to light my room!!
  • power strips (Score:4, Interesting)

    by Anonymous Coward on Friday November 18, 2005 @10:59PM (#14068425)
    All of the power strips I see in Japan have switches next to each socket to turn off the socket for each individual appliance. Looks like a good solution to me.
    • Re:power strips (Score:2, Interesting)

      by entirety ( 909951 )
      Mod Parent down... I lived in Japan for many years... Switches next to sockets are not common throughout Japan. Those who think the Japanese are ahead of the US at everything have never been outside of Tokyo. The people are just as hosed, if not more, as the US folks are. I can tell you that they use a lot of fluorescent lights though. But with all the neon signs perhaps it is a break even.

      Don't hate me... Love me for breaking your paradigms... Now, give me one of dem nickels!
  • Wall Wart Pet Peeve (Score:5, Interesting)

    by Ritz_Just_Ritz ( 883997 ) on Friday November 18, 2005 @11:03PM (#14068443)
    My pet peeve is the almost unlimited combination of wall wart connectors, polarity, output voltage, output current, etc. Wouldn't it be so much easier if there was some sort of standard wall wart power supply with a standard connector? If you're a gadget geek, you wind up with a rather unwieldy pile of these things in your home and many of them invariably wind up staying plugged in all the time. You can tell they're using energy since they're always a bit warm to the touch, even when the actual device that's supposed to use it isn't even plugged in. Once they standardize the form factor, perhaps they could actually enhance them to the point where quiescent energy usage is much lower.
  • by Rufus88 ( 748752 ) on Friday November 18, 2005 @11:15PM (#14068491)
    I unplug all my clocks when I'm not using them.
  • by digitaldc ( 879047 ) * on Friday November 18, 2005 @11:23PM (#14068528)
    "Energy efficiency experts say the answer lies instead in industry-wide standards, which would require manufacturers to build appliances with low consumption when in standby."

    Wouldn't it be nice if the 'Energy experts' spent more time promoting the most obvious source of free power in (and out of) the world; solar power?

    Installing just a few solar roof shingles would easily off-set the cost of vampire appliances.
    see: http://www.oksolar.com/roof/ [oksolar.com]

    Not only do they generate power for your whole household, they end up paying for themselves when you produce a greater current than you are taking in. The energy is sent back to the power line and the energy company pays you.
    • by bluGill ( 862 ) on Friday November 18, 2005 @11:52PM (#14068660)

      If you live in southern California this is a good idea, paybacks in as little as 4 years. (Including government subsidies) If you live in MN like I do, you are looking at a 30 year payback if all goes well - which is longer than many roofs last. If you shovel the roof you might do better, but that is both dangerous (Don't fall off the roof), and harmful to the panels (which tend to be easily damaged when walked on).

      If you live in areas with a lot of sun you are stupid not to investigate this. Many people live in climates where they do not pay off.

    • by Latent Heat ( 558884 ) on Saturday November 19, 2005 @12:09AM (#14068757)
      Solar and energy conservation are not exclusive options. However, solar is about 5 dollars per peak watt, and that isn't even talking about finding a roofing contracter you can trust to put the panel up on the roof without introducing roof leaks. If you average 6 hours peak sun a day, you are talking 30 dollars per average watt for solar panels.

      Suppose a transformer wall wart uses 4 watts and you can replace it with a solid-state ferrite switcher that uses .5 watts. It would take nearly 100 dollars of solar panel to do the same thing.

      Oh, and about back feeding the line, you could probably get away with a small amount of back feed and just don't tell anyone about it. If you put up a serious solar panel setup and plan to back feed enough that the power company will notice, they get real, real huffy about that. In fact, they are supposed to by law buy back your power, but they really hate that. I was at an alternate energy fair where the local utility was touting their wind mills (you pay extra for the bragging rights of getting "green power"), and when I asked the utility dude about home solar panels and back feeds, he was telling me about all kinds of restrictions (two meter arrangements where you pay more for incoming and get back less on outgoing), and when I mentioned the laws regulating buyback, the fellow got in my face an I thought I would get punched. So much for committment to green power.

  • Picking a lamp in your house with a 100 watt bulb and never turning it on again. That seems it would be much simpler.

    /yes, i know...

  • by Phat_Tony ( 661117 ) on Friday November 18, 2005 @11:28PM (#14068554)
    I remember during the early 90's, when the appliances that wouldn't turn off started to take over. The first appliances I remember that wouldn't turn off were VCRs from the mid 80's- they offered the feature of being impossible to turn off without unplugging them, and always helpfully flashing "12:00" on the display when plugged in. As my parent slowly replaced old appliances with new ones, I remember tech support phone calls from my parents:

    "How do I turn it off"
    "Press the 'power' button"
    "I did that, but there's still a light on."
    "That's the 'standby' light."
    "The what?"
    "That's the light that comes on to tell you that the appliance is off."
    "!!???"
    "I don't know why."
    "You mean one light or another is going to be on the entire time we own this appliance, unless we unplug it?"
    "Yep. Get used to it. Everything's that way now."

    It used to be that the power button was just a switch that did the same thing as unplugging it, to save you the inconvenience. They've now thoughtfully removed that feature; if you really want it OFF, you have to go back to unplugging it again.

    All of this coincided with a preponderance of clocks. I can see two engineers somewhere having a conversation:
    "Have you noticed how cheap digital clocks have gotten?"
    "Yeah! Let's put them in everything!"

    I remember when my neighbor's old analogue kitchen wall clock died, so he said he'd better shop for a new one. I asked him if he really needed another, because there were already digital clocks on his coffee machine, oven, range top, microwave, radio, and even toaster oven. Pretty much everything that used electricity in the kitchen except the refrigerator and mixer had their own LED clock.

    They still replaced the wall clock. It's the only one they looked at. It came as news to them that they already had six clocks in their kitchen. They'd never noticed them.

    Feature-creep didn't originate with software.

  • by SysKoll ( 48967 ) on Friday November 18, 2005 @11:39PM (#14068592)
    Quoteth the NYTese: In the typical house that's enough to light a 100-watt light bulb 24/7

    Translated in human language: In the typical house that's 100 W.

    By definition, watts are independant of time. Joules are a quantity of energy, and 1 watt = 1 Joule per second.

    It's sad to see that the tech section of one of the US's largest newspaper feels the need to dumb down its writing, or maybe just hires incompetent writers. Drool-proof paper cannot be far.

    On the plus side, no units in the article were compared to a football field or a the Library of Congress, for once. That's progress, I suppose.

  • welcome to 2001 (Score:4, Interesting)

    by YesIAmAScript ( 886271 ) on Saturday November 19, 2005 @12:01AM (#14068715)
    http://www.extremetech.com/article2/0,1558,87100,0 0.asp [extremetech.com]

    George Bush campaigned for this stuff back in the early days. I may not like the guy much, but he was right about this. Companies consistently make their products more power inefficient just to make them cheaper, because very very few people pay attention to efficiency of appliances. They save a few pennies on day 1 and give it back and then some every year.

    Energy Star has been incredibly effective. The cheapest refrigerator you buy is within 80% as efficient as the most efficient models. This is definitely not true with many other classes of devices (like lights!).

    Bush also inadvertently coined a great spoonerism about power-stealing vampires when talking about this initiative.
    • http://www.extremetech.com/article2/0,1558,87100,0 0.asp [extremetech.com]

      Bush also inadvertently coined a great spoonerism about power-stealing vampires when talking about this initiative.

      I don't get it -- what's the spoonerism? It's not "wall wart" is it? It can't be "wall wart" because that article is from July, 2001, and the term "wall wart" was around many years before that. Go search Google Groups -- I found two uses of the term as far back as 1989.

      And for what it's worth, a transformer that is inline (a

  • light pollution (Score:4, Interesting)

    by bcrowell ( 177657 ) on Saturday November 19, 2005 @12:15AM (#14068781) Homepage
    There's also a problem with light pollution in cities. Too many businesses leave bright lights on all night, which lights up the sky and makes it impossible to see the stars. Amateur astronomers have to drive farther and farther to get to dark skies. I'd imagine this is a much bigger waste of energy than people's VCRs always keeping an LED on. A few towns have passed light pollution ordinances.
  • What I would like to see is mandatory labelling. I want Staples and Best Buy to adopt some sort of standard energy impact sticker, like the nutrition labels the FDA requires for food. Ever bought a window air conditioning unit from Sears? All models are displayed with a big yellow sticker from the EPA listing their effeciancy. I bought the one with the highest effeciancy and was comfortable all summer long. What if home electronics were all displayed with something in the same vein? Let's make this into a p
  • by bigbigbison ( 104532 ) on Saturday November 19, 2005 @01:50AM (#14069164) Homepage
    I typically plug most of my stuff into powerstrips and with the exception of the cable box which takes forever to restart, I turn the powerstrips off every night before I go to bed. Most of these components I've had for at least 5 years and none of them have any problems working as soon as I turn the powerstrip back on. Even my reciever remembers all of its settings and I've left the powerstrip turned off on it for weeks while on vacation.

    Of course this raises the question, "if they work fine after having no power sent to them, then why are they made to draw power even when they are off???" Can anyone answer that?

    I got in the habit when I lived in the dorms in college and could hear the stuff humming while I was trying to sleep and just kept doing it ever since. I suppose it is like these electronic thermostats that seem so popular. My family always just turned it down before the last person went to sleep at night...

    Realistically, there are tons of other places that waste much more electricity than appliances. Basically all the buildings at all the universities I've either studies or worked at leave lots of lights on 24/7. During holiday breaks, I've even tried to turn off the lights in the hallway of our dept. office only to come back the next day to find that someone has turned them back on and left them on. Of course that isn't even mentioning the fact that the heat in our building can't be adjusted and so during the winter it is so hot we open the windows in the hall and turn the AC on in our offices (and I do just turn the "Fan" part of the AC unit on since it is winter and cold out, but many of others do actually put the AC on high)...
    or the fact that we are told not to turn off our office computers, or the people who live four blocks away but still seem to need to drive to the office...

    While I haven't done any calculations on it, I would imagine that fixing the heating in our department building would save more energy than all of the department members unplugging their electonics while not in use...
  • by cdn-programmer ( 468978 ) <<ten.cigolarret> <ta> <rret>> on Saturday November 19, 2005 @01:57AM (#14069190)
    I think I read somewhere that 60% of homes are heated by methane (CH4) (Natural Gas). Last I checked today the price of Nat Gas is $11.41 and I expect this is at the Henry Hub and the units are MM-btu's (ie 1 million btu). The conversion factor between MM-btu and GJ is 1.054615. For some reason the "units" program shows this conversion factor as 1.0550559. This is close enough for the girls I go with.

    Since there are 3600 seconds in an hour an energy consumption of 1 kWh is equivalent to an energy consumption of 3600 kilojoules. Eg - for the units impaired we do this:

    kWh => k(W)(h) => k(j/s)(h) => k(j/s)(3600s) => k(j)(3600)(s/s) => 3600 kj = 3.6*10^6j = 3.6e6j (the later being scientific notation)

    We know the price of NatGas is 11.41 for 1 MM-btu (10^6 btu = 1e6 btu)

    multiply by 1.054615 and we get about $12 bux = 1Gj = 10^9j = 1e9j

    divide by 1000 to get: $0.012 = 10^6j = 1e6j

    but: kWh = 3.6e6j = 3.6(1e6j) = 3.6 * 0.012 = 4.3 cents.

    This is a wholesale price for natgas. Wholesale prices for electricty are about 5 cents per kWh. Delivered prices are about 2x in both cases as well. Check your energy bills.

    What this shows is that at present prices, the cost of energy from a source such as delivered natural gas is about the same as the cost of energy from electricty. When you consider that electricty can be used to drive a heat pump (whole house negative fridge) at an overall thermal effciency of upwards of 300% if earth or lake coupled then it is actually cheaper and more energy efficient to heat our homes with electricty rather than natural gas. Ditto with oil.

    Now a standard incandecent heater (light bulb) is upwards of 90% efficient. IE - when you run your incandecent heater you leak about 10% or so of the energy in the visible spectrum while the vast majority of the energy is retained as usable heat. Much of the visible light falls on walls and floors and furniture and people and pets and most of this energy is also salvaged eventually as heat. Only that small portion which leaks out of windows is actually lost.

    Hense we can say that the heating effciency of an incandecent lightbulb is pretty close overall to 100% so it really is pretty close to being on par with natural gas and other energy sources such as oil.

    What this means is that the energy loss from appliances offsets the energy consumption from the furnace and the prices are so close it is more or less a wash. If we check the futures prices on Natural Gas come March we may find the old 100 watt light bulbs are cheaper.

    -------------

    What these calculations demonstrate is that in the winter heating season the only path to energy conservation is through attention to the building envelope. Energy efficient appliances accomplish next to nothing (in colloqial French Canadian this is loosely translated to SFA).

    However in the cooling season in summer the story is a lot different. These applicances during summer add to the cooling load of the building and this load is very considerable. Still in summer if we pay attention again to the building envelope then we can eliminate a huge percentage of the energy that must be pushed out of the building against the thermal gradient by the HVAC system. Note that in this case the Delta-T for an air coupled system might be sitting at say 40F while the Delta-T for an earth or water coupled system might only be 10F.

    So energy efficient appliances and lighting starts to make a great deal of sense once we get the building envelope insulation up where it should be which in Northern States and Canada is probably north of R50 in the walls and R70 in the ceilings. Then we can use the electricty saved to run a small earth or water coupled HVAC/Heat pump system and in so doing more or less eliminate the dependancy on Natural Gas and heating oil.

    However with the typical homes we live in - especially in the winter time - its a wash. Pay for your energy as electricity or pay for it as Nat Gas.

For God's sake, stop researching for a while and begin to think!

Working...