Electricity consumption of computers

Discussion in 'hardware' started by razorboy, May 11, 2016.

  1. razorboy

    razorboy Registered Member

    Joined:
    Dec 26, 2010
    Posts:
    155
    Location:
    North
    It's my understanding that computers use little juice. However, our electricity bills have been rising and the person who runs the kitchen here has accused me of being the culprit. I have an i5 desktop custom, built as a DAW, fairly good horsepower and 8 GB of ram. Am I wrong in thinking that it would not consume a noticeable amount of juice when running, and almost none in sleep mode?

    Thanks for any wisdom.
     
  2. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,024
    It's probably on the order of 50-100W while running, and toward the high side or more when CPU is maxed. If you have HDDs, each one adds ~5W. In sleep mode, maybe 5-10W total, or even less.
     
  3. amarildojr

    amarildojr Registered Member

    Joined:
    Aug 8, 2013
    Posts:
    1,962
    Location:
    Brasil
    Buy a "kill-a-watt" and use it to "kill-a-myth" :) See who's using more power. It's the only real and scientific way of doing it.
     
    Last edited: May 11, 2016
  4. anon

    anon Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    4,085
    +1
     
  5. funkydude

    funkydude Registered Member

    Joined:
    Apr 5, 2004
    Posts:
    6,851
    It depends on a few things, such as the wattage rating of the PSU in your computer. Also, if you're using Windows, make sure the power settings are not set to "High Performance" and instead are set to "Balanced". The former will disable the automatic clock of your CPU and will force it to run at full speed always, wasting power.

    When you state the bills are rising, do you mean just the price, or the actual usage has been rising? You might want to double check that.
     
  6. Nebulus

    Nebulus Registered Member

    Joined:
    Jan 20, 2007
    Posts:
    1,582
    Location:
    European Union
    For instance, my desktop computer consumes somewhere between 110W and 250W (depending on the load), and that translates to 2.6 to 6 kilowatts a day. I wouldn't call that little.
     
  7. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    The person who is running that kitchen is an idiot if he thinks a computer is the big power hog in a kitchen. Unless running off natural gas, the ovens, grills, and stove tops consume many more times power. So do the fryers, toasters, overhead lights, heat lamps, dishwashers, and more. And note the refrigerators and freezers are nothing but electricity guzzlers - especially if opened many times each day! And then there are the lights in the dining area, signage and outdoor (including parking lot) lights. Oh, and that does not mention air conditioning either.

    Noticeable is a relative term. But in terms of the rest of restaurant, it would be barely noticeable, and yes, almost none when in sleep mode.

    I recommend a good UPS with AVR. Most of the better UPS have an LCD display read-out that will show you how much power the UPS is pulling from the wall in the same manner as a kill-a-watt meter does. But at the same time, the UPS protects your computer and other attached devices, like your monitor, printer (if ink-jet) and your network equipment) from damaging power anomalies. Even many lessor UPS without LCD information panels come with software to monitor the UPS activity and power use.
    I would! At least compared to appliances in a kitchen - and especially when compared to a commercial kitchen. I note even in an average home, the refrigerator typically is the largest (by a BIG margin) energy consumer.

    Let's assume your computer (and monitor and network gear) consumes an average of 200W (and that is a high estimate, btw, even with an i7, 16GB of RAM and a hungry graphics card). As seen by this kWh calculator, 200W @ 24 hours per day = 4.8kWh. As seen here, Hawaii has the highest energy costs in the US with an average of $.3334. So 4.8 x .3334 = $1.60 per day. That is chump-change compare to the other energy hogs in a kitchen - and note that is assuming the computer is pulling an average of 200W for 24 hours a day and never sleeping (a very unlikely scenario).

    @ Nebulus, it is highly unlikely your computer is awake and running most of the day either. And according to Google, the average cost of electricity in Europe is a lot less than Hawaii's. In 2013, the average in Europe was just €.208 or ~$.24 per kWh. So even if your computer is running 24/7 and pulling 6kWh per day, that's just $1.44 (or just €1.25) per day - again, chump-change when compared to a commercial kitchen.

    Just FTR, this computer I am using now has an i5 6600 pushed to 3.9GHz, 2 x 8GB DDR4 RAM, R7 370 2GB OC graphics card, and 2 SSDs. Also connected to my UPS is my cable modem, my wireless router, and a 4-port Ethernet switch. AND, also connected to my UPS are two 24" LCD monitors! And according to my APC UPS display panel and the PowerChute software, while typing this, with my email client and 8 browser windows open and Pandora streaming AC/DC, I am consuming just 108 watts. When gaming, it will jump up, but I am assuming that computer in the restaurant is not used for gaming - at least no for hours on end.
     
  8. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,549
    razorboy, unless you give the full specs of your system cpu, gpu(s), hdds and peripheral devices (monitor & sound system) no-one can answer your question.
    e.g.
    I have a NUC5PPYH that consumes 1w when sleeping, 6W in normal circumstances and 15W on load (no peripherals or monitor attached on it).
    And I have another system that consumes 3W on sleep, 70-90W in normal circumstances but on heavy load it can go over 350W (games, video editing, etc.), plus another 25-30W of the monitor consumption.

    So depending on the specs of the system and the use (DAW= Digital audio workstation ? ) it can add to the electricity bill... but even if you have a beast of a system (700-800W on load) the added consumption would be small compared to the other kitchen appliances as Bill stated above.

    Panagiotis
     
  9. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    First, he did say it was an i5 based computer so ball park consumption can easily be determined. But also, because this is about a computer in a restaurant where we can assume there are large refrigerators, freezers, dishwashers, lighting requirements, and other power hungry devices, we can answer his question. His kitchen manager does not know what he is talking about.

    Even a little kitchen grade blender will consume more power than that computer - no matter how you calculate it.

    2HP = 1491.4 watts
    11.5A x 120V = 1380 watts
    11.5 x 120 x .7PF = 966 watts (where PF = power factor)
    11.5 x 120 x .6PF = 828 watts​
     
  10. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,549
    No he didn't. A custom build i5 could be a system with no extra gpu which could consume depending on the i5 generation and model from a total 70-75W to a 110W or it could be a 3way sli gtx 980-TI configuration that consumes over 850W.

    Panagiotis
     
  11. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    Yes he did (give a ball-park description). Let's be a little realistic here, okay? Is it realistic for computer used in a restaurant kitchen to have $1800 worth of graphics cards? No!

    I already showed where a small mixer consumes almost as much as your unrealistic (for this scenario) 850W. A simple 4-slice bread toaster will nearly double that. Microwave ovens will too and many commercial grade MW ovens will use more than that.

    Also note he stated the computer is used as a DAW (I am assuming digital audio workstation) computer. These require a nice audio board and a decent amount of RAM, but any old graphics solution will do.

    And even if he did have multiple power hungry cards in that system, with a kitchen manager complaining about power costs, it is unlikely somebody is playing demanding games on this computer for hours at a time to raise the monthly electricity costs significantly.

    I am not disagreeing with you that total system specs would be helpful. I just disagree with your claim that no one can answer without them - not when we are talking about a commercial kitchen where there are many much higher power energy consumers eating lots of power all day (and in some cases, night too) long.

    And just for the record, the typical power factor for a 2 HP blender motor at maximum load would be .82, not .6. That means 11.5 x 120 x .82 = 1131.6 watts for that blender as a more realistic value, way more than your 850W unrealistic kitchen computer scenario.
     
  12. Mrkvonic

    Mrkvonic Linux Systems Expert

    Joined:
    May 9, 2005
    Posts:
    8,694
    My experience shows roughly 100-200W/h/desktop normally.
    Mrk
     
  13. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    Agreed. I think most people would REALLY be surprised how little their computers actually draw. Even when hard core gaming, maxing out the graphics card's power demands is a rare event. And then when it does happen, it is rarely longer than a second or two at a time. An even rarer event is to max out the power demands of the CPU, RAM and graphics card simultaneously as typically the tasks are handed off from one to the other. In fact, I have NEVER seen the power demands of all the components in a computer maxed out simultaneously in "real-world" scenarios - only during "synthetic" scenarios running benchmark programs. And running benchmark programs while a commercial kitchen is open is not a likely scenario either.
     
  14. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,024
    OK, so there are six boxes running, here. Two, I'm working with. The others are not doing much. LAN router running pfSense. File server. Whatever. Total is 436W.
     
  15. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    436W with 6 computers? That makes sense to me since computers at idle don't draw much - and of course, less when sleeping.

    Another common misconception many have is they believe their computers draw what their power supplies are rated at and/or the power supplies constantly deliver what they are rated at. Not so either way. If the computer (motherboard, CPU, RAM, graphics, drives) needs 150W, it draws from the PSU 150W regardless if the PSU is a 250W supply or a 750W supply. And those supplies draw from the wall 150W regardless their size too, plus another 15 - 20% due to inefficiency wasted in the form of heat. So if both supplies are 80 PLUS Gold for example and rated ~90% efficiency, both supplies will pull from the wall about 165W.
     
  16. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    4,046
    Location:
    USA
    Agreed. Most people don't realize this.

    My PC draws 100 watts idle and up to 600 on a 3D game. If you have a UPS with a display it should tell you how much power you are drawing through it.
     
  17. Jarmo P

    Jarmo P Registered Member

    Joined:
    Aug 27, 2005
    Posts:
    1,183
    Desktop computers are the biggest consumer of electricity in modern households, if running 24/7.
    That is if no electric heating/AC etc. Btw to mirimir W = J/s and is a power unit. I think the values you give apply well to a modern gaming machine. Multiply it with time and you get the energy consumption, usually in bills as kWh.

    Typically refridgerators and freezers consume maybe half of that each or less and laptops even less. My 44 inch TV is a led one and its consumption depends on the background brightness setting. I measured it, but cant remember now exactly to be sure, was probably 80W.

    And typically these days lighting is not much of a problem with led/energy lamps. I for instance keep toilet room lamp running 24/7, it is 5.5W and 2 other 3W bulbs also.

    So yes, if you are not a gamer, buy a laptop and even an external larger display to it and you will be saving in bills.
     
  18. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    Nah! Sorry, but that is wrong. Just a little bit of homework with Google will show that refrigerators are, after air conditioning.

    A good information source is Michael Bluejay's Saving Electricity. Note the article, How much electricity does my refrigerator use?
    And unless you are running a "folding" or another such program 24/7, even if you don't allow your PC to sleep, it draws very little when idle. And while some people have disabled sleep mode, most let their monitor(s) sleep.

    I think it is important to note that computers have gotten much more efficient in recent years. 80-PLUS certification ensures at least 80% efficiency - and more importantly, 80% across the full range of expected loads. Most CPUs and GPUs are much more efficient than their predecessors. DDR4 is considerably more efficient than DDR3 which is more efficient than DDR2 and so on. Drives are more efficient, and even more helpful is more and more computers are going with SSDs.
     
  19. Alec

    Alec Registered Member

    Joined:
    Jun 8, 2004
    Posts:
    355
    Location:
    Dallas, TX
    My electric provider can calculate a rough estimation of usage by load type each month. It's pretty cool, I think they must be able to categorize remotely from various electrical load properties... resistance, capacitance, inductance, reactance, etc. Anyway, for the last 12 months, my residential usage has been: Cooling 42%, Refrigeration 23% (I have two refrigerators), Appliances 13%, Electronics 10%, Other 7%, Lighting 5%. I live in Texas, so that's why A/C is such a huge percentage; but I suspect it's accurate averaged over the year. I believe appliances would include things like washer, dryer, dishwasher, vacuum, etc... all of which would clearly consume more than a PC while running... but even averaged out over time periods where they are inactive they consume more than my household electronics. I leave 2 PCs running essentially 24/7, although they fall into sleep mode after 10 mins of inactivity... and they are probably only about 1/3 to 1/2 of my "Electronics" load given other devices such as TVs, DVRs, WiFi routers, game consoles, audio receivers/amps, etc. So, I would guess only 1.5 to 2.5% of my overall annual electrical bill can be attributed to each of the PCs.
     
  20. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,549
    Where did he say that the system is in the kitchen?
    For me is unrealistic to have a desktop system placed in the kitchen of a restaurant... but you assume that it is real.The op said that the bill have being rising he did not say where the system is placed and neither said anything on how it is used.
    That depends on where someone lives.
    Do not assume that everywhere the climate conditions are similar to yours. In North countries the refrigerator (if needed) won't consume that much and the air conditioners are not used at all.
    http://www.academia.edu/6688344/Fac...onsumption_of_household_refrigerator-freezers

    Panagiotis
     
    Last edited: May 13, 2016
  21. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    :( Okay, my mistake. He said "the person who runs the kitchen" which clearly indicates this computer is used in a restaurant or cooking facility. So is that really a distinction you want to make your stand on? o_O This is about the whole facility's power consumption. It does not matter if the computer is in the kitchen, behind the bar, in the manager's office or in the dining room!

    I feel you are just being argumentative now. :(

    Refrigerators are not typically put outdoors in the sun, or in the freezing cold. Typically they are kept in kitchens - regardless the country. Kitchens get pretty warm. Refrigerator doors (especially in commercial kitchens) are opened frequently. So if anything, your Malaysian paper just emphasizes the point - except for air conditioning, refrigerators are the largest power consumers in most households.

    But again, this is not a household. It is commercial cooking/eating facility. Most likely with several refrigerators and freezer - perhaps even big industrial "walk-in" coolers and freezers. And sorry, but you don't know what you are talking about in terms of air conditioners in "North countries". Even in Montreal Canada, summer temps above 90°F (32°C) are not uncommon. So air conditioning in restaurants and many homes are common. But even if no air conditioning, refrigerators are typically kept in environments humans feel comfortable in. And that pretty much is always much warmer than inside a refrigerator so food must be cooled, or go bad.

    @ Alec - thanks for that breakdown. It is interesting how they break down appliances, separating out "refrigeration" from AC and other appliances. I don't see how they can categorize anyone's usage specifically without doing specific surveys of their household. The meter just does not care and does not display that sort of information. I am sure there are more sophisticated meters, but those would be too expensive to put on everyone's home. So they average and estimate, as you noted. But it is their job to anticipate usage effectively and such information for individual devices, as well as what is found in the average household is well known and widely published. So I am sure their information is pretty accurate.

    You say you have two refrigerators. That is not uncommon and sadly, the 2nd one is often an older one. Refrigerators made even 5 years ago are MUCH less efficient than those made today. Although I have two too (the smaller one downstairs is used for drinks) I am sure most households have just one. But many of the folks I know who liven in houses have a refrigerator in the kitchen and a separate "deep" freezer somewhere else - often in the garage. They, of course, are not opened as often, but it take a long time to recover when they are.
     
  22. virion

    virion Registered Member

    Joined:
    Jan 13, 2013
    Posts:
    6
    Location:
    USA
    Perceptions can be tricky. It never occurred to me that razorboy's computer was located in a restaurant. I pictured "the person who runs the kitchen" as razorboy's wife (or other domestic partner). :D
     
  23. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    6,024
    I wasn't sure, myself. Because he never actually said "restaurant". But then, he also hasn't said that it's not a restaurant ;)
     
  24. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    2,265
    Location:
    Nebraska, USA
    Yes, perceptions can be tricky. It never occurred to me "the person who runs the kitchen" could be the wife or domestic partner in the household. Expressing it that way made me to assume it was a commercial or institutional (like a hospital, nursing home, or school cafeteria) kitchen and I took "the person" as someone in authority over or responsible for paying the bills for the facility - regardless where this computer is physically located in the facility (kitchen, office, dinning room or wherever). Also with razorboy stating the computer was used as a DAW system, that further implied to me this computer was being used in something more "industrial" than a home kitchen.

    My apologies if I assumed wrong.

    Nevertheless, whether a busy restaurant kitchen or a one-bedroom apartment kitchen, unless a computer is "crunching" for a "Folding" for a cure for cancer or "BOINCing" for ET - that is, the system is being pushed to 100% utilization 24/7, no way is a desktop computer the biggest consumer of electricity.
     
  25. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,549
    @Bill_Bright
    The document I posted I posted it only because it had a refridgerator chart between consumption and external temperatures. 21°C ->750W, 25°C->850W, 31°C->1150W.
    In cold places lots of people have a refrigerator and an outdoor chest freezer for that reason.

    Initially, I assumed that he was talking about rental rooms (in Italy they even rent beds and not entire rooms for students) with common kitchen and bathroom. And I have witnessed first hand and multiply times arguments like that (e.g. your TV or stereo system or computer consumes too much) between university students and their housemates (if they divided the electric bill) or between students and landlords if the rent for the room is all-included (fixed price, bills included).

    Panagiotis
     
Loading...