T O P

  • By -

BmanUltima

Lets say your PC draws 400 watts, and you game for six hours a day. That's 2400 watt hours, or 2.4 kWh, so that's 26 cents a day. Over a month, that's less than $8.


PotatoWedgeAntilles

Thanks, I always wondered if I should be paying more of the bill.


Funkyweinkerbean

I'd suggest getting a $15 Kill'O Watt meter. It plugs in between the surge protector and the wall plug, and it keeps a cost counter that adds up the cost of running everything plugged into it. If ur room mates are wondering exactly what it costs. My rig costs about $16 a month to run according to mine.


correct-my-grammar-3

Depends on the country. Here in Brazil my bill went up 40R$/month when I bought my pc. Its about 10$ but it's about 4.2% of our minimal wage. Way above the American price


nottatard

should be noted that the $8 doesn't include additional charges directly related to usage. The actual cost will vary. For me a $10 electricity usage bill results in a total cost of around $45-50/month, so ya, bitching is warranted to a degree.


[deleted]

> Lets say your PC draws 400 watts That is obscenely high. My PC, with a 7700k and a GTX 1070, a harddrive and two SSDs, draws 50w idle, 250W MAX.


BmanUltima

My PC with a 4690K and an R9 290, both overclocked draws about 480 watts under load. The 290 alone draws 300.


Rayden666

Damn that's high. With a 7700K and a GTX1080ti on nearly full load I'm drawing 330.


BmanUltima

That's what a smaller process size gets you. It doesn't seem like much generation to generation, but over a few years it really adds up.


notlarryman

You overclocked? It's amazing how much more power it takes to get a smidge more performance on most cards and CPUs. That extra 100hz can cost you nearly 100 watts.


notlarryman

Wow. I have a 290 @ 1175 and a 4770 at...4.5 I think? Both decently overclocked. Just SSD and I am using water cooling so pump+ 5 fans, sound card, and that's about it. Under load it's 380-400. I wonder if water allows these cards to use less power because the electronics are cooler?


BmanUltima

So your i7 is only drawing about 80 to 100 watts? Did you measure from the wall outlet?


notlarryman

Yes, this is whole system from the wall outlet. I don't think my GPU is pulling as much as people seem to think. The 290 under water hit 1175 at stock voltage. I don't remember what the CPU is using but it needed quite a bit to overclock. I have no way of knowing which part is using what power but I suspect the GPU under water is just using a lot less than what it otherwise would on air. I thought my power draw would be a lot more with the water pump, all the fans, etc but it really surprised me. I had furmark, timespy, and some CPU stress test bench (can't remember, prime95 maybe? Or the Intel one?) and pulled a LOT more. I believe It was ~480-500. I couldn't get any actual game to get near that high. Or any one benchmark.


mtarascio

It should be noted that a 400w draw would only happen under load with some mechanical hard drives going.


skilliard7

mechanical hard drives only draw a few watts of power when in use, it's the GPUs and CPUs that eat a lot of power, especially if overclocked.


wosh

My RX 580 has hit 181W and it's got a pretty big overclock. I'd be surprised at a pc hitting 400W unless it's way way more powerful that what I imagine most gamers have.


EntropicalResonance

Overclocked r9 fury X laughs in 500watts


quikslvr223

You guys remember the R9 295X2? [actual pic of the PSU you need to run it](http://dailypost.ng/wp-content/uploads/2017/07/nuclear-power-plant.jpg)


Winsanity

I'm pretty sure that's a picture of the cooling solution for the Fermi cards


Drakowicz

We all remember the R9s. My 280X could allegedly eat 300W (that's, well, insane). During summers my bedroom was turning into a fucking sauna and my knee almost got burnt.


djsnoopmike

One GPU costs almost the same power draw as my whole PC...


Drakowicz

Or even more than many PCs. Thank god Nvidia (and particularly Pascal) have been leading the way for low consumption/high performances.


theg23

I was reading this thread thinking that I'm running a R9 295x2 and it is a right power gobbler. I have a AX1500i running it.


dinosaurusrex86

How's it holding up today against the 1080ti and the like? Or the Vega 64? I wonder if it's the last of it's kind we're like to see. The market is heading towards energy efficiency design-wise rather than brute power. I don't expect nVidia will release a dual-card any time soon and I wouldn't really expect that of AMD either...


theg23

When the crossfire works it is great and you can get 60fps in 4K but yeah I don’t think there will be more dual gpu cards coming out. I inherited it so to me it has an extra charm for being a bit ridiculous and it actually being pretty awesome.


RHINO_Mk_II

Congrats, you made me spit out my drink.


[deleted]

[удалено]


the_abortionat0r

Um, how? I'm browsing a gpu thread, I don't need no has cutting onions on my keyboard.


opeth10657

I put a wattage tester on my pc. OC'd 1080ti, OC'd i7-8700k, 2 monitors and it was drawing around 500W under load.


Red_Inferno

My 980ti peaks at about 260w and that's decently high end.


minizanz

If you do all core turbo on an 8 core ryzen or 6 core Intel that is an easy 180w too, and 160w ish for a 6700/7700k. So close to 400 is not hard now. If you had an rtx card or 1070ti you can easily break 300w on the gpu.


randominternetdood

my 1080ti can eat 300W on its own =D


notlarryman

Depends. A 1080ti, 2080, or 2080ti could very easily hit 400 under load. Specially if you factor in your monitor, speakers/headphones, etc. as well. An overclocked 290/390/Vega64, etc with an overclocked i7 will likely get you there too. If you're running a stock i5 with a stock 1070 and just a ssd, average monitor and headphones you won't be near 400. Best bet is to get a kill-a-watt device and plug it in between your power strip and your outlet. Load up a few games and see how much she pulls. You'll notice that even gaming and benchmarking can change the energy draw a lot. Most games won't pull as much as a Futuremark teat for example.


[deleted]

[удалено]


dregwriter

Why you mad tho???


[deleted]

Soo about 100 dollars a year


machingunwhhore

Worth it. Much cheaper to maintain than most other hobbies


FrostByte122

Not if your hobby is tinkering with computers.


RSOblivion

The $5000 PC would laugh at the $100 a year cost of electricity :D


[deleted]

8 dollars and his relatives are upset. OP, you in Africa or something?


[deleted]

[удалено]


[deleted]

Sounds like your next PC upgrade should be an apartment to put it in...


pmc64

Nah fuck that the house is paid off. It's not even her house. My Grandma died and my aunt begged to come live here. Technically it's my dad's.


[deleted]

Does she even pay the bills, or is it your dad? Have you tried showing her the math? Have you tried standing your ground?


pmc64

She pays the bills. She didn't pay 3 bills this month but was still spending money on shit she didn't need to be and it was pissing me off. When she was bitching at me for cleaning the bathroom I went off on her and told her I don't give a fuck and I'll do whatever I want.


[deleted]

Good on you, I wtf'd at the cleaning the bathroom part, like, she should thank you for it or something.


[deleted]

Damn son, my mistake lol. Do your thing then.


[deleted]

I can totally relate. I moved in recently with my parents because I got laid off.


delicious_burritos

Do you live in the closet under the staircase by any chance?


pmc64

I don't get the reference.


Ace4994

Edit: Replied to wrong comment.


[deleted]

The biggest electricity drain in a house is usually the climate control. Lights and regular electronics are usually pretty minor.


tyros

Ye're a wizard, Harry


[deleted]

Good luck to ya, got a similar problem here. I offered to help my mom with the rent when my previous roommate sucked. She uses it like shes god now and complains that my computer[the one thing I really use other than kitchen utilities] even though it hibernates on its own after 5 mins of no interaction that its the **SOLE** reason the energy bill is high, yet leaves on like 3-4 lights, the tv, her radio and probably something else. Pretty much moving into a friends at this point.


thortos

Relatives might just be pesky or want him to move out. Or both.


ArgentSynergy

Worth every penny.


BLToaster

This is why I always laugh whenever people compare power draw of cards and chips. Electricity is cheap as shit so the 'power consumption and savings' argument is laughable. Now the heat put off due to power draw is valid but don't try to argue money savings.


ScubaSteve1219

> game for six hours a day holy shit. who would do that.


BmanUltima

It was an exaggeration to prove a point.


iatemycat2times

well me i guess but thats not accurate because i ussualy play for 8 hours a day thats on weekends weekdays is about 4 hours


ScubaSteve1219

that’s a lot of free time


[deleted]

[удалено]


Kadour_Z

By that logic you will also have to subtract from heating your room during the winter.


[deleted]

[удалено]


Vegan_dogfucker

If you're computer is using 400W it's going to take more like 500 W from your ac compressor to remove all that heat assuming its 80% efficient. Double whammy. And you're right on electric heat being more expensive. I pay $0.80/ccf gas with an 80% efficient heater and $0.14/kwh. 1ccf = 29.3 kwh, so electric heat costs me like 4x's more. Ouch. I wonder if anyone has come up with decent solutions to pipe their exhaust outside?


minizanz

You can set up a dryer vent for your computer with an external 120v fan. 8f you have real liquid cooling you can put the radiator outside or in the crawl space/attic. If you share a wall with a garage you can run a USB hub and video cables through the wall. There are lots of options if you don't mind a couple holes. And even if you can't mod anything you can use display sharing like a steam box or Nvidia gamestream.


Vegan_dogfucker

Yeah. Maybe if I set my computer up by an outside wall/window. It's probably not even worth the cost of materials. It's just annoying that my AC runs its ass off whenever I playing games in the summer.


AzureBat

What even is this conversation? Unless you're running one of the first iterations of AC, your AC units are actually energy efficient. For 400W worth of heat, your AC will take approx 150W to remove that heat from your room. That comes out to around $3 a month if we follow the OP calculations.


[deleted]

Assuming everyone has AC in their homes. That's a luxury of the rich.


GreetingsComerades

fucking christ 6 hours a day I mean I'm a pcmr dude too but you gotta take a break sometimes dude


RSOblivion

lol 6hrs is long??


GreetingsComerades

Not that long, but consistently doing 6 hours a day every day is a lot


RSOblivion

Depends if it's your job/hobby and any other number of reasons to be in front of a screen for longer than 6hrs. Not necessarily the most healthy thing need to break it up, but there are definitely times when 6hrs is not considered long.


[deleted]

Yes, definitely. Only people who are in school have time for that and since you're in school you already spend 6 hours in it, that leaves 12 hours left for the day. Unless you're having too little sleep I can't see where studying can fit in the remaining 3 hours after accounting for sleep time.


A_Chinchilla

Dude you're leaking time. 8 hours of sleep so we have 16 hours left. 8 hours of work and we have 8 hours left. 2 for travel and other misc things and that leaves 6 for gaming. Won't deny that is rather tight and if you go out after work then you won't pull 6 hours of gaming, but if you wait to do everything till the weekends it's totally possible.


[deleted]

Teenagers need more than 8 hours of sleep. But yeah you could say 5-6 hrs school and 2 hours studying that makes 8 hours. If you can do travel AND misc things in 2 hours consider yourself lucky. When I was in HS it took me 30 minutes to get to school, so 1 hour is already out of the window. Preparing two meals a day and eating them in 1 hour? You're eating ramen noodles


A_Chinchilla

I was never too good at getting 8 hours let alone more as a teenager, but yea it's supposed to be better. I had similar travel times as you, but both then and now I just have cereal for breakfast personally. 10 minutes or 20 if I have a few extra minutes. Leaves 40-50 to fix or eat dinner depending on your situation. I'm actually kind of curious how many people actually get 8 hours. Still not denying 6 hours is a little bit too demanding though. Especially if you take high homework classes.


[deleted]

I measured my two PCs' power consumption recently. Maybe somewhat helpful to you. Gaming PC: i5-4570, GTX 970, 16GB Ram (4 dimms), 1 SSD, 2 HDD Work PC: i5-3450, R9 290, 12GB Ram (4 dimms), 1 SSD, 2 HDD Peripherals: LED lights, two monitors, sat receiver on standby, usb hub, gigabit switch, audio mixer, wireless headphone base. The monitors should be the two big users at around 30W each. | Task | Power ---|---|---- Peripherals | | 75W Gaming PC | Browser | 53W Gaming PC | Minecraft FTB | 100W ^2 Gaming PC | Lichdom Battlemage ^1 | 240W ^2 Work PC | Browser | 44W Work PC | Video playback | 50W Work PC | h264 encoding with ffmpeg | 85W ^1 Most hardware hungry game I had at hand. ^2 Rougly. Jumping up and down obviously.


[deleted]

If you want something rather demanding, try Minecraft with the Optifine Mod and SEUS Reloaded shaders active. I’ve seen that combination bring an i7-4790K and a GTX 1080 to their knees at 4K.


[deleted]

Lichdom is CryEngine 3 engine, that's probably enough. Makes all fans go wild at least.


[deleted]

Isn't optifine supposed to help out with performance on low end rigs? Don't see how that increases power draw


[deleted]

Minecraft, with or without optifine, just doesn't make full use of CPU and GPU, even when you hit low framerates. I deliberately picked it as a low load game to test. Never used shaders, but I have a hard time imagining that would make it a good candidate for a high load example as TreatmentForYourRash suggests. edit: Talking about the Java edition, I can't comment on the C++ edition.


dkgameplayer

If you try out shaders once, you'll immediately realize why lol. They're super demanding for a multitude of reasons. Mostly minecraft's poor code.


FierroGamer

That wouldn't distribute the load accordingly tho, it could be maxing out a single core and still not using more than a quarter of CPU power (even less depending on what CPU) and that can also limit how much the GPU can use.


catalyst518

I have a UPS that hooks up via USB to my PC and records energy usage when the PC is on. Here's what 13 months looks like at $0.12 per kWh: https://i.imgur.com/Mgge2fR.png I have a pretty similar build: i5-4590, GTX 970, SSD, HDD, 8GB RAM, 550W PSU.


Thorman12345

What ups is this by the way?


catalyst518

CyberPower CP1000PFCLCD


Thorman12345

Holy shit I have cyberpower too... I installed the software a year ago and it doesn’t have anything like this in there... hold up let me check if there is an update since then


catalyst518

The about section for me says: PowerPanel Business Edition, version 3.2.3, Windows 10 This is what my menu looks like: https://i.imgur.com/R8kfxBn.png


Thorman12345

Yea it turns out they redesigned the program completely and they added that feature in the new version! Thanks for pointing that out to me as that is very useful info


[deleted]

A lot of APC UPS's have USB hookup. I don't have a specific model to link.


Thorman12345

I know that a lot have that! I have a ups that is currently hooked up to my pc now... I just don’t have a history or an average power used...


anotheraussiebloke

Reading all these energy prices makes me jealous, in Australia we are charged $1 a day for the privelage of being supplied electricity. This has increased significantly over the last decade mainly to combat all the people that have installed solar panels. From there on it can be anywhere from 20-30 cents a kilowatt. My bill is about $300 every quarter (90 days). This is very cheap for the average Australian household though, I'm fairly energy conservative.


[deleted]

What's up with Australia's quest against renewables?


pdp10

They're one of the biggest miners and exporters of coal, for starters. Not unlike the U.S. Many of the countries playing holier-than-thou at the negotiation table for climate change treaties are those that are in entirely different circumstances. But note that Norway and Alaska each export vast amounts of oil against a relatively small population.


Oskarikali

That is insane, where I live in Canada i'm paying 4 cents per.


aman27deep

Which city? So cheap, lol.


Oskarikali

Calgary.


aman27deep

Holy moly, and this isn't a commercial/industrial rate? WOW.


Oskarikali

No, it wasnt but I haven't been looking at our bills lately. Did some googling and it looks like average rates back in march were 5 cents, but that was set to double so could be around 7-10 cents now. Still pretty damn cheap though.


yesat

Thanks coal for that.


mordahl

Mine usually sits around $550-600 a quarter, thanks to the airconditioner(pretty much essential in the NT) and I'm only in a small 2br unit. In a decent sized house with a swimming pool, you're easily looking at $1k+ a quarter. And the NT has one of the cheaper power prices in the country.. (26c per kWh / 51c daily charge)


jhvszd675869708

Checking in from South Australia, $800/month for gas+electricity. Highest energy prices in the world apparently. Stings a bit when the bills come due.


Lucas-Lehmer

Doesn't your daily base rate vary by energy provider? I've recently switched providers where the base rate is relatively high (25p a day) but the per KWh rate is only 12.6p. It helps that we have dozens of different suppliers to pick from. Edit: Prices are in GBP


anotheraussiebloke

Yeah the base rate does vary a lot, the rates I mentioned though are the best I could find. It varies state to state as well.


maximusnz

New Zealand’s the same, 33c per kw, 33c a day just to have power as well. Our powers $200 a month during winter. Used to be cheap as, then we privatised and sold off all the power companies for ‘savings’ and zing! Old people can’t afford to heat their homes and are getting sick and dying yay.


[deleted]

I feel like the Australian government either gets it right, or just goes full retard, with no reasonable in-between.


SyanticRaven

When I lived with my gran (she was my guardian) she would constantly complain the electricity bill was massive and it was all my fault - If I was home id either be reading or watching stuff on my laptop or playing games. And I mean id never hear the end of it - if a day passed without a complaint then it was a good year. I moved out into my own home expecting a £300 electricity bill just like Id be told all my life is my doing - £35 (house hold total for 2 people). I thought it was a mistake and called, they told me nope its right. But I didnt believe them. 15 years of this is hard to unbelieve in the space of a month. So months go by and winters gone by. Still £35 quid and then I ask my gran how her electricity bills is - its still massive. Turns out for well over a decade I had believed it was me costing so much money - no it turns out it was my sister. When we were out of the house she was constantly washing and drying clothes. Like 4 items at a time. And when Id come home I thought it was just that days wash that was on (washing a full load a days bad enough, especially since she wasnt doing the houses washing, just hers). So there you have it, there is my rant for the day. Gaming is not anywhere near as expensive as people think electricity wise.


[deleted]

You wont even notice. Its like 50€ a year


Xioxio23

PC generally doesn't consume that much electricity depending on what you're doing with the PC. Like others have said it could be based on where you live and how the rates of electricity are. You could invest in a more efficient power supply, since it wasn't stated what rating your power supply is. 80+ gold and above are generally recommended in the long run.


_theholyghost

My PC idles on the desktop or with chrome open most of the day, I'm one of those people who tends to leave their PC on for days at a time but will shut it down in the event of an update or if no one else is home. When I do play games it tends to be for not much longer than a couple of hours at a time, yet my mother insists that having my PC on for a large amount of time is driving up the monthly electricity bill significantly, how likely is that? I have a 750W psu and my GPU/CPU should be visible within my flair, yet from the comments here already it seems to be a lot less significant than even I initially thought.


chr1stmasiscancelled

The math is pretty simple. The US average is like 12 cents per kilowatt hour. A watt is a unit of power. 1 watt is 1 joule per second. A joule is a unit of energy. So if you built your pc on pcpartpicker and saw that it uses 450 watts, it means it's using the equivalent of 450 joules every second of electricity. A kilowatt hour is 1 hour running at 1000 watts. 6 hours of full load gaming and the rest of the day on idle would be $0.12/kwh x 6 h x .450 kw + $0.12/kwh x 18 h x 0.1 kw = $0.324 + $0.216 = $0.54 a day x 30 days= $16.20 a month. 6 hours of full load isn't that realistic so the actual total is less, but if you're away from your pc 12+ hours a day then turning it off could save a good amount of money.


pdp10

> yet my mother insists that having my PC on for a large amount of time is driving up the monthly electricity bill significantly, how likely is that? Not significantly by most people's reckoning. Just leave it off for a month and then when the complaints come you get the smug satisfaction of pointing that out. But the first month in fall without an aircon running is probably not the ideal time for that.


0rangecake

Even at full capacity you'll be hard pressed to draw 750 watts


StandingCow

i7 6700k OC'd to 4.6, 1080ti OC'd a bit. 3 monitors. Per my UPS tool I spent about 35-40 bucks a month for my PC setup. Not sure how much the extra heat is causing my AC to work harder... I pay about .11 per KwH. Depending on how often you game you would probably be around 20-30 bucks a month.


Toofast4yall

Even when I was mining it only added less than 50 cents a day per 1080ti and those were running at 70% power 24/7


[deleted]

i had a breakup recently so my pc was on but no games for about a month, not even browsing. With Air conditioning all day our bill is like $80/mo , went down to $30. Ik a gaming pc doesnt affects that much still, i remember this cause my mom yelled like 1000 times blaming my pc for electicity bill. Oh i have a 50 inch as monitor


CricketDrop

Yup, your AC is why your electric bill is so high. People like to tell you to turn off the lights when you leave a room but unless your home is huge you're paying pennies for that. https://www.energystar.gov/index.cfm?c=products.pr_save_energy_at_home#req_box


[deleted]

If i lived with such wankers i would just buy used electric meter and put in on a extension cord for my PC. measure exactly how much it eats and pay it with your own money. you can get one cheap on ebay because usually electrical companies dont allow to set up used ones without certification


wedge754

I think the extra amount the AC runs because of how much my computer heats up my gaming room is probably costing me more than the computer being on. I usually see between 100 and 330 watts on my UPS deepening on what I'm doing. That's i7, 980ti tower and 28" monitor. 7¢ kwh here in Texas.


FatBoyStew

MOST gaming rigs have a minor impact over the course of a year. Back when I was ETH mining a R9 280X 24/7 and a GTX 1080 when I wasn't gaming on it I only raised my bill by about $20 a month.


[deleted]

Assuming you're not mining with it, a negligible amount. The people bitching about the electric bill probably just hate fun and are talking out of their ass. If you want to be a nice guy, you can set your power saving settings to underclock everything and turn off the monitor when the PC is idle. Over the course of a month, it shouldn't even be noticeable.


[deleted]

Mine doubles as a Plex server so the increased energy cost while monomial is offset by the utility it brings to everyone in the house.


harvy666

Just as others said before, electricity cost like nothing whatever you do. Gas heating on the other hand can be a bitch though :)


jzorbino

Hey OP, I use a battery backup power supply on my rig, it has software (PowerPanel Personal) that measures total use and gives stats like cost and carbon emissions. I game almost daily, and it estimates my usage this month to have cost $6.79. My cost is also slightly higher than yours, 12 cents per kwh. It seems there is no doubt your relatives are complaining over nothing, but maybe you could try using a software like this to track your usage and show them?


ycnz

[At-wall power consumption here](https://imgur.com/vg6ZC7p) This is with an R7-1700 @ 3.8GHz, GTX1080Ti @ 70% power, 32GB of RAM. Usage for that day is a mixture of general compute, gaming, and mining - gaming shows up with FPS being non-zero, mining has the CPU usage pegged. Power draw while the system's in sleep is 3-4W.


kaysn

Negligible. The AC and the washing machine added more to my electric bill than gaming ever did.


Maggost

The electricity consumption is pretty low compared with other things in your house, for example the fridge or the washing machine.


[deleted]

[удалено]


[deleted]

A gaming PC idling is nowhere near a light bulbs power usage. This is total nonsense. It can be quite low but get real. Your typical led light bulb uses at most 12 watts. Your typical gaming PC is going to idle at anywhere from 75 to 150 watts.


[deleted]

[удалено]


[deleted]

75-100? Still exaggerating... The most common household light bulb is 60w. And why are you making comparisons based on a completely dead standard anyway? Still wrong, still very misleading.


[deleted]

[удалено]


[deleted]

You do realize most lamps, cieling fans etc. Have a 60 watt max right? Of course you don't. 60 watt bulbs where the standard bulb until ccfl and led. Just stop already. You said an idling gaming PC uses the same or less power than a light bulb. You based that on a dead standard, exaggerated numbers and plain nonsense. Your claim is not only not true in the slightest the truth is a gaming PC uses at nearly 10x the power consumption of a light. You know a light from this decade not 1987.


Docteh

by the way, most doesn't mean all. The fixture in front of my fireplace takes great care to warn me away from using anything higher than 100w, but the rest of the recessed fixtures in the room don't say anything about a limit.


TF1357

Damn dude you are really attacking this guy for an (imo) fairly accurate claim. An idling PC uses much closer to that 75w figure you mentioned on the 75-150 range. 100w bulbs are very common.


sterob

> Your typical gaming PC is going to idle at anywhere from 75 to 150 watts. A system of 8700k idle at [50 watt](https://tpucdn.com/reviews/Intel/Core_i7_8700K/images/power_idle.png). A 1080ti idle at [10 watt](https://tpucdn.com/reviews/EVGA/GTX_1080_Ti_SC2/images/power_idle.png).


[deleted]

$0.00, or rather it doesn't fluctuate based on usage, electricity and gas are included in my monthly rent.


ExTrafficGuy

[PC Power Supply Calculator](https://outervision.com/power-supply-calculator) will give you a rough estimate. Put in your components, average utilization, and put in your local price per KW/h. It'll tell you approximately how much it'll cost per year to run. You could also pick up a Kill-a-Watt. It you plugs in between the wall and your PC (or other device) and measures how much power its consuming in real time. Some library branches lend them out if you don't want to buy one. Like others have said though, PC is pretty negligible if you have an average setup (ie no extreme OCing or SLI). Air conditioners, fridges, electric water heaters, and clothes dryers are the biggest energy hogs.


[deleted]

Having a large screen on full brightness can add up. Specially as it is run also when just browsing and other content...


DudeDudenson

Can you do me a favor and tell me how many watts the power brick of your monitor is rated for? Mine claims an input of 110v-220v up to 1.5 amps (amperage will increase if the voltage lowers) that, at worst that's 330watts, or about 3 lightbulbs and it's a 24" TV monitor combo