Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to calculate costs of lightbulbs: 1 Watt running the whole year, at 0,28 eurocent/kWh costs 1 Euro per year. Until someone corrected me and it turned out that every 1 Watt 24/7 will be 2 Euro per year.

In the US electric power might be cheaper. And if it's running only part of the time, you should adjust the calculation.

My desktop/server runs 24/7, so I prefer having a CPU with 65W TDP over one that is 125W TDP. That might run up to 120 Euro per year difference for me (if it would be running at 100% CPU).



Real world energy use is nothing like what you see on spec sheets. And not just because manufacturers differ in how they compute TPD. And TPD is also not a good indicator for energy use at (near) idle. With underclocking/volting in the BIOS you can get a beefier CPU to outperform smaller CPUs per watt. Because CPUs get really inefficient as they use more power undervolted or capped high TPD chips might be much more power efficient in the real world than their low TPD counterparts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: