Alex Rubenstein wrote:
I'm looking for information on energy consumption vs percent utilization. In other words if your datacenter consumes 720 MWh per month, yet on average your servers are 98% underutilized, you are wasting a lot of energy (a hot topic these days). Does anyone here have any real data on this?
I've never done a study on power used vs. CPU utilization, but my guess is that the heat generated from a PC remains fairly constant -- in the grand scheme of things -- no matter what your utilization is.
You should be able to pick up simple current / wattage meter from local hardware store for $20 or so. That will tell you that on a modern dual-CPU machine the power consumption at idle CPU is about 60% of peak. The rest is consumed by drives, fans, RAM, etc. As wattage the difference is 100-120W (50-60W per cpu) All modern operating systems do moderate job of saving CPU wattage when they are idle (BSD's, Linux, MACOS X, WinXP, etc.) Pete