Hello, I've done quite a bit of studyin power usage and such in datacenters over the last year or so.
I'm looking for information on energy consumption vs percent utilization. In other words if your datacenter consumes 720 MWh per month, yet on average your servers are 98% underutilized, you are wasting a lot of energy (a hot topic these days). Does anyone here have any real data on this?
I've never done a study on power used vs. CPU utilization, but my guess is that the heat generated from a PC remains fairly constant -- in the grand scheme of things -- no matter what your utilization is. I say this, because, with a CPU being idle of 100% utilized, they still are grossly inefficient, on the order of less than 10% in all cases (ie, 1 watt in returns at least .9 watts of heat, no matter loading of the CPU). -- Alex Rubenstein, AR97, K2AHR, alex@nac.net, latency, Al Reuben -- -- Net Access Corporation, 800-NET-ME-36, http://www.nac.net --