I used very raw estimation (which is not well correct but dont make too much of errors) - to remive 1 KW out of building, yiou spend extra 1 KW. But anyway, 450,000 servers have a great power consumption - you can use river or a lake to cool them, but you still need 45,000 KW of power to make them work. So, to say 60,000KW - 100,000KW will nopt be a big mistake (total power consumption). Of course, if you build data center inside the the power plant dam, then you have both, enough cooling and enough power. -:) ----- Original Message ----- From: "Matthew Crocker" <matthew@crocker.com> To: <nanog@merit.edu> Sent: Friday, June 16, 2006 1:16 PM Subject: Re: WSJ: Big tech firms seeking power
I wonder just how much power it takes to cool 450,000 servers.
450,000 servers * 100 Watts/Server = 45,000,000 watts / 3.413 watts/ BTU = 13.1 Million BTU / 12000 BTU/Ton = 1100 Tons of cooling
A 30 Ton Liebert system runs about 80 amps @ 480 volts or 38400 watts, you'll need at least 40 or them to cool 1100 tons which is 1536 Kw * 24 hours * 7 days * 4.3 weeks = 1,110,000 KwH/month * $0.10/ KwH = $111,000 /month in cooling.
I think my math is right on this...
-- Matthew S. Crocker Vice President Crocker Communications, Inc. Internet Division PO BOX 710 Greenfield, MA 01302-0710 http://www.crocker.com