That's a good point, though if you are running your breakers that close
I think you have bigger problems, as a power outage, however unlikely,
could cause your equipment to not come back up at all. Software updates
that reboot several servers in quick succession could also cause a
breaker to trip under those circumstances. Unfortunately, there's no way
to tell how close a breaker is to tripping without tripping it. Breakers
may have amp meteres and a rated size, but the actual load before
tripping is +-20% for common models, meaning a 20A breaker may trip as
low as 16A.
On 2017年10月11日 12:58, Matt Harris wrote:
Another thing to remember - and I've actually seen breakers tripped on
PDUs due to heat before because of this - is that it's going to spin
all of your fans harder to keep internal temps down if the ambient
temp is higher. This will increase your power draw, which means that
if you're paying for metered power by usage, you're going to pay more
- those fans really do add up in terms of power. In extreme cases, you
can draw too much power and trip a breaker on a PDU because every host
in a rack, or especially those towards the top, are spinning full
tilt. It's not a good condition and one that you should force them to
correct.
On Wed, Oct 11, 2017 at 11:54 AM, Zachary Winnerman
<zacharyw09264@gmail.com <mailto:zacharyw09264@gmail.com>> wrote:
I recall some evidence that 80+F temps can reduce hard drive lifetime,
though it might be outdated as it was from before SSDs were around. I
would imagine that while it may not impact the ability for a server to
handle load, it may reduce equipment lifetime. It also could be an
indication that they lack redundancy in the case of an AC failure.
This
could cause equipment damage if the datacenter is unattended and
temperatures are allowed to rise.
On 2017年10月11日 11:45, Keith Stokes wrote:
> There are plenty of people who say 80+ is fine for equipment and
data centers aren’t built for people.
>
> However other things have to be done correctly.
>
> Are you sure your equipment is properly oriented for airflow
(hot/cold aisles if in use) and has no restrictions?
>
> On Oct 11, 2017, at 9:42 AM, Sam Kretchmer
<sam@coeosolutions.com
<mailto:sam@coeosolutions.com><mailto:sam@coeosolutions.com
<mailto:sam@coeosolutions.com>>> wrote:
>
> with a former employer we had a suite at the L3 facility on Canal in
> Chicago. They had this exact issue for the entire time we had
the suite.
> They kept blaming a failing HVAC unit on our floor, but it went
on for
> years no matter who we complained to, or what we said.
>
> Good luck.
>
>
> On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
> <nanog-bounces@nanog.org
<mailto:nanog-bounces@nanog.org><mailto:nanog-bounces@nanog.org
<mailto:nanog-bounces@nanog.org>> on behalf of
dhubbard@dino.hostasaurus.com
<mailto:dhubbard@dino.hostasaurus.com><mailto:dhubbard@dino.hostasaurus.com
<mailto:dhubbard@dino.hostasaurus.com>>> wrote:
>
> Curious if anyone on here colo¹s equipment at a Level 3 facility
and has
> found the temperature unacceptably warm? I¹m having that experience
> currently, where ambient temp is in the 80¹s, but they tell me
that¹s
> perfectly fine because vented tiles have been placed in front of all
> equipment racks. My equipment is alarming for high temps, so
obviously
> not fine. Trying to find my way up to whomever I can complain
to that¹s
> in a position to do something about it but it seems the support
staff
> have been told to brush questions about temp off as much as
possible.
> Was wondering if this is a country-wide thing for them or unique
to the
> data center I have equipment in. I have equipment in several
others from
> different companies and most are probably 15-20 degrees cooler.
>
> Thanks,
>
> David
>
>
>
> ---
>
> Keith Stokes
>
>
>
>
--
Matt Harris - Chief Security Officer
Main: +1 855.696.3834 ext 103
Mobile: +1 908.590.9472
Email: matt@netfire.net <mailto:matt@netfire.net>