http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-cent... (No, I don't know if it's real or not.) --Steve Bellovin, http://www.cs.columbia.edu/~smb
--On fredag, fredag 28 nov 2008 08.34.33 -0500 "Steven M. Bellovin" <smb@cs.columbia.edu> wrote:
http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-c enter-fit-for-a-james-bond-villain/ (No, I don't know if it's real or not.)
It is. The server space is outside the blastproof area. Go figure. -- Måns Nilsson M A C H I N A I'm into SOFTWARE!
Steven M. Bellovin wrote:
http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-cent... (No, I don't know if it's real or not.)
more images: http://www.archdaily.com/9257/pionen-%E2%80%93-white-mountain-albert-france-... cheers, raoul -- ____________________________________________________________________ DI (FH) Raoul Bhatia M.Sc. email. r.bhatia@ipax.at Technischer Leiter IPAX - Aloy Bhatia Hava OEG web. http://www.ipax.at Barawitzkagasse 10/2/2/11 email. office@ipax.at 1190 Wien tel. +43 1 3670030 FN 277995t HG Wien fax. +43 1 3670030 15 ____________________________________________________________________
-----Original Message----- From: Steven M. Bellovin [mailto:smb@cs.columbia.edu] Sent: Friday, November 28, 2008 5:35 AM To: nanog@nanog.org Subject: an over-the-top data center
http://royal.pingdom.com/2008/11/14/the-worlds-most-super-desi gned-data-center-fit-for-a-james-bond-villain/ (No, I don't know if it's real or not.)
One could consider purchasing the underground tunnels in downtown London that BT is selling to build a competing "over-the-top" data center. http://www.nytimes.com/2008/11/28/business/worldbusiness/28tunnel.html
On Sat, Nov 29, 2008 at 7:03 AM, Buhrmaster, Gary <gtb@slac.stanford.edu> wrote:
One could consider purchasing the underground tunnels in downtown London that BT is selling to build a competing "over-the-top" data center.
That's a "below the surface" datacenter, innit? srs (ok, I'll get my coat)
On Nov 28, 2008, at 6:34 AM, Steven M. Bellovin wrote:
http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-cent... (No, I don't know if it's real or not.)
I recall visiting something of this sort a couple years back.. On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions. I wonder if "pirates" were listed anywhere in their business plan... -danny
On 1-Dec-08, at 10:27 AM, Danny McPherson wrote:
On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions.
Your data connection to shore is going to be tenuous at best. One good blow strong enough to make you drag anchor and you kiss goodbye your fibre trunk connection. Putting that back in service is a bit more than a four hour splice job. An alternative would be to run a microwave link to shore, but I'm not sure I would want to bet the farm on the mechanics necessary to keep the dish aligned. And what do you do when it's time to haul out and paint the bottom?!? Then there is the matter of power. It wouldn't be very hard to DOS the entire operation by taking out the fuel barges. I suppose you could permanently tie up to a pier, but at that point you're just a building with a leaky basement. I don't see how anyone could claim this is more secure than a purpose-built data centre. (And even at anchor, how do you stop someone from taking you out with something as simple as a drill?) --lyndon (mailing via Wimax from S/V Bandido I, at the dock in Vancouver :-)
On Dec 1, 2008, at 2:19 PM, Lyndon Nerenberg wrote:
On 1-Dec-08, at 10:27 AM, Danny McPherson wrote:
On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions.
Your data connection to shore is going to be tenuous at best. One good blow strong enough to make you drag anchor and you kiss goodbye your fibre trunk connection. Putting that back in service is a bit more than a four hour splice job.
Not if the ship is literally encased in concrete at the shore. Which solves all your other problems as well. There are even examples of actual free-floating ships which have been stable for a decade or more. See the floating casinos in Louisiana, which have been hit by hurricanes, and are still attached to shore by electricity, bits, and physically. -- TTFN, patrick
Patrick W. Gilmore wrote:
On Dec 1, 2008, at 2:19 PM, Lyndon Nerenberg wrote:
On 1-Dec-08, at 10:27 AM, Danny McPherson wrote:
On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions.
Your data connection to shore is going to be tenuous at best. One good blow strong enough to make you drag anchor and you kiss goodbye your fibre trunk connection. Putting that back in service is a bit more than a four hour splice job.
Not if the ship is literally encased in concrete at the shore. Which solves all your other problems as well.
There are even examples of actual free-floating ships which have been stable for a decade or more. See the floating casinos in Louisiana, which have been hit by hurricanes, and are still attached to shore by electricity, bits, and physically.
The same ones that were moved inland and deposited on top of someone's house? Hardly a good example of stable. http://www.katrina.noaa.gov/helicopter/images/katrina-biloxi-miss-grand-casi... ~Seth
Not if the ship is literally encased in concrete at the shore. Which solves all your other problems as well.
But that's not a ship, it's a building.
There are even examples of actual free-floating ships which have been stable for a decade or more.
And many counter-examples. --lyndon
On 1 Dec 2008, at 19:19, Lyndon Nerenberg wrote:
An alternative would be to run a microwave link to shore, but I'm not sure I would want to bet the farm on the mechanics necessary to keep the dish aligned.
Actually this is pretty straightforward. Systems exist for getting rock steady film from moving helicopters and I'm sure that a system that can keep a camera aimed at a point can do the same for a microwave dish. Ian
On Monday 01 December 2008 13:27:30 Danny McPherson wrote:
On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions.
You mean something akin to Sealand's HavenCo? Yes, I know that's an old fort, and not a ship, but a similar concept at least.
On Mon, 1 Dec 2008 16:03:39 -0500 Lamar Owen <lowen@pari.edu> wrote:
On Monday 01 December 2008 13:27:30 Danny McPherson wrote:
On a related noted, some have professed that adapting old ships into data centers would provide eco-friendly secure data center solutions.
You mean something akin to Sealand's HavenCo? Yes, I know that's an old fort, and not a ship, but a similar concept at least.
HavenCo, which ran a datacenter on the "nation" of Sealand, is no longer operating there: http://www.theregister.co.uk/2008/11/25/havenco/ --Steve Bellovin, http://www.cs.columbia.edu/~smb
Steven M. Bellovin wrote:
HavenCo, which ran a datacenter on the "nation" of Sealand, is no longer operating there: http://www.theregister.co.uk/2008/11/25/havenco/
--Steve Bellovin, http://www.cs.columbia.edu/~smb
If you do a bit more research on that one, it never got to a serious point. They had one 802.11b onto the platform and never got very far with it. No fiber and no redundancy. However the idea was a bit of a novelty, because it's claimed to be sovereign territory. Kind regards, Martin List-Petersen -- Airwire - Ag Nascadh Pobal an Iarthar http://www.airwire.ie Phone: 091-865 968
On Mon, Dec 1, 2008 at 16:34, Steven M. Bellovin <smb@cs.columbia.edu> wrote:
HavenCo, which ran a datacenter on the "nation" of Sealand, is no longer operating there:
Which is the same story for most (if not all) of these hype-driven "bullet-proof" data centers. I recall a .com CEO espousing the capabilities of his datacenter-inside-an-old-bank-vault to prevent DoS attacks such as the one that had hit Yahoo! the week before. I must say that the provided dinner, drinks and Hummer Limo ride, to the DC, made the humor of the CEO more enjoyable. Sadly a lot of older pensioners were eating his every word. At that time I worked for an equipment/services reseller and I persisted quietly, as best I could, to save some people's life savings. I felt like a diver witnessing a herring infused shark fest. -Jim P.
On Monday 01 December 2008 16:34:26 Steven M. Bellovin wrote:
On Mon, 1 Dec 2008 16:03:39 -0500 Lamar Owen <lowen@pari.edu> wrote:
You mean something akin to Sealand's HavenCo? Yes, I know that's an old fort, and not a ship, but a similar concept at least.
HavenCo, which ran a datacenter on the "nation" of Sealand, is no longer operating there: http://www.theregister.co.uk/2008/11/25/havenco/
Which shows how well the concept works; which is why I mentioned it....
Apologies to the list. I didn't know whether to fork this into a couple of replies, or just run with it. I chose the latter. 1) This datacenter is only 12,000 sq ft. (submessage: who cares?) 2) The generators are underground. A leak in their exhaust system kills everyone -- worse, a leak in their fuel tank or filler lines (when being filled from above) could do the same. Yes, you could address this with alarms (provided they work and are tested, etc). 3) No one cares if the server farm is blast proof (it isn't), if the connectivity in/out of it gets blasted (submessage: silos were meant to deliver one thing, datacenters aren't in the same operational model once they need connectivity to the outside world) 4) With all of that fog and plant life, I wonder how they critically manage humidity. [Or if they even do]. ---- To the question of carrier hotels and their supposed secrecy, etc. If you need connectivity to multiple providers, those providers know where the buildings are, and presumably so do most of their employees. If 500,000 people (say the top 10 companies together) know where the building is, it's not a secret. ** Carrier hotels aren't meant to be more secure than the lines coming into them. Those lines are coming in on unsecured poles, manholes and the rest. Their most dramatic failure modes are pretty obvious if not well-studied. Internet "security" [as in resilience] is built on the concept of a point-of-view of connectivity with multiple failures and routing around them -- NOT sacred nodes that cannot fail or universal end-to-end reachability. Internet "security" [as in integrity] is not something that's been proven on the Internet yet [general case, please no banter about encryption/quantum oscillation, etc]. Lots of people have already said this is dull -- it is, it is also a nice set of pictures. ** Submitted without proof. This covers all the buildings that make claims about not having their name on the door and have loading docks with no security on them. (you know who you are). Deepak
On Mon, 1 Dec 2008, Deepak Jain wrote:
3) No one cares if the server farm is blast proof (it isn't), if the connectivity in/out of it gets blasted (submessage: silos were meant to deliver one thing, datacenters aren't in the same operational model once they need connectivity to the outside world)
It's much easier to restore fiber connectivity in a time of crisis than it is to source hardware manufacturered at the other end of the world and have this set up properly. I do think there is value in keeping the hw safer than the connectivity to the outside. I bet the military or emergency services can establish a 10km fiber stretch in a few hours. Replacing some telecom hw and set it up from scratch would probably take weeks (I'm not talking about a single router here). -- Mikael Abrahamsson email: swmike@swm.pp.se
Mikael Abrahamsson wrote:
On Mon, 1 Dec 2008, Deepak Jain wrote:
3) No one cares if the server farm is blast proof (it isn't), if the connectivity in/out of it gets blasted (submessage: silos were meant to deliver one thing, datacenters aren't in the same operational model once they need connectivity to the outside world)
It's much easier to restore fiber connectivity in a time of crisis than it is to source hardware manufacturered at the other end of the world and have this set up properly. I do think there is value in keeping the hw safer than the connectivity to the outside.
I bet the military or emergency services can establish a 10km fiber stretch in a few hours. Replacing some telecom hw and set it up from scratch would probably take weeks (I'm not talking about a single router here).
Hi Mikael, The speed with which fibre can be pulled will very much depend on the available paths and other resources. It may be that the previous path of the damaged fibre may now be blocked or otherwise unavailable such that construction work is required. As you say it is likely to be more difficult to recover from a problem at a datacentre due to the greater potential for damage and diversity of resources required. The point has already been made that not all customers may be able to avail of site resilience due to the associated cost, and so may be reliant on the one datacentre. In addition one thing which I do not think has been mentioned is that damage to a building may make the site unsafe and possibly injure staff; perhaps causing planning, coordination and implementation of site recovery to be considerably more complicated than simply replacing equipment. Most customers would not be willing to pay extra to get hardened datacentres, so despite the complexities of recovery Deepak is largely right when he said that no one cares about blast proof server farms, at least in the peaceful parts of the world. Paul.
I bet the military or emergency services can establish a 10km fiber stretch in a few hours. Replacing some telecom hw and set it up from scratch would probably take weeks (I'm not talking about a single router here).
But we aren't talking about the military here, are we? We are talking about an ISP on an ISP forum. Deepak
Deepak Jain wrote:
I bet the military or emergency services can establish a 10km fiber stretch in a few hours. Replacing some telecom hw and set it up from scratch would probably take weeks (I'm not talking about a single router here).
But we aren't talking about the military here, are we? We are talking about an ISP on an ISP forum.
Yes.... but in a disaster scenario where critical communication links are down the military would respond and reestablish the links, if for nothing else to re establish situational awareness for themselves.
But we aren't talking about the military here, are we? We are talking about an ISP on an ISP forum.
Yes.... but in a disaster scenario where critical communication links are down the military would respond and reestablish the links, if for nothing else to re establish situational awareness for themselves.
This is getting off-topic in a big way, but I can pretty much assure you that the US military isn't going to be re-establishing ISP circuits for the military's situational awareness. I can't speak of the Swedish military. In most countries with a big-bad-military, the most the military will do is allow the commercial entities to expedite their own repairs and perhaps bypass certain permit requirements -- which is as it should be. If this is the reason to build a bomb proof datacenter, I encourage all my competitors to do so. Someone said it earlier, its far cheaper, and far more reliable to be massively redundant than super hardened in one (or a few) locations. If you think you can't afford the former, but can get the latter, you don't understand what you are solving for. Deepak
--On måndag, måndag 1 dec 2008 18.19.14 -0500 Deepak Jain <deepak@ai.net> wrote:
1) This datacenter is only 12,000 sq ft. (submessage: who cares?)
For some things, it is OK. It is not the only one, only the best marketed one.
2) The generators are underground. A leak in their exhaust system kills everyone -- worse, a leak in their fuel tank or filler lines (when being filled from above) could do the same. Yes, you could address this with alarms (provided they work and are tested, etc).
The original design and purpose required internal gensets. Keeping them inside is still important for a number of reasons. This is the Baltic, not San Diego. Rain, fog, snow, etc. Both intake and exhaust are normally coupled to the outside via boulder-blocked blasted tunnels, so the gas path is not connected to the inside.
3) No one cares if the server farm is blast proof (it isn't), if the connectivity in/out of it gets blasted (submessage: silos were meant to deliver one thing, datacenters aren't in the same operational model once they need connectivity to the outside world)
See what Mikael wrote.
4) With all of that fog and plant life, I wonder how they critically manage humidity. [Or if they even do].
I have been told by people who have been working with the construction of this very site that it is an unusually dry cave. It is pretty high up by Stockholm standards, which helps. -- Måns Nilsson M A C H I N A if it GLISTENS, gobble it!!
On Tue, 2008-12-02 at 10:33 +0100, Måns Nilsson wrote:
4) With all of that fog and plant life, I wonder how they critically manage humidity. [Or if they even do].
I have been told by people who have been working with the construction of this very site that it is an unusually dry cave. It is pretty high up by Stockholm standards, which helps.
Seems like dry-ice was used to make the "tropical fog" in the photos, not water poured over hot rocks like a sauna/bath house. -- Jeremy Jackson Coplanar Networks (519)489-4903 http://www.coplanar.net jerj@coplanar.net
On Tue, Dec 02, 2008 at 11:19:36AM -0500, Jeremy Jackson wrote:
Seems like dry-ice was used to make the "tropical fog" in the photos, not water poured over hot rocks like a sauna/bath house.
I've tried to avoid stating the obvious reading through all this funny thread, but I can't help it now. Am I the only one thinking that shady lights, tropical fog, creepy tunnels, blue/colored lights, and *waterfalls* are *bad* things in a datacenter? I mean, it make a good movie set, but seriously... I wouldn't want to be looking for that damn blue "locator" LED on that 10th switch with a blue neon light... A. -- In god we trust, others pay cash. - Richard Desjardins, Miami
The Anarcat wrote:
On Tue, Dec 02, 2008 at 11:19:36AM -0500, Jeremy Jackson wrote:
Seems like dry-ice was used to make the "tropical fog" in the photos, not water poured over hot rocks like a sauna/bath house.
I've tried to avoid stating the obvious reading through all this funny thread, but I can't help it now.
Am I the only one thinking that shady lights, tropical fog, creepy tunnels, blue/colored lights, and *waterfalls* are *bad* things in a datacenter?
I mean, it make a good movie set, but seriously... I wouldn't want to be looking for that damn blue "locator" LED on that 10th switch with a blue neon light...
Not to mention dry ice = carbon dioxide which isn't particularly healthy for the humans in that enclosed space. -- Jay Hennigan - CCIE #7880 - Network Engineering - jay@impulse.net Impulse Internet Service - http://www.impulse.net/ Your local telephone and internet company - 805 884-6323 - WB6RDV
Maybe it isn't dry ice.... Maybe it is from liquid oxygen, in which case it better be a smoke free workplace. ---------------------- Brian Raaen Network Engineer braaen@zcorum.com On Tuesday 02 December 2008, Jay Hennigan wrote:
The Anarcat wrote:
On Tue, Dec 02, 2008 at 11:19:36AM -0500, Jeremy Jackson wrote:
Seems like dry-ice was used to make the "tropical fog" in the photos, not water poured over hot rocks like a sauna/bath house.
I've tried to avoid stating the obvious reading through all this funny thread, but I can't help it now.
Am I the only one thinking that shady lights, tropical fog, creepy tunnels, blue/colored lights, and *waterfalls* are *bad* things in a datacenter?
I mean, it make a good movie set, but seriously... I wouldn't want to be looking for that damn blue "locator" LED on that 10th switch with a blue neon light...
Not to mention dry ice = carbon dioxide which isn't particularly healthy for the humans in that enclosed space.
-- Jay Hennigan - CCIE #7880 - Network Engineering - jay@impulse.net Impulse Internet Service - http://www.impulse.net/ Your local telephone and internet company - 805 884-6323 - WB6RDV
On Dec 2, 2008, at 2:25 PM, Brian Raaen wrote:
Maybe it isn't dry ice.... Maybe it is from liquid oxygen, in which case it better be a smoke free workplace.
This is of course off-off-topic, but I would suspect the room temperature ultrasonic misters, not dry ice or wood smoke. Regards Marshall
----------------------
Brian Raaen Network Engineer braaen@zcorum.com
On Tuesday 02 December 2008, Jay Hennigan wrote:
The Anarcat wrote:
On Tue, Dec 02, 2008 at 11:19:36AM -0500, Jeremy Jackson wrote:
Seems like dry-ice was used to make the "tropical fog" in the photos, not water poured over hot rocks like a sauna/bath house.
I've tried to avoid stating the obvious reading through all this funny thread, but I can't help it now.
Am I the only one thinking that shady lights, tropical fog, creepy tunnels, blue/colored lights, and *waterfalls* are *bad* things in a datacenter?
I mean, it make a good movie set, but seriously... I wouldn't want to be looking for that damn blue "locator" LED on that 10th switch with a blue neon light...
Not to mention dry ice = carbon dioxide which isn't particularly healthy for the humans in that enclosed space.
-- Jay Hennigan - CCIE #7880 - Network Engineering - jay@impulse.net Impulse Internet Service - http://www.impulse.net/ Your local telephone and internet company - 805 884-6323 - WB6RDV
Marshall wrote:
This is of course off-off-topic, but I would suspect the room temperature ultrasonic misters, not dry ice or wood smoke.
Regards Marshall
Concur. As anyone who works with air conditioning knows, ultrasonic are the low maintenance option for your humidifier units anyways. A lot of your datacenters have those 8-) There are also doors between the plants and NOC and the server rooms ... Having them external to the AC and pumping visible fog out into the room instead of invisible into the air feeds is unusual, but if the resulting humidity (in the NOC, not the server rooms) is normal it's no big deal. You can have the floor covered in an inch of water and the air be perfectly safe humidity for systems (just don't drop a live power cable in the water...). I wouldn't do this personally, but if done right it should be safe. -george william herbert gherbert@retro.com
Marshall Eubanks wrote:
On Dec 2, 2008, at 2:25 PM, Brian Raaen wrote:
Maybe it isn't dry ice.... Maybe it is from liquid oxygen, in which case it better be a smoke free workplace.
This is of course off-off-topic, but I would suspect the room temperature ultrasonic misters, not dry ice or wood smoke.
I'd be more worried about the artificial waterfalls... the sound of flowing water has an established physiological effect..... Um... where's the bathroom? -- Jeff Shultz
I would agree with the psychological effects. That would be a downside to working in a place that aside from that is so unbelievably kickass. -----Original Message----- From: Jeff Shultz [mailto:jeffshultz@wvi.com] Sent: Tuesday, December 02, 2008 1:28 PM To: NANOG list Subject: Re: an over-the-top data center Marshall Eubanks wrote:
On Dec 2, 2008, at 2:25 PM, Brian Raaen wrote:
Maybe it isn't dry ice.... Maybe it is from liquid oxygen, in which case it better be a smoke free workplace.
This is of course off-off-topic, but I would suspect the room temperature ultrasonic misters, not dry ice or wood smoke.
I'd be more worried about the artificial waterfalls... the sound of flowing water has an established physiological effect..... Um... where's the bathroom? -- Jeff Shultz
Speaking as a Datacenter Manager who (believe it or not) at one time was an Art Director, I have to say that the "ambience" in those photographs, in the form of fog, odd/colored lighting, etc. was certainly created at the time of the photo shoot by an Art Director ... with delusions (illusions) of grandeur in mind. I imagine that were any of us to visit the site in question on a normal working day we'd find no such special effects and it would look, other than the granite walls, not too different from any other datacenter, or NOC. _________________________________________ chuck goolsbee - fully RFC 1925 compliant
chuck goolsbee wrote:
would look, other than the granite walls
On the subject of suitability problems, unless there is good air circulation in these bunkers from the outside, radon seepage from the surrounding granite has the potential to cause a lot of health problems for any unlucky punter who happens to work in there, although it's unlikely that it would have any effect on any equipment housed in the facility. Having said that, radon seems to be a well known problem in Stockholm and I've no doubt that they took measures to deal with it. Nick
On Tue, 2008-12-02 at 21:49 +0000, Nick Hilliard wrote:
chuck goolsbee wrote:
would look, other than the granite walls
On the subject of suitability problems, unless there is good air circulation in these bunkers from the outside, radon seepage from the surrounding granite has the potential to cause a lot of health problems for any unlucky punter who happens to work in there, although it's unlikely that it would have any effect on any equipment housed in the facility.
So control systems in nuclear power plants don't need any extra shielding to prevent "glitches"?
From: Marshall Eubanks [mailto:tme@multicasttech.com] Sent: Tuesday, December 02, 2008 15:15
This is of course off-off-topic, but I would suspect the room temperature ultrasonic misters, not dry ice or wood smoke.
Still off-topic, but I hope they used distilled water. If the water has a medium to high mineral content ("hard" water), the miniscule droplets produced by ultrasonic misters evaporate quickly into microscopic dust motes, small enough to evade most filtering systems. (This data center actually reminds me of the old Kon-Tiki movie theater in Dayton, OH.) -- Jim Goltz <jgoltz@mail.nih.gov> CIT/DCSS/HSB/ASIG 12/2216 DCSS Firewall group on-call: 240-338-2103
On Tue, 02 Dec 2008 13:26:51 EST, The Anarcat said:
Am I the only one thinking that shady lights, tropical fog, creepy tunnels, blue/colored lights, and *waterfalls* are *bad* things in a datacenter?
Well, across the hall we have: Photo-op version: http://www.vtnews.vt.edu/story.php?relyear=2006&itemno=621 Production version: http://www.arc.vt.edu/images/upgrade/IMG_2434.jpg
On Nov 28, 2008, at 8:34 AM, Steven M. Bellovin wrote:
http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-cent... (No, I don't know if it's real or not.)
--Steve Bellovin, http://www.cs.columbia.edu/~smb
It has become de rigeur in some parts of the colocation and wholesale datacenter space to have a media puff-piece done on your new datacenter. Typically, that puff-piece is full of hyperbole and contains lots of power and efficiency numbers that don't add up. The tech media is a willing participant and, while they don't know any better, they don't make the effort to pick up the phone and ask someone who might know a bit more than they do. The classier datacenter providers generally don't do this stuff. For one thing, its an absolute waste of time - it generates a lot of worthless and time wasting leads for your sales force. Daniel Golding
participants (30)
-
Blake Pfankuch
-
Brian Raaen
-
Buhrmaster, Gary
-
Charles Wyble
-
chuck goolsbee
-
Daniel Golding
-
Danny McPherson
-
Deepak Jain
-
George William Herbert
-
Goltz, Jim (NIH/CIT) [E]
-
Ian Mason
-
Jay Hennigan
-
Jeff Shultz
-
Jeremy Jackson
-
Jim Popovitch
-
Lamar Owen
-
Lyndon Nerenberg
-
Marshall Eubanks
-
Martin List-Petersen
-
Mikael Abrahamsson
-
Måns Nilsson
-
Nick Hilliard
-
Patrick W. Gilmore
-
Paul Cosgrove
-
Raoul Bhatia [IPAX]
-
Seth Mattinen
-
Steven M. Bellovin
-
Suresh Ramasubramanian
-
The Anarcat
-
Valdis.Kletnieks@vt.edu