×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

How Data Center Operator IPR Survived Sandy

samzenpus posted about 2 years ago | from the keep-on-keeping-on dept.

Data Storage 50

Nerval's Lobster writes "At the end of October, Hurricane Sandy struck the eastern seaboard of the United States, leaving massive amounts of property damage in its wake. Data center operators in Sandy's path were forced to take extreme measures to keep their systems up and running. While flooding and winds knocked some of them out of commission, others managed to keep their infrastructure online until the crisis passed. In our previous interview, we spoke with CoreSite, a Manhattan-based data center that endured even as much of New York City went without power. For this installment, Slashdot Datacenter sat down with executives from IPR, which operates two data centers—in Wilmington, Delaware and Reading, Pennsylvania—close to Sandy's track as it made landfall over New Jersey and pushed northwest."

Sorry! There are no comments related to the filter you selected.

PPPPP (5, Insightful)

Anonymous Coward | about 2 years ago | (#42030277)

Proper Planning Prevents Poor Performance

Re:PPPPP (4, Informative)

jesseck (942036) | about 2 years ago | (#42031087)

I remember seeing this in the past... but we used the 7 P's: "Proper Prior Planning Prevents Piss-Poor Performance"

Re:PPPPP (0)

Anonymous Coward | about 2 years ago | (#42037273)

Brevity.

Duh, just outsource it to India (4, Funny)

crazyjj (2598719) | about 2 years ago | (#42030325)

My friend Rahul and Sameer will take care of your needs, and are to be speaking excellent English, most also.

Re:Duh, just outsource it to India (0)

Anonymous Coward | about 2 years ago | (#42031143)

Yes, please do the needful.

Re:Duh, just outsource it to India (0)

Anonymous Coward | about 2 years ago | (#42033699)

Argh, where did that phrase come from!

Bain of my existance

Re:Duh, just outsource it to India (0)

Anonymous Coward | about 2 years ago | (#42033765)

Bane.

dupe! (0, Troll)

larry bagina (561269) | about 2 years ago | (#42030381)

Already posted here [slashdot.org] . Do you guys even bother checking?

Re:dupe! (0, Troll)

Anonymous Coward | about 2 years ago | (#42030611)

If you pay enough to slashdot, you get to run your ad multiple times.

Ducked the most important question (3, Interesting)

joeflies (529536) | about 2 years ago | (#42030559)

In his response to the question "So you suffered no downtime at all?", the business development manager provided a non-answer to a yes/no question. The interviewer should have followed that question up to clarify.

Re:Ducked the most important question (1)

Dishevel (1105119) | about 2 years ago | (#42031179)

Read the whole interview.

tl;dr version (3, Funny)

kiite (1700846) | about 2 years ago | (#42030613)

"We didn't do anything special; our power never went out."

Re:tl;dr version (2)

mcgrew (92797) | about 2 years ago | (#42030849)

Of course their power never went out, they had three separate electric companies wired in, if one went down a second kicked in, if that went down one of their two generators kicked in.

Re:tl;dr version (4, Informative)

kiite (1700846) | about 2 years ago | (#42030931)

Right. And, according to TFA, none of their supplies ever went out. I live in NYC. A lot of the city lost power, sure. The transit system was knocked out, sure. There was a lot of flooding in fringe areas, where most data centers weren't. This guy is talking about NYC like it got demolished by the storm. Slashdot already did a piece on how a NYC data center mitigated power loss; reading TFS, I was hoping for a point of view from a more heavily battered standpoint. Instead, I got, "We had back-ups, and we think they work because we test them regularly, but we didn't actually have to do anything."

Re:tl;dr version (1)

Synerg1y (2169962) | about 2 years ago | (#42030973)

Still... whole sections of the grid went down in the NYC area, electrical companies own portions of that, it sounds like where they were at, they still had access to power, most didn't, it was generator or shut down, and for most it was the latter. I'm not trying to bash them by any means though, 2n is hella impressive in scope and investment. The question now is will other datacenters follow suite, despite a hurricane up there being a 1/2 century type event.

Re:tl;dr version (2)

hawguy (1600213) | about 2 years ago | (#42031141)

Of course their power never went out, they had three separate electric companies wired in, if one went down a second kicked in, if that went down one of their two generators kicked in.

I wonder if that's true - I can believe that they bring in power from several different substations, but if there were a widespread grid outage, it seems like that would have taking out all of their substations.

I don't understand this comment:

We potentially have two power grids. We actually have three. The third we would never go to, I don’t think.

Why would they connect to a power source that they'd never go to?

Re:tl;dr version (4, Informative)

Rich0 (548339) | about 2 years ago | (#42031241)

Yeah, Reading PA. Go look it up on a map. They weren't going to be having multiple substation outages that far inland. My own workplace didn't lose power and is about 20 miles further East. It doesn't hurt that they're about 200 yards from a substation and that both the substation and the plant site are fed by transmission lines on steel towards that stand WAY above the height of nearby trees.

Most of the outages for Sandy were due to flooding or downed trees. The former was only a problem along the coast or near rivers, and really a big problem for NYC where they have transmission equipment underground. Trees are horrible for the last mile of power delivery, but aren't an issue for the major substations, since if you drive by one of them you'll note that the transmission lines are WAY up in the air, and the trees are trimmed back a huge distance on either side of them anyway. The towers themselves are steel and on concrete foundations - they're not going to fall unless they're hit by something like a tornado.

The reason so many lost power wasn't because of transmission being cut, but by a bazillion downed trees taking out every other telephone pole in the region. If you want an IT analogy imagine if all your big network feeds and datacenter are intact, but some vandal walks around your building and sticks a firecracker next to every single network port.

For an inland location like Reading PA, this was just a matter of having either good power connectivity, or generators. Wilmington is next to the Delaware Bay and would be at more risk, but as long as you're at reasonable elevation and above-ground you'd be fine.

Re:tl;dr version (1)

hawguy (1600213) | about 2 years ago | (#42031449)

Yeah, Reading PA. Go look it up on a map. They weren't going to be having multiple substation outages that far inland. My own workplace didn't lose power and is about 20 miles further East. It doesn't hurt that they're about 200 yards from a substation and that both the substation and the plant site are fed by transmission lines on steel towards that stand WAY above the height of nearby trees.

Sure, not for this disaster, but how about the next one that might be a tornado that goes through Reading. Or an earthquake. Or an east coast ice storm that downs powerlines (including steel towered transmission lines) throughout the region.

Locating a datacenter well outside of a disaster zone just shows that they were lucky for this particular disaster.

Re:tl;dr version (2)

Vancorps (746090) | about 2 years ago | (#42033169)

While you are right, the big selling point for a lot of data centers is physical location. IO Data here in Scottsdale for instance prides itself on the fact that there really is no severe weather in the area. Historically the area is geologically stable, not prone to flooding, no where near any forest fires. So their location is their first defense against disaster with N+3 redundancy as additional defenses.

Disaster planning is hard, some things you take for granted during normal times simply aren't available during a disaster. Think diesel fuel delivery using trucks. That's why a lot of data centers rely on pipelines for fuel delivery with trucks as a standby.

Re:tl;dr version (1)

Rich0 (548339) | about 2 years ago | (#42035447)

I think the most likely disaster scenario for Reading PA would be a meteor impact. That area of the country just doesn't get much in the way of natural catastrophes unless you happen to be right next to a river or creek that can flood. I think a tornado makes the news about once every three years and is generally confirmed by the lawn furniture being dispersed in a non-linear pattern.

Sure, it can happen, but it is about as uneventful an area as you'll find.

Oh, ice storm is another failure mode for sure - you can certainly get those around there.

WIlmington DE power didn't go out. (1)

billstewart (78916) | about 2 years ago | (#42035779)

In fact the standard commercial power in much of the area didn't go out. There were presumably the usual power lines hit by trees or other local outages, but the power grid stayed up. It's too far from the ocean for tide and storm surge flooding, and much of the storm energy either didn't head their direction or got expended on New Jersey.

generators (1)

Anonymous Coward | about 2 years ago | (#42030681)

generators + diesel, it's not rocket science.

Re:generators (0)

Anonymous Coward | about 2 years ago | (#42030875)

In Reading and Wilmington, they didn't even need generators. It just rained a little there.

If IPR had any down time in those locations, I would dump them immediately.

Re:generators (0)

slimjim8094 (941042) | about 2 years ago | (#42031075)

Oops, the basement is completely underwater and the fuel tanks are flooded. What do you do now?

Oh, you put the fuel tanks up high? No you didn't, that's against the fire code (bad idea having flammable liquid above people's heads in a fire).

It's OK, your tanks didn't leak, and you were clever enough to put your generators up high. But the fuel pumps shorted out.

Alright, you got lucky and the pumps were fine. But now you're out of fuel, as is everyone else, and travel is difficult since tunnels are flooded so getting trucks in is a nontrivial task. How do you keep the generators running?

What does Mr. Anonymous Coward, Site Reliability Engineer Extrordinaire, do now? More importantly, did you think of it before this hundred-year storm?

Re:generators (1)

hawguy (1600213) | about 2 years ago | (#42031329)

What does Mr. Anonymous Coward, Site Reliability Engineer Extrordinaire, do now? More importantly, did you think of it before this hundred-year storm?

I think he'd do exactly what this ISP did -- locate their main facility outside the city so they aren't constrained by urban high-rise fire codes and expensive real estate costs.

Note that their Wilmington facility is in a high-rise building and only has around 10000 gallons of fuel. At 3MW, that gives them around 48 hours before they need to refuel, so if they experienced flooding and power loss at that site, they would have had the same problem as the NYC datacenters. Their suburban Reading facility has over 30,000 gallons of fuel - and are located above the 500 year flood plain.

Re:generators (1)

slimjim8094 (941042) | about 2 years ago | (#42032047)

Sure, but then most of the transit is in cities, and it's either more expensive or slower to build outside. It's a tradeoff, to be sure.

Nobody's suggesting that it's impossible for a DC in NYC to weather the scenario that ended up happening, but it's into the diminishing returns so it's a lot more expensive. Even if the claim could be made that the DC made mistakes, they weren't trivially stupid mistakes - which is what the GP was implying.

Re:generators (1)

mlts (1038732) | about 2 years ago | (#42031527)

This is why that business critical stuff does work with more than one data center. There is only so much that can be done at one location.

Yes, the generator may fire up, but even when the diesel tank is full, assuming no trucks available to refill it, how long will it last, especially if power is out for weeks. There is always the ability to use natural gas for a generator, but on a DC level, it would take some large pipes to handle the gas coming in, and this assumes the lines are pressurized.

Having multiple data centers that are geographically separate as well as some replication system [1] is a must for an enterprise.

If I owned a data center, I'd put in the expense and effort into what tier I was doing (including excercising the generator and actually testing the ATS mechanisms), then I'd tell customers that if they wanted more reliability, to look for an additional data center and WAN clustering. Then, have the lawyers write up the SLA with the usual "hurricane/terrorist/acts of war/acts of Thor/etc." disclaimers, and call it done.

[1]: Databases can replicate, Netbackup has AIR, most SANs like EMC's VNX have replication for both LAN/WAN, and if one wants to be really ghetto, using the Dropbox software can replicate document changes and stash them offsite. Of course, there is active HA/failover as well, such as PowerHA or vMotion.

Re:generators (0)

Anonymous Coward | about 2 years ago | (#42031675)

my Data Center was impacted by the storm, unlike TFA... grid power was down for 3 days and multiple employees got stuck in the office for a couple days (road closures keeping people in the office, and preventing replacement stuff from getting in)

I'll repeat, generators, diesel, frozen food, and offers of couches to crash on from any employee living within walking distance.

it also takes some incredible foresight like not building your data center on ground level next to a river/flood plain (while the roads may have been flooded out, the facility itself is set well back from the flood zones)

Re:generators (0)

Anonymous Coward | about 2 years ago | (#42031987)

Actually there is a well tested technology to allow the motor to be on the roof and the pump in the basement that works over thousands of feet, its the pump jack seen on oil wells, and a sucker rod to the pump. Yes you have to have a pipe chase down to the tank, but, you can ensure that nothing gets flooded. Today you can find smaller fully submersible pumps, perhaps hydraulic, that would go in the tank and be powered with lines down to the tank from up high. So the oil issue is economics, the technology exists in many forms.

Re:generators (1)

Hobadee (787558) | about 2 years ago | (#42032003)

A well planned data center will have a fuel-delivery contract that says something along the lines of: "After X days, you must be able to deliver Y fuel every Z days. If you don't or can't deliver Y fuel every Z days, you pay us for the downtime we incur."

As long as they have enough fuel onsite to last X days, they are fine; The fuel delivery company is on the hook if they go down. (Assuming everything is regularly tested and in good working order.)

Re:generators (1)

petermgreen (876956) | about 2 years ago | (#42036289)

I can't believe any sane fuel delivery company would sign a contract as simplistic as that. I'd at the very least expect a real contract to have "Force majeure" clauses and a set price for each day of downtime.

Re:generators (1)

hawguy (1600213) | about 2 years ago | (#42031171)

generators + diesel, it's not rocket science.

I'd say that locating outside of the hurricane's path was the better choice - having backup power does no good if the carriers that serve you can't power their equipment (like the earlier anecdote from the other datacenter about one of their carriers having their generator confiscated by the NYPD)

Re:generators (2)

ATestR (1060586) | about 2 years ago | (#42031283)

I'd say that locating outside of the hurricane's path was the better choice

If it isn't a hurricane, it's an Earthquake. If not that, then a nasty Blizzard. Or a Tornado. You can't avoid them all, and its best to prepare as best you can for the events possible at the location that you choose. No one can prevent all disasters, but you can mitigate the risk.

Re:generators (1)

hawguy (1600213) | about 2 years ago | (#42031359)

I'd say that locating outside of the hurricane's path was the better choice

If it isn't a hurricane, it's an Earthquake. If not that, then a nasty Blizzard. Or a Tornado. You can't avoid them all, and its best to prepare as best you can for the events possible at the location that you choose. No one can prevent all disasters, but you can mitigate the risk.

And the best way to mitigate risk is to have your DR site in a completely different geographical area. Relying on a single datacenter to keep your company running during a large scale disaster is foolish.

Re:generators (3, Interesting)

LoRdTAW (99712) | about 2 years ago | (#42032077)

I work next to a Verizon data center out here in Farmingdale, Long Island (supposedly all the Verizon cell phone traffic for long island). They recently built extensions to the building and had two large diesel generators installed, a 15,000 gallon fuel tank along with two large cooling systems. Turns out they needed it.

During the aftermath they didn't run out of diesel because they bought in an additional on site 15,000 gallon fuel tank (in a 40 foot container). Plus they had semi trucks with sleeper tractors from out of state with trailers full of diesel ready to fill the tanks back up on site 24/7. Armed guards manned the premises 24/7 and lived out of a mobile home. They also bought in generator powered flood lights to keep the surrounding property lit up like it was day light. Those generators sounded like a pair of locomotives running, and probably because they use engines of similar size. My manager found out they burn 4000 gallons of diesel a day keeping the building going. They ran the generators until the 7th or 8th. So they burned something like 40,000+ gallons of fuel in that time frame.

If you have the money and the right infrastructure, you can keep the power going as long as you need.

My Datacenter Had Perfect Uptime (5, Funny)

Anonymous Coward | about 2 years ago | (#42030929)

I had perfect 100% uptime during Sandy. No packet loss, no adverse effects, no fuss, no muss.

Please write a Slashdot article about me. Also, please conveniently ignore the fact that my datacenter is in Kansas City, MO.

Re:My Datacenter Had Perfect Uptime (1)

Bigby (659157) | about 2 years ago | (#42031205)

I had no packet loss either. It's kind of hard to lose something that was never sent.

I live just south of Hoboken.

Re:My Datacenter Had Perfect Uptime (0)

Anonymous Coward | about 2 years ago | (#42032089)

The datacenter I run likewise had no packet or power loss issues... It was like hell the day hurricane Sandy rolled through here in sunny Tucson, Az. Oh, wait... that was "summer" here.

Innovative technique (1)

Rytr23 (704409) | about 2 years ago | (#42030959)

Not being in the path of the worst of the storm. When everything around you stays up and running, your DC probably will too. These guys are genius. Who interviewed these tools and why?

uhuhhh... (1)

WGFCrafty (1062506) | about 2 years ago | (#42030961)

They're built for redundancy, if any integral systems went down in both THAT would be news. They could have lost one complete physical location and their clients would be upset but not out cash....

This isn’t a sales pitch, but if you do it yourself, you only have yourself to yell at, to complain to.

Then why does it read exactly like that with no real substance and mediocre answers?

This Brings Up A Great Question (1)

Revotron (1115029) | about 2 years ago | (#42030991)

If a generator's sitting idle in a data center, and the power never goes out, is it working?

You know, I heard data centers in Bangalore also had perfect uptime during Hurricane Sandy. Well, at least the ones that weren't suffering from brownouts.

Re:This Brings Up A Great Question (1)

mcgrew (92797) | about 2 years ago | (#42031341)

If a generator's sitting idle in a data center, and the power never goes out, is it working?

Yes, they test them regularly, especially if they're expecting a big storm.

Re:This Brings Up A Great Question (0)

Anonymous Coward | about 2 years ago | (#42031429)

*whoosh*

Datacenter catastrophe checklist (2)

sl4shd0rk (755837) | about 2 years ago | (#42030995)

1) On-site Diesel to power ops for 48hours
2) Tanker of Diesel pump->doorstep within 12hours
3) Generators
4) Backup generators
5) 48hours worth of food for staff + repair guys
6) nearby lodging reservations staff + repair guys

Re:Datacenter catastrophe checklist (1)

Keruo (771880) | about 2 years ago | (#42031861)

Or you could

1) place physically similar datacenters around the world
2) make your datacenter virtual, so you can keep the applications running at any place, and verify that hot-migrate works
3) ignore localized storms, since you have capacity and uptime on global scale

Sure, you notice that the datacenter goes down, but you don't have to waste diesel on generators, since the services have already been handed over to the next datacenter to handle.
Your crew can stay at home sleeping in their own beds rather than some cleaning closet or meeting room at datacenter and fix things once the storm has passed.

Re:Datacenter catastrophe checklist (1)

funkboy (71672) | about 2 years ago | (#42032613)

Weeelll, the problem with #2 in NYC was that the city wouldn't let fuel trucks for the datacenters in lower Manhattan into the area until the debris for their were cleared (which makes sense as they didn't want to have to deal with stuck fuel trucks too). Most of the NYC DCs that ran out of fuel ran out because of this reason.

Which brings up a few common rules for ultra-high availability datacenters:

  - don't build them in the middle of a city (riots, strikes, traffic, WTC being kamikazied, etc)
  - build them in a low disaster risk zone (e.g. flood, tsunami, earthquake, forest fires, plane crash, etc).
  - available power sources should include at least *two* technologies that don't require a truck roll to refuel (e.g. utility electric, utility natural gas, wind/solar/hydro)
  - or if that's really not available or "too expensive" then store at a week's worth of fuel on-site.

For bonus points you can run your DC off of whichever power technology is most economical at any given time during the day. This includes cranking up your diesel during the most expensive peak hours of the day when your fuel costs you less than utility electric. For extra bonus points, sell your surplus power back to the utility during these peaks. It's also a nice way to prove to your clients that your generator works :-).

Summary reminded me of this (0)

Anonymous Coward | about 2 years ago | (#42031081)

http://xkcd.com/705/

We stayed up!! (0)

Anonymous Coward | about 2 years ago | (#42031367)

We have data centers in Tinton Falls and Piscataway - both in the path of the storm - and we stayed up throughout. Hardest part was getting fuel...but we had 2000+ servers and dozens of data/Telco circuits running the whole time :)

Me too (2)

sjames (1099) | about 2 years ago | (#42031913)

I did just fine during Sandy as well. I have a laptop with a good battery and I can always run it from a cigarette lighter adapter in the car, but I never lost grid power. Of course, I live in Ga. but that's beside the point.

this is nothing new (1)

neitzert (184856) | about 2 years ago | (#42038041)

this is nothing new. I built this 10 years ago: http://patentscope.wipo.int/search/en/WO2003090106

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?