Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power IT

Raised Flooring Obsolete or Not? 372

mstansberry writes "In part three of a series on the price of power in the data center, experts debate the merits of raised flooring. It's been around for years, but the original raised floors weren't designed to handle the air flow people are trying to get out them today. Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go. Is cooling with raised floors the most efficient option?"
This discussion has been archived. No new comments can be posted.

Raised Flooring Obsolete or Not?

Comments Filter:
  • by Anonymous Coward
    ...but in lowered walling.
    • No (Score:4, Informative)

      by temojen ( 678985 ) on Thursday November 03, 2005 @04:51PM (#13944585) Journal
      It's in lower power chips, more efficient PSUs, and possibly liquid cooling where the radiator is outside the building (or a heat exchanger to heat pump loop in hot climates).
      • Re:No (Score:3, Insightful)

        by Nutria ( 679911 )
        liquid cooling

        Being, literally, a grey-beard who remembers working on intelligent (3270-series) terminals and water-cooled mainframes and Unix and DOS punks crowing about how "the mainframe is dead"... things like Citrix, LTSP, liquid-cooled racks, and IBM setting new records in "number of mainframe MIPS sold every year" really amuses me.
      • Re:No (Score:4, Interesting)

        by Keruo ( 771880 ) on Thursday November 03, 2005 @07:56PM (#13946317)
        Well, leave out raised floors and install servers on floor level then.
        But remember, this is what happens when shit hits the fan [novell.com] and servers are on floor level.
  • Where else? (Score:5, Funny)

    by Anonymous Coward on Thursday November 03, 2005 @04:44PM (#13944528)
    Where else am I going to store my beer so it can stay cold and the boss not find it?
    • by Anonymous Coward
      Where else am I going to store my beer so it can stay cold and the boss not find it?

      The problem is your boss. At a previous company my boss was the one that insisted we have a "beer fridge" hidden in the back of our server room, out of site of the rest of the company.
    • by hamburger lady ( 218108 ) on Thursday November 03, 2005 @05:10PM (#13944783)
      man, i'd much rather have you for an employee than the guy asking where to hide the bodies..
    • Re:Where else? (Score:4, Informative)

      by craigburton ( 910855 ) on Thursday November 03, 2005 @05:36PM (#13945074)
      Can I suggest fitting one of these in your data centre racks?

      http://www.canford.co.uk/commerce/resources/catdet ails/2457.pdf [canford.co.uk]

      or maybe even one of these...

      http://www.canford.co.uk/commerce/resources/catdet ails/2458.pdf [canford.co.uk]
    • easy, you upgrade to a whiskey habit, which requires no refrigeration and makes no telltale psssst when you open it. Also, indefinite shelf life, sealed or open. when I worked for a dot-com in 2000-2002, we were allowed beer after 5:00pm, and my boss, the CTO, said it was ok if I could keep a bottle of hard stuff in my desk so long as I waited till past 5 before taking a shot.
  • sub-floor (Score:5, Funny)

    by backdoorman ( 798833 ) on Thursday November 03, 2005 @04:44PM (#13944529)
    But then where will we keep the bodies?
    • Lowered ceilings. To justify the cost, just say you need it for the recessed lighting.
      • by b1t r0t ( 216468 ) on Thursday November 03, 2005 @04:56PM (#13944643)
        Lowered ceilings. To justify the cost, just say you need it for the recessed lighting.

        Downside: needs more reinforcement, especially if you need to hide an overweight PHB. Upside: if the odors go upwards, the bodies will remain undetected longer.

        Or you could just use old enclosed racks as sarcophagi, hiding them in the back of the storage room behind stacks of obsolete boxen.

    • Limit it to marketing people and lawyers. !human == !murder, ergo no legal trouble.

      You might get in trouble with peta, but the last time I checked they only concerned themselves with cute animals and didn't care much about invertebretes.
    • Re:sub-floor (Score:3, Interesting)

      by Clemensa ( 800698 )
      Yes...the bodies of mice. In all seriousness, every so often we get the most awful smell in our server room. That's when we call Rentokil, and they inevitably find the bodies of dead mice in our raised flooring in our server room. Bear in mind it's a couple of floors up....when people said to me "you are never more than 10 foot away from a rat when you are in London" I took it to mean horizontal distance, and not *actual* distance (I didn't imagine that many rats lived on every floor of buildings...)
    • i'm more concerned about keeping my booze cool than hiding bodies. the bodies can be dissolved in caustic soda and flushed down the toilet
  • Turns? (Score:5, Insightful)

    by mboverload ( 657893 ) on Thursday November 03, 2005 @04:45PM (#13944534) Journal
    As long as the space under the floor has a negative or positive atmosphere I can't see how somme turns have anything to do with the air flow.
    • Re:Turns? (Score:5, Funny)

      by geoffspear ( 692508 ) on Thursday November 03, 2005 @04:49PM (#13944559) Homepage
      Thanks for reassuring me about this.

      After reading this very insightful article summary, I was planning to completely replace all of the ductwork in my house on the assumption that air can't go around corners. You just saved me several thousand dollars.

    • Re:Turns? (Score:5, Interesting)

      by AKAImBatman ( 238306 ) * <akaimbatman AT gmail DOT com> on Thursday November 03, 2005 @04:52PM (#13944598) Homepage Journal
      Indeed. It's been years since I've seen a raised floor. As far as I know, most new datacenters use racks and overhead wire guides instead. The reason for this is obviously not the air flow. The raised floor made sense when you had only a few big machines that ran an ungodly number of cables to various points in the building. (At a whopping 19.2K, I'll have you know!) Using a raised floor allowed you to simply walk *over* the cabling while still allowing you to yank some tiles for easy troubleshooting.

      (Great way to keep your boss at bay, too. "Don't come in here! We've got tiles up and you may fall in a hole! thenthegruewilleastyouandnoonewillnoticebwhahaha")

      With computers being designed as they are now, the raised floor no longer makes sense. For one, all your plugs tend to go to the same place. i.e. Your power cords go to the power mains in one direction, your network cables go to the switch (and ultimately the patch panel) in another, and your KVM console is built into the rack itself. With the number of computers being managed, you'd be spending all day pulling up floor tiling and crawling around in tight spaces trying to find the right cable! With guided cables, you simply unhook the cable and drag it out. (Or for new cables, you simply loop a them through the guides.)

      So in sort, times change and so do the datacenters. :-)
      • Re:Turns? (Score:5, Interesting)

        by LaCosaNostradamus ( 630659 ) <`moc.liam' `ta' `sumadartsoNasoCaL'> on Thursday November 03, 2005 @05:14PM (#13944838) Journal
        Obviously you realize that as the equipment contents of datacenters change, it doesn't make sense to change the room sturcture all that much? Hence many older datacenters have retained their raised floors. Of course, their air conditioners were also designed for raised floors.

        I don't know where you've worked, but every datacenter I've seen has had a raised floor, and all of them still had at least one mainframe structure still in use ... hence, they still routed cables under the floor for them, by design.
      • Re:Turns? (Score:4, Interesting)

        by nettdata ( 88196 ) on Thursday November 03, 2005 @05:17PM (#13944872) Homepage
        Actually, with the way computers are being designed now, raised flooring and proper cooling is even MORE of an issue than it was.

        With the advent of blades, the heat generated per rack space is now typically MUCH higher than it was a back in the day. If anything, the raised flooring should be redesigned, as it can't cope with the airflow that is needed for higher density server rooms.

        You'll find that a number of racks are being redesigned with built-in plenums for cooling... a cold feed on the bottom, and a hot return at the top, with individual ducts for various levels of the rack.

        There are even liquid-cooled racks available for the BIG jobs.

        I think that it's not so much that we're going to get rid of raised floors, but just redesign the materials and layout of them to be more effective with the needs of today.

      • Re:Turns? (Score:4, Insightful)

        by swb ( 14022 ) on Thursday November 03, 2005 @05:55PM (#13945263)
        I still miss why running cabling under the floor is worse than running it in overhead trays. Either the trays are too high to get at without a ladder (thus making them at least as inconvenient as floor tiles), or they're too low and you bash things into them.

        Overhead tray systems also suffer from a fairly rigid room layout, and I have yet to see a data center being used the way it was originally layed out after a few years. Raised flooring allows for a lot of flexibility for power runs, cabling runs and so on without having to install an overhead tray grid.

        Raised flooring also offers some slight protection against water leaks. We had our last raised floor system installed with all the power and data runs enclosed in liquidtight conduit due to the tenant's unfortunate run-ins with the buildings drain system and plumbing in the past.

        I guess overhead tray makes sense if all you want to do is fill 10,000 sq ft with rack cabinets, but it's not really that flexible or even attractive, IMHO.

    • Re:Turns? (Score:5, Interesting)

      by convolvatron ( 176505 ) on Thursday November 03, 2005 @04:52PM (#13944600)
      you're right in some sense, the pressure underneath the
      plenum will force air through no matter what. there
      are however two problems. the first is that turbulence
      underneath the floor can turn the directed kinetic energy
      of the air into heat...this can be a real drag. in circumstances
      where you need to move alot of air, the channel may not
      even be sufficiently wide.

      more importantly, the air ends up coming out where the
      resistance is less, leading to uneven distribution of
      air. if you're grossly overbudget and just relying on
      the ambient temperature of the machine room, this isn't
      a problem. but when you get close to the edge it can
      totally push you over.
    • ahem [wikipedia.org].
    • It's more a matter of airflow. If you have high airflow, it can matter. For example, if you drive your car towards or against the wind, either way you get where you're going, but it just takes more energy to fight the wind.

      Granted, this is 70mph wind stuff we're talking about, so it likely wouldn't apply in a datacenter environment. Although it'd be fun to imagine losing certain co-workers getting sucked into the hurricane-force winds. Tune in tonight at 7 for "When Datacenters Attack!"
    • Yes, turns (Score:3, Informative)

      The longer the ductwork, the more turns, and the more severe those turns, the more your fans have to work to achieve the same pressure and airflow. This, because of the increased friction in the pipe.

      Now admittedly, friction isn't as important to gasses as it is to other states of matter, but it can have an effect, especially in high flow cooling.

    • Intuitively (IANAME), I would expect there to be

      1) Resistance. Turns, right angled plenum, or obstructions from cables/power cords would impede airflow right?
      2) While atmospheric differential is key, the magnitude of the differential would be indicate how much resistance/efficiency there is.
      3) Even a perfectly working system must only be capable of delivering a certain amount of cool air flow. With these hotter and hotter computers, at some point the equipment exceeds your airflow budget.

    • Ever try to pull a string around a corner, or ten corners? Your pull may be the same, but the result is not.

      When the air is forced to turn a corner it creates more friction than if it is pushed/pulled in a straight line. This serves to both heat the air, and to cause the motors creating the negative/positive atmospheres to do that much more work.

      I do wonder how much difference either effect really has. Doesn't seem like there should be much. Raised floors are optimal for taking advantage of convection c
      • Well, for starters, wasting a few thousand square feet of usable space for ventilation is silly. Also you may not want to bring in fresh air. If it's 100 out and 70 in the room, why bring in 100 degree air? Also moving air by convection is not a quick process.
    • by freeweed ( 309734 ) on Thursday November 03, 2005 @05:26PM (#13944958)
      So long as you have positive air pressure under your floor, you'll get *some* effect from your perf tiles. However, as I'm sure some fluid dynamics folks will jump in with, air flow is a HARD problem. Yeah, so you're getting cold air coming up through your perfs. Well, most of them. Some of them are actually pulling air DOWN. Why?

      If you're bored, check out TileFlow [inres.com]. It's an underfloor airflow simulator. You put in your AC units, perf tiles, floor height, baffles, you name it. It will (roughly) work out how many CFM of cold air you're going to see on a given tile. It's near-realtime (takes a second to recalculate when you make changes), so you can quickly add/remove things and see the effect. I spent some time messing with this a couple of years ago, and it's very easy to set up a situation where you have areas in your underfloow with *negative* pressure.

      The article basically summed it up for me:

      McFarlane said raised floors should be at least 18 inches high, and preferably 24 to 30 inches, to hold the necessary cable bundles without impeding the high volumes of air flow. But he also said those levels aren't realistic for buildings that weren't designed with that extra height.

      I'd go with 24 inches MINIMUM, myself. Also, proper cable placement (ie: not just willy-nilly) goes a long way towards helping airflow issues. Like they said though, you don't always have the space.

      Of course, with the introduction of a blade chassis or 4, you suddenly need one HELL of a lot more AC :)
      • I'd go with 24 inches MINIMUM, myself.

        Not bad, at about 1" per year is typical. Might last a career.

        A layer each for:

        • Serial cables (RS232)
        • Mainframe cables (more layers here than I can count)
        • Thick Ethernet
        • Arcnet
        • Token Ring
        • Thin Ethernet
        • 10BaseT
        • SCSI this that and the next
        • FDDI
        • SSA
        • FC-AL
        • 100BaseT
        • 1000BaseT
        • 10000Base fibre

        Oh, and don't forget power, 2 phase and 3 phase, 240v and 120v. And those silly traceiver boxes and modems.

        Floors end up being garbage pits...

  • Short Article. (Score:4, Interesting)

    by darkmeridian ( 119044 ) <william.chuang@g[ ]l.com ['mai' in gap]> on Thursday November 03, 2005 @04:46PM (#13944538) Homepage
    Says that raised floors may be inefficient if it gets block. Then says alternatives are expensive. Direct AC where you need it, the article says.

    Why wouldn't raised floors be bad if you used them properly?
    • I've wanted raised floors, but not for cooling. I just want to hide the power and data cables. I figure just a 3" rise in the floor would be sufficient.

      If it wasn't my basement I'd just put outlets in the floor, and if I didn't want it also to serve as my theater room I'd consider outlets in the ceiling.
    • Direct AC where you need it, the article says.
      Maybe we should move towards water cooling. It seems inefficient to keep a big room a 65 degrees just to cool a few square centimeters of silicon.
  • by Infinityis ( 807294 ) on Thursday November 03, 2005 @04:47PM (#13944545) Homepage
    I thought the raised flooring was just to make the people working there look taller and more impressive, kinda like how they do with pharmacists.
  • Turns? (Score:4, Funny)

    by archeopterix ( 594938 ) * on Thursday November 03, 2005 @04:48PM (#13944554) Journal
    Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go.
    Tell that to the methane in my bowels.
  • by wcrowe ( 94389 ) on Thursday November 03, 2005 @04:48PM (#13944555)
    Another big reason for raised floors is to handle wiring. I know companies where it was installed only for this reason. Cooling wasn't even on their minds.
    • True, but IMO not the best way to handle wiring, overhead runs are much easier and cleaner. Every raised floor environment I have worked in was a mess under the floor and a nightmare to run new cables through.

      If cooling is not a concern, concrete slab with overhead runs is the best way. If cooling is an issue, use raised floor, for cooling only and overhead runs for cables.
    • by cvd6262 ( 180823 ) on Thursday November 03, 2005 @06:13PM (#13945439)
      Another big reason for raised floors is to handle wiring.

      or pluming. I'm serious. (An a bit OT)

      When I was at IBM's Cottle Rd. facility, now (mostly) part of Hitachi, they had just finished rebuilding their main magnetoresitive head cleanroom (Taurus). They took the idea from the server techs, and dug out eight feet from under the existing cleanroom (without tearing down the building) and put in a false floor.

      All of the chemicals were stored in tanks under the floor. Pipes ran veritcally, and most spills (unless it was something noxious) wouldn't shut down much of the line. It was a big risk but, if what I hear is correct, people still say it's the best idea they had in a while.
  • You say the "experts" debate it, then ask us? Who you calling expert anyway?

    Hey! You! get offa my cloud!
  • Comment removed based on user account deletion
    • But complaining that the air has to "turn 90 degrees" seems a little silly to me. Is there something I'm missing that an expert can clarify here?

      Laminar flow is more efficient at thermal transfer than turbulent flow.

    • by Iphtashu Fitz ( 263795 ) on Thursday November 03, 2005 @04:55PM (#13944638)
      If something is airtight, putting air in one end will move air out the other end.

      The problem lies with larger datacenter environments. Imagine a room the size of a football field. Along the walls are rows of air conditioners that blow cold air underneath the raised floor. Put a cabinet in the middle of the room and replace the tiles around it with perforated ones and you get a lot of cooling for that cabinet. Now start adding more rows & rows of cabinets along with perforated tiles in front of each of them. Eventually you get to a point where very little cold air makes it to those servers in the middle of the room because it's flowing up through other vents before it can get there. What's the solution? Removing servers in the middle of hotspots & adding more AC? Adding ducting under the floor to direct more air to those hotspots? Not very cheap & effective approaches...

      • by bsd4me ( 759597 )

        Put a cabinet in the middle of the room and replace the tiles around it with perforated ones and you get a lot of cooling for that cabinet.

        Maybe this is the problem. Every industrial datacenter I have been in places racks over either empty spaces, or tiles with a large vent in them. The rack has fans in it to force air through vertically (bottom to top). A few perforated tiles get scattered about for the humans, but I have been in some datacenters without them to maximize airflow to the racks. But t

        • Every industrial datacenter I have been in places racks over either empty spaces, or tiles with a large vent in them.

          That works to an extent, but what if the cabinet is pretty much fully loaded? We loaded up 8-foot cabinets with 30+ 1U dual CPU servers. The amount of air coming up through the holes underneath the cabinets were never enough to cool all that hardware down. Besides, my original example was just that - an example.
    • Not an expert, but I had some HVAC work done recently in my home.

      The blower moving the air only has a certain amount of power. Hook it up to a duct ten feet long, and output basically equals input. Hook it up to a duct ten *miles* long -- even a perfectly airtight one -- the power you put into one end will be lost by the other end, because the air molecules lose momentum (and gain heat) as they bounce off each other and the walls of the duct.

      Every time a duct turns a right angle, the molecules lose a lot
    • I used to work in a large building which had air ducts for heating/cooling. Unfortunately, the air pressure wasn't well balanced to compensate for the location of the Sun and office walls (which were added after the office block was built). So people ended up with either freezing cold blasts of air (the North/West sides), or being cooked by the heat of the Sun ( South/East sides). Those in the centre got no natural daylight at all and in those offices at the end of the air duct the air would become stale if
    • by circusboy ( 580130 ) on Thursday November 03, 2005 @05:08PM (#13944765)
      it can turn on a dime, but also stay on that dime. poor circulation results. trumpets have nice (if tight) curves, and even building ducts can have redirects inside the otherwise rectangular ducts to minimize trapped airflow in corners. for the most part even those corners are curved to help the stream of air.

      most server rooms aren't part of the duct, for example, the one here is large and rectangular, with enormous vents at either end. not very well designed.

      airflow is a very complicated problem, my old employer had at least three AC engineers on full time staff to work out how to keep the tents cold ( I worked for a circus, hence the nick.) the ducting we had to do in many cases was ridiculous.

      why do you think the apple engineering used to use a cray to work out the air passage through the old macs. just dropping air-conditioning into a hot room isn't going to do jack if the airflow isn't properly designed and tuned. air, like many things, doesn't like to turn 90 degrees, it needs to be steered.
  • just make the floor grates. Strong enough to stand on, with lots of small holes.
    Shheesh,
    • That won't work for the same reason that leaving the cover off of many old Unix workstations would cause them to overheat - the air doesn't go where you need it. Take a look inside a sparc IPX or something, and it will give you an idea of what directed airflow is all about. Now, multiply that by a factor of a gojillion.
  • If we get rid of the raised floors, how am I supposed to impress people with my knowledge of zinc whiskers? [accessfloors.com.au]
  • by Seltsam ( 530662 ) on Thursday November 03, 2005 @04:50PM (#13944576)
    I interned at ARL inside of Aberdeen Proving Grounds this past summer and when touring the supercomputer room (more like cluster room these days), the guide said they used one of the computers in the room to simulate the airflow in that room so they could align the systems for better cooling. How geeky is that!
    • the guide said they used one of the computers in the room to simulate the airflow in that room so they could align the systems for better cooling.

      I bet that computer simulated the best cooling for itself.
    • How geeky is that!

      Some call that planning and engineering.

      An engineering firm that was hired to do some upgrades to our 2 room computer facility which included a fan to circulate air between the two rooms. We asked what the CFM of the fans were and how often the air would be exchanged between the rooms. Their answer: Dunno, never thought of that. Good thing we did.
  • by Work Account ( 900793 ) on Thursday November 03, 2005 @04:53PM (#13944611) Journal
    To paraphrase a popular saying: "It's the COMPUTERS, stupid!"

    Inefficient architectures must be discarded to make way for more modern, smaller, COOLER processors.

    Let's address the real problem here -- not the SYMPTOM of hot air.

    We need to address the COMPUTERS.
    • by n0dalus ( 807994 ) on Thursday November 03, 2005 @05:03PM (#13944723) Journal
      Perhaps more importantly, better software solutions can make large hardware systems unnecessary. Instead of running and cooling 10 servers for a certain purpose, write better software to allow you to do the same thing on just one or two servers. If you cut down the amount of servers in the room by enough, you don't even need dedicated cooling.
  • by WesG ( 589258 ) on Thursday November 03, 2005 @04:53PM (#13944612)
    I am waiting for the day where someone invents a computer that doesn't need to be cooled or generate excess heat.

    Think about the lightbulb....A standard 60-watt incadescent bulb generate lots of heat. A better design is something like the LED bulbs that generate the same amount of lumens, with much less power, and more importantly little to no heat.

    Good design can allow these devices to not generate excess heat, hence eliminating the need for the raised floor.
    • This is essentially impossible. Unless you consider so called "reversible computing". But reversible computing must be adiabatic, and thus very slow. Basically, as you slow a computation down you begin to approach ideal efficiency.

      See http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org]

      Fast computing is made possible by destroying information (that's all computers do really, they destroy information). That destruction process entails an entropy cost that must be paid in heat.
    • Efficiency [otherpower.com]

      LEDs are certainly better than flashlight bulbs.

      But when a white LED delivers 15-19 lumens per watt, its about the same as a 100W incandescent and five times worse than a fluorescent. LEDs appear bright because they put out a fairly focused beam - not because they put out lots of light.
  • by WormholeFiend ( 674934 ) on Thursday November 03, 2005 @04:53PM (#13944613)
    Just have the whole data center submerged in an inert solution like the one made by 3M (fluorinert?), and have the workers wear scuba equipment.

    Most. Efficient. Cooling. Evar!
  • I thought hot air rises, cold air falls.

    The article points out that overhead cooling requires additional fans, etc.

    Racks need to be built more like refridgerators. Foamcore/fiberglass insulated with some nice weatherstripping to create a chamber of sorts. Since the system would be near sealed, convection currents from the warm air exaust rising off the servers in the rack would pull cold air down. Cold air goes in through the bottom of the rack, heats up, gets pushed back through the top. This could pro
    • Note that I'm not calling the parent poster stoopid, but rather the design of forcing cold air through the *floor*. As the parent here notes, cold air falls. This is presumably why most home fridges have the freezer on top.

      I was most surprised to read this article. I've never worked in a data center, but I have worked in semiconductor production cleanrooms, and given the photos I've seen of data centers with the grated flooring, I guess I always assumed the ventilation was handled the same way as in a

  • by G4from128k ( 686170 ) on Thursday November 03, 2005 @04:54PM (#13944629)
    Someone needs to create an air interconnect standard that lets server room designers snap-on cold air supplies onto a standard "air-port" on the box or blade. The port standard would include several sizes to accomodate different airflow needs and distribution form large supply ports to a rack of small ports on servers. A Lego-like portfolio of snap-together port connections, tees, joints, ducts, plenums, etc. would let an IT HVAC guy quickly distribute cold air from a floor, wall or ceiling air supply to a rack of servers.
    • I would think that if one had multiple racks, the ventilation could be done in between them, for example sucking the return air out of the middle of a pair of racks, and feeding fresh air in the sides. This could be extended as needed.

      My thinking is a good rack system should have the airflow under control.
  • No Raised Floors? (Score:3, Interesting)

    by thebdj ( 768618 ) on Thursday November 03, 2005 @04:55PM (#13944633) Journal
    We had an issue where I once worked because we had so many servers the general server room that many different groups used was no longer adequate for our needs, since we were outgrowing our alotted space. Now instead of building us a new server room with the appropriate cooling (which presumably would have included raised flooring) we got a closet in a new building. This is obviously not much fun for the poor people who worked outside the closet, because the servers made a good deal of noise and even with the door closed were quite distracting.

    Now, we had to get building systems to maximize the air flow from the AC vent in the room to ensure maximum cooling and the temperature on the thermostat was set to the minimum (about 65 F I believe). One day, while trying to do some routine upgrades to the server, I noticed things not going so well. So I logged off the remote connection and made my way to the server room.

    What do I find when I get there? The room temperature is approximately 95 F (the outside room was a normal 72) and the servers are burning up. I check the system logs and guess what, it has been like this four nearly 12 hrs (since sometime in the middle of the night). To make this worse our system administrator was at home for vacation around X-Mas, so of course all sorts of hell was busting loose.

    We wound up getting the room down after the people from building systems managed to get us more AC cooling in the room; however, the point is it was never really enough. Even on a good day it was anywhere from 75 F to 80 F in the room and with nearly a full rack and another one to be moved in there is was never going to be enough. This is what happens though when administrations have apathy when it comes to IT and the needs of the computer systems, particularly servers. Maybe we should bolt servers down and stick them in giant wind tunnels or something...
    • Okay, screw this post [slashdot.org] about putting the servers in a giant tank filled with a coolant. Put the servers in a vertical wind tunnel so you can practice your sky diving while swapping a hard disk!
  • by zjeah ( 623944 )
    We have been using raised flooring in our data center for decades and never had any cooling issues. Granted we have 4 large air handlers for the room but when running a raised floor one must have the proper system in place. Some hardware is designed to get it's air right from the floor and some is not. Our large server racks don't have floor openings so we have vent tiles in the floor on the front side and the servers in turn suck the cool air through. Raised floor is a great place to route cables/power/pho
  • it's unlikely a computer room is going to get "too small" unless your company is growing at an astounding rate. Moore's law has been making computers smaller and faster and more power-efficient by several db per year.

    More likely the powers that be have overbought capacity, in order to expand the apparent size and importance of their empire. I've seen several computer rooms that could have been replaced with three laptops and a pocket fan.

    • Or, alternatively, the powers that be don't want to buy all-new hardware every 18 months because Moore's so-called law told them to. Maybe it's often more cost effective to add another server in parallel to the existing ones than to buy new servers, move everything off the old ones onto the new ones, then throw the old servers out.
  • Thermal Dynamics... (Score:2, Informative)

    by BoraSport ( 702369 )
    The raised floor has more to do with how heat moves in an environment rather then how you move air through a duct. Most raised floors don't have major ducting under them. In our data centers the raised floor provides a controlled space that we can use to modify temps.

    Heat rises, our original designs back in 2002 for our data center called for overhead cooling using a new gel based radiator system. It would have been a great solution and caused us to go with a lower raised floor, just for cables and bracin

  • Not obsolete. (Score:2, Interesting)

    by blastard ( 816262 )
    Where I've worked it was primarily for running wires, not cooling. I've also worked in places that have the overhead baskets, and quite frankly, although they are convenient, they are 'tugly. They are great for temporary installations and where stuff gets moved alot, but I'd rather have my critical wires away from places where they can get fiddled with by bored individuals.

    So, no, I don't think they will be obselete any time soon. But hey, I'm an old punchcard guy.
  • by mslinux ( 570958 ) on Thursday November 03, 2005 @05:04PM (#13944735)
    "Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go."

    I wonder how all those ducts throughtout America (with tons of 90 degree turns) carry air that heats and cools houses and office buildings every day?
  • by Ed Almos ( 584864 ) on Thursday November 03, 2005 @05:07PM (#13944754)
    I'm in a data center right now with two rack mounted clusters and three IBM Z series machines plus a load of other kit. Without the raised flooring AND the ventilation systems things would get pretty toasty here but it has to be done right. The clusters are mounted in back to back Compaq network racks which draw air in the front and push it out the back. We therefore have 'cold' isles where the air is fed in through the raised floor and 'hot' isles where the hot air is taken away to help heat the rest of the building.

    The only other option would be water cooling but that's viewed by my bosses as supercomputer territory.

    Ed Almos
  • Obsolete or not... (Score:3, Informative)

    by GillBates0 ( 664202 ) on Thursday November 03, 2005 @05:07PM (#13944758) Homepage Journal
    ...make sure you avoid floor zinc plated floor tiles. Few things are as damaging to a computer room as Zinc whiskers [wikipedia.org] or other assorted airborne metal particles.

    Very difficult to track down random machine failures to bad interior decoration choices!

  • by Anonymous Coward on Thursday November 03, 2005 @05:09PM (#13944773)
    We worked very closely with Liebert ( http://www.liebert.com/ [liebert.com] ) when we recently rennovated our data center for a major project. The traditional CRAC (Computer Room AC) units supplying air through a raised floor is no longer viable for the modern data center. CRAC units are now used as supplemental cooling, and primarily for humidity control. When you have 1024 1U, dual processor servers producing 320 kW of heat in 1000 sq ft of space, an 18 inch raised floor (with all kinds of crap under it) is not adequate to supply the volume of air needed to cool that much heat in so small a space.

    We had intended to use the raised floor to supply air, but Liebert's design analysis gave us a clear indication of why that wasn't going to work. We needed to generate air velocities in excess of 35 MPH under the floor. There were hotspots in the room where negative pressure was created and the air was actually being sucked into the floor rather than being blown out from it. So, we happened to get lucky as Liebert was literally just rolling off the production line their Extreme Density cooling system. The system uses rack mounted heat exchangers (air to refrigerant), each of which can dissipate 8 - 10 kW of heat, and can be tied to a building's chilled water system, or a compressor that can be mounted outside the building.

    This system is extremely efficient as it puts the cooling at the rack, where it is needed most. It's far more efficient than the floor based system, although we still use the floor units to manage the humidity levels in the room. The Liebert system has been a work horse. Our racks are producing between 8 - 9 kW under load and we consistently have temperatures between 80 - 95 F in the hot aisle, and a nice 68 - 70 F in the cold aisles. No major failures in two years (two software related things early on; one bad valve in a rack mounted unit).
  • This seems to be more about bad rack design than raised floors. It's a basic principle of ducting design that, as the airflow spreads out from the source through different paths, the total cross section of the paths should stay roughly constant. (Yes, I am simplifying and I as sure someone can explain this better and in more detail. Yes, duct length and pressure drop is important. But the basic concept is true. If I want consistent airflow in my system, and the inlet is one square metre, the total of all th
    • It's a basic principle of ducting design that, as the airflow spreads out from the source through different paths, the total cross section of the paths should stay roughly constant.

      I used to do commercial HVAC work, and everybody in the business does the opposite from what you describe. The ducts are largest near the air handler, and they are smallest at the end of the line. Typically, the main trunk of the duct gets smaller in diameter after each branch comes off of it and goes to a diffuser.

      One issue w
  • by ka9dgx ( 72702 ) on Thursday November 03, 2005 @05:26PM (#13944956) Homepage Journal
    The problem is that power density has gone through the roof. It used to be that a rack of computers was between 2kw and 5kw. Modern blade servers easily push that up to 25kw per rack. You'd have to have 10 feet or more of space below the floor to accomplish cooling with an external source, thus the move to in-rack cooling systems, and the new hot aisle / cold aisle systems.

    Wiring is now usually ABOVE the equipment, and with 10Gigabit copper, you can't just put all of the cables in a bundle any more, you have to be very careful.

    It's a brave new datacenter world. You need some serious engineering these days, guessing just isn't going to do it. Hire the pros, and save your career.

    --Mike--

  • HVAC concerns (Score:3, Interesting)

    by Elfich47 ( 703900 ) on Thursday November 03, 2005 @05:37PM (#13945083)
    Heating Ventilation and Air Conditioning (HVAC) design is based upon how air moves through a given pipe or duct.

    When you are designing for a space (such as a room) you design for the shortest amount of ductwork for the greatest amount of distribution. Look up in the ceiling of an office complex sometime and count the number of supply and return diffusers that work to keep your air in reasonable shape. All of the ducts that supply this air are smooth, straight and designed for a minimal amount of losses.

    All air flow is predicated on two imporant points within a given pipe (splits and branching with in the duct work is not covered here): pressure loss within the pipe and how much power you have to move the air. The higher the pressure losses, the more power you need to move the same amount of air. Every corner, turn, rough pipe, longer pipe all contribute to the amount of power needed to push the air through at the rate you need.

    Where am I going with all of this? Well under floor/raised floor systems do not have alot of space under them and it is assumed that the entire space under it is flexible and can be used (ie no impediments or blockages). Ductwork is immobile and does not appreciate being banged around. Most big servers need immense amounts of cooling. A 10"x10" duct is good for roughly 200 CFM of air. That much air is good for 2-3 people (this is rough, since I do not have my HVAC cookbook in front of me.. yes that is what it is called). Servers need large volumes of air and if that ductwork is put under the floor, pray you don't need any cables in that area of the room. Before you ask: Well why don't we just pump the air into the space under the floor and it will get there? Air is like water, it leaves through the easiest method possible. Place a glass on the table and pour water on the table and see if any of the water ends up in the glass. Good chance it ends up spread out on the floor where it was easiest to leak out. Unless air is specifically ducted to exatcly where you want it, it will go anywhere it can (always to the easiest exit).

    Ductwork is a very space consuming item. Main trunks for 2 and three story buildings can be on the order of four to five feet wide and three to four feet high. A server room by itself can require the same amount of cooling as the rest of the floor it is on. (ignoring wet bulb/dry bulb issues, humidity generation and filtering, we are just talking about number of BTUs generated). A good size server room could easily require a seperate trunk line and return to prevent the spreading of heated air throughout the building (some places do actually duct the warm air into the rest of the building during the winter). Allowing this air to return into the common plenum return will place an additional load on the rest of the buildings AC system. Place the server on a seperate HVAC system to prevent overloading the rest of the building's AC system (which is designed on a per square foot basis assuming for a given number of people/computers/lights per square foot if the floor plan does not include a desk plan layout).

  • by slasher999 ( 513533 ) on Thursday November 03, 2005 @05:41PM (#13945123)
    Raised flooring is useful for several reasons, moving cool air through a data center is only one of them. While requiring air to make severe turns to get out of the floor isn't optimal, most cabinets and the equipment in those cabinets is engineered with this in mind. Air is generally drawn in through the front of the cabinet and device and warm air blows out the back. Fans in the equipment pull the air in - the air doesn't have to "turn" on its own again (not that is really did in the first place). Warm air then rises after leaving the device where it is normally drawn back into the top of the AC unit.

    Raised flooring also provides significant storage for those large eletrical "whips" where 30A (in most US DCs any how) circuits are terminated as well as a place to hide miles of copper and fiber cable (preferably not too close to the electrical whips). Where else would you put this stuff? With high density switches and servers, we certainly aren't seeing less cable needed in the data centers. Cabinets that used to hold five or six servers now hold 40 or more. Each of these needs power (typically redundant) and network connectivity (again, typically redundant), so we actually have more cables to hide than ever before.

    Cabinets are built with raised flooring in mind. Manufactureres expect your cabling will probably feed up through the floor into the bottom of the cabinet. Sure, there is some space in the top of the cabinets, but nothing like the wide open bottom!

    Anyhow, there you have the ideas of someone who is quickly becoming a dinosaur (again) in the industry.
  • Hell no (Score:5, Interesting)

    by Spazmania ( 174582 ) on Thursday November 03, 2005 @06:33PM (#13945610) Homepage
    Raised floor cooling was designed back when the computer room held mainframe and telephone switch equipment with vertical boards in 5-7 foot tall cabinets. The tile was holed or removed directly under each cabinet, so cool air flowed up, past the boards and out through the top of the cabinet. It then wandered its way across the ceiling to the air conditioners' intakes and the cycle repeated.

    Telecom switching equipment still uses vertically mounted boards for the most part and still expects to intake air from the bottom and exhaust it out the top. Have any AT&T/Lucent/Avaya equipment in your computer room? Go look.

    Now look at your rack mount computer case. Doesn't matter which one. Does it suck air in at the bottom and exhaust it out at the top? No. No, it doesn't. Most suck air in the front and exhaust it out the back. Some suck it in one side and exhaust it out the other. The bottom is a solid slab of metal which obstructs 100% of any airflow directed at it.

    Gee, how's that going to work?

    Well, the answer is: with some hacks. Now the holed tiles are in front of the cabinet instead of under it. But wait, that basically defeats the purpose of using the raised floor to move air in the first place. Worse, that mild draft of cold air competes with the rampaging hot air blown out of the next row of cabinets. So, for the most part your machines get to suck someone elses hot air!

    So what's the solution? A hot aisle / cold aisle approach. Duct cold air overhead to the even-numbered aisles. Have the front of the machines face that cold aisle in the cabinets to either side. Duct the hot air back from the odd-numbered aisles to the air conditioners. Doesn't matter that the hot aisles are 10-15 degrees hotter than the cold aisles because air from the hot aisles doesn't enter the machines.

    • Well, you're close. You are correct that the answer lies in a "hot aisle/cold aisle" configuration. The difference is, it works better when the cold air is coming up from below the raised floor tiles.

      Why? You must keep in mind, you're not trying to pump "cold" air in, you're trying to take heat out, and as Mother Nature knows, heat rises. So why not harness the natural convection of heat, allow it to flow up to the ceiling, and have some "perf" ceiling tiles and use the space over the ceiling t
  • by pvera ( 250260 ) <pedro.vera@gmail.com> on Thursday November 03, 2005 @07:59PM (#13946345) Homepage Journal
    I spent the first 8 years of my professional life stuck working in NOCs with standard raised flooring, the cooling was just one of the many things it was needed for.

    Examples:

    Wiring: Not everyone likes to use overhead ladders to carry cables around. In the Army we had less than 50% of our wiring overhead, the rest was routed thru channels underneath the raised flooring.

    HVAC Spill protection: Many of our NOCs had huge AC units above the tile level, and these things could leak at any moment. With raised flooring the water will pool at the bottom instead of run over the tiles and cause an accident. We had water sensors installed, so we knew we had a problem as soon as the first drop hit the floor.

    If the natural airflow patterns are not enough for a specific piece of equipment, it does not take a lot to build conducts to guarantee cold air delivery underneath a specific rack unit.

    The one thing I did not like about the raised floors was when some dumbass moron (who did NOT work within a NOC) decided to replace our nice, white, easy to buff tiles, with carpeted tiles. 10 years later and I can't still figure out why the hell would he approve that switch, since our NOC with its white tiles looked fricking gorgeous just by running a buffer and a clean mop thru it. The tiles with carpeting were gray so they darkened our pristine NOC.

    I bet many of the people against raised flooring are land lords that don't want to get stuck with the cost of rebuilding flooring if the new tenant does not need a NOC area. I have been to a NOC in a conventional office suite, they basically crammed all of their racks into what seemed to be a former cubicle island. The air conditioning units were obviously a last-minute addition and it looked like the smallest spill would immediately short the lose power strips on the first row of racks in front of them. Shoddy as hell.

E = MC ** 2 +- 3db

Working...