Inside Dell's containerized data centers
Dell was mum about its containerized data center designs. But once the company announced it would supply the servers behind Microsoft's Azure compute cloud and Microsoft said it was embracing containers in a big way, it was only a matter of time (and a little schmoozing) before Dell opened up a bit and explain its container plans - intended for "a select number of customers."
The thing about having an elite bunch of engineers and marketing people trying to sell custom server, storage, and data center designs to super-secret hyperscale data center customers is that you don't want to talk about what you are doing. But sometimes, you just can't help but brag a little.
Dell's data center shipping container package - code-named "Humidor," if you trust the graphics files that give us an outside and an inside view of the containers - is not a traditional commercial product, but rather a custom product that Dell's Data Center Solutions unit has cooked up for the several dozen customers in the world who have tens or hundreds of thousands of servers and who are trying to get out of building hundreds of thousands of square feet of data center as they roll out infrastructure. The DCS unit, as we previously reported, is kicking out tens of thousands of servers itself and is one of the bright spots in enterprise sales at Dell.
Andy Rhodes, director of marketing for Dell's DCS unit, gave me a virtual tour of the Humidor containers and explained why Dell's approach to the idea of using shipping containers to house IT gear is a bit different from that being espoused by Sun Microsystems, Rackable Systems, and Hewlett-Packard.
First and foremost, as you can see from the following photo, Dell's containers are stacked two high: Here's the outside view:
And here's the inside cutaway view:
According to Rhodes, there are good technical and political reasons for stacking containers on top of each other. Dell agrees that hyperscale customers are interested in compute and storage density and they don't want to spend a lot of data center facilities. The company did a lot of research with prospective customers and found, like others doing containerized data centers, that power and cooling efficiency was also important. But what Dell found out is that the companies most likely to buy its customer servers and containers to wrap around them don't want to plunk them into parking lots and they are not interested in the mobility that comes from deploying containers. Moreover, the facilities people and the IT people do not always get along, and other container products have power, cooling, and IT mixing inside a single container. Each group needs different - and separate - access to their respective gear.
And thus, some Humidor designs are based on containers stacked atop each other. Dell expects customers to put a cheap shell of some sort around the containers as they are lined up and stacked, not to leave them in the parking lot. Having millions of dollars in IT gear sitting exposed, without physical security and inside portable containers, is not the brightest thing anyone has ever suggested. But using containers to cut data center facility costs by anywhere from 20 to 30 per cent is pretty smart, which is what Dell reckons a containerized data center can yield over a conventional brick-and-mortar data center, depending on a lot of variables, of course.
The bottom container in a Humidor setup has 24 full-depth, 19-inch racks - standard racks that support anyone's IT gear. This is important because a container that uses custom racks is not as useful or flexible as one that does use standard racks, since a container is expected to have a ten-year economic life while a rack of servers is lucky to have a three year life. Clearly, the servers are going to move in and out of the racks a few times inside the container. And using standard racks gives customers flexibility. Dell says that the bottom portion of the Humidor can support over 2,400 of its XS23 two-socket servers. That's a lot of iron in a 40-foot container.
The top part of the Humidor double-container has power transformation, metering, and distribution gear, uninterruptible power supplies, and air handlers, according to Rhodes. And the facilities people apparently don't mind having to take the stairs to their own gear so long as they don't have to share with the IT nerds.
Here's another view inside:
"Our approach to containers is different," says Rhodes. "There is no SKU. This is not a standard product. And while we are going to work very closely with those customers who can benefit from containers, we are not going to mainstream this. Our competitors have built, and they are hoping customers will come." ®
Google owns the patent?
every time I see one of those surplus ocean-going cargo containers in a parking lot, I will have to wonder what is really going on inside of it.
Is the supermarket down the street really a front for another cluster of government data processing nodes examining our Internet traffic?
Wow I know how much a Dell server sucks (at the front) and blows (at the back) would be curios to see what cfm this wall of server fans pushes.. and with the access doors shut what sort of pressure differential exists between the cold and hot sides... loose server blowout problem? Also what sort of pressure 'altitude' are we talking do you need Breating Aperratus to work in the cold side? do the hot side workers need to go through staged decompression?