Original URL: https://www.theregister.com/2013/11/26/facebook_dc_tour/

We flew our man Jack Clark into Facebook's desert data tomb. This is what he saw

Your personal snaps held in suspended animation

By Jack Clark in San Francisco

Posted in On-Prem, 26th November 2013 11:44 GMT

Pictures In the high desert of Oregon, Facebook will store in a cold low-power digital morgue the photos of you, me, and everyone else on its content farm.

This tomb is not a typical one, but instead a "cold storage" addition to the company's sprawling data center complex in the state's city of Prineville. We first reported on the new facility in February. This autumn Facebook got in touch to see if we wanted to take a trip to a rural part of Oregon for a "special" visit to its facility.

After checking the back of Vulture West's sofa for the requisite funds, we said yes.

FBtrip

Now that's remote access ... Mt Hood, Oregon, glimpsed from our Portland-Redmond micro-plane
(click to enlarge)

To get to the data center you need to get a plane to Redmond, Oregon, then drive into the high desert for about 30 minutes. It's cold, dry, and empty. The climate brings an inch of rain a year on average, and temperatures lie between the low 20s and mid-80s, in degrees Fahrenheit (-6°C to 27°C). This is ideal weather for free-cooled data centers like Facebook's.

Though Facebook is famed for decorating its offices with graffiti and the pre-school candy-colors that Google is also fond of, there's no getting around the fact that data centers are ugly buildings whose form is intertwined with their function.

FBbuildingonefront

Brutalist Soviet re-education complex, or Facebook data center?

The data center complex has been in continuous construction since 2010, and as of January 2013 almost 3,000 workers have had a hand in the erection of the bit barns.

The buildings bring to mind aluminum smelters, or silvery chocolate bars made for a gigantic, utilitarian god. If you stood any of the structures upright, it would be 81 stories tall. The top section of each center is devoted to a giant free air cooling system that takes in the dry, cool Oregon air, filters it, and uses it to cool the humming servers within.

FBbuildingoneexternal

Byte barn ... If you stood Building One on its side, it would be 81 stories tall

Besides operating an advanced free cooling system in tandem with the lean designs espoused by its Open Compute Project, Facebook has also sought to trim its power use elsewhere.

One major approach pioneered by the company is reducing the number of transformations it applies to power.

FBPDU

Facebook has re-jigged the way it distributes electricity to save on power cost
(click to enlarge)

Rather than bringing power in from a transformer at 480/277VAC and feeding it through a few UPS systems then a power distribution unit and then a server power supply, Facebook brings it straight down into the facility into these blue power supply boxes that plug into the "open rack" power backplanes.

This, Facebook says, leads to a 7.5 per cent loss during transmission, versus a 21 to 27 per cent loss in the traditional approach.

Out googlin' Google with tech specs

This approach, we're told, has made Facebook's data centers among the most efficient in the world, with an enviable Power Usage Effectiveness figure of around 1.07, meaning that for every watt consumed by the data center and its infrastructure, .07 watts are expended on the cooling and support systems. This compares with between 1.08 and 1.18 for Google.

FBcorridor

2013 ... a content odyssey in one hall of four in Building One

Though Facebook doesn't disclose the precise size of its infrastructure outlay, we figured that one of the four halls of Building One (pictured above) contained about 360 racks containing 15 to 25 in-production open-compute servers per stack, plus or minus ten per cent.

FBcoldstorageexternal

Facebook's diminutive 'cold storage' facility

Moving on, Facebook's aforementioned data tomb, the cold-storage vault pictured above, sits alongside Building One, and contains three IT bays each 16,000 squ ft in area. It's a relative pipsqueak compared to the 300,000 squ ft Building One and Building Two bit halls.

However, the cold-storage facility is the site of an innovative new design approach by Facebook that involves storing as much of its users' content as possible, while precisely controlling power consumption to keep costs down.

This third building uses a new Facebook server architecture that lets the company keep old photos on cheap hard drives, cramming 30 drives into each 2U chassis.

The systems Facebook uses here are called Open Vault storage arrays. Their design specification [PDF] is available on the Open Compute Project's website for other bit-fiddlers to download and play with.

At the heart of the 2U box's electronics is a set of LSISAS2x28 chips [PDF] from LSI that are each a 28-port 6Gbps SAS/SATA expander with an integrated 150MHz ARM926 system-on-a-chip to manage the gear, along with flash and EEPROM firmware, serial ports for debugging, temperature and voltage sensors, and cooling fans.

Each server uses Shingled Magnetic Recording (SMR) high-capacity, low-cost drives: these disks are good for reading and sequential writing, but terrible at updating data, which perfectly fits the needs of a digital archive of rarely accessed but useful, static information.

These disks are extremely sensitive to vibration, so only one drive in 15, the number per unit in the 2U box, can spin at a time.

FBtrip

That's a lot of drunken-night-out photos ... Each Open Vault rack can cram in a whopping petabyte (click to enlarge)

Though the performance of the system is poor, it's perfect for Facebook's purposes of slurping in as much content as possible and storing it as cheaply as possible. Each rack is capable of storing 1PB of data, according to Facebook. With even just a dozen racks, that's a hefty amount of binary: enough space for hundreds of millions of personal snaps.

Hot information can be stored elsewhere, of course. Apparently, about 80 per cent of Facebook's web traffic goes towards just eight per cent of its photos, so that's a lot of pics not getting many eyeballs.

FBtrip

Like any modern facility, Facebook indulges in hot aisle and cold aisle separation
(click to enlarge)

With the system, Facebook has built a low-cost, high-volume byte vault that is ideal for storing Crappy And Barely Accessed Data (CABAD), such as photos of your cats in hats, or the grimaces of your friends as you snap a misanthropic pic after midnight.

Though this use case may seem trivial, the tech can be used by many organizations that need to keep data for a long time, but don't particularly care about having good access rates, and don't want to have an eye-watering electricity bill. Don't forget: most of the drives are spun down at any given moment.

FBtrip

Facebook doesn't expect to fill up the available storage capacity in the facility until 2017
(click to enlarge)

The Open Vault arrays are being loaded into the freshly built and mostly empty cold-storage building on Facebook's dense Prineville site; the first hall is being gradually occupied, and the other two halls are vacant for now.

The social network says it doesn't expect to fill up the available storage capacity until 2017. It's estimated each of the tomb's three halls could eventually hold 1 exabyte. ®