Feeds

And here's how a datacentre network works...

It's all about layers

  • alert
  • submit to reddit

Build a business case: developing custom apps

The job of a datacentre network is to connect the equipment inside to the outside world, and to connect the internal systems to each other. It needs to be secure, high performance and operate with an eye on energy consumption, with a guiding principle of minimising device numbers and costs, so you end up with a system that can do what's needed while remaining as simple as possible.

Every facility is different, so there's no off-the-shelf answer as to what exactly goes into a datacentre network. Component selection will vary according to budget, business requirements, site location and capacity, available power and cooling, and a host of other criteria depending on circumstances. That said, you're likely to find that most datacentre networks arrive at common solutions to common problems and so look fairly similar.

You can conceive of a datacentre network as a series of layers, with the stored data at the bottom. On the first layer is the connection to the outside world - the internet - and, if it's an enterprise's own datacentre, to the rest of the company. If the datacentre is owned by a service provider and is servicing a number of external clients, the Internet connection and any other connections linking clients directly also sit on the outside ring.

The second layer, commonly referred to as the edge or access layer, consists of IP-based, Ethernet devices, such as firewalls, packet inspection appliances and switches, that route traffic to and from the core of the datacentre to the outside world. Here too sit many web servers in a so-called demilitarised zone or DMZ: hemmed in by firewalls, external visitors are allowed this far into the datacentre network but no further.

Below this is the core, with large, high-performance switches consisting of blades plugged into chassis, with each blade providing dozens of ports. The chassis is likely to be managed by a management blade, while other features such as security and traffic shaping can be provided by further blades. All data passes through these devices.

Closer to the servers will be a further layer, consisting of a series of switches, maybe one per rack or row of racks, depending on density, tasked with distributing data to and between servers in order to minimise load on the core.

Behind the servers, conceptually, is the main storage. This, the fourth and final layer, consists of a series of high-performance storage arrays connected via a Fibre Channel network that's entirely separate from the main network. This means that only the servers can connect directly to the storage, although there's likely also to be a link from the storage to the IP network for management purposes.

The Fibre Channel network needs separate switches and management systems to configure it, adding to IT staff's workload, so this situation is slowly changing. In ten years time, industry analysts expect that most storage systems will be connected using the IP-based Ethernet network, probably running at either 40Gbps or 100Gbps.

Let's look at an example of the network's job. You click on a link in your browser, which generates a request for data that arrives at our datacentre via the Internet connection. The incoming request is scanned for malware, and is re-assembled and decrypted if WAN optimisation and encryption are in use. It's then sent on to a switch in the access layer. This switch routes the request to a web server in the DMZ, which might be physical or virtual, and which might be fronted by a load balancer to allow a cluster of servers to handle high traffic levels.

The web server receives and processes the request. A response needs information from a database, so the web server calls for data from a database server at the core of the network.

The data demand is passed to a core switch which routes it to a database server. The processed request traverses the storage network, is pulled off the disks, arrives back from main storage, is packaged up and sent back to the web server. It's then assembled into a web page and pushed back out the Internet connection.

While a broad-brush look at network design, this is the template with which a datacentre network designer will approach the problem of building a new network from scratch. ®

Build a business case: developing custom apps

More from The Register

next story
iPad? More like iFAD: We reveal why Apple fell into IBM's arms
But never fear fanbois, you're still lapping up iPhones, Macs
Sonos AXES support for Apple's iOS4 and 5
Want to use your iThing? You can't - it's too old
You didn't get the MeMO? Asus Pad 7 Android tab is ... not bad
Really, er, stands out among cheapie 7-inchers
Apple winks at parents: C'mon, get your kid a tweaked Macbook Pro
Cheapest models given new processors, more RAM
4K video on terrestrial TV? Not if the WRC shares frequencies to mobiles
Have your say with Ofcom now, before Freeview becomes Feeview
Leaked Windows Phone 8.1 Update specs tease details of Nokia's next mobes
New screen sizes, dual SIMs, voice over LTE, and more
YES, iPhones ARE getting slower with each new release of iOS
Old hardware doesn't get any faster with new software
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.