Voltaire pairs InfiniBand and Ethernet
Why can't we all just get along?
Network convergence may be all the rage, but the reality is that many shops have a mix of Ethernet and InfiniBand networks. Voltaire wants to sell customers - particularly those running financial trading systems or parallel database clusters - one box that will span both kinds of networks, and hence the Grid Director 4036E.
The Grid Director 4036E is a 34-port quad data rate (40Gb/sec) InfiniBand switch that also sports an Ethernet gateway with two ports. The InfiniBand ports (2.72 Tb/sec of aggregate bandwidth) have under 100 nanosecond port-to-port latency, which is the kind of low latency that makes banks, brokerages, and supercomputing labs coo.
The Ethernet ports, which are based on Voltaire's own fifth generation gateway chip, can run at either Gigabit Ethernet or 10 Gigabit Ethernet speeds. They also have what Asaf Somekh, vice president of marketing at Voltaire, says is "extremely low latency," with a hop between Ethernet and InfiniBand taking under 2 microseconds.
Voltaire has put Ethernet gateways in its high-end InfiniBand switches before - indeed, the 6U Grid Director 2004 (96 ports) and 15U Grid Director 2012 (288 ports) switches have this capability, which was added to the box through gateway line cards. But these Grid Director boxes are big and expensive, and Voltaire's HPC customers wanted a more compact box that burned less juice, with an Ethernet gateway built in as standard. The Grid Director 4036E is a compact 1U box with 34 ports, and you can link the boxes together to aggregate ports across multiple machines.
Without fast Ethernet gateways, InfiniBand might not be as popular as it is in certain industries. Financial services companies that run trading systems often use InfiniBand for some of their infrastructure. However, they need Ethernet connectivity because the market feeds that pump data into the trading systems are usually based on Ethernet, not InfiniBand. The fashion in trading systems these days is to put trading systems and the systems of customers who use them (usually hedge funds) in the same data centre. These customers don't want to share switching with their competitors.
With space, power, and cooling at a premium, Voltaire needed to come up with a smaller InfiniBand switch with an Ethernet gateway. The Grid Director 4036E has software called Voltaire Message Acceleration (VMA) built into its firmware. This takes the multicast data feeds from market data providers and maps them to the InfiniBand chip inside the switch, actually speeding it up and reducing latency, according to Somekh.
In commercial HPC applications as well as in parallel database implementations, customers sometimes want to use InfiniBand as the backbone of the server cluster, but their storage (from NetApp, Panasas, BlueArc, and others) is based on Ethernet. By using the gateway in the switch, HPC shops can put everything on the IB port on the server, eliminating a performance bottleneck and the cost of an Ethernet adapter on the server and an Ethernet switch for the storage network.
The Grid Director 4036E will be available at the end of this quarter and will cost somewhere in the neighbourhood of $1,000 per port.
So how is QDR InfiniBand doing? "We have seen good pickup in 2009," says Somekh, "but most of it came from the government labs in the United States and China, plus some education and research centers. These are the early adopters, and it usually takes nine to twelve months for commercial institutions to pick up a new technology, and that is why the 4036E was tied to this point in the cycle."
To get a sense of the converged networking plans of potential customers, Voltaire commissioned a poll of more than 120 members of the Global CIO and Executive IT Group. This is an old boy IT network (the human kind) affiliated with the Sloan School at MIT.
Some 45 per cent of those participating in the poll said they planned to implement a mix of InfiniBand and Ethernet networks, with 54 per cent saying they would do Ethernet networks alone. In terms of what is important to them when it comes to networks, 31 percent cited high bandwidth, 22 per cent said low latency, and 17 per cent said network scalability. ®
Sponsored: Benefits from the lessons learned in HPC