This article is more than 1 year old

Akamai on dragging 'em kicking and streaming to the edge: They might be public cloud giants, but we're, er, vids in

CEO Tom Leighton pitches CDNs for enterprise

Akamai Edge World The future of enterprise IT is not in centralised clouds, but in a complex interplay between massive core data centres and small edge locations.

So said Akamai cofounder and CEO Tom Leighton as he kicked off his company's annual shindig in Las Vegas.

"They say that the best place to put your infrastructure is in data centres at the core of the internet. Now, if you think about it, that doesn't make a lot of sense," he said.

"The core data centres are further from the end users, and so the latency is worse. Core data centres are focal points for traffic, which can lead to congestion. They are rich targets for attackers. And as you can often read in the headlines, data centres have a tendency to go down, which can disrupt your business."

Solving the content distribution problem takes servers and applied mathematics – Leighton was a mathematics professor at MIT. "The edge is where all the users are. It's where all the devices are. And it's where all the bandwidth is – that's why more and more functionality is moving to the edge," he told us.

Playing in this space takes a ridiculous amount of distributed infrastructure – Akamai has placed 250,000 servers in 4,000 locations across 140 countries, making this the largest distributed computing platform in the world.

CDNs are still mostly about video streaming

There's no denying the company has built a lead in the sector, and over the six years from 2012 to 2018, it has bumped up its revenues from $1.4bn to $2.7bn.

However, despite loud proclamations about the edge as a universal platform for enterprise IT, the main CDN use case today is still the delivery of media – namely high-definition video.

"I was recently told by one of the world's largest broadcasters that they are planning to stop using satellites altogether within 10 to 15 years," Leighton said.

Back in the old days, broadcasters just paid for a satellite and were done – today, they have to think about traffic, bandwidth and latency, something that comes to them as naturally as flying does to the fish.

"2008 was the first time when the traffic on the Akamai platform went over a terabit per second. This seemed like a really big deal to us back then. To put a terabit of traffic in context, it's the amount of traffic needed for a million people to watch a stream that's encoded at one megabit per second. Now, a megabit per second is less than half as good as what you would need for standard definition TV – that's about two to three megs.

"A lot has changed since 2008 – we now deliver sporting events and gaming downloads that consume 20 to 30 terabits per second, all on their own.

"The peak last year was 72Tbps – that's what you would need for 18 million viewers watching at 4Mbps. That's a little bit better than standard definition TV, and at the bottom end of the range for DVD – which is already a legacy technology."

What caused this deluge of traffic, which set a new internet record? Turns out it was a cricket match in India.

Leighton, who holds 50 patents related to moving information around, then attempted to estimate how much traffic internet video might consume in the next few years. He said there were around 2.5 billion people who have streamed a video at least once in their life. If all of these people would watch simultaneously at 10Mbps – which makes for pretty decent quality, but nowhere near 4K – the amount of traffic it would generate is 25,000Tbps.

"That's why online broadcasters are so concerned about the capacity of the internet, and the cost of streaming over the top. They are wondering: can the internet scale to handle all that load, and can they afford to pay the cost of so much traffic?

"One common misconception is that the last mile is the bottleneck. If you do the math, you've got tens of thousands of terabits per second of capacity in the last mile, going to the end users – and that's just wired connections. If you look at the cellular connectivity – and again, no precise figures here – there's roughly 10,000Tbps of capacity, or more, in wireless, and that number is going to get a lot bigger with 5G."

"So however you look at it, there's roughly 50,000Tbps of capacity in the last mile, it's a big number. And that's why the edge is so important – there's a ton of capacity there." ®

More about

TIP US OFF

Send us news


Other stories you might like