Intel CEO Brian is a man living on the edge

And what's on Krzanich's mind? Data. Your data. Petabytes of data

A man jumping off a cliff

Comment Intel declined to comment on industry whisperings that Qualcomm is mulling ending its Arm-powered server processor efforts.

Perhaps it's no wonder: Intel has a monopoly on the compute workhorses used in data centers by Google and Amazon, among others, to provide their increasingly ubiquitous cloud computing services, as well as all the businesses and enterprises relying on its silicon. Some analysts say Intel has nothing less than 99 per cent of the world's server CPU market locked up.

And it has proven to be extremely lucrative: a third of the US giant's annual revenues now come from the data-center sector – $19.1bn versus $34bn from personal computer chips, out of $62.8bn total sales in 2017. That $19.1bn figure was up 11 per cent on the previous year.

Photo of Intel

Having a monopoly on x86 chips and charging eyewatering prices really does pay off – Intel CEO


To hear Chipzilla's execs tell it at this year's Intel Capital conference, held in Palm Desert, California, on Tuesday, this remarkable market grip and resulting sales and profit is the result of classic strategic thinking on the part of its CEO Brian Krzanich. "Brian saw that the world was increasingly going to be using and relying on data," noted Intel Capital CEO Wendell Brooks just before introducing Krzanich on stage to give a keynote.

Krzanich himself picked up on that approach, noting that not long ago the issue was that everyone had too much data. "We would have discussions about which data we could delete," he told attendees. (And thanks to Europe's incoming privacy rules, GDPR, we still will, we note.)

The chief exec pushed on: the massive reduction in storage costs, he said, combined with exploding processing power has meant that people want more data, not less. Data is the future. Krzanich has often proclaimed this at events, and after the Cambridge Analytica scandal earlier this year, you'd think he'd dial it back a tad, but no. Full steam ahead. Data is the future.

Regardless as to whether Intel got it right by design, or accident, or a combination of both, it's indisputable that Intel spends a lot of time thinking about data. And what to do with what Krzanich predicted is the "coming flood of data" formed the main part of his keynote.

Intel makes a lot of data-center chips, but it desperately wants to be seen as a data-centric chipmaker. There's more to the company than PCs and servers, it insists, all while it stumbles here and there outside of its PC-server comfort zone. Don't mention the wearables, nor the mobile chips, nor the broadband modems, and so on.

By 2021, Krzanich predicted once again, as he has done in the past, the average internet user will produce 1.9GB of data every day. And that is going to be dwarfed by what is emitted by all the machines that we will connect up and network: an autonomous vehicle will produce 4TB per day; a connected airplane 5TB; and a "smart factory" a ridiculous 1PB of data every day; and so on.


To which the obvious questions are: what do you do with it all, do you even need all that, and how do you make any sense of it? And the answers to those questions, Intel's CEO said, come in understanding what that data is, and what use you need to make from it. And then, critically, where you do the analysis on it. You might do some processing on an Intel-powered backend server. Or an Intel CPU nearer the device. Just make sure there's an Intel chip involved somewhere.

"How much processing do you need to do on the device? How much processing do you need to do on the edge? And how much processing do you need to do in the cloud?" Krzanich summarized.

He gave an autonomous car as an example. The next-generation robo-ride will produce a constant stream of data about itself and the world around it. Some of it is only useful for immediate use, and needs to be processed in the car itself – the prime example being analyzing something in the road.

In that case, sending the data to the cloud to process and waiting for a decision – "What's that there.... Don't quite know... Let's ask headquarters.... Oh! There's something in the way! Brake!" – is going to take too long, and will likely result in a crash. Instead, it needs to be processed in the car and acted on immediately. Using, cough, cough, an Intel processor.

In a jam

The next level of data in this self-driving vehicle scenario is information with value that may last minutes to hours, such as an accident on the road causing traffic jams. This kind of data should be processed at the edge – and then shared with other cars so that they are aware of the prang and can route around it.

And then, the next level of data is in traffic patterns. That information can be sent up to the cloud where it is processed on a broader basis with large numbers of different data input sources, ie: cars. This system can then assist in better route planning – so at 9am, you take a different route to one you would take at 2pm.

In fact, Krzanich argued, in the case of an autonomous car, you could end up with a situation where more data comes into the car – in the form of music or video or phone calls – than leaves it.

Smart flow

You still have to make sense of these enormous flows of data, however, and that is where artificial intelligence comes in.

Krzanich noted that artificial intelligence is nothing new – "these algorithms have been around for years" – but the ability to store that data at a vastly lower cost and the computing power that is now readily available has made it possible. Now machine-learning software is needed to "unlock the power of data."

Again, though, it all depends on the workload and the application – both of which will vary massively, and so will the type of AI applied to it. "The diversity of AI solutions will be at the edge," Krzanich predicted – using the same logic as the autonomous car: not everything needs to, or should be, sent to the cloud.

He gives another example: of cameras at a packing factory that are reading bar codes and identifying products. By shifting the processing in that case from the edge of the network to the camera itself, Intel found that facial recognition became three times faster – from 30 seconds to just 10.

And to make that kind of on-device processing possible, he sees specialized silicon and specialized algorithms for specific tasks – "systems that can learn by themselves" – as being key. Earlier, Brooks had spoken about a facial recognition chip that was optimized to work 50 per cent faster.

Done right, the efficiencies can be enormous: Krzanich said that at the packing factory, it was able to increase production volume and package sorting by ten-fold just by optimizing the operation.

"By 2025, the world will have dramatically shifted," he noted. And Intel's chips will be driving it all, at every level, he assumes. ®

Sponsored: Minds Mastering Machines - Call for papers now open

Biting the hand that feeds IT © 1998–2018