AMD takes on Intel in 'the internet of things'
From refrigerators to one-armed bandits
AMD took direct aim at Intel's low-power Atom embedded processor and platform on Wednesday with the release of its G-Series Fusion APUs (accelerated processing units), the follow-ons to AMD's recently released C-Series and E-Series APUs for the notebook, netbook, and tablet markets.
"AMD's commitment is to ensure the game-changing technologies we develop for consumers and the enterprise are also available for the vast and growing embedded market," said Patrick Patla, general manager of AMD's server and embedded division, when announcing the G-Series debut.
Vast and growing is no exaggeration. The buzzphrase du jour, "the internet of things", refers to how everything from TVs to toasters are rapidly finding their way online – and embedded processors are providing the smarts for that transition.
All three APUs offer DirectX 11 support – lacking in Intel's latest Sandy Bridge processor line – plus support for OpenGL 4.0 and OpenCL, and hardware decode support for H.264, VC-1, MPEG2, WMV, DivX, and Adobe Flash.
You can see a full list of the G-Series capabilities by taking a gander at its product brief, but a summary of the five parts that comprise the series is as follows:
According to AMD, the G-Series is targeted at "embedded applications such as Digital Signage, x86 Set-Top-Box (xSTB), IP-TV, Thin Client, Information Kiosk, Point-of-Sale, and Casino Gaming" – all of which can be upstanding citizens in the internet of things.
AMD also trumpeted a host of design wins in its announcement, including companies such as Advansus, Compulab, Congatec, Fujitsu, Haier, iEi, Kontron, Mitec, Quixant, Sintrones, Starnet, WebDT, Wyse, "and many others."
Some of those companies may not be household names, but there's a good chance that one of them supplies the innards to something you use on a regular basis. A Sintrones V-BOX for example, might have been aboard the bus or train that took you to work this morning, or a Kontron or Quixant controller may have powered the one-armed bandits that entertained attendees at the recent Las Vegas Consumer Electronic Show during their off hours. ®
Meh, with a capital M.
It sounds like the Intel CE line from two years ago, that was interesting for a few minutes but has been fraught with trouble. As has been pointed out x86 has horrible power efficiency, I have seen a lovely dual core ARM based chip decoding HD video or doing 3D graphics in under 2W maximum.
The kicker as far as CE is concerned for these devices is: do they have proven crypto cores with secure on-board NVM which is isolated from the application processor? If not they won't get certification from any of the content security companies and thus it won't hit the home entertainment market. At least company I know has dropped an Intel CE for a major project because they couldn't pass security certification after two attempts.
Unless AMD have fully understood security they most these products are going to do is make glorified netbooks or expensive alternatives to the existing Chinese TV media players. The power consumption also prevents them from being used in many commercial set-top box designs because they won't meet the European requirements for efficient power design in complex set-top boxes. If it is consuming more than 4W when running full video decode and graphics then it isn't classed as meeting the requirements!
Followed by taking on Eddie the Eagle in ski jumping....?
Why on earth would AMD want to take on Intel in the internet of things? The internet of things runs on ARM. Taking on Intel in the internet of things is like taking on Eddie the Eagle in ski jumping.
"internet of things"
Take a look around any random house or any random commercial premises.
Pick a few boxes with computers inside, other than PC-derived things. At home, a set top box, in the office, a router, I'm sure you can find plenty.
Now how many of them will be x86-based? I'll expect the answer will be basically zero, and I'd stake money on it.
Intel CPUs (even with Atom) are basically irrelevant unless the hardware needs to run Windows.
The "internet of things" doesn't run Windows. It may well run Linux, or QNX, or VxWorks, or some other OS most readers never heard of.
"The internet of things" doesn't run on x86 though. And won't.
Point malware at the device? WTF?
Sorry, but you can't exploit flash (let alone just the flash H.264 decode on these procs) simply by trying to chuck a malformed flash file at the IP. Flash is not a server (some might argue this) on the box. It does not have open ports listening for connections, let alone listening for flash files to then try to display.
/Fail for complete lack of understanding of network communication.
14x USB 2.0? and the massive PCB power draw
"14x USB 2.0?"
really have they got No Imagination , 14x USB 2.0?, who Needs 14 of anything in the consumer device.
id be hard pushed to even justify 14 fast sata and they are useful, never mind wasting die space and the massive PCB power draw on 14 slow usb2 ports.
perhaps people dont actually use their goggle fu to look at the many ARM and low power PPC SOC and see many cheap Chips giving you many Gigabit Ethernet and useful stuff without going OTT (Over The Top)