The Register® — Biting the hand that feeds IT

Feeds

AMD claims 'world's fastest GPU' title

Radeon HD 7970 sports all-new 'Graphics Core Next' chip

AMD has unveiled its first graphics card based on its Graphics Core Next architecture, which The Reg told you about in excruciating detail this summer. According to AMD, the card – the Radeon HD 7970 – is also the only GPU to be built using a 28-nanometer process.

"This graphics card represents a revolution in the graphics industry," crowed AMD GPU honcho Matt Skynner in AMD's announcement. "To put it bluntly, at 28nm the AMD Radeon HD 7970 changes everything!"

AMD's Graphics Core Next (GCN) architecture is a from-the-bottom-up rethinking of GPU design, abandoning the time-honored very long instruction word (VLIW) architecture and replacing it with "compute units" (CUs) that are essentially vector cores containing multiple single-instruction-stream, multiple-data-stream (SIMD) structures, programmed in a per-lane basis.

What that stack of acronyms adds up to – in theory, at least, since we haven't yet got our hands on a GCN-based GPU – is a chip that not only provides a hefty dose of graphics power, but also lends itself to the CPU-cum-GPU cooperative processing known as "heterogeneous computing", in which chunks of tasks are assigned to compute or graphics cores depending upon which can handle them most efficiently.

AMD Radeon HD 7970 graphics card

AMD Radeon HD 7970 graphics card (click to enlarge)

The Radeon HD 7970, which Skynner not-so-modestly dubs "the world's fastest GPU", includes support for PCIe 3.0 and AMD CrossFire multiple-GPU tech, plus AMD App Acceleration, which the company claims "enables exquisite high-definition video images and exceptional performance improvements for everyday applications."

With the introduction of the HD 7970, AMD also rolled out a new metric – at least new to you humble Reg reporter – saying that the 28nm part provides "an improvement of over 150 per cent in performance/sq mm over the prior generation."

The HD 7970 also support an assortment of AMD-branded enhancements, including:

  • AMD HD3D Technology for – you guessed it – stereo 3D display,
  • AMD Eyefinity Technology for attaching up to six displays, supporting stereo 3D and total resolutions of up to 16K-by-16K pixels,
  • AMD PowerTune Technology for dynamic boosting of clock speeds when power requirements allow, and
  • AMD ZeroCore Power Technology for low idle-power levels and quieter operation.

As you've come to expect from AMD's graphics offerings, the HD 7970 supports DirectX 11, DirectCompute, and OpenCL. Also supported are DisplayPort 1.2 and HDMI 1.4a, and Discrete Digital Multi-Point (DDM) audio to run multiple independent audio streams over those connections.

On paper – and from what we learned about AMD's Graphics Core Next at this summer's Fusion Developer Summit – the Radeon HD 7970 looks like a threat to Nvidia's claim that their GeForce GTX 580 is "the world's fastest DirectX 11 GPU."

When the HD 7970 becomes available on January 9 "from retailers worldwide", it'll run you a cool $549, list price.

And, yes, it'll run Crysis. ®

Intel AMD

The only reason AMD didn't romp off into the distance for a couple of years while Intel pursued NetBurst is that Intel illegally forced Dell etc. to use their chips and not use AMDs.

The CPU market would be in much better shape now had AMD actually built up a decent cash pile to invest in R&D to compete with Intel's Core architecture.

15
0
Anonymous Coward

Yup....

...even with a bloody great joke alert icon, some people still miss the whole point of the post.

13
0
Anonymous Coward

@ AC

None of the published reviews agree with your assertion.

The 7970 is approx 20% faster than the 580 on all tests. Only some of the super-OC'd 580s can match it, and they cost MORE than the 7970.

The 590 and the 6990 are both dual GPU cards, and not a meaningful comparison at all. You can expect the 7990 to blow them both away as well.

Your post appears to be green team FUD.

8
0

"I get the impression that games developers write primarily for the consoles and then port across an equivalent-ish version to the PC. And consoles are less powerful than a high-end PC + Graphics Card, so are games really making use of the power available in these cards? Correct me if I'm wrong."

Yep, you're wrong. It seems to be a popular myth that PC gaming is dying, but there are still plenty of games developed specifically for PC, along with an awful lot more that are developed for all platforms at the same time rather than just being ported later. Plus, even the worst ports usually have much, much better graphics on the PC version, even though their interface often ends up sucking donkey balls.

It's also worth bearing in mind that the current generation of consoles is obsolete and will likely be replaced within a couple of years (sooner for the Wii, but it's not really worth talking about hardware for that). Not only will that mean PCs need to keep working to stay ahead, but this sort of new technology is exactly the sort of thing that future consoles will be built out of. Remember, pretty much all improvements in computing have been made incrementally, and without the constant push for more powerful PCs, consoles would never be able to improve either.

Slightly more on topic - from the Nvidia GTX580 website linked:

"Swift. Stealthy. Deadly."

I think my money will go to AMD, since at least their hardware doesn't seem to be threatening to kill me.

8
0
Anonymous Coward

You're missing the point...

....AMD (again) have realised they can't win on the how many processors you can cram into a space or how fast you can clock something, so they are doing something that they are good at, looking at the whole pc and going, right, how can we rework this without breaking everything (as Intel have such a huge advantage over AMD, they usually tend to try and force their technolgies through USB3 vs thunderbolt anyone?)

So as with the Athlon, the x64's and now fusion they have gone, ok, let's see how we can increase the whole machine. I know lets get the CPU and GPU working together nicely doing what each is best at.

6
0

More from The Register

Fanbois vs fandroids: Punters display 'tribal loyalty'
Buying a new mobe? You'll stick with the same maker - survey
iPhone 5 totters at the top as Samsung thrusts up UK mobe chart
But older Apples are still holding their own
Google to Glass devs: 'Duh! Go ahead, hack your headset'
'We intentionally left the device unlocked'
Japan's naughty nurses scam free meals with mobile games
Hungry women trick unsuspecting otaku into paying for grub
 breaking news
Turn off the mic: Nokia gets injunction on 'key' HTC One component
Dutch court stops Taiwanese firm from using microphones
Next Xbox to be called ‘Xbox Infinity’... er... ‘Xbox’
We don’t know. Maybe Microsoft doesn’t (yet) either
Barnes & Noble bungs Raspberry Pi-priced Nook on shelves
That makes the cheap-as-chips e-reader cool now, right?
Sord drawn: The story of the M5 micro
The 1983 Japanese home computer that tried to cut it in the UK