|GTX 260||GTX 280|
|Power connectors||6-pins + 6-pin||6-pin + 8-pin|
GeForce GXT 200 GPUs support DirectX 10 and Open GL 2.1. The cards support two flavors of SLI. There's the standard connection of two GPU boards and a new 3-way SLI if you really feel like overdoing things.
The chip architecture consists of a number of texture processing clusters (TPCs), which is made up of a number of streaming multiprocessors (SMs). Each SM contains eight processor cores and texture filtering processors used in graphics processing.
GTX 280 improves on G80 and G92 designs by increasing the number of SMs per TCP from two to three. It also increases the maximum number of TCPs per chip from eight to 10. So where the GeForce 8 and 9 series had a total of 128 processing cores, the GTX 200 series has a max of 240.
To address more complex shaders in modern games, Nvidia has shifted the GPU balance to a higher shader to texture ratio. By adding one more SM to each TCP and keeping texturing hardware constant, the shader to texture ratio is increased by 50 per cent.
Another new addition is double-precision, 64-bit floating point computation support. That's good news for high-end scientific, engineering, and financial computing applications, a market that Nvidia is focusing more and more attention on with each generation of chip. Each SM incorporates a double-precision 64-bit floating math unit, for a total of 30 double-precision 64-bit processing cores.
Nvidia has also employed a more flexible power management than previous GPU incarnations. Using a HybridPower-capable nForce motherboard, the GTX 200 GPU can be fully powered off when not performing graphics-intensive operations. Nvidia also estimates Blu-ray DVD playback mode takes about 35W, and idle mode without HybridPower takes up 25W.
Meanwhile AMD is readying its Radeon HD 4800 series, starting with the 4850 at a starting price of $299. Later it will roll out the 4870 at $349. This summer will certainly prove to be an interesting time for the graphics card market. ®
Nvidia launches GTX 200 series GPUs
Big screens benefit the most
Hmmm, from some Inquiring sites, the early numbers seem to suggest that it's the big screens that get the biggest benefit. So if you are still on your 19" monitor with a maximum resolution of 1280x1024, the upgrade would hardly be worth it.
Then again, the people that can afford the 30" screens that benefit the most are the ones most likely to be able to afford one of these cards anyway!
I wonder if ATI will be doing the same as they did at the last launch and concentrating on volume as opposed to Top Dog. The card that is the Top Dog typically costs Top Dollar, and as I said, it's the people that have the big house that has the large room for large monitors that buys the Top Dog card. So in terms of earnings, I think the lesser cards are what makes the company coffers less empty.
However, the cynic in me still thinks that if ATI could have shot for Top Dog they would have....
PS: I'm a PC gaming fan with a 24" monitor with a self specced Crysis killer machine (heh, my machine lost to the beast that is Crysis!) but the whole Console and HD thang is most definitely going to be my next purchasing focus.
Nice hardware, could we have some drivers please?
Great, more Nvidia hardware.
Perhaps they could spend a couple of dollars more on the stabilty of their drivers? The latest 175 series drivers are woeful, leading to all manner of amusing BSODs. Even the latest Nvidia Linux drivers (well, the 64bit ones) won't recognise my nearly-new 8800GTS!
If you know the wattage of the card then some basic maths will give you the amps..
Watts = Volts x Amps
Amps = Watts / Volts
19.66A = 236W / 12V
Mines the one with the pocket protector and slide rule.
which is why whenever i buy from abroad, i make sure i enter my full first name.
"Gift for" being a very popular first name round my neck of the woods.
Funny that. All the decent products I've ever looked at seem to have a spec sheet on the manufacturer's website showing total amps available and the max draw on each rail. That even goes down as far as the not-a-well-known-brand unit currently doing its stuff in my rig.
For the cheap ones, who gives a toss? If you buy an el cheapo PSU you should be damned grateful if the volts are anywhere near spec, never mind the current. Anyone not quoting a full spec probably has a damned good reason for keeping quiet on the subject.