NVIDIA's 1.4 Billion Transistor GPU: GT200 Arrives as the GeForce GTX 280 & 260
by Anand Lal Shimpi & Derek Wilson on June 16, 2008 9:00 AM EST- Posted in
- GPUs
GT200 vs. G80: A Clock for Clock Comparison
The GT200 architecture isn't tremendously different from G80 or G92, it just has a lot more processing power. The comparison below highlights the clock for clock difference between GT200 and its true predecessor, NVIDIA's G80. We clocked both GPUs at 575MHz core, 900MHz memory and 1350MHz shader, so this is a look at the hardware's architectural enhancements combined with the pipeline and bus width increases. The graph below shows the performance advantage of GT200 over G80 at the same clock speeds:
Clock for clock, just due to width increases, we should be at the very worst 25% faster with GT200. This would be the case where we are texture bound. It is unlikely an entire game will be blend rate bound to the point where we see greater than 2x speedups, and while test cases could show this real world apps just aren't blend bound. More realistically, the 87.5% increase in SPs will be the upper limit on performance improvements at the same clock rate. We see our tests behave within these predicted ranges.
Based on this, it appears that Bioshock is quite compute bound and doesn't run into many other bottlenecks when the burden is eased. Crysis on the other hand seems to be limited by more than just compute as it didn't benefit quite as much.
The way compute has been rebalanced does affect the conditions under which performance will benefit from the additional units. More performance will be available in the case where a game didn't just need more compute, but it needed more computer per texture. The converse is true when a game could benefit from more compute, but only if there was more texture hardware to feed them.
108 Comments
View All Comments
Chaser - Monday, June 16, 2008 - link
Maybe I'm behind the loop here. The only competition this article refers to is some up coming new INTEL product in contrast to an announced hard release of the next AMD GPU series a week from now?BPB - Monday, June 16, 2008 - link
Well nVidia is starting with the hi end, hi proced items. Now we wait to see what ATI has and decide. I'm very much looking forward to the ATI release this week.FITCamaro - Monday, June 16, 2008 - link
Yeah but for the performance of these cards, the price isn't quite right. I mean you can get two 8800GTs for under $400 and they typically outperform both the 260 and the 280. Yes if you want a single card, these aren't too bad a deal. But even the 9800GX2 outperforms the 280 normally.So really I have to question the pricing on them. High end for a single GPU card yes. Better price/performance than last generations card, no. I just bought two G92 8800GTSs and now I don't feel dumb about it because my two cards that I paid $170 for each will still outperform the latest and greatest which cost more.
Rev1 - Monday, June 16, 2008 - link
Maybe lack of any real competition from ATI?hadifa - Monday, June 16, 2008 - link
No, The reason is high cost to produce. over a Billion transistors, low yields, 512 bit bus ...
Unfortunately the high cost and the advance tech doesn't translate to equally impressive performance at this stage. For example, if the card had much lower power usage under load, still it would have been considered a good move forward for having comparable performance to a dual GPU solution but with much cooler running and less demanding hardware.
As the review mentions, this card begs for a die shrink. It will make it use less power, be cheaper, run cooler and even have a higher clock.
Warren21 - Monday, June 16, 2008 - link
That competition won't come for another two weeks, but when it does -- rumour has it NV plan to lower their prices. Most preliminary info has HD 4870 at 299-329 and pretty much GTX 260 performance, if not, then biting at it's heels.smn198 - Tuesday, June 17, 2008 - link
You haven't seen anything yet. check out this picture of the GTX2 290!! http://tinypic.com/view.php?pic=350t4rt&s=3">http://tinypic.com/view.php?pic=350t4rt&s=3Mr Roboto - Wednesday, June 18, 2008 - link
Soon it will be that way if Nvidia has their way.