NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
AA Disabled Performance
Up to this point, most of our benchmarks have been run with 4xAA, as we feel most people considering something like the new 8800 GTX are going to be interested in image quality as well as performance. If you don't care about antialiasing, the need for such fast graphics cards trails off quickly, as you'll see here.
The 8800 GTX SLI still has issues with Battlefield 2, but more importantly you see the clustering of all of the high-end graphics configurations once antialiasing is disabled. Discounting the single ATI X1950 XTX and GeForce 7900 GTX cards, the spread among all the cards is about 20%-25%. Battlefield 2 is also clearly beginning to run into CPU limitations, with many of the cards showing very little in the way of performance drops when going from 1600x1200 to 1920x1440. When 8800 GTX SLI is fixed, we expect to see a more or less flat line throughout resolution scaling. Battlefield 2142 would once again be something nice to test, as frame rates are a bit lower with that title, but overall the Battlefield series has always been pretty demanding when it comes to CPU power (not to mention have enough memory).
With 4xAA, Episode One showed a bit more separation, and our particular demo seemed to be CPU limited to around 230 FPS. Disabling antialiasing shows that 230 FPS is indeed where our CPU tops out. The other cards move closer to this mark, but without dropping to a lower resolution none of them are yet able to reach it. With the minimum score coming in at 56 FPS, and even then only at 2560x1600, Half-Life 2: Episode One does not appear to really need anything faster in the GPU department just yet.
Disabling antialiasing in Prey improved performance in most of the tested configurations by about 20%, and the 8800 GTX SLI setup becomes a bit more CPU limited.. The relative positions of the cards don't really change much, although the GeForce 7 series cards appear to do slightly better without antialiasing compared to the ATI cards.
111 Comments
View All Comments
JarredWalton - Wednesday, November 8, 2006 - link
Page 17:"The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings."
Using a third GPU for physics processing is another possibility, once NVIDIA begins accelerating physics on their GPUs (something that has apparently been in the works for a year or so now).
Missing Ghost - Wednesday, November 8, 2006 - link
So it seems like by substracting the highest 8800gtx sli power usage result with the one for the 8800gtx single card we can conclude that the card can use as much as 205W. Does anybody knows if this number could increase when the card is used in DX10 mode?JarredWalton - Wednesday, November 8, 2006 - link
Without DX10 games and an OS, we can't test it yet. Sorry.JarredWalton - Wednesday, November 8, 2006 - link
Incidentally, I would expect the added power draw in SLI comes from more than just the GPU. The CPU, RAM, and other components are likely pushed to a higher demand with SLI/CF than when running a single card. Look at FEAR as an example, and here's the power differences for the various cards. (Oblivion doesn't have X1950 CF numbers, unfortunately.)X1950 XTX: 91.3W
7900 GTX: 102.7W
7950 GX2: 121.0W
8800 GTX: 164.8W
Notice how in this case, X1950 XTX appears to use less power than the other cards, but that's clearly not the case in single GPU configurations, as it requires more than everything besides the 8800 GTX. Here's the Prey results as well:
X1950 XTX: 111.4W
7900 GTX: 115.6W
7950 GX2: 70.9W
8800 GTX: 192.4W
So there, GX2 looks like it is more power efficient, mostly because QSLI isn't doing any good. Anyway, simple subtraction relative to dual GPUs isn't enough to determine the actual power draw of any card. That's why we presented the power data without a lot of commentary - we need to do further research before we come to any final conclusions.
IntelUser2000 - Wednesday, November 8, 2006 - link
It looks like putting SLI uses +170W more power. You can see how significant video card is in terms of power consumption. It blows the Pentium D away by couple of times.JoKeRr - Wednesday, November 8, 2006 - link
well, keep in mind the inefficiency of PSU, generally around 80%, so as overall power draw increases, the marginal loss of power increases a lot as well. If u actually multiply by 0.8, it gives about 136W. I suppose the power draw is from the wall.DerekWilson - Thursday, November 9, 2006 - link
max TDP of G80 is at most 185W -- NVIDIA revised this to something in the 170W range, but we know it won't get over 185 in any case.But games generally don't enable a card to draw max power ... 3dmark on the other hand ...
photoguy99 - Wednesday, November 8, 2006 - link
Isn't 1920x1440 a resolution that almost no one uses in real life?Wouldn't 1920x1200 apply many more people?
It seems almost all 23", 24", and many high end laptops have 1900x1200.
Yes we could interpolate benchmarks, but why when no one uses 1440 vertical?
Frallan - Saturday, November 11, 2006 - link
Well i have one more suggestion for a resolution. Full HD is 1920*1080 - that is sure to be found in a lot of homes in the future (after X-mas any1 ;0) ) on large LCDs - I believe it would be a good idea to throw that in there as well. Especially right now since loads of people will have to decide how to spend their money. The 37" Full HD is a given but on what system will I be gaming PS-3/X-Box/PC... Pls advice.JarredWalton - Wednesday, November 8, 2006 - link
This should be the last time we use that resolution. We're moving to LCD resolutions, but Derek still did a lot of testing (all the lower resolutions) on his trusty old CRT. LOL