NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
AA Disabled Performance
Up to this point, most of our benchmarks have been run with 4xAA, as we feel most people considering something like the new 8800 GTX are going to be interested in image quality as well as performance. If you don't care about antialiasing, the need for such fast graphics cards trails off quickly, as you'll see here.
The 8800 GTX SLI still has issues with Battlefield 2, but more importantly you see the clustering of all of the high-end graphics configurations once antialiasing is disabled. Discounting the single ATI X1950 XTX and GeForce 7900 GTX cards, the spread among all the cards is about 20%-25%. Battlefield 2 is also clearly beginning to run into CPU limitations, with many of the cards showing very little in the way of performance drops when going from 1600x1200 to 1920x1440. When 8800 GTX SLI is fixed, we expect to see a more or less flat line throughout resolution scaling. Battlefield 2142 would once again be something nice to test, as frame rates are a bit lower with that title, but overall the Battlefield series has always been pretty demanding when it comes to CPU power (not to mention have enough memory).
With 4xAA, Episode One showed a bit more separation, and our particular demo seemed to be CPU limited to around 230 FPS. Disabling antialiasing shows that 230 FPS is indeed where our CPU tops out. The other cards move closer to this mark, but without dropping to a lower resolution none of them are yet able to reach it. With the minimum score coming in at 56 FPS, and even then only at 2560x1600, Half-Life 2: Episode One does not appear to really need anything faster in the GPU department just yet.
Disabling antialiasing in Prey improved performance in most of the tested configurations by about 20%, and the 8800 GTX SLI setup becomes a bit more CPU limited.. The relative positions of the cards don't really change much, although the GeForce 7 series cards appear to do slightly better without antialiasing compared to the ATI cards.
111 Comments
View All Comments
Sunrise089 - Thursday, November 9, 2006 - link
Then I suppose he's in the market to part with an ugly old high-end CRT. I'd love to buy it from him. Seriously.JarredWalton - Thursday, November 9, 2006 - link
You want an older 21" Cornerstone CRT? It's a beast, but you can have it for the cost of shipping (which unfortunately would probably be ~$50). I'd also sell my Samsung 997DF 19" CRT for about $50, and maybe an NEC FE991-SB for $50 (which unfortunately has a scratch from my daughter in the anti-glare coating). If anyone lives in the Olympia, WA area, you know how to contact me (I hope). I'd rather someone come by to pick up any of these CRTs rather than shipping, as I don't think I have the original boxes.DerekWilson - Thursday, November 9, 2006 - link
lol next thing you know links to ebay auctions are gonna start showing up in our articles :-)yyrkoon - Thursday, November 9, 2006 - link
lol, I've got a 21" techtronics I'll sell for $200 usd, plus shipping ;) Hasnt been used since I purchased my Viewsonic VA1912wb (well, been used very little ).imaheadcase - Wednesday, November 8, 2006 - link
can't stand AA benchmarks myself :)Question: Do you have any info on what kinda card nvidia releasing this feb? Is it something in between these 2 cards or something even lower?
Im looking for a $300ish g80! :D
flexy - Wednesday, November 8, 2006 - link
if ANYTHING counts then how those high-end cards perform WITH their various AA settings.... the power of those cards (and the money spent on :) RIGHT translated into ---> IMAGE QUALITY/PERFORMANCE.Please dont tell you you would get an G80 but do NOT care about AA, this does NOT make any sense...sorry...
I am especially impressed reading that transparency AA has such a LITTLE performance impact. What game engine did you test this on ?
On the older ATI cards (and am i right that T.A. is the same as "adaptive antialiasing" ? )...this feature (depending on game engine) is the FPS killer....eg. w/ games like oblivion (WHERE ARE THE GOTHIC 3 BENCHEIS BTW ? :)...much vegetation etc. game-engines.
Enable transparency AA and see all those trees, grass etc. without jaggies.
imaheadcase - Thursday, November 9, 2006 - link
Well lots of people don't are for AA. Even if i had this card I would not use it. I visually see NO difference with it on or off. Its personal test. I don't even see "jaggies" on my older 9700 PRO card.flexy - Thursday, November 9, 2006 - link
you sure are talking about ANTIALIASING ???What resoltions do you run ? Not that my CRT can even handle more than 1600x1200..but even w/ 1600 i get VERY prominent jaggies if i dont run AA.
I made it a habit to run at least 4xAA in ANY game, and some engines (hl2:source engine) etc. run extremely well with 4xAA, even 6xAA is very playable at elast with HL2.
The very recent games, namely NWN2 and G3 now dont support AA, playing at 1280x1024 and it looks utterly horrible ! If you say you dont see jags in say ANY resolution under 1600..very hard to believe
imaheadcase - Thursday, November 9, 2006 - link
Yes im talking about antialiasing. I normally play BF2 and oblivion at 1024x768 (9700 pro remember).Fact is most people won't see them unless someone points them out. The brain is still better at rendering stuff the way you want to see it vs hardware :)
flexy - Thursday, November 9, 2006 - link
ok..but then it's also a performance problem. If it doesn't bother you, well ok.I also have to settle w/ the fact that many RECENT games are even unable to do AA..however i wish they would.
But once i get a 8800 i will do &&&& to get the most out of IQ, AA, AF, transparency/texture AF, you name it. ALONE also for the reason that i would need a super-high end monitor first to even run resolutions like 2000xsomething...and as long as i have a lame 19" CRT and CANNOT even go over 1600 (99,99% of games even running everything on 1280x or 1360x) i will use all the power to get out best possible IQ in those low resolutions.
Also..looking at the benchmarks..its NOT that you lose any real time gaming-experiencee since THOSE monster cards are made for exactly this...eg. running oblivion with all those settings at MAX AND AA on and HDR...and you are still in VERY reasobale FPS ranges.