NVIDIA's GeForce 7800 GTX Hits The Ground Running
by Derek Wilson on June 22, 2005 9:00 AM EST- Posted in
- GPUs
Half Life 2 Performance
Half-Life 2 is arguably one of the best looking games currently available. We mentioned earlier that it stresses pixel processing power more than memory bandwidth on the graphics card, and we see that here. While enabling AA/AF does cause a performance loss at these high resolutions, it isn't truly severe until we reach 2048x1536. Assuming a more common resolution of 1600x1200 - not everyone has a monitor with support for higher resolutions - the single 7800GTX is actually faster than 6800U SLI. In fact, the only time we see the 6800 SLI setup win out is when we run 4xAA/8xAF at 2048x1536, and then it's only by 3%. Also worth noting is that while SLI helps out the 6800 series quite a bit (provided you have a fast CPU and are running a high resolution), the 7800GTX is clearly running into CPU limitations. Our FX-55 can't push any of the cards past the 142 FPS mark regardless of resolution.ATI has done well in HL2 since its release, and that trend continues. The SLI configurations (other than the 6600GT) all surpass the performance of the X850XTPE, but it does come out ahead of the 6800U in a single card match - it's as much as 42% faster when we look at the 1600x1200 AA/AF scores. The 7800GTX, of course, manages to beat it quite easily. The 540 MHz core clock of the XTPE is quite impressive in the pixel shader heavy HL2, but with the additional pipelines and improved MADD functionality, the 7800GTX chews up HL2 rocks and spits out Combine gravel.
One thing that isn't immediately clear is why the 6800U cards have difficulties supporting the 2048x1536 resolution in some games. Performance drops by almost half when switching from 1600x1200 to 2048x1536, so either there's a driver problem or the 6800U simply doesn't do well with the demands of HL2 at such resolutions. There are 63% more pixels at 2048x1536 compared to 1600x1200, so it's rather shocking to see a performance decrease larger than this amount. We would venture to guess that it's a matter of priorities: the number of people that actually run 2048x1536 resolution in games is very small in comparison to the total number of gamers, and with most cards only providing a 60Hz refresh rate at that resolution, many don't worry too much about gaming performance.
127 Comments
View All Comments
CrystalBay - Wednesday, June 22, 2005 - link
Does this card play Riddick smoothly @ shader 2++ ?????fishbits - Wednesday, June 22, 2005 - link
"In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last. Dont make me switch to consoles damnit."Funny, I can't afford the very best TVs the minute they come out. Same for stereo components. But I don't cry about it and threaten "Don't make me switch to learning the ukelele and putting on my own puppet shows to entertain myself!" Every time a better component comes out, it means I get a price reduction and feature upgrade on the items that are affordable/justifiable for my budget.
Seriously, where does the sense of entitlement come from? Do these people think they should be able to download top-of-the-line graphics cards through BitTorrent? Do they walk around Best Buy cursing out staff, manufacturers and customers for being so cruel as to buy and sell big-ass plasma TVs?
On second thought, get your console and give up PC gaming. That way you can stop being miserable, and we can stop being miserable hearing about your misery.
tazdevl - Wednesday, June 22, 2005 - link
Funny how the single card deltas here are higher than at any other site.Unwhelmed for the amount of money and lack of performance increase.
Have to commend nVIDIA for ensuring retail availability at launch.
archcommus - Wednesday, June 22, 2005 - link
Impressive, but I'm still happy with my X800 XL purchase for only $179. For what it seems, with a 1280x1024 display, I won't need the kind of power this card delivers for a very long time. And less than $200 compared to $600, with still excellent peformance for now and the forseeable future? Hmm, I'll take the former.Lonyo - Wednesday, June 22, 2005 - link
I would have liked some 1280x1024 benchmarks with 8xAA from the nVidia cards and 6xAA from ATi to see if it's worth getting something like a 7800GTX with 17/19" LCD's to run som esuper high quality settings in terms of AA/AF.segagenesis - Wednesday, June 22, 2005 - link
I'm not disappointed. For one thing the price of current cards will likely drop now, and there will also be mid-range parts soon to choose from. I think the transparency AA is a good idea for say... World of Warcraft. The game is loaded with them and too often can you see the blockyness of trees/grass/whatever.#44 - Actually are you new to the market? :) I remember when early "accelerated" VGA cards were nearly $1000. Or more.
Everybody lambasted NVIDIA last year for the lack of product (6800GT/Ultra) to the market, so them actually making a presence this year instead of a paper launch should also be commended. Of course, now what is ATI gonna pull out of its hat?
KeDaHa - Wednesday, June 22, 2005 - link
The screenshot shows very clearly that SSAA provides quite a quality improvement over no AAThe difference is bloody miniscule, perhaps if you used an image SLIGHTLY larger than 640x480 to highlight the difference?
L3p3rM355i4h - Wednesday, June 22, 2005 - link
Wowzers. Time to get rid of teh 9800...shabby - Wednesday, June 22, 2005 - link
In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last.Dont make me switch to consoles damnit.
Xenoterranos - Wednesday, June 22, 2005 - link
Hell, for the same price as an SLI setup I can go out and get a 23 inch cinema display...And since these cards can't handle the 30" native resolution anyway, it's a win-win. And yeah, whats up with the quality control on these benchmarks! I mean really, I almost decided to wait for the ATI next-gen part when I saw this (GeForce man since the GeForce2 GTS!)