Fall '06 NVIDIA GPU Refresh - Part I: GeForce 7900 GS
by Derek Wilson on September 6, 2006 9:00 AM EST- Posted in
- GPUs
The Test
Coming up with an ideal testbed for graphics card comparisons has become a bit tricky of late. In the past, we used AMD Athlon 64/X2 configurations as these were the highest performing platforms. We had the added benefit of being able to run SLI and/or CrossFire with the best chipset options for the respective GPUs. Intel's Core 2 Duo launch has muddied the waters somewhat, as we are now stuck with testing CrossFire on a non-ATI chipset, and SLI testing with Core 2 Duo requires that we use a somewhat outdated nForce4 SLI X16-based motherboard. NForce 590 SLI for Intel will become available in the near future, and although the primary difference will be in features, performance may also be better.
In the end, decisions have to be made on how to test our GPUs, and compromises may be necessary. For now, we have restricted testing to single PCI-E X16 solutions. When we provide the second part of this GPU launch covering the 7950 GT, we will also take a look at CrossFire and SLI performance from the various offerings. Here's the test configuration we used.
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | Intel D975XBX (LGA-775) |
Chipset: | Intel 975X |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 6.8 NVIDIA ForceWare 91.47 |
Desktop Resolution: | 1920 x 1440 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won’t bother disabling it.
In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.
29 Comments
View All Comments
munky - Wednesday, September 6, 2006 - link
FEAR is a DX9 game, not OpenGL...
DerekWilson - Wednesday, September 6, 2006 - link
I'm looking into this at the moment but having trouble finding documentation on it.I suppose, as I was recently testing quad sli and saw huge performance increases, I assumed the game must be using the 4 frame afr mode only possible in opengl (dx is limited to rendering 3 frames ahead). I'll keep looking for confirmation on this ...
MemberSince97 - Wednesday, September 6, 2006 - link
Jupiter EX is a DX9 rendering engine...DerekWilson - Wednesday, September 6, 2006 - link
corrected, thanks ... now I have to figure out why FEAR likes quad sli so much ...MemberSince97 - Wednesday, September 6, 2006 - link
Nice writeup DW, I really like the mouseover performance % graphs...PrinceGaz - Thursday, September 7, 2006 - link
So do I, but there is one errorThat should be 14% and 25% advantages
The 7900GS has 20 PS while the 7900GT has 24 PS. That makes the 7900GS 20% slower than the 7900GT, but it makes the 7900GT 25% faster than the 7900GS. It's important to remember which one you're comparing it against when quoting percentages.
Hopefully the percentage performance difference in the graph itself was calculated correctly, or at least consistently.
PrinceGaz - Thursday, September 7, 2006 - link
Ooops sorry, please ignore my post. For some reason I thought for a moment the 7900GS had 16 PS and the 7900GT had 20 PS (despite writing the correct values in my comment). The article is correct, I was just getting confused.PS. an edit function would be nice.
Frackal - Wednesday, September 6, 2006 - link
There is no way an X1900xt gets 75fps at 1600x1200 4xAA, at that same resolution and AA setting I get well over 120-130fps average with an X1900xtx. Most sites show it hitting at least 100+DerekWilson - Wednesday, September 6, 2006 - link
if you use the built in demo features to run a timedemo with dice's own calculations you will get a very wrong (skewed upward) number. Dice themselves say that results over 100 fps aren't reliable.the problem is that they benchmark the load screen, and generally one card or the other will get better load screen performance -- for instance, the x1900 gt may get 300+fps while the 7900 gt may only get 200fps. (I just picked those numbers, but framerates for the load screen are well over 100 fps in most cases and drastically different between manufacturers).
not only does no one care about this difference on a load screen, but it significantly interferes with benchmark numbers.
the timedemo feature can be used to output a file with frametimes and instantaneous frames per second. we have a script that opens this file, removes the frame data for the load screen, and calculates a more accurate framerate average using only frame data for scenes rendered during the benchmark run.
this will decrease over all scores.
we also benchmark in operation clean sweep which has a lot of fog and water. we use a benchmark with lots of smoke and explosions and we test for some ammount of time in or near most vehicles.
splines - Wednesday, September 6, 2006 - link
Ownage approved.