Quake 4 Performance
There has always been a lot of debate in the community surrounding pure timedemo benchmarking. We have opted to stick with the timedemo test rather than the nettimedemo option for benchmarking Quake 4. To be clear, this means our test results focus mostly on the capability of each graphics card to render frames generated by Quake 4. The frame rates we see here don't directly translate into what one would experience during game play.
Additionally, Quake 4 limits frame rate to 60 fps during gameplay whether or not VSync is enabled. Performance characteristics of a timedemo do not reflect actual gameplay. So why do we do them? Because the questions we are trying to answer have only to do with the graphics subsystem. We want to know what graphics card is better at rendering Quake 4 frames. Any graphics card that does better at rendering Quake 4 frames will play Quake 4 better than another card for Quake 4. While that doesn't mean the end user will see higher performance in the game, it does mean that the potential for seeing more performance is there. For instance, if the user upgrades a CPU before the next graphics card upgrade.
Timedemos do walk a fine line between synthetic benchmarks and real world benchmarks. While we tend to favor real world data here at AnandTech, this type of benchmark is very capable of using a real world data set to test the maximum capabilities of the graphics cards under its particular work load without bottlenecking at other points in the system. To be sure, even timedemos can see memory and CPU bottlenecks, as data must be transfered to the graphics card some how. But this impact is much lower than the impact of running AI, physics, script management, I/O, and other game code at the same time.
What this means to the end user is that in-game performance will almost always be lower than timedemo performance. It also means that graphics cards that do slightly better than other graphics cards will not always show a tangible performance increase on an end user's system. As long as we keep these things in mind, we can make informed conclusions based on the data we collect.
Our benchmark consists of the first few minutes of the first level. This includes both inside and outdoor sections, with the initial few fire fights. We tested the game with High Quality settings (not Ultra), and we enabled all the advanced graphics options except for VSync and antialiasing. Id does a pretty good job of keeping framerate very consistent, and so in-game framerates of 25 are acceptable. While we don't have the ability to make a direct mapping to what that means in the timedemo test, our experience indicates that a timedemo fps of about 35 translates into an enjoyable experience on our system. This will certainly vary on other systems, so take it with a grain of salt. But the important thing to remember is that this is more of a test of relative performance of graphics cards when it comes to rendering Quake 4 frames -- it doesn't directly translate to Quake 4 experience.
The 7600 GT is able to run Quake 4 at this resolution with no problem, while the 6600 GT and the X1600 XT aren't up to the task. Running at 1600x1200 is a good solid resolution for Quake 4, as the low contrast edges and the pixel size (on a 21" monitor like the one we test on) is good enough to make aliasing less of an issue than in a game like Battlefield 2. Very interestingly, the X1900 GT bests the 7900 GT in an OpenGL game, where the opposite was true in more than a couple DirectX games in this series of tests.
The X1900 GT is a great value for Quake 4, offering performance beyond that of the 7900 GT while costing much less. Owners of the X1800 GTO (or similar class cards) aren't doing too poorly here, but owners of the 6600 GT would do well by running at 1024x768 for the best experience. Other cards that couldn't hold their own at 1600x1200 will do fine at 1280x1024.
Very interestingly, it seems that a benchmark traditionally ruled by NVIDIA hardware has lost ground to ATI. As this really is more of a pure OpenGL rendering benchmark, we are glad to see ATI doing so well where they haven't had strong performance in the past.
74 Comments
View All Comments
Sharky974 - Friday, August 11, 2006 - link
I tried comparing numbers for SCCT, FEAR and X3, the problem is Anand didn't bench any of these with AA in this mid-range test, and other sites all use 4XAA as default. So in other words no direct numbers comparison on those three games at least with those two Xbit/FS articles is possible.Although the settings are different, both FS and Anand showed FEAR as a tossup, though.
It does appear other sites are confirming Anand's results more than I thought though.
And the X1900GT for $230 is a kickass card.
JarredWalton - Friday, August 11, 2006 - link
The real problem is that virtually every level of a game can offer higher/lower performance relative to the average, and you also get levels that use effects that work better on ATI or NV hardware. Some people like to make a point about providing "real world" gaming benchmarks, but the simple fact of the matter is that any benchmark is inherently different from actually sitting down and playing a game - unless you happen to be playing the exact segment benchmarked, or perhaps the extremely rare game where performance is nearly identical throughout the entire game. (I'm not even sure what an example of that would be - Pacman?)Stock clockspeed 7900GT cards are almost uncommon these days, since the cards are so easy to overclock. Standard clocks are actually supposed to be 450/1360 IIRC, and most cards are at least slightly overclocked in one or both areas. Throw in all the variables, plus things like whether or not antialiasing is enabled, and it becomes difficult to compare articles between any two sources. I tend to think of it as providing various snapshots of performance, as no one site can provide everything. So if we determine X1900 GT is a bit faster overall than 7900 GT and another site determines the reverse, the truth is that the cards are very similar, with some games doing better on one architecture and other games on the other arch.
My last thought is that it's important to look at where each GPU manages to excel. If for example (and I'm just pulling numbers out of the hat rather than referring to any particular benchmarks) the 7900 GT is 20% faster in Half-Life 2 but the X1900 GT still manages frame rates of over 100 FPS, but then the X1900 GT is faster in Oblivion by 20% and frame rates are closer to 40 FPS, I would definitely wait to Oblivion figures as being more important. Especially if you run on LCDs, super high frame rates become virtually meaningless. If you can average well over 60 frames per second, I would strongly recommend enabling VSYNC on any LCD. Of course, down the road we are guaranteed to encounter games that require more GPU power, but predicting what game engine is most representative of the future requires a far better crystal ball than what we have available.
For what it's worth, I would still personally purchase an overclocked 7900 GT over an X1900 GT for a few reasons, provided the price difference isn't more than ~$20. First, SLI is a real possibility, whereas CrossFire with an X1900 GT is not (as far as I know). Second, I simply prefer NVIDIA's drivers -- the old-style, not the new "Vista compatible" design. Third, I find that NVIDIA always seems to do a bit better on brand new games, while ATI seems to need a patch or a new driver release to address performance issues -- not always, but at least that's my general impression; I'm sure there are exceptions to this statement. ATI cards are still good, and at the current price points it's definitely hard to pick a clear winner. Plus you have stuff like the reduced prices on X1800 cards, and in another month or so we will likely have new hardware in all of the price points. It's a never ending rat race, and as always people should upgrade only when they find that the current level of performance they had is unacceptable from their perspective.
arturnowp - Friday, August 11, 2006 - link
I think another advantage of 7900GT over X1900GT is power consumption. I'm not checking numbers of this matter so I am not 100% sure.coldpower27 - Saturday, August 12, 2006 - link
Yes, this is completely true, going by Xbitlab's numbers.
Stock 7900 GT: 48W
eVGA SC 7900 GT: 54W
Stock X1900 GT: 75W
JarredWalton - Friday, August 11, 2006 - link
Speech-recognition + lack of proofing = lots of typos"... out of a hat..."
"I would definitely weight..."
"... level of performance they have is..."
Okay, so there were only three typos that I saw, but I was feeling anal retentive.
Sharky974 - Friday, August 11, 2006 - link
Not too beat this to death, but at FS the X1900GT vs 7900GT benchmarksX1900GT:
Wins-BF2, Call of Duty 2 (barely)
Loses-Quake 4, Lock On Modern Air Combat, FEAR (barely),
Toss ups- Oblivion (FS runs two benches, foliage/mountains, the cards split them) Far Cry w/HDR (X1900 takes two lower res benches, 7900 GT takes two higher res benches)
At Xbit's X1900 gt vs 7900 gt conclusion
"The Radeon X1900 GT generally provides a high enough performance in today’s games. However, it is only in 4 tests out of 19 that it enjoyed a confident victory over its market opponent and in 4 tests more equals the performance of the GeForce 7900 GT. These 8 tests are Battlefield 2, Far Cry (except in the HDR mode), Half-Life 2, TES IV: Oblivion, Splinter Cell: Chaos Theory, X3: Reunion and both 3DMarks. As you see, Half-Life 2 is the only game in the list that doesn’t use mathematics-heavy shaders. In other cases the new solution from ATI was hamstringed by its having too few texture-mapping units as we’ve repeatedly said throughout this review."
Xbit review: http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
Geraldo8022 - Thursday, August 10, 2006 - link
I wish you would do a similar article concerning the video cards for HDTV and HDCP. It is very confusing. Even though certain crds might state they are HDCP, it is not enabled.tjpark1111 - Thursday, August 10, 2006 - link
the X1800XT is only $200 shipped, why not include that card? if the X1900GT outperforms it, then ignore my comment(been out of the game for a while)LumbergTech - Thursday, August 10, 2006 - link
so you want to test the cheaper gpu's for those who dont want to spend quite as much..ok..well why are you using the cpu you chose then? that isnt exactly in the affordable segement for the average pc user at this pointPrinceGaz - Thursday, August 10, 2006 - link
Did you even bother reading the article, or did you just skim through it and look at the graphs and conclusion? May I suggest you read page 3 of the review, or in case that is too much trouble, read the relevant excerpt-