Half Life 2 GPU Roundup Part 1 - DirectX 9 Shootout
by Anand Lal Shimpi on November 17, 2004 11:22 AM EST- Posted in
- GPUs
Battle in the Canal
Our first benchmark is packed full of just about all of the stressful elements you will encounter throughout Half Life 2. The demo starts aboard a boat driving in a tunnel before making a splash into a wide open body of water. The boat is piloted over to the shore where the player dismounts and heads inside for some action.
While inside the flashlight is used to illuminate dark areas and the player encounters a few firefights before heading upstairs to the outside. While outside (and while being pursued by a helicopter) the player encounters a few enemies on his way into a warehouse. The demo concludes inside the warehouse.
We created this demo because it incorporates just about everything – water, the flashlight, a vehicle, engaging enemies indoors as well as outdoors and sunlight. Since we’re dealing with all very capable cards here, let’s first look at performance at 1280 x 1024. Remember that we used the highest detail settings with the exception of anisotropic filtering and antialiasing, which were both disabled for this test (we will look at their impact on image quality/performance later on in this review).
It’s no surprise that we find ATI’s Radeon X800 XT at the top of the charts here, but interestingly enough, NVIDIA’s GeForce 6800 Ultra is not far behind. In fact, the X800 XT only outperforms the 6800 Ultra by around 5%.
At the $400 price point, the GeForce 6800GT is able to outperform the Radeon X800 Pro by just under 10%, so while ATI takes the #1 spot, NVIDIA takes numbers two and three here.
As we drop down in price we see that the Radeon X700 XT, GeForce 6800 and GeForce 6600GT all provide virtually identical performance. With the GeForce 6800 being the most expensive of the three, the winner for the $200 - $250 range ends up being both the X700 XT and the 6600GT. If you want an AGP card then your only option will be the 6600GT.
The Radeon 9800 Pro doesn’t actually do too bad at 1280 x 1024, however in actual gameplay the GPU can stutter a bit, interrupting an otherwise smooth performance experience. Radeon 9800 and 9700 owners will find a much better balance of performance and image quality at 1024 x 768.
The biggest thing to take away from our resolution scaling graphs is an idea of what cards are best suited for 1024 x 768 and what it takes to have butter smooth performance at 1280 x 1024.
The Radeon 9700 Pro and 9800 Pro are both best suited for 1024 x 768, while they will play 1280 x 1024 just fine if you are willing to deal with some choppiness.
While the 6800 and the 6600GT perform relatively well at 1600 x 1200, their sweet spot is much closer to 1280 x 1024. Even though all of the cards here seem to scale relatively similarly to one another, only the highest end $400+ cards manage to truly perform well at 1600 x 1200.
79 Comments
View All Comments
Anand Lal Shimpi - Wednesday, November 17, 2004 - link
Thanks for all of the comments guys. Just so you know, I started on Part 2 the minute the first article was done. I'm hoping to be done with testing by sometime tomorrow and then I've just got to write the article. Here's a list of the new cards being tested:9600XT, 9550, 9700, X300, GF 6200, GF 5900XT, GF4 Ti 4600, GF4 MX440
I'm doing both DX9 and DX8 comparisons, including image quality.
After Part 2 I think I'll go ahead and do the CPU comparison, although I've been thinking about doing a more investigative type of article into Half Life 2 performance in trying to figure out where its performance limitations exist, so things may get shuffled around a bit.
We used the PCI Express 6600GT for our tests, but the AGP version should perform quite similarly.
The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison.
Any other requests?
Take care,
Anand
Cybercat - Wednesday, November 17, 2004 - link
You guys made my day comparing the X700XT, 6800, and 6600GT together. One question though (and I apologize if this was mentioned in the article and I missed it), did you guys use the PCIe or AGP version of the 6600GT?Houdani - Wednesday, November 17, 2004 - link
18: Many users rely on hardware review sites to get a feel for what technology is worth upgrading and when.Most of us have financial contraints which preclude us from upgrading to the best hardware, therefore we are more interested in knowing how the mainstream hardware performs.
You are correct that it would not be an efficient use of resources to have AT repeat the tests on hardware that is two or three generations old ... but sampling the previous generation seems appropriate. Fortunately, that's where part 2 will come in handy.
I expect that part 2 will be sufficient in showing whether or not the previous generation's hardware will be a bottleneck. The results will be invaluable for helping me establish my minimum level of satisfaction for today's applications.
stelleg151 - Wednesday, November 17, 2004 - link
forget what i said in 34.....pio!pio! - Wednesday, November 17, 2004 - link
So how do you softmod a 6800NU to a 6800GT???or unlock the extra stuff....
stelleg151 - Wednesday, November 17, 2004 - link
What drivers were being used here, 4.12 + 67.02??Akira1224 - Wednesday, November 17, 2004 - link
Jedilol I should have seen that one coming!
nastyemu25 - Wednesday, November 17, 2004 - link
i bought a 9600XT because it came boxed with a free coupon for HL2. and now i can't even see how it matches up :(coldpower27 - Wednesday, November 17, 2004 - link
These benchmarks are more in line with what I was predicting, the x800 Pro should be equal to 6800 GT due to similar Pixel Shader fillrate while the X800 XT should have an advantage at higher resolutions due to it's having a higher fillrate being clocked higher.Unlike DriverATIheaven:P.
This is great I am happy knowing Nvidia's current generation of hardware is very competitive in performance in all aspects when at equal amounts of fillrate.
Da3dalus - Wednesday, November 17, 2004 - link
In the 67.02 Forceware driver there's a new option called "Negative LOD bias", if I understand what I've read correctly it's supposed to reduce shimmering.What was that option set to in the tests? And how did it affect performance, image quality and shimmering?