Half Life 2 GPU Roundup Part 1 - DirectX 9 Shootout
by Anand Lal Shimpi on November 17, 2004 11:22 AM EST- Posted in
- GPUs
Benchmarking Half Life 2
Unlike Doom 3, Half Life 2 has no build in benchmark demo but it has full benchmark functionality. To run a Half Life 2 timedemo you must first modify your Half Life 2 shortcut to include the -console switch then launch the game.
Once Half Life 2 loads, simply type timedemo followed by the name of the demo file you would like to run. All Half Life 2 demos must reside in the C:\Program Files\Valve\Steam\SteamApps\username\half-life 2\hl2\ directory.
Immediately upon its launch, we spent several hours playing through the various levels of Half Life 2, studying them for performance limitations as well as how representative they were of the rest of Half Life 2. After our first pass we narrowed the game down to 11 levels that we felt would be good, representative benchmarks of gameplay throughout the entire game of Half Life 2. We further trimmed the list to just five levels: d1_canals_08, d2_coast_05, d2_coast_12, d2_prison_05 and d3_c17_12. We have put together a suite of five demos based on these levels that we believe are together representative of Half Life 2 gameplay. You can download a zip of our demos here. As we mentioned earlier, ATI is distributing some of their own demos but we elected not to use them in order to remain as fair as possible.
When benchmarking Half Life 2 we discovered a few interesting things:
Half Life 2's performance is generally shader (GPU) limited when outdoors and CPU limited when indoors; now this rule of thumb will change if you run at unreasonably high resolutions (resolutions too high for your GPU) or if you have a particularly slow CPU/GPU, but for the most part take any of the present day GPUs we are comparing here today and you'll find the above statement to be true.
Using the flashlight can result in a decent performance hit if you are already running close to the maximum load of your GPU. The reason behind this is that the flashlight adds another set of per pixel lighting calculations to anything you point the light at, thus increasing the length of any shaders running at that time.
The flashlight at work
Levels with water or any other types of reflective surfaces generally end up being quite GPU intensive as you would guess, so we made it a point to include some water/reflective shaders in our Half Life 2 benchmarks.
But the most important thing to keep in mind with Half Life 2 performance is that, interestingly enough, we didn't test a single card today that we felt was slow. Some cards were able to run at higher resolutions, but at a minimum, 1024 x 768 was extremely playable on every single card we compared here today - which is good news for those of you who just upgraded your GPUs or who have made extremely wise purchases in the past.
For our benchmarks we used the same settings on all GPUs:
Our test platforms were MSI's K8N Neo2 (nForce3) for AGP cards and ASUS' nForce4 motherboard for PCI Express graphics cards. The two platforms are comparable in performance so you can compare AGP numbers to PCI Express numbers, which was our goal. We used an Athlon 64 4000+ for all of our tests, as well as 1GB of OCZ DDR400 memory running at 2-2-2-10.
79 Comments
View All Comments
ballero - Wednesday, November 17, 2004 - link
it'd be nice a comparison between cpuJalf - Wednesday, November 17, 2004 - link
To those wanting benchmarks on older hardware, remember that this is a hardware site, not a games review site.Their focus is on the hardware, and honestly, few hardware enthusiasts can get excited about an 800 mhz cpu or a Geforce 3. ;)
For AT, HL2 is a tool to compare new *interesting* hardware. It's not the other way around.
CU - Wednesday, November 17, 2004 - link
I would also like to see slower cpu's and 512meg systems tested. It seems all recent cards can run it fine, so it would be nice to see how other things affect HL2.CU - Wednesday, November 17, 2004 - link
Based on the 6800nu vs 6600gt I would say that HL2 is being limited by fillrate and not bandwith. I say this since they both have about the same fillrate, but the 6800nu has around 40% more bandwidth than the 6600gt. So, unlocking extra pipes and overclocking the GPU should give the most increase in fps. Anyone want to test this?Jeff7181 - Wednesday, November 17, 2004 - link
... in addition... this is a case where minimum frame rates would be very useful to know.Jeff7181 - Wednesday, November 17, 2004 - link
Those numbers are about what I expected. I'm sorta thinking that triple buffering isn't working with the 66.93 drivers and HL2 because I have vsync enabled, it seems like the frame rate is either 85 or 42.I also suspected that anistropic filtering wasn't particularly necessary... I'll have to try it without and see how it looks... although with 4XAA and 8XAF I'm still getting acceptable frame rates.
nserra - Wednesday, November 17, 2004 - link
#8 i never heard of 6800 extra pipes unlocked, where did you see that. Arent you making some confusion with the Ati 9500 cards?MAME - Wednesday, November 17, 2004 - link
Make some budget video card benchmarks (Ti4200 plus or minus) and possibly a slower cpu or less ram so that people will know if they have to upgradeAkira1224 - Wednesday, November 17, 2004 - link
#8 Thats not a fair comparison. Yes atm it would seem the 6800Nu is a better buy. However if you go to Gameve you will find the XFX (clocked at PCIe speeds)6600GT for $218. Thats a much better deal than your example using Newegg. You talk about a $5 diff... if you are a smart shopper you can get upwards of a $50 diff.THAT makes the 6600GT the better buy. Esp when you consider that the market this card is aimed at is not the same market that will softmod their cards to unlock pipes. Either way you go you will get great performance.
I digress off topic.... sorry.
nserra - Wednesday, November 17, 2004 - link
You didn’t use overclocked nvidia cards like hardocp did. That Kyle has the shame to say he used stock clock, those BFG OC are overclocked from factory. Just 25Mhz but its something.Very good review!!! Better then the NVIDIA's GeForce 6600GT AGP review where something was missing.