Half-Life 2: Episode One Performance
Episode One of the new Half-Life 2 series makes use of recent Source engine updates to include Valve's HDR technology. While some people have done HDR that won't allow antialiasing (even on ATI cards), Valve put a high value on building an HDR implementation that everyone can use with whatever settings they want. Consistency of experience is usually not important enough to developers who care about pushing the bleeding edge of technology, so we are very happy to see Valve going down this path.
We use the built-in timedemo feature to benchmark the game. Our timedemo consists of a protracted rocket launcher fight and features much debris and pyrotechnics. The source engine timedemo feature is more like the nettimedemo of Id's Doom 3 engine, in that it plays back more than just the graphics. In fact, Valve includes some fairly intensive diagnostic tools that will reveal almost everything about every object in a scene. We haven't found a good use for this in the context of reviewing computer hardware, but our options are always open.
The highest visual quality settings possible were used including the "reflect all" setting which is normally not enabled by default. Antialiasing was left disabled for this test, and anisotropic filtering was set at 8x. While the Source engine is notorious for giving great framerates for almost any hardware setup, we find the game isn't as enjoyable if it isn't running at at least 30fps. This is very attainable even at the highest resolution we tested on most cards, and thus our target framerate is a little higher in this game than others.
All current generation midrange cards except the X1600 XT are capable of running Half-Life 2: Episode One with all the eye candy enabled at 1920x1440. This is good news for people with the upper end of large affordable wide format display panels who like to play Half-Life 2 based games. Those with the older generation cards (or displays that max out at a lower resolution) may need to run at a lower resolution in order to enjoy their HL2:Ep1 experience.
Looking at lower resolutions, the X1900 XT is running into a CPU bound situation at 800x600 with the Core 2 Extreme, but at 250fps we don't really feel the loss that hard. Every card we tested is essentially playable at 1600x1200 under this game, but those with cards on the lower end of the spectrum may prefer to drop resolution a little more and use the reflect world rather than reflect all setting to avoid hiccups. The X1900 GT is the most bang for the buck solution once again, though we'll have to reserve some judgement for the overclocked 7900 GT when we take a look at it. Not even Valve's traditionally ATI friendly engine can make the X1600 XT look like a good buy though.
74 Comments
View All Comments
DerekWilson - Thursday, August 10, 2006 - link
look again :-) It should be fixed.pervisanathema - Thursday, August 10, 2006 - link
You post hard to read line graphs of the benchmarks that show the X1900XT crushing the 7900GT with AA/AF enabled.Then you post easy to read bar charts of an O/Ced 7900GT barely eeking out a victory over the X1900XT ins some benchmarks and you forget to turn on AA/AF.
I am not accussing you guys of bias but you make it very easy to draw that conclusion.
yyrkoon - Sunday, August 13, 2006 - link
Well, I cannot speak for the rest of the benchmarks, but owning a 7600GT, AND Oblivion, I find the Oblivion benchmarks not accurate.My system:
Asrock AM2NF4G-SATA2
AMD AM2 3800+
2GB Corsair DDR2 6400 (4-4-4-12)
eVGA 7600GT KO
The rest is pretty much irrelivent. With this system, I play @ 1440x900, with high settings, simular to the benchmark settings, and the lowest I get is 29 FPS under heavey combat(lots of NPCs on screen, and attacking me.). Average FPS in town, 44 FPS, wilderness 44 FPS, dungeon 110 FPS. I'd also like to note, that compared to my AMD 3200+ XP / 6600GT system, the game is much more fluid / playable.
Anyhow, keep up the good work guys, I just find your benchmarks wrong from my perspective.
Warder45 - Thursday, August 10, 2006 - link
The type of chart used just depends on if they tested multiple resolutions vs a single resolution.Similar to your complaint, I could say they are bias towards ATI by showing how the X1900XT had better marks across all resolutions tested yet only tested the 7900GT OC at one resolution not giveing it the chance to prove itself.