Half-Life 2: Episode One Performance
Episode One of the new Half-Life 2 series makes use of recent Source engine updates to include Valve's HDR technology. While some people have done HDR that won't allow antialiasing (even on ATI cards), Valve put a high value on building an HDR implementation that everyone can use with whatever settings they want. Consistency of experience is usually not important enough to developers who care about pushing the bleeding edge of technology, so we are very happy to see Valve going down this path.
We use the built-in timedemo feature to benchmark the game. Our timedemo consists of a protracted rocket launcher fight and features much debris and pyrotechnics. The source engine timedemo feature is more like the nettimedemo of Id's Doom 3 engine, in that it plays back more than just the graphics. In fact, Valve includes some fairly intensive diagnostic tools that will reveal almost everything about every object in a scene. We haven't found a good use for this in the context of reviewing computer hardware, but our options are always open.
The highest visual quality settings possible were used including the "reflect all" setting which is normally not enabled by default. Antialiasing was left disabled for this test, and anisotropic filtering was set at 8x. While the Source engine is notorious for giving great framerates for almost any hardware setup, we find the game isn't as enjoyable if it isn't running at at least 30fps. This is very attainable even at the highest resolution we tested on most cards, and thus our target framerate is a little higher in this game than others.
All current generation midrange cards except the X1600 XT are capable of running Half-Life 2: Episode One with all the eye candy enabled at 1920x1440. This is good news for people with the upper end of large affordable wide format display panels who like to play Half-Life 2 based games. Those with the older generation cards (or displays that max out at a lower resolution) may need to run at a lower resolution in order to enjoy their HL2:Ep1 experience.
Looking at lower resolutions, the X1900 XT is running into a CPU bound situation at 800x600 with the Core 2 Extreme, but at 250fps we don't really feel the loss that hard. Every card we tested is essentially playable at 1600x1200 under this game, but those with cards on the lower end of the spectrum may prefer to drop resolution a little more and use the reflect world rather than reflect all setting to avoid hiccups. The X1900 GT is the most bang for the buck solution once again, though we'll have to reserve some judgement for the overclocked 7900 GT when we take a look at it. Not even Valve's traditionally ATI friendly engine can make the X1600 XT look like a good buy though.
74 Comments
View All Comments
gmallen - Friday, August 11, 2006 - link
Most of the PC enthusiast population interested in mid-range cards are still running AGP motherboards (this is based on sales of pci motherboards vs. agp motherboards). Where are these cards?Josh7289 - Friday, August 11, 2006 - link
They don't exist.
arturnowp - Friday, August 11, 2006 - link
HiIt's written that all card in oblivion was tested with HDR Lighting with X800GTO doesn't support. I think your results are misleading. The same with SC: Chaos Theory...
BTW: Who plays Oblivion with Actor Fade at 20%, Item Fade at 10% and Object Fade at 25% you get better graphics and performance setting those option to 50-60% and turning off grass with consums a lot of power and doesn't look good. In foliage it's better to see your enemies from greater distance the say with a horse ;-)
arturnowp - Friday, August 11, 2006 - link
OK there's writen about SC: Chaos Theory but all in all conclusion are misleading "Owners of the X800 GTO may have a little more life left in their card depending on how overclocked the card is, but even at stock clocks, it might be wise to hang on for another product cycle if possibl" where GeForce 6600GT performe on par with X800GTO. It would be better to exclude X800GTO from charts or mark it as SM 2.0 card. What's better GeForce 6600GT should be tested in SM 2.0 mode...nv40 - Friday, August 11, 2006 - link
Don't why?http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
Some difference of test are so large that it almost shocked me
For instance:
7900GT@84.21 with FX-60 can run 54 FPS avg in 1600x1200 with 4xAA 16xAF in X-bit lab
7900GT@91.33 with X6800 just be 35 FPS ave in 1600x1200 with only 4x AA in Anandtech
Problem of 91.33? Intel 975X? X6800? nVidia?
more than 40% performance difference despite X6800 is far superior to FX-60
coldpower27 - Friday, August 11, 2006 - link
They probably aren't running the same time demo sequences.nv40 - Friday, August 11, 2006 - link
Maybe... but only 9% dif in X1900GT (41 vs 38)And 7900GT test in Anandtech definitely performed much worse then X-bit lab in general
nothing with which is correct or not, but if both are right, the the conclusion may be probably draw like below:
1. Driver problem: 91.33 is much slower than 84.21 (nV Cheat, or 91.33 problem)
2. CPU problem: X6800 is much inferior than FX-60 in game (Rediculous, and far from true in every test)
3. Platform problem: nVidia cards perform much worse in intel chipset (975X)
Sharky974 - Friday, August 11, 2006 - link
I agree. I clearly remember Xbit declaring the 7900GT to win the vast majority of benches vs the X1900GT.In fact overall the X1900GT wasn't warmly recieved. I really feel this deserves some looking into.
For example, I'll have to go look, but I think Firing Sqaud also showed the X1900GT as inferior to the 7900GT.
As it stands now, it's like Anand's platforms are somehow ATI biased, on the other hand I believe Xbit platform is Nvidia biased. Xbit reviews nearly always show Nvidia winning.
Sharky974 - Friday, August 11, 2006 - link
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...I started on the first page of benches.
As one glaring example:
Firings squad: Quake 4 1280X1024 4XAA 8XAF 7900GT-87.2 X1900GT-60.6
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...
Anand: Quake 4 1280X1024 4XAA 7900 GT-45.1 X1900GT-49.8
http://images.anandtech.com/reviews/video/roundups...">http://images.anandtech.com/reviews/video/roundups...
With similar settings, FS has the 7900GT getting nearly double the frames Anand does. The X1900GT also gets significantly more in FS review, from 49 to 60 FPS, but nowhere near the change the 7900GT sees, with the net effect the X1900GT eaks out a win at Anand, but loses by nearly 27+ FPS at FS.
The X1900GT is definitly a better card than I had remembered, even at the FS benches though.
Also, FS was using a FX-57. Anand a much more powerful CPU, making results all the more puzzling.
In addition to some of the other suggestions, I'd question drivers. FS was using older drivers on both since it is an older review. Perhaps Nvidia drivers have seen a large performance decrease, or ATI's a similar increase? This seems fairly unlikely, though, as I dont think you normally get huge differences from driver to driver.
Unless Nvidia really was cheating RE 16-bit filtering as the INQ claimed a while back, so they fixed it causing a massive performance decrease? :) Again though, that suggestion is made half-jokingly.
This definitly needs a lot of looking into I fell. Anand's results are quite different than others around the web at first blush.
JarredWalton - Friday, August 11, 2006 - link
Levels can make a huge difference in performance. For example, Far Cry has segments that get about 80 FPS max on any current CPU (maybe higher with Core 2 Extreme overclocked...), but other areas of the game run at 150+ FPS on even a moderate CPU like a 3500+. I don't have a problem providing our demo files, but some fo them are quite large (Q4 is about 130 MB if I recall). SCCT, FEAR, and X3 provide a reference that anyone can compare to, if they want. The only other thing is that ATI driver improvements are certainly not unlikely, especially in Quake 4.