Half-Life 2 Performance Benchmark Preview
by Anand Lal Shimpi on September 12, 2003 12:34 AM EST- Posted in
- GPUs
More on Mixed-Mode for NV3x
We briefly mentioned the Mixed Mode of operation for NV3x GPUs that Valve implemented in Half-Life 2, but there is much more to it than just a special NV3x code path. In fact, the mixed mode NV3x code path was really only intended for the GeForce FX 5900 Ultra (NV35). The mainstream FX chips (5200/5600) require a slightly different code path.
Here you can see the 40% performance boost NVIDIA gets from the special NV3x
code path.
The GeForce FX 5600 (NV31) uses a code path that is internally referred to as dx82; this path is a combination of DX9 (pixel shader 2.0) and DX8.1 (pixel shader 1.4) code, and thus, doesn't look as good as what you'll see on the 5900 Ultra.
Although the 5900 Ultra performs reasonably well with the special NV3x mixed mode path, the 5600 and 5200 cards do not perform well at all. Valve's recommendation to owners of 5600/5200 cards is to run the DX8 (pixel shader 1.4) code path in order to receive playable performance under Half-Life 2. The performance improvement gained by dropping to the DX8 code path is seen most on the GeForce FX 5200; although, there is a slight improvement on the 5600 as you can see below:
The sacrifices that you encounter by running either the mixed mode path or the DX8 path are obviously visual. The 5900 Ultra, running in mixed mode, will exhibit some banding effects as a result of a loss in precision (FP16 vs. FP32), but still looks good - just not as good as the full DX9 code path. There is a noticeable difference between this mixed mode and the dx82 mode, as well as the straight DX8 path. For example, you'll notice that shader effects on the water aren't as impressive as they are in the native DX9 path.
Are the visual tradeoffs perceptive? Yes. The native DX9 path clearly looks better than anything else, especially the DX8.0/8.1 modes.
111 Comments
View All Comments
atlr - Friday, September 12, 2003 - link
Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
"The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."
Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
Anonymous User - Friday, September 12, 2003 - link
Time to load up on ATI stock :)Anonymous User - Friday, September 12, 2003 - link
Quoted from Nvidia.com:"Microsoft® DirectX® 9.0 Optimizations and Support
Ensures the best performance and application compatibility for all DirectX 9 applications."
Oops, not this time around...
Anonymous User - Friday, September 12, 2003 - link
#74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.Anonymous User - Friday, September 12, 2003 - link
#74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.#75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.
Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
Anonymous User - Friday, September 12, 2003 - link
Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.And besides that there have been rumours of "optimalisations" in the new det50s.
Anonymous User - Friday, September 12, 2003 - link
How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....Anonymous User - Friday, September 12, 2003 - link
I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!Thanks
Anonymous User - Friday, September 12, 2003 - link
"Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
Anonymous User - Friday, September 12, 2003 - link
"The 9600 fully supports DX9 whereas the 9500 does not."Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.