Half-Life 2 Performance Benchmark Preview
by Anand Lal Shimpi on September 12, 2003 12:34 AM EST- Posted in
- GPUs
Improving Performance on NVIDIA
If the hypotheses mentioned on the previous page hold true, then there may be some ways around these performance issues. The most obvious is through updated drivers. NVIDIA does have a new driver release on the horizon, the Detonator 50 series of drivers. However, Valve instructed us not to use these drivers as they do not render fog in Half-Life 2. In fact, Valve was quite insistent that we only used publicly available drivers on publicly available hardware, which is a reason you won't see Half-Life 2 benchmarks in our upcoming Athlon 64 review.
Future drivers may be the key for higher performance to be enabled on NVIDIA platforms, but Gabe issued the following warning:
"I guess I am encouraging skepticism about future driver performance."
Only time will tell if updated drivers can close the performance gap, but as you are about to see, it is a decent sized gap.
One thing that is also worth noting is that the shader-specific workarounds for NVIDIA implemented by Valve will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.
Gabe also cautioned that reverting to 16-bit floating point values will only become more of an issue going forward as "newer DX9 functionality will be able to use fewer and fewer partial precision functions." Although the theory is that by the time this happens, NV4x will be upon us and will have hopefully fixed the problems that we're seeing today.
NVIDIA's Official Response
Of course, NVIDIA has their official PR response to these issues, which we've published below:
During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.
We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.
Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.
Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.
The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.
We are committed to working with Gabe to fully understand.
111 Comments
View All Comments
Anonymous User - Friday, September 12, 2003 - link
==="full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.===By the time full 32-bit becomes standard (probably with DX10 in 2-3 years) there will be NEW cards that make current cards look like sh!t. ATi will have DX10 cards for under $100, same as nVidia and their 5200. People have been upgrading their PC's for new games for YEARS! Only an [nv]IDIOT would attempt to use an old card for new games and software (TNT2 for Doom3? NOT!).
Anonymous User - Friday, September 12, 2003 - link
Funny that you guys think nVidia will be still "plugging along" with the GFFX if the DX spec changes to 32bit... you _do_ know what happens to the GFFX when it's forced to run 32bit prcession don't you? You'd get faster framerates by drawing each frame by hand on your monitor with a sharpie.Pete - Friday, September 12, 2003 - link
#23, the second quote in the first post here may be of interest: http://www.beyond3d.com/forum/viewtopic.php?t=7839... Note the last sentence, which I surrounded by ***'s."nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):
'We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.
You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. ***We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.***'"
If this document is indeed real, nV themselves told their own employees Gabe's presentation wasn't skewed by Valve's marketing relationship with ATi.
Anonymous User - Friday, September 12, 2003 - link
Link please #38Anonymous User - Friday, September 12, 2003 - link
LOL! 19, I saw that too. Looks like I'll be replacing my nVidia 'the way it's meant to be played in DX8 because our DX9 runs like ass, and we still sell it for $500+ to uninformed customers' card with an ATi Radeon. Thanks for the review Anand; it will be interesting to see the AA/AF benchmarks, but I have a pretty good idea of who will win those as well.Anonymous User - Friday, September 12, 2003 - link
>>>>>>>ANYONE ELSE CATCH THE FOLLOWING IN THE ARTICLE<<<<<<<<<<<<<<<""One thing that is also worth noting is that the shader-specific workarounds for NVIDIA that were implemented by Valve, will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.""
So I guess the nvidia fan boys won't be able to run their $500 POS cards with Counterstrike 2 since it will probably be based on the HL2 engine.
buhahahaha
>>>>>>>>>>>>>>>>>>>>>>>>><<<<<<<<<<<<<<<<<<<
Anonymous User - Friday, September 12, 2003 - link
Valve specifically said "full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.Pete - Friday, September 12, 2003 - link
#23, I believe you're inferring far too much from ATi's HL2 bundling. Check TechReport's article on Gabe's presentation, in which Gabe is noted as saying Valve chose ATi (in the bidding war to bundle HL2) because their cards quite obviously performed so much better (and look better doing it--keep in mind, as Anand said, all those nVidia mixed modes look worse than pure DX9).In short, Valve doesn't need to do much to please others, as they're the one being chased for the potentially huge-selling Half-Life 2. Everyone will be sucking up to them, not the other way around. And it wouldn't do for Valve to offer nV the bundle exclusive, have consumers expect brilliant performance from the bundled FX cards, and get 12fps in DX9 on their DX9 FX card or 30fps on their $400+ 5900U. That would result in a lot of angry customers for Valve, which is a decidedly bad business move.
People will buy HL2 regardless. Valve's bundling of HL2 with new cards is just an extra source of income for them, and not vital to the success of HL2 in any way. Bundling HL2 will be a big coup for an IHV like ATi, which requires boundary-pushing games like HL2 to drive hardware sales. Think of the relationship in this way: it's not that ATi won the bidding war to bundle HL2, but that Valve *allowed* ATi to win. Valve was going to get beaucoup bucks for marketing tie-ins with HL2 either way, so it's in their best interests to find sponsorships that present HL2 in the best light (thus apparently HL2 will be bundled with ATi DX9 cards, not their DX8 ones).
You should read page 3 of Anand's article more closely, IMO. Valve coded not to a specific hardware standard, but to the DX9 standard. ATi cards run standard DX9 code much better than nV. Valve had to work extra hard to try to find custom paths to allow for the FX's weaknesses, but even that doesn't bring nV even with ATi in terms of performance. So ATi's current DX9 line-up is the poster-child for HL2 almost by default.
We'll see what the Det50's do for nV's scores and IQ soon enough, and that should indicate whether Gabe was being mean or just frank.
Anonymous User - Friday, September 12, 2003 - link
#33 To be pedantic, the spec for DX9 24bit minimum, it has never been said by Microsoft that it was 24bit and nothing else, 24bit is just a minimum.Just as 640x480 is a minimum. That doesn't make 1024x768 non standard.
But considering you are right, and 24 bit is a rock solid standard, doesn't that mean that Valve in the future will violate the DX9 spec in your eyes? Does that not mean that ATI cards will be left high and dry, in the future? Afterall, there will be no optimizations allowed/able?
32bit is the future, according to Valve after all.
Nvidia may suck at doing it, but at least they can do it.
XPgeek - Friday, September 12, 2003 - link
edit, post #32-should read, "my ATi is so faster than YOUR nVidia"