Half-Life 2 Performance Benchmark Preview
by Anand Lal Shimpi on September 12, 2003 12:34 AM EST- Posted in
- GPUs
ATI & Valve - Defining the Relationship
The first thing that comes to mind when you see results like this is a cry of foul play; that Valve has unfairly optimized their game for ATI's hardware and thus, it does not perform well on NVIDIA's hardware. Although it is the simplest accusation, it is actually one of the less frequent that we've seen thrown around.
During Gabe Newell's presentation, he insisted that they [Valve] have not optimized or doctored the engine to produce these results. It also doesn't make much sense for Valve to develop an ATI-specific game simply because the majority of the market out there does have NVIDIA based graphics cards, and it is in their best interest to make the game run as well as possible on NVIDIA GPUs.
Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use). Thus, it is clear that a good attempt was made to get the game to run as well as possible on NVIDIA hardware.
To those that fault Valve for spending so much time and effort trying to optimize for the NV3x family, remember that they are in the business to sell games and with the market the way it is, purposefully crippling one graphics manufacturer in favor of another would not make much business sense.
Truthfully, we believe that Valve made an honest attempt to get the game running as well as possible on NV3x hardware but simply ran into other unavoidable issues (which we will get to shortly). You can attempt to attack the competence of Valve's developers; however, we are not qualified to do so. Yet, any of those who have developed something similar in complexity to Half-Life 2's source engine may feel free to do so.
According to Gabe, these performance results were the reason that Valve aligned themselves more closely with ATI. As you probably know, Valve has a fairly large OEM deal with ATI that will bring Half-Life 2 as a bundled item with ATI graphics cards in the future. We'll be able to tell you more about the cards with which it will be bundled soon enough (has it been 6 months already?).
With these sorts of deals, there's always money (e.g. marketing dollars) involved, and we're not debating the existence of that in this deal, but as far as Valve's official line is concerned, the deal came after the performance discovery.
Once again, we're not questioning Valve in this sense and honestly don't see much reason to, as it wouldn't make any business sense for them to cripple Half-Life 2 on NVIDIA cards. As always, we encourage you to draw your own conclusions based on the data we've provided.
Moving on…
111 Comments
View All Comments
dvinnen - Friday, September 12, 2003 - link
#31: I know what I said. DX9 dosen't require 32 bit. It's not in the spec so you couldn't write shader that uses more than 24bit percision.XPgeek - Friday, September 12, 2003 - link
Well #26, if the next gen of games do need 32 bit precision, then the tides will once again be turned. and all these "my ATi is so faster than for nVidia" will have to just suck it up and buy another new card, whereas the GFFX's will still be plugging along. by then, who knows, maybe DX10 will support 32 bit precision on the nVidia cards better...btw, im still loading down my GF3 Ti500. so regardless, i will have crappy perf. but i also buy cards from the company i like, that being Gainward/Cardex nVidia based boards. no ATi for me, also no Intel for me. Why? bcuz its my choice. so it may be slower, whoopty-doo!
for all i know, HL2 could run for crap on AMD CPUs as well. so i'll be in good shape then with my XP2400+ and GF3
sorry, i know my opinions dont matter, but i put em here anyhow.
buy what you like, dont just follow the herd... unless you like having your face in everyones ass.
Anonymous User - Friday, September 12, 2003 - link
#28 Not 24bit, 32 bit.Anonymous User - Friday, September 12, 2003 - link
Yeah, like mentioned above, what about whether or not AA and AF were turned on in these tests? Do you talk about it somewhere in your article?I can't believe it's not mentioned since this site was the one that make a detailed (and excellent) presentation of the differences b/w ati and nvdia's AA and AF back in the day.
Strange your benchmarks appear to be silent on the matter. I assume they were both turned off.
Anonymous User - Friday, September 12, 2003 - link
>>thus need full 32-bit precision."<<Huh? Wha?
This is an interesting can of worms. So in the future months time, if ATI stick to 24bit, or cannot develop 32 bit precision, the tables will have reversed on the current situation - but even moreso because there would not be a work around (Or optimization).
Will ATI users in the future accuse Valve of sleeping with Nvidia because their cards cannot shade with 32-bit precision?
Will Nvidia users claim that ATI users are "non-compliant with directX 9"? Will ATI users respond that 24bit precision is the only acceptable standard Direct 9 standard, and that Valve are traitors?
Will Microsoft actually force manufacturers to bloody well wait and force them to follow the standard.
And finally, who did shoot Colonel Mustard in the Dining Room?
Questions, Questions.
dvinnen - Friday, September 12, 2003 - link
#26: It means it can't cheat and use 16 bit registries to do it and need a full 24bit. SO it would waste the rest of the registryAnonymous User - Friday, September 12, 2003 - link
#26 That was in reference to the fx cards. They can do 16 or 32 bit precision. Ati cards do 24 bit precision, which is the dx 9 standard.24 bit is the dx 9 standard because it's "good enough." It's much faster than 32 bit, and much better looking then 16 bit. So 16 bit will wear out sooner. Of course, someday 24 bit won't be enough, either, but there's no way of knowing when that'll be.
Anonymous User - Friday, September 12, 2003 - link
Valve says no benchmarks on Athlon 64! :-/Booo!
Quote:
http://www.tomshardware.com/business/20030911/inde...
"Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision."
The new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
Anonymous User - Friday, September 12, 2003 - link
I agree with #23 in terms of money making power the ATI/Valve combo is astounding. ATI's design is superior as we can see but the point is that ATI is going to get truckloads of money and recognition for this. Its a good day to have stock in ATI, lets all thank them for buying ArtX!Anonymous User - Friday, September 12, 2003 - link
I emailed gabe about my 9600 pro, but he didnt have to do all this just for me :DI love it.