Looking Back: ATI's Catalyst Drivers Exposed
by Ryan Smith on December 11, 2005 3:22 PM EST- Posted in
- GPUs
Looking Back: ATI’s Catalyst Drivers Exposed
It’s no secret in the hardware world that good software is often just as important as good hardware. The best processor, the best TV tuner, and even the best sound card can only be as good as the software and drivers backing it up. Even a small change in one critical piece of code can result in a massive difference that represents a significant change in performance and sales of a piece of hardware.
Above all, however, this concept is embodied in the realm of video cards, where over the years, we have been spoiled by promises of “A performance improvement between 17 and 24% is noticed in Jedi Knight II” and “up to 25% performance improvement in popular consumer and professional applications”. These days, it’s not just common to see GPU makers find ways to squeeze out more performance out of their parts - it’s expected. Finishing the design of and launching a GPU is just the first steps of a much longer process of maximizing performance out of a part, a process that can quite literally take years.
With the flexible nature of software, however, it has caused a significant shift in the marketing strategies of GPU makers, where the war is not over at launch time, but continues throughout the entire product cycle and in to the next one as new optimizations and bug fixes are worked in to their drivers, keeping the performance landscape in constant motion. Just because a side did not win the battle at launch doesn’t mean that they can’t still take it later, and just because a side won now doesn’t mean that they’ll keep their win.
We have seen on more than one occasion that our benchmarks have been turned upside down and inside out, with cases such as ATI’s Catalyst 5.11 drivers suddenly giving ATI a decisive win in OpenGL games, when they were being soundly defeated just a driver version before. However, we have also seen this pressure to win drive all sides to various levels of dishonesty, hoping to capture the lead with driver optimizations that make a product look faster on a benchmark table, but literally look worse on a monitor. Quake3, 3DMark 2003, and similar incidents have shown that there is a fine line between optimizing and cheating, and that as a cost for the flexibility of software, we may sometimes see that line crossed.
That said, when the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?
Today, we set out to answer these questions by taking a look back at a piece of hardware whose time has come and is nearly gone: ATI’s R300 GPU and the Radeon 9700 Pro.
It’s no secret in the hardware world that good software is often just as important as good hardware. The best processor, the best TV tuner, and even the best sound card can only be as good as the software and drivers backing it up. Even a small change in one critical piece of code can result in a massive difference that represents a significant change in performance and sales of a piece of hardware.
Above all, however, this concept is embodied in the realm of video cards, where over the years, we have been spoiled by promises of “A performance improvement between 17 and 24% is noticed in Jedi Knight II” and “up to 25% performance improvement in popular consumer and professional applications”. These days, it’s not just common to see GPU makers find ways to squeeze out more performance out of their parts - it’s expected. Finishing the design of and launching a GPU is just the first steps of a much longer process of maximizing performance out of a part, a process that can quite literally take years.
With the flexible nature of software, however, it has caused a significant shift in the marketing strategies of GPU makers, where the war is not over at launch time, but continues throughout the entire product cycle and in to the next one as new optimizations and bug fixes are worked in to their drivers, keeping the performance landscape in constant motion. Just because a side did not win the battle at launch doesn’t mean that they can’t still take it later, and just because a side won now doesn’t mean that they’ll keep their win.
We have seen on more than one occasion that our benchmarks have been turned upside down and inside out, with cases such as ATI’s Catalyst 5.11 drivers suddenly giving ATI a decisive win in OpenGL games, when they were being soundly defeated just a driver version before. However, we have also seen this pressure to win drive all sides to various levels of dishonesty, hoping to capture the lead with driver optimizations that make a product look faster on a benchmark table, but literally look worse on a monitor. Quake3, 3DMark 2003, and similar incidents have shown that there is a fine line between optimizing and cheating, and that as a cost for the flexibility of software, we may sometimes see that line crossed.
That said, when the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?
Today, we set out to answer these questions by taking a look back at a piece of hardware whose time has come and is nearly gone: ATI’s R300 GPU and the Radeon 9700 Pro.
58 Comments
View All Comments
Ryan Smith - Sunday, December 11, 2005 - link
You should see the cooler attached, it sure sounds like a 757.Anyhow, good catch, thanks.
ss284 - Sunday, December 11, 2005 - link
I think this article might have been a bit more meaningful if some newer generation games were tested, like half life 2 and far cry.ElJefe - Sunday, December 11, 2005 - link
lol yes I thought the same.I was like eh? bf2 and half-life2 and doom3. Or quake 4 maybe. ( even though most gamers are not on that bandwagon yet, bf2 for first person is kinda king still)
Cygni - Tuesday, December 13, 2005 - link
Older drivers are going to have issues with newer games. Thats whats talked about in the article. If you are running Cat 1.0's with FEAR, its going to go ape shit... FEAR wasnt even around when those drivers came out. By using older games, they can limit this factor and make it a pure perforamnce comparison.ksherman - Sunday, December 11, 2005 - link
:(vshah - Sunday, December 11, 2005 - link
Mouseover makes the first image dissapear for me in firefox and ie.Will there be an nvidia version of this?
kerynitian - Monday, December 12, 2005 - link
I would definitely be interested in seeing how nvidida and their driver improvements in the nv40 line related to the marks put up by ati in this article...coldpower27 - Sunday, December 11, 2005 - link
Yes it might be interesting to do one with a 6800 GT/Ultra, to see if there have been improvements of extracting performance out of NV40 technology over the past now 18 months of life.I think we were in the early 61.xx when NV40 came out?
nts - Monday, December 12, 2005 - link
With this article testing on the R300 they would probably test NVIDIA NV30 (FX) cards.coldpower27 - Sunday, December 11, 2005 - link
Actually I beleive that is ~ 20 months instead of 18.