Fall 2003 Video Card Roundup Part I - ATI's Radeon 9800 XT
by Anand Lal Shimpi & Derek Wilson on October 1, 2003 3:02 AM EST- Posted in
- GPUs
The Newcomers
As we briefly mentioned, there are three new products to talk about today – the Radeon 9800 XT, the Radeon 9600 XT and then NVIDIA’s NV38.
The XT line of Radeon 9x00 cards is specifically targeted at the very high end of the gaming market. With AMD and their Athlon 64 FX, Intel and the Pentium 4 Extreme Edition, it’s not too surprising to see even more companies going this direction. With an ultra-premium part like the Radeon 9800 XT the profit margins are high and more importantly, the PR opportunities are huge – claiming the title of world’s fastest desktop GPU never hurts.
The effort required to produce a part like the Radeon 9800 XT is much lower than a serious redesign. When making any kind of chip (CPU, GPU, chipset, etc…) the design team is usually given a cutoff point where they cannot make any more changes to the design, and that is the design that will go into production. However, it is very rare that manufacturers get things right on the first try. Process improvements and optimizing of critical paths within a microprocessor are both time intensive tasks that require a good deal of experience.
Once ATI’s engineers had more experience with the R350 core and more time with it they began to see where the limitations of the GPU’s clock speed existed; remember that your processor can only run as fast as its slowest speed path so it makes a great deal of sense to change the layout and optimize the use of transistors, etc… to speed up the slow paths within your GPU. This oversimplified process is what ATI and their foundry engineers have been working on and the results are encompassed in the R360 – the core of the Radeon 9800 XT.
The Radeon 9800 XT is able to run at a slightly higher core frequency of 412MHz, quite impressive for ATI’s 0.15-micron chip (yes, this is the same process that the original R300 was based on). Keep in mind that the Radeon 9800 Pro ran at 380MHz and you’ll see that this 8% increase in clock speed is beginning to reach the limits of what ATI can do at 0.15-micron.
The Radeon 9800 XT does receive a boost in memory speed as well, now boasting a 365MHz DDR memory clock (730MHz effective) – an increase of 7% over the original Radeon 9800 Pro and an increase of 4% over the 256MB 9800 Pro. ATI was much more proud of their core clock improvements as we will begin to crave faster GPU speeds once more shader intensive games come out.
The Radeon 9800 XT does have a thermal diode (mounted on-package but not on-die) that has a driver interface that will allow the card to automatically increase its core speed if the thermal conditions are suitable. The GPU will never drop below its advertised 412MHz clock speed, but it can reach speeds of up to 440MHz as far as we know. The important thing to note here is that ATI fully warrantees this overclocking support, an interesting move indeed. Obviously they only guarantee the overclock when it is performed automatically in the drivers, as they do not rate the chips for running at the overclocked speed in all conditions.
The OverDrive feature, as ATI likes to call it, will be enabled through the Catalyst 3.8 drivers and we’ll be sure to look into its functionality once the final drivers are made available.
The Radeon 9800 XT will be available in the next month or so and it will be sold in 256MB configurations at a price of $499 – most likely taking the place of the Radeon 9800 Pro 256MB.
263 Comments
View All Comments
Anonymous User - Wednesday, October 1, 2003 - link
Jesus #29, reread what you wrote:"And yes, they might have had more time as they did not benchmark NV38. However that they did not get NV38 makes this review even more suspicious."
Uh, duh. And I'm sorry for being rude, I just get somewhat annoyed when people make ridiculous accusations and comments in general.
Anonymous User - Wednesday, October 1, 2003 - link
If I admit mistyping sth. then it looks like this: Ooops, forgot an "s" on the word test and an "t" in the last sentence. But now let me continue whining...Anonymous User - Wednesday, October 1, 2003 - link
A benchmark suggestion: BF1942 please.Anonymous User - Wednesday, October 1, 2003 - link
#28 "Since you admit to mistyping your previous response" WELL I DID NOT MISTYPE IT, YOU MISREAD IT.AND: 24h are enough for me to take the NV38 and also run Tomb Raider: AOD, Shadermark 2.0 and Aquamark3/Halo with AA/AF. Just exclude less interesting test for that.
BTW: You should consider being polite. I will be less hard to respect you then.
Anonymous User - Wednesday, October 1, 2003 - link
You missed the point #23, and it makes you look stupid. Since you admit to mistyping your previous response, as I said before, Anandtech JUST received the NV38 within the last 24 hours. To claim that this is somehow suspicious makes you look ignorant based on that fact alone, and the fact that there's nothing "else" to make this review look suspicious at all. I think Anand did a great job, and certainly better than every other web site except perhaps B3D.And if you can't wait for a review you're just a freaking whiner who probably isn't even going to be purchasing a high-end video card and so really has nothing better to do with his time but bitch and moan. Guys like me, who are seriously considering a 9800XP (well actually, I'm thinking a 9800 non Pro now), are who Anand is writing for mostly. You, well, you're just a whiner apparently.
Also, #25, I disagree with you on the unreleased drive bit. The 52 beta drivers are supposed to resemble the final Det 50's very closely according to Anandtech, so it makes zero sense to imply that using 52 series beta drivers is somehow not right. Also, exactly what DX9 titles is Anandtech going to test with? They don't have HL2 just like every other site, and what other DX9 games are out currently, even for web sites? None. Tomb Raider is the ONLY real game that AT should have included, which is an odd omission I admit, but nothing more. DX7/DX8 is the most prevalent standard used in games today, it would be idiotic not to include mostly these types of games in a review.
Anonymous User - Wednesday, October 1, 2003 - link
Concerning Doom3....Indeed, Carmack has written specific lower precision code paths for NVidia.
However, that does not mean it's impossible to compare ATI and NVidia in Doom3 directly. You can do that, by enforcing the NVidia to run in the same generic code path, that will be used by all non-NVidia cards.
Carmach has done that himself, and indicated that the FX cards than achieve about half the performance as the Radeons.
That's completely in line with the other new games and benchmarks.
Anonymous User - Wednesday, October 1, 2003 - link
#25 Perfectly rightAnonymous User - Wednesday, October 1, 2003 - link
I like the idea of taking more games for benchmarking. However, it has become so very clear by now that that GeforceFX cards do great on DX8 but are lousy at DX9 games.Therefor, you need to be very carefull on the games you choose to benchmark. Seems that the games tested are mostly DX8 like. That simply gives a wrong picture to people who want to buy a new card now, at the moment that all new games have much DX9 features. They will be dissappointed when they start to play these new games.
Furthermore, we simply know that NVidia has been cheating in almost all (benchmark) games with their drivers.
The last review in THG for Aquamark, was very clear. Only drivers that were release before the benchmark came out rendered correctly. ALL drivers after that reduce image quality, or simply don't render stuff at all.
And Aquamark isn't the only one. This isn't an iccident, this happens in almost all games.
And images corruption in beta drivers certainly isn't limited to benchmarking games. Lot's of others suffer too.
At this point in time, you simply CAN NOT ignore these facts. It is totally unacceptable to use unreleased NVidia drivers, without having a extremely thorough investigation to check where the new cheats are.
Do any of you really think that a driver optimization can produce double performance? Dream on!
Just look at Aquanox. The only driver that produces correct images, has half the performance of the Radeons. (Which is exacltly what you expect from experiences with all other Dx9 like games.)
The 45.23 has a convenient 'bug', and just happens to get double performance. The 51.75 has another less obvious 'bug', and also just happens to get the same double performance.
Do I have ANY reason to believe that somehow, the unreleased beta 52.14 drivers doesn't have 'bugs' that happen to double the performance?
Come on... nobody can be that naive....
If there's anything we've learned in the last half year, it's that:
1) NVidia is at this moment completely unreliable
2) Driver optimizations that give more than a few percent performance increase, don't optimize, but simply cheat.
If you want to show benchmark results on FX cards, you use drivers that you can be really confident in that they don't cheat. It is totally unacceptable to use unreleased beta drivers. It's NOT enough to say you're going to look into image quality in the future!!!
Anonymous User - Wednesday, October 1, 2003 - link
On the additional benchmark Q: BF1942 pleaseAnonymous User - Wednesday, October 1, 2003 - link
#22 For you again: "I don't need to wait for complete reviews on other sites. And yes, they (THE OTHER SITES) might have had more time as they (THE OTHER SITES) did not benchmark NV38. However that they (THE OTHER SITES) did not get NV38 makes this review (ANANDTECHS) even more suspicious."And yes Shadermark 2.0 is no game. But it might have shown that NV38s Pixel Shader 2.0 is inferior to the Radeon XTs. And this WILL be important for performance under many DirectX 9.0 games like Half-Life 2.
And I am not Natoma. I do not even post regulary on Hardware sites. And I am no idiot or fan boy eiter.