Fall 2003 Video Card Roundup Part I - ATI's Radeon 9800 XT
by Anand Lal Shimpi & Derek Wilson on October 1, 2003 3:02 AM EST- Posted in
- GPUs
The Newcomers
As we briefly mentioned, there are three new products to talk about today – the Radeon 9800 XT, the Radeon 9600 XT and then NVIDIA’s NV38.
The XT line of Radeon 9x00 cards is specifically targeted at the very high end of the gaming market. With AMD and their Athlon 64 FX, Intel and the Pentium 4 Extreme Edition, it’s not too surprising to see even more companies going this direction. With an ultra-premium part like the Radeon 9800 XT the profit margins are high and more importantly, the PR opportunities are huge – claiming the title of world’s fastest desktop GPU never hurts.
The effort required to produce a part like the Radeon 9800 XT is much lower than a serious redesign. When making any kind of chip (CPU, GPU, chipset, etc…) the design team is usually given a cutoff point where they cannot make any more changes to the design, and that is the design that will go into production. However, it is very rare that manufacturers get things right on the first try. Process improvements and optimizing of critical paths within a microprocessor are both time intensive tasks that require a good deal of experience.
Once ATI’s engineers had more experience with the R350 core and more time with it they began to see where the limitations of the GPU’s clock speed existed; remember that your processor can only run as fast as its slowest speed path so it makes a great deal of sense to change the layout and optimize the use of transistors, etc… to speed up the slow paths within your GPU. This oversimplified process is what ATI and their foundry engineers have been working on and the results are encompassed in the R360 – the core of the Radeon 9800 XT.
The Radeon 9800 XT is able to run at a slightly higher core frequency of 412MHz, quite impressive for ATI’s 0.15-micron chip (yes, this is the same process that the original R300 was based on). Keep in mind that the Radeon 9800 Pro ran at 380MHz and you’ll see that this 8% increase in clock speed is beginning to reach the limits of what ATI can do at 0.15-micron.
The Radeon 9800 XT does receive a boost in memory speed as well, now boasting a 365MHz DDR memory clock (730MHz effective) – an increase of 7% over the original Radeon 9800 Pro and an increase of 4% over the 256MB 9800 Pro. ATI was much more proud of their core clock improvements as we will begin to crave faster GPU speeds once more shader intensive games come out.
The Radeon 9800 XT does have a thermal diode (mounted on-package but not on-die) that has a driver interface that will allow the card to automatically increase its core speed if the thermal conditions are suitable. The GPU will never drop below its advertised 412MHz clock speed, but it can reach speeds of up to 440MHz as far as we know. The important thing to note here is that ATI fully warrantees this overclocking support, an interesting move indeed. Obviously they only guarantee the overclock when it is performed automatically in the drivers, as they do not rate the chips for running at the overclocked speed in all conditions.
The OverDrive feature, as ATI likes to call it, will be enabled through the Catalyst 3.8 drivers and we’ll be sure to look into its functionality once the final drivers are made available.
The Radeon 9800 XT will be available in the next month or so and it will be sold in 256MB configurations at a price of $499 – most likely taking the place of the Radeon 9800 Pro 256MB.
263 Comments
View All Comments
Jeff7181 - Thursday, October 2, 2003 - link
I think Anand is too worried about creating benchmarks that compare to benchmarks done by other review sites. Which is why they had "trouble" benchmarking certain games.I agree, Morrowind would be a good game to benchmark with... I've used it recently to show the differences of AA and AF along with FS2004.
I think what needs to be done in some games like Morrowind is just play the game for 15 minutes... then tell us what the minimum frame rate was, the average, and the high. Who cares if it's not replicated EXACTLY each time... after 15 minutes, the average along with the lows and highs should paint a pretty accurate picture.
Also, in my opinion, FS2004 is THE BEST software to use in comparing the differences between AA and AF between video cards. All you have to do is disable weather and ATC, and save a flight, then load the flight every time you want to take a screen shot. Also pressing Shift+Z twice puts your frame rates on the screen, so there's no need to use FRAPS.
Anonymous User - Thursday, October 2, 2003 - link
How about testing old games up to 2048x1536?Anonymous User - Thursday, October 2, 2003 - link
I suggest adding Tiger Woods 2004 to the suite. Turning up the eye candy is more demanding than one may think, so it would be a good test. But my main motivation is that there appear to be serious driver-related image quality issues with ATI (!) cards (e.g. water reflections).Anonymous User - Thursday, October 2, 2003 - link
What I would also like to see, is the test results from ATI and Nvidia against DCC packages, such as 3DStudioMax and Maya. I would like to know if these high end gaming cards can also handle some animation rendering too. Maybe they can't, but its one man's dream...Anonymous User - Thursday, October 2, 2003 - link
Good job.You should benchmark it with MORROWIND as well, or maybe under GOTHIC 2.
Anonymous User - Thursday, October 2, 2003 - link
And I have a voodoo2 and it sucks on Dx9, what's your point.?Anonymous User - Thursday, October 2, 2003 - link
Sony PS2 and X box Never have a graphics card issue (coz they purley game consoles idiot) yeah I know that, but also the game programers write the games for that particular game console.My question is why does Nvidia or Ati have to constantly adapt their drivers to PC games instead of the Games be compatible with the Graphics cards?
yours sincerly...
Noise
Anonymous User - Thursday, October 2, 2003 - link
I have ATI 9000 card and I can say that ATI sucks in OpenGL.
Anonymous User - Thursday, October 2, 2003 - link
#163, I believe that FarCry/64-bit/improved graphics is 100% marketing BS.Anonymous User - Thursday, October 2, 2003 - link
It's a good suite for testing, but one game that I'd really like to see is Far Cry performance on an Athlon 64...From what I've read the game will use the 64-bit architecture for something graphics-related, and it would be interesting to see how the graphics cards handle this.
If it can't be done now, it may be one to remember for the future...
Also, how well do the 64-bit drivers of both companies perform?