Fall 2003 Video Card Roundup Part I - ATI's Radeon 9800 XT
by Anand Lal Shimpi & Derek Wilson on October 1, 2003 3:02 AM EST- Posted in
- GPUs
The Newcomers
As we briefly mentioned, there are three new products to talk about today – the Radeon 9800 XT, the Radeon 9600 XT and then NVIDIA’s NV38.
The XT line of Radeon 9x00 cards is specifically targeted at the very high end of the gaming market. With AMD and their Athlon 64 FX, Intel and the Pentium 4 Extreme Edition, it’s not too surprising to see even more companies going this direction. With an ultra-premium part like the Radeon 9800 XT the profit margins are high and more importantly, the PR opportunities are huge – claiming the title of world’s fastest desktop GPU never hurts.
The effort required to produce a part like the Radeon 9800 XT is much lower than a serious redesign. When making any kind of chip (CPU, GPU, chipset, etc…) the design team is usually given a cutoff point where they cannot make any more changes to the design, and that is the design that will go into production. However, it is very rare that manufacturers get things right on the first try. Process improvements and optimizing of critical paths within a microprocessor are both time intensive tasks that require a good deal of experience.
Once ATI’s engineers had more experience with the R350 core and more time with it they began to see where the limitations of the GPU’s clock speed existed; remember that your processor can only run as fast as its slowest speed path so it makes a great deal of sense to change the layout and optimize the use of transistors, etc… to speed up the slow paths within your GPU. This oversimplified process is what ATI and their foundry engineers have been working on and the results are encompassed in the R360 – the core of the Radeon 9800 XT.
The Radeon 9800 XT is able to run at a slightly higher core frequency of 412MHz, quite impressive for ATI’s 0.15-micron chip (yes, this is the same process that the original R300 was based on). Keep in mind that the Radeon 9800 Pro ran at 380MHz and you’ll see that this 8% increase in clock speed is beginning to reach the limits of what ATI can do at 0.15-micron.
The Radeon 9800 XT does receive a boost in memory speed as well, now boasting a 365MHz DDR memory clock (730MHz effective) – an increase of 7% over the original Radeon 9800 Pro and an increase of 4% over the 256MB 9800 Pro. ATI was much more proud of their core clock improvements as we will begin to crave faster GPU speeds once more shader intensive games come out.
The Radeon 9800 XT does have a thermal diode (mounted on-package but not on-die) that has a driver interface that will allow the card to automatically increase its core speed if the thermal conditions are suitable. The GPU will never drop below its advertised 412MHz clock speed, but it can reach speeds of up to 440MHz as far as we know. The important thing to note here is that ATI fully warrantees this overclocking support, an interesting move indeed. Obviously they only guarantee the overclock when it is performed automatically in the drivers, as they do not rate the chips for running at the overclocked speed in all conditions.
The OverDrive feature, as ATI likes to call it, will be enabled through the Catalyst 3.8 drivers and we’ll be sure to look into its functionality once the final drivers are made available.
The Radeon 9800 XT will be available in the next month or so and it will be sold in 256MB configurations at a price of $499 – most likely taking the place of the Radeon 9800 Pro 256MB.
263 Comments
View All Comments
WooDaddy - Wednesday, October 1, 2003 - link
Good eye #93. Evan actually mentioned it earlier too.Anand/Derek mentioned Nvidia being better at Doom 3. Y'all sneaky son-of-a-guns must be beta testing it in the background or sumthin. I know Carmack said it ran better but I betcha y'all got your hands on a copy. Go ahead. Admit it! Quit holding out on us. We wanna see the benchmark! I got a shiny nickel with your name on it if you put it out there...
Overall great review. I sorta agree that 1024x768 is kinda like the 640x480 of yesteryear now, but most of us can gather what 1280 will run at. For the fanboys/girls, "You should've included counterstrike and hexen 2. waah!" Honestly, I know how long it can take to set up and benchmark those tests in a _controlled_ environment. Do you guys use automated software testers?
Question though.Even though FFXI ran slow, is it still playable? I don't want to believe that it runs that slow all the time.
Anonymous User - Wednesday, October 1, 2003 - link
k, I'm going to ignore everyone who's bitching because they didn't read it and thus haven't already twigged that IQ comparisons will be in part 2Re the PCI slot thing, doesn't that apply equally to two-slot cards? If putting a PCI card next to the AGP slot on a one-slot card is bad, surely putting a PCI card in the first slot after a two-slot card isn't exactly smart either? You still lose an extra PCI slot over what you would have with a one-slot card
Anonymous User - Wednesday, October 1, 2003 - link
What about Max Payne 2 ? i like to see it in next benchsAnonymous User - Wednesday, October 1, 2003 - link
Please use Battlefield 1942 in benchmarking in the future! It's an awesome game and has some very nice and demanding mods like desert combat. Please use desert combat in benchmarking too. Try flying around, blowing up stuff and checking if framerate ever goes to unacceptable levels. Gamers rarely care about average or maximum fps, if game is running 50fps or 150 fps it doesn't matter, but if it ever runs as sluggishly as <10 fps in heat of the battle, it is very annoying.Just tell us with your own words, which graphics card brings playable framerates!
Anonymous User - Wednesday, October 1, 2003 - link
I would like to see Battlefield 1942 added into the benchmarks. Especially since it is such a popular game and they have battlefield vietnam coming out before too long. Thanks.Anonymous User - Wednesday, October 1, 2003 - link
Thanks for benching so many games. Since I only play a few games I look for performance in those games in particular. My games were covererd and I really appreciate that.Anonymous User - Wednesday, October 1, 2003 - link
I see the fanbois are out in full force. :/Anonymous User - Wednesday, October 1, 2003 - link
over all a sad review, using drivers that are not out for every one to use, no IQ tests to see if the drivers are cheating at all, and then comments like "From these two graphs, it seems like NVIDIA is the clear winner, but in watching this demo run so many times, we noticed that the NVIDIA cards were running choppier than the ATI cards, and we again had some image quality questions we need to answer"so that pretty much does it for me. I won't take this with a grain of salt untill they rip apart the drivers, and make sure Nvidia is not up to any "optimazations" Ive lost all trust in Nvidia. I hope the Nv40 can turn this around.
Anonymous User - Wednesday, October 1, 2003 - link
#69 just because you run games at 1280x1024, doesn't make you the majority representation of gamers. Most gamers run at 1024x768. Most computer users resolution is at 1024x768, like 55% or something like that.Anonymous User - Wednesday, October 1, 2003 - link
The "Prescott" string on Page 4 is white?! Just select the 3 last lines from "2.8 GHz ...". Has it been white all the time?