Fall 2003 Video Card Roundup Part I - ATI's Radeon 9800 XT
by Anand Lal Shimpi & Derek Wilson on October 1, 2003 3:02 AM EST- Posted in
- GPUs
The Newcomers
As we briefly mentioned, there are three new products to talk about today – the Radeon 9800 XT, the Radeon 9600 XT and then NVIDIA’s NV38.
The XT line of Radeon 9x00 cards is specifically targeted at the very high end of the gaming market. With AMD and their Athlon 64 FX, Intel and the Pentium 4 Extreme Edition, it’s not too surprising to see even more companies going this direction. With an ultra-premium part like the Radeon 9800 XT the profit margins are high and more importantly, the PR opportunities are huge – claiming the title of world’s fastest desktop GPU never hurts.
The effort required to produce a part like the Radeon 9800 XT is much lower than a serious redesign. When making any kind of chip (CPU, GPU, chipset, etc…) the design team is usually given a cutoff point where they cannot make any more changes to the design, and that is the design that will go into production. However, it is very rare that manufacturers get things right on the first try. Process improvements and optimizing of critical paths within a microprocessor are both time intensive tasks that require a good deal of experience.
Once ATI’s engineers had more experience with the R350 core and more time with it they began to see where the limitations of the GPU’s clock speed existed; remember that your processor can only run as fast as its slowest speed path so it makes a great deal of sense to change the layout and optimize the use of transistors, etc… to speed up the slow paths within your GPU. This oversimplified process is what ATI and their foundry engineers have been working on and the results are encompassed in the R360 – the core of the Radeon 9800 XT.
The Radeon 9800 XT is able to run at a slightly higher core frequency of 412MHz, quite impressive for ATI’s 0.15-micron chip (yes, this is the same process that the original R300 was based on). Keep in mind that the Radeon 9800 Pro ran at 380MHz and you’ll see that this 8% increase in clock speed is beginning to reach the limits of what ATI can do at 0.15-micron.
The Radeon 9800 XT does receive a boost in memory speed as well, now boasting a 365MHz DDR memory clock (730MHz effective) – an increase of 7% over the original Radeon 9800 Pro and an increase of 4% over the 256MB 9800 Pro. ATI was much more proud of their core clock improvements as we will begin to crave faster GPU speeds once more shader intensive games come out.
The Radeon 9800 XT does have a thermal diode (mounted on-package but not on-die) that has a driver interface that will allow the card to automatically increase its core speed if the thermal conditions are suitable. The GPU will never drop below its advertised 412MHz clock speed, but it can reach speeds of up to 440MHz as far as we know. The important thing to note here is that ATI fully warrantees this overclocking support, an interesting move indeed. Obviously they only guarantee the overclock when it is performed automatically in the drivers, as they do not rate the chips for running at the overclocked speed in all conditions.
The OverDrive feature, as ATI likes to call it, will be enabled through the Catalyst 3.8 drivers and we’ll be sure to look into its functionality once the final drivers are made available.
The Radeon 9800 XT will be available in the next month or so and it will be sold in 256MB configurations at a price of $499 – most likely taking the place of the Radeon 9800 Pro 256MB.
263 Comments
View All Comments
Anonymous User - Thursday, October 2, 2003 - link
I wouldn't mind seeing Medieval Total War in the test group. Maybe use a huge battle and replay it using all the different cards. Thanks for including Command and Conquer it helped me pick my card.Anonymous User - Thursday, October 2, 2003 - link
Quote:The picture quality in UT2003 has been the reason for some of the criticism against nVidia recently. nVidia has chose to lower the picture quality in this game when it concerns anisotropic filtering. Most clear we see nVidia's lower picture quality in the 51.75-driver when AF is activated in the control panel.
Quote:
The major difference in picture quality in WC3 is the anti aliasing. Due to some reason it seems like nVidia's cards have problems with horizontal edges in this game above anything else. Check i.e. the large log in the fire; it looks like it is principally without AA.
Quote:
It maybe gets rather boring to hear (read), but we can not lie; ATi's high quality FSAA win over nVidia once more. As we can see Detonator 51.75 is totally out of the game when it seems like AF is being disabled.
Quote:
Once more ATi takes the lead, 9800 Pro gets along this time too. Detonator 51.75 lowers (once more) the quality of texture filtering.
Quote:
Radeon kicks some ass in BF1942, nVidia doesn't stand a chance. Once again, Detonator 51.75 lowers the texture quality so much that we don't find the results comparable.
Quote:
ATi rips nVidia into pieces in Tomb Raider. If this is an indication of upcoming DirectX9 performance (and it seems like that when looking at Half Life 2 tests) nVidia won't have a lot to say the coming 6 months. Detonator 51.75 increases performance, but also causes some strange bugs.
nVidia's present drivers doesn't allow for floating point precision for render targets etc. resulting in lower quality. Detonator 51.75 also makes the Depth Of Field effect to run amok. The main character is repeatedly erased, despite the fact that the effect is supposed to erase things the longer away they are, for example. As expected, AA works better with ATi's card. Anisotropic filtering also looks better on ATi's card, since this kind of filtering causes ""texture aliasing" (floating pixels) on nVidia's cards.
Subjective analysis: There's no question about it, nVidia's card isn't even close to being playable. Despite that, ironically, Tomb Raider is part of nVidia's so called "The Way It's Meant To Be Played" program. I noticed no differences between XT and Pro when playing the game.
Quote:
It's a bit strange to see that ATi's image quality and performance increases when we apply AF through the application instead of from the control panel. On the mountain in the very middle of the picture it's clear that the image quality once again is better with ATi's card.
Subjective analysis: I am very confused with the test results from Jedi Academy. The strange thing is that the game runs a lot better (and looks better) on the ATi card, but in the performance tests it seems like if nVidia beat the 9800 Pro, which doesn't reflect the impression we got when actually playing the game.
Quote:
If we take a look at the 3DMark03 performance with Detonator 51.75 we can see that it has been given a good increase. Although we have great suspicions that these optimizations are more or less exclusively application specific, and that it might be optimizations that is by a nature that is not applicable on a real game. Therefore we have chosen to give 51.75 a gray pillar because we don´t believe it is giving an accurate result.
With Detonator 44.03 that is "approved" by Futuremark you can see that nVidia really has not got a chance against ATi's cards. As said: nVidia says it is misleading, but all of the DX9 games so far shows that Futuremark actually succeeded in creating a pretty good prophecy with their 3DMark03.
If we take a closer look at the even more synthetic Pixel Shader 2.0-test we see that the FX-card's weak spot is DX9.
Quote:
We got very weird results in Aquamark 3. nVidia's Detonator 44.03 shows a very low performance, but the driver generates an output that is completely approved Detonator 51.75 on the other hand has noticeable losses of picture quality on a couple of areas: Mist/smoke has been heavily reduced in at least one scene, texture filtering is a lot worse, some things look like the are rendered with lower precision, and finally many lightning effects are a little suspect or not even there. Right now under these circumstances we can only point away from the results from detonator 50. And under those circumstances ATi has an unbelievable lead ahead nVidia.
http://www.nordichardware.com/reviews/graphiccard/...
_________________
Anonymous User - Thursday, October 2, 2003 - link
I would like to see Battlefield 1942 as a test game, it is one of the more popular games out there right now.Insomniac - Thursday, October 2, 2003 - link
If it is a Prescott, it isn't any better than the current Northwood P4. I base this on the Athlon 64 article. Anand used a Radeon 9800 Pro 256 MB in it and the only benchmark I see on both is the UT 2003 Flyby. For both articles he used the Catalyst 3.7. So taking the numbers:2.8 GHz Intel Processor (Prescott whited out): 212
Intel Pentium 4 3.2C: 232.8
Intel Pentium 4 3.0C: 226.3
The 3.0C is faster than the 2.8 by 6.7%
The 3.2C is faster than the 3.0C by 2.9%
I would expect a 2.8C to score closer to 220.
Of course, that is only one benchmark. I wonder what is going on. Throw us some information Anand! :)
Anonymous User - Thursday, October 2, 2003 - link
I did.Anonymous User - Thursday, October 2, 2003 - link
the inq noticed this. anybody did?highlight the following words:
featured in our test suite (Half Life 2 & Gunmetal).
Our test bed was configured as follows:
2.8GHz Intel Processor
512MB DDR400
Intel 875P Motherboard
then it will show up as:
featured in our test suite (Half Life 2 & Gunmetal).
Our test bed was configured as follows:
2.8GHz Intel Processor Prescott
512MB DDR400
Intel 875P Motherboard
meh...
Anonymous User - Thursday, October 2, 2003 - link
loli dont think some of you morons have read the entire 9 pages of posts.
you slackers ! :)
Icewind - Thursday, October 2, 2003 - link
Im excited to get my 9800XT. I got a 4200 in Sept 2002 cause UT2k3 and BF1942 ran like crap on my firt gen Geforce 3, It only cost me $178 and I modded it for better cooling and ran it at near 4400 speeds. Now im quit tired of the slowdowns in BF942 and UT2k3 as I want higher res and my effects cranked up, and I also want to run Half Live 2 in all its glory. I work my ass off at work and I feel like getting a $500 card, that is my choice, period.Anonymous User - Thursday, October 2, 2003 - link
<b>How about adding one more video card? a GF4ti4x00 </b>SO MANY people have these cards and I for one would like to see a comparison of what I would have to gain from upgrading.
Anonymous User - Thursday, October 2, 2003 - link
[b]How about adding one more video card? a GF4ti4x00 [/b]SO MANY people have these cards and I for one would like to see a comparison of what I would have to gain from upgrading.