Fall 2003 Video Card Roundup - Part 3: ATI's Radeon 9600 XT
by Anand Lal Shimpi & Derek Wilson on October 15, 2003 10:26 AM EST- Posted in
- GPUs
The definitive Fall Refresh
After NVIDIA released the TNT2 Ultra, we saw the first incarnation of the now common 6-month product cycle. The strategy was the exact one used to dethrone 3dfx, and is based on a very simple principle of using parallel design teams. If you have three design teams, one working on the current generation product, one working on the 6-month follow-up and one working on the next-generation solution, assuming all teams work efficiently, you should be able to maintain a stream of GPU releases in 6 month intervals.
To make the job a bit easier, you only work on inventing new architectures every 12 months, giving you a little break in between the hectic lifestyle of a GPU design engineer. But in order to maintain competitiveness you have to have a product every 6 months, so in the time between architectures you simply refresh your current generation architecture. A refresh is generally a higher clocked GPU, potentially with some faster memory, made possible due to more experience with manufacturing that particular GPU (yields improve over time) and the availability of faster memory. Sometimes we get advancements in process technology that allows for a boost in clock speed as well.
When NVIDIA introduced the 6-month product cycle the idea was that new architectures would debut in the Fall, and refresh products would hit in the Spring. The delay of NV20 (GeForce3) changed things around a bit and the GeForce2 Ultra became the first Fall refresh product. Since then, little attention has been paid to when various GPUs hit, as long as we get something new every 6 months were happy. Earlier this year we heard that both ATI and NVIDIA would be releasing their true next-generation hardware next Spring, leaving this Fall as the refresh cycle.
ATIs high-end refresh was the Radeon 9800 XT, and as you can guess their midrange refresh is the new Radeon 9600 XT. Much like the Radeon 9800 XT, the 9600 XT only adds two features: a higher clock speed and support for OverDrive.
The Radeon 9600 XT GPU now runs at 500MHz, a 25% increase in clock speed over the 9600 Pros 400MHz clock. The memory speed of the Radeon 9600 XT remains at 300MHz DDR (effectively 600MHz), so there is no increase in memory bandwidth over its predecessor.
The hefty increase in clock speed is due to improvements in process technology as well as the introduction of a low-k dielectric. As we briefly explained in our 9800 XT review, the benefits of a low-k dielectric are mainly related to shielding from crosstalk in high transistor density chips, which gives us the clock speed boost we see with the 9600 XT. Because were just talking about an increase in core clock speed, the games to receive the biggest performance boost from the XT would be those that are GPU-limited, which unfortunately are few and far in between these days. Games that are largely shader bound such as Half Life 2 will definitely enjoy the 9600 XTs increase in clock speed, but for now well see most of the performance benefits go to waste.
We explained OverDrive technology in our Radeon 9800 XT review and tested it in our Catalyst 3.8 driver update. The Radeon 9600 XT includes an on-die thermal diode that measures the temperature of the core; when the temperature is cool enough the driver will instruct the core to overclock itself by a set margin. The Radeon 9600 XT will run at one of three speeds depending on its temperature: 500MHz, 513MHz or 527MHz. The combination of this driver and hardware support makes up ATIs OverDrive feature.
OverDrive is currently not enabled for the Radeon 9600 XT in the Catalyst 3.8 drivers, we will have to wait for the Catalyst 3.9s before we can test the 9600 XT with OverDrive. If youre curious about the performance implications of enabling OverDrive, have a look at our Catalyst 3.8 review its nothing to get too excited about.
70 Comments
View All Comments
BerSerK0 - Friday, January 30, 2004 - link
I have a 9600XT, and real life FPS on these games are much better :)xxZoDxx - Sunday, December 28, 2003 - link
My $.02... Constantly listening to the bickering of ATI vs nvidia, it's like Ford vs Chevy, & Pepsi vs Coke. My feelings on this? nvidia has superior hardware. Now before you get your ATI underoos in a bunch, ATI has the superior Drivers/architecture. Look at most openGL benches. Don't they follow the clock, RAM, and bandwidth speeds more closely? nvidia mostly holds all these cards. If they could only get their drivers to work as well with D3D, there wouldn't be a question. Congrats on the latest 50 series but they still have a way to go to get D3D up to snuff. Personally? I have a 5900 that o/c's like mad (well beyond 5950) and I only paid 2 bills for it. ATI... get the pricing down and you could OWN nvidia. Now I wait for the flamers..........TurtleMan - Tuesday, December 23, 2003 - link
Hmm FFXi is a main factor for me , and now i have an unopen 9600 xt sitting right here, i began to wonder if i should open it up or buy a 9800 se..Rustjive - Wednesday, October 22, 2003 - link
The FFXI benchmark is heavily CPU bound in addition to being GPU bound. Case in point: I ran it on my duallie PIII-733 with the TI4200, and I barely got over 1200 rendered (compared with the 3000+ of Anandtech's results.) Then contrast this to Aquamark 3, in which I got 13.03 FPS as opposed to the ~15FPS of Anand's. (Comparatively, the performance differences are quite drastic.) All I have to say is...blah to FFXI and the world of MMORPGs. Blah.Anonymous User - Tuesday, October 21, 2003 - link
Scores are bullshit, why bench the top of the line ATI 9800 XT against the middle of the road 5600? Thats just retarded.Anonymous User - Tuesday, October 21, 2003 - link
When we will see a test with ATI,s 9800 made by different manufacturers?Asus,Hercules,Gigabyte,ATI??
Thank u!
Anonymous User - Saturday, October 18, 2003 - link
How come they used diff. settings for every benchmark? Sometimes they used 1024X768 with no AA/AF enabled while other times they used 1024X768 w/ 4XAA/8XAF. Where did the 6XAA settings go? Can't stay consistant thorough out the review so people won't have to worry about the settings for each game benchamarks. Can anyone expalain this?Anonymous User - Friday, October 17, 2003 - link
Unfortunately this review has missed an important issue: noise levels. Simply put many of the people reading this site will have serious hearing loss by their mid 40s because these systems are too loud for the long daily exposure times people experience with them. Old programmers who were around the old line printers frequently have hearing loss from the high pitched buzzing of the printers. Ditto any other industrial noise exposures. Silent computing is a worthwhile goal! I was very disappointed to discover that the last nVidia-based graphic card I placed in my main system was so noisy. Now I need to find a quieter one that delivers similar performance. These reviews are not much help on that dimension. Sorry.Anonymous User - Friday, October 17, 2003 - link
Can you PLEASE get rid of the flash ! :(Whats wrong with the classic Anandtech graphs
that everybody loved ?
It doesnt even look better ..
Anonymous User - Friday, October 17, 2003 - link
Hey #8, maybe it's because NVIDIA sucks. Even when they do match the performance of ATI, the image quality is lower anyway.