ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Introduction
Take all the clichés used to describe a long overdue event or the unexpected fulfillment of a promise (hot places freezing, heavy animals soaring through the air, etc...) and you still couldn't say enough to fully proclaim the news that ATI has finally properly hard launched a product. That's right, looking around the internet this morning has provided us with the joyous realization that the Radeon X1900XT, XTX, and CrossFire parts are available for purchase. We've tried to keep an eye on the situation and it's been quite easy to see that ATI would be able to pull it off this time. Some sites started taking preorders earlier in the week saying their X1900 parts would ship in one to two days, putting the timeframe right on the mark. There were no missing dongles, no problems with customs, and ATI told us last week that thousands of parts had already been delivered to manufacturers.
And if that isn't enough to dance about, ATI has delivered a hugely powerful part with this launch. The Radeon X1900 series is no joke, and every card featuring the name is a behemoth. With triple the pixel shader units of the X1800 XT, and a general increase in supporting hardware throughout the pixel processing engine, ATI's hugely clocked 384 Million transistor GPU is capable of crunching enormous volumes of data very quickly. Fill rate isn't increased very much because the X1900 series still only allows 16 pixels to be drawn to the screen per clock cycle, but power is delivered where it is needed most. With longer and more complex shader programs, pixels need to stay in the shader engine longer which further shifts the performance burden from the theoretical maximum fill rate.
NVIDIA would like us to compare the X1900's increase in ALU (arithmetic logic unit) power to what they did with the FX 5900 after NV30 tanked. Certainly, increasing the math power (and increasing memory bandwidth) helped NVIDIA, but fortunately for ATI the X1900 is not derived from a fundamentally flawed GPU design. The X1800 series are certainly not bad parts, even if they are being completely replaced by the X1900 in ATI's lineup.
I'll spoil the results and make it clear that the X1900XT and XTX are hands down the best cards out there right now. But all positives aside, ATI needed this card to hard launch with good availability, perform better than anything else, and look good doing it. There have been too many speed bumps in ATI's way for there to be any room for a slip up on this launch, and it looks like they've pulled it off. The launch of the X1900 series not only puts ATI back on top, but (much more importantly) it puts them back in the game. Let's hope that both ATI and NVIDIA can keep up the good fight.
But let's not forget why we're here. The first thing we are going to do is talk about what makes the R580 GPU that powers the X1900 series so incredibly good at what it does.
Take all the clichés used to describe a long overdue event or the unexpected fulfillment of a promise (hot places freezing, heavy animals soaring through the air, etc...) and you still couldn't say enough to fully proclaim the news that ATI has finally properly hard launched a product. That's right, looking around the internet this morning has provided us with the joyous realization that the Radeon X1900XT, XTX, and CrossFire parts are available for purchase. We've tried to keep an eye on the situation and it's been quite easy to see that ATI would be able to pull it off this time. Some sites started taking preorders earlier in the week saying their X1900 parts would ship in one to two days, putting the timeframe right on the mark. There were no missing dongles, no problems with customs, and ATI told us last week that thousands of parts had already been delivered to manufacturers.
And if that isn't enough to dance about, ATI has delivered a hugely powerful part with this launch. The Radeon X1900 series is no joke, and every card featuring the name is a behemoth. With triple the pixel shader units of the X1800 XT, and a general increase in supporting hardware throughout the pixel processing engine, ATI's hugely clocked 384 Million transistor GPU is capable of crunching enormous volumes of data very quickly. Fill rate isn't increased very much because the X1900 series still only allows 16 pixels to be drawn to the screen per clock cycle, but power is delivered where it is needed most. With longer and more complex shader programs, pixels need to stay in the shader engine longer which further shifts the performance burden from the theoretical maximum fill rate.
NVIDIA would like us to compare the X1900's increase in ALU (arithmetic logic unit) power to what they did with the FX 5900 after NV30 tanked. Certainly, increasing the math power (and increasing memory bandwidth) helped NVIDIA, but fortunately for ATI the X1900 is not derived from a fundamentally flawed GPU design. The X1800 series are certainly not bad parts, even if they are being completely replaced by the X1900 in ATI's lineup.
I'll spoil the results and make it clear that the X1900XT and XTX are hands down the best cards out there right now. But all positives aside, ATI needed this card to hard launch with good availability, perform better than anything else, and look good doing it. There have been too many speed bumps in ATI's way for there to be any room for a slip up on this launch, and it looks like they've pulled it off. The launch of the X1900 series not only puts ATI back on top, but (much more importantly) it puts them back in the game. Let's hope that both ATI and NVIDIA can keep up the good fight.
But let's not forget why we're here. The first thing we are going to do is talk about what makes the R580 GPU that powers the X1900 series so incredibly good at what it does.
120 Comments
View All Comments
Live - Tuesday, January 24, 2006 - link
Thanks for the explanation! Derek I think this merits a mention in the review.NullSubroutine - Tuesday, January 24, 2006 - link
perhaps a flash system where you can pick the card within the benchmark and it will show it on the line graph. just simply activate/deactivate feature.bldckstark - Tuesday, January 24, 2006 - link
I have to agree that a group color for the multi-GPU setups would be helpful on the bar graphs. The outline you used to denote negative gains would work well for this. Then ATI and Nvidia bars would still have a different major color, but the multi-GPU setups could have a yellow outline. E.G. ATI = red, ATI X-fire = Red w/ yellow outline, Nvidia = blue, Nvidia SLI = blue w/ yellow outline.Rock Hydra - Tuesday, January 24, 2006 - link
I don't know if you meant this or not, on the page mentioning the new crossfire board. There is url, I don't know if it was intended to be active or plain text, but I thought I would just bring that to your attention.DerekWilson - Tuesday, January 24, 2006 - link
thanks, fixedemilyek - Tuesday, January 24, 2006 - link
Good article.You have two typos in your article.
In the system specs you have OZC Powerstreams instead of ...stream
When you use the words 'eek out' as a verb that means 'squeeze out', it is spelled 'eke'-- 'eke out'.
DerekWilson - Tuesday, January 24, 2006 - link
I had no idea there was a correct spelling for eke ...thanks
beggerking - Tuesday, January 24, 2006 - link
Did anyone notice it? the breakdown graphs doesn't quite reflect the actual data..the breakdown is showing 1900xtx being much faster than 7800 512, but in the actual performance graph 1900xtx is sometimes outpaced by 7800 512..
DerekWilson - Tuesday, January 24, 2006 - link
We didn't aggregate performance of each card under each game.for the percent improvment breakdown we only looked at 2048x1536 with 4xAA which clearly shows the x1900xtx in the lead.
our reasoning is that this is the most stressful stock test we throw at the cards -- it shows what the cards can handle under the highest stress.
beggerking - Tuesday, January 24, 2006 - link
umm.. what about 8xAA or higher? or lower resolution? w/wo AA?if you don't aggregate performance, then won't the graphic be misleading?
isn't max quality the most stressful test ?