ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Hardware Features and Test Setup
We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.
These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?
In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.
Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.
ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU
First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.
We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.
These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?
In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.
Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.
ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU
First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.
120 Comments
View All Comments
Midreian - Wednesday, January 25, 2006 - link
This test kind of seems biased to me. The cards were tried in CrossFire for the ATI cards, but when it came to the Nvidia 7800 GTX's, 512mb and 256mb, neither were tested in SLI and compared to CrossFire.Anyone have a comparison of SLI vs. CrossFire for the same tests?
DerekWilson - Wednesday, January 25, 2006 - link
For all the games but Battlefield 2 we ran both CrossFire and SLI numbersWe only ran SLI for the GTX 512 because we only looked at the highest end multigpu solution for each series (7800, 1800, 1900).
We would have included SLI in the BF2 portion, but our benchmark doesn't correctly represent gameplay for SLI. We are working on this.
Thanks,
Derek Wilson
Zebo - Tuesday, January 24, 2006 - link
Seems weird not to have Sli GT's in there. I still think thats the best deal in highend -around the $550 price point. Should clean up on both the 1900Xt and 1900XTX pretty handily for the same price or less. Is AT still in the business of recommending "bang for the buck"? or moving away from that? Because only .05% of your readers are going go up into the realm of $1000 video cards ( GTX's and XTX's in dual config)danidentity - Tuesday, January 24, 2006 - link
Are the figures in the "Load Power" chart the power consumption of just the video card, or the entire system? If those numbers are just the video card, that's flat out insane.Josh Venning - Wednesday, January 25, 2006 - link
The numbers in the Load Power chart represent the power draw for the entire system under stress testing. Even so, the 7800 GTX 512 SLI and X1900 XTX Xfire setups are ridiculously power-hungry.flexy - Tuesday, January 24, 2006 - link
its nice to see ATI come up with something GOOD after so many disappointments, paper-launches etc.$500 is an "attractive" price (relatively spoken), looking at the card's specs...i am still having a X850XT and (sadly ???) dont really have an "urge" to get this card since i MAINLY play HL2 (full details, even AAx6) and its fast and great even on my old X850XT. Al;most makes me wish i had more game-engines which demand/justify upgrading to this card.
As said..very hapy for ATI and this card is all the way UP on my wishlist (since i am a graphicscard-ho ;).....but then i also know G71 will come and this card will be a killer-card too (from the theoretical speaks). If i had a very slow system and barely could play any games i PROBABLY would get the R580 now... ;)
Fenixgoon - Tuesday, January 24, 2006 - link
great job by ATI for bringing out some killer cards. note that a crossfire x1900 system is CHEAPER than 7800 512's. but hey, regardless of who's on top we win :)as far as the parts being expensive - of course they will be, they're top of the line and released today.
i bought a radeon x800pro for $170 and run COD2 at 1280x1024 maxed out (no FSAA/AF) with very few framedrops (worst is scoping in on smoke from grenades). i also have stuttering issues with HDR. minus HDR, i run HL2 @ 1280x1024 6xFSAA and 16xAF. this is coming from a budget system! putting all my components together, my setup costs about 700.
Xenoterranos - Wednesday, January 25, 2006 - link
Wow, for that kind of money, you could have almos bought an Xbox 360 bundle...or half a ps3 (har har har).lamestlamer - Tuesday, January 24, 2006 - link
Did anyone else notice how the x1800xt trounced the 7800gtx in almost all tests? A look at the 7800gtx 512 release benchmarks shows the exact opposite. Perhaps the quality settings were on for the 7800gtx while the x1800xt had performance settings. Even the 7800gtx 512 which cannot possibly have a larger than 40% lead over the 7800gtx has a 100% lead in some cases.ocyl - Tuesday, January 24, 2006 - link
It's been mentioned above but I will say it again. While it's okay to say that R580 has 48 pixel shaders, it only really has 16 pixel pipelines.