ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
The Test
For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | Intel D975XBX (LGA-775) ASUS P5N32SLI SE Deluxe |
Chipset: | Intel 975X NVIDIA nForce4 Intel x16 SLI |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) NVIDIA nForce 6.86 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 6.8 NVIDIA ForceWare 91.33 |
Desktop Resolution: | 1920 x 1440 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)
For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.
In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.
74 Comments
View All Comments
TigerFlash - Wednesday, August 23, 2006 - link
I suppose I worded that the opposite way. Do you think Intel will stop supporting Crossfire cards?michal1980 - Wednesday, August 23, 2006 - link
Can we not even get any numbers for cards below the 7900GTX.I understand your limited, but how about some numbers from some cards below that, to see what an upgrade would do.
I know we can kind of take test from old reviews of the cards, but your test bed has changed since core 2, so its not a fair heads to heads test of old numbers to new.
it would be nice to see if theres a point(wise or not) to upgrade from a 7800gt or that gen of cards, or something slower like a 7900gt.
but it seems like ever 'new gen' card test just drops off 'older' cards
michal1980 - Wednesday, August 23, 2006 - link
what i meant is that on the tables, or where all the new cards are, it would be nice to have some numbers for old cards.Lifted - Wednesday, August 23, 2006 - link
Agreed. I'm still running a 6800GT and have not seen much of a reason to upgrade with the current software I run. Perhaps if I saw that newer games are 3x faster I might consider an upgrade, so how about it?