ATI HD 2900XT CrossFire: Intel 975X versus Intel P35
by Gary Key on May 16, 2007 12:00 PM EST- Posted in
- GPUs
Test Setup
Our test configurations today consist of the ASUS P5K-Deluxe sporting the new P35 chipset and the Intel D975XBX2KR based on the venerable 975X chipset. Our retail P5K-Deluxe board was purchased recently even though an embargo is supposedly in place until June 4th for distribution of P35 product. Likewise, P35 boards from other suppliers such as MSI and Gigabyte can also be purchased at this time making this one of the stranger product releases in recent memory.
The P5K-Deluxe features ASUS's C.G.I. technology from their P965 motherboards. ASUS C.G.I. stands for ASUS Cross Graphics Impeller (marketing still reigns) and is a feature that when enabled will automatically optimize system performance if a CrossFire configuration is detected. These optimizations occur within the Direct Media Interface between the P35 MCH and ICH9R that is utilized to enable CrossFire operation on this motherboard.
The 975X chipset utilizes Peer-to-Peer write capability within the MCH to enable 2x8 PCI Express lane capabilities for CrossFire. This feature is not available in the P965 or P35 without a special PCIe controller chip and BIOS support. ATI/AMD enables CrossFire support utilizing the Direct Media Interface (DMI) to link the x16 GPU slot (16 PCI Express Lanes) residing on the MCH and the x4 GPU slot (4 PCI Express Lanes) residing on the ICH. Contrary to rumors and initial reports in certain forums, the P5K-Deluxe does not perform CrossFire operations with a 2x8 PCI Express lane configuration.
Test conditions were maintained the same, as much as possible, over the platforms tested. Our game tests were run at settings of 1280x1024 4xAA, 1600x1200 4xAA, and 1920x1200 4xAA with 8xAF implemented in games that support this feature. These settings were used on both our single card and CrossFire setups. We feel like these settings and resolutions will provide accurate benchmark results for the typical user utilizing a CrossFire setup with a high end CPU.
All results are reported in our charts and color-coded for easier identification of results. We utilize new drive images on each board in order to minimize any potential driver conflicts. Our 3DMark results are generated utilizing the standard benchmark resolution for each program. We run each benchmark five times, throw out the two low and high scores, and report the remaining score. All results are run at stock speeds for this article although we will provide overclocked results in the next article. For those wondering, our cards generally had no issues running at 853/1000 provided we had notified the electric company of a pending power surge.
This preview is not a graphics card review and as such we are not including results with products from the Big Green Machine yet. Those comparisons will come in our P35 chipset article. We are simply providing results on how each chipset handles CrossFire operations at this time. We will provide P965 and RD600 results in our follow-up to this article so you can have a clear picture of which Intel chipset performs the best with a CrossFire configuration. We might even throw an RD580 into the mix to see how well the R600 performs on it.
We also booked several sessions with a psychologist so we can understand why there was a lapse in our thought process for choosing Windows Vista Ultimate 64-bit as our operating system. The R600 already has enough early driver issues to make one think twice about using it, but to throw a new operating system and chipset into the mix and then to push matters further by going 64-bit was clearly not the actions of a sane person. It sounded good at the time, it really did, but after several days of constant frustration, hair pulling, dog kicking (relax PETA, just a joke), finger nail chewing, and general panic attacks about missing the article deadline... well, we would have not have done it any differently as it turns out.
Why? Whether we like it or not, Vista is the future of Windows for the time being and is required for DirectX 10. Honestly, it was time to see how far the various vendors had come since release in providing decent driver or game support. 64-bit OSes are also the future - after all, AMD released x86-64 on the world over three years ago. We collected enough information to generate a weekend short story on the subject but as we feared, progress has been slow.
NVIDIA released their first decent set of Vista drivers this last week and we are busy redoing all of our 8800GTS/GTX numbers for the P35 launch article. In the meantime, we chewed through four different driver releases from AMD and decided to stick with the publicly released 4.37.4.3 drivers for this article. We generated some really impressive 3DMark numbers with the alpha 4.37.4.2 drivers but let's just say when it came time to using actual applications those drivers were not always stable or feature capable. We did receive a new set of beta 8.38 drivers a couple of days ago and those are in testing but we do not have enough experience with them yet to publish meaningful numbers.
You might notice in our game testing that several of the more popular games are not benchmarked. We had screen corruption issues in Oblivion, S.T.A.L.K.E.R., Half-Life 2: Episode One, and even Sims 2 when utilizing CrossFire. These same issues are not evident under Windows XP so we contribute most of the issues to driver maturity, though several games we tried are also having some minor issues with XP as well. Also, our Battlefield 2142, Flight Simulator X, and Half-Life 2: Lost Coast benchmarks would not run consistently under Vista so we are back to the drawing board on those and a couple of other games.
As for providing current DX10 benchmarks from the upcoming Lost Planet and Call of Juarez games we decided it was best to wait on the next driver release before providing results as any scores generated now are basically useless. When running CrossFire with the R600 each demo has problems with rendering, tearing, jitters, and several other issues that are likely to be fixed shortly. Needless to say, our first experiences with DX10 and the R600 were not pleasant.
Our test configurations today consist of the ASUS P5K-Deluxe sporting the new P35 chipset and the Intel D975XBX2KR based on the venerable 975X chipset. Our retail P5K-Deluxe board was purchased recently even though an embargo is supposedly in place until June 4th for distribution of P35 product. Likewise, P35 boards from other suppliers such as MSI and Gigabyte can also be purchased at this time making this one of the stranger product releases in recent memory.
The P5K-Deluxe features ASUS's C.G.I. technology from their P965 motherboards. ASUS C.G.I. stands for ASUS Cross Graphics Impeller (marketing still reigns) and is a feature that when enabled will automatically optimize system performance if a CrossFire configuration is detected. These optimizations occur within the Direct Media Interface between the P35 MCH and ICH9R that is utilized to enable CrossFire operation on this motherboard.
The 975X chipset utilizes Peer-to-Peer write capability within the MCH to enable 2x8 PCI Express lane capabilities for CrossFire. This feature is not available in the P965 or P35 without a special PCIe controller chip and BIOS support. ATI/AMD enables CrossFire support utilizing the Direct Media Interface (DMI) to link the x16 GPU slot (16 PCI Express Lanes) residing on the MCH and the x4 GPU slot (4 PCI Express Lanes) residing on the ICH. Contrary to rumors and initial reports in certain forums, the P5K-Deluxe does not perform CrossFire operations with a 2x8 PCI Express lane configuration.
Standard Test Bed CrossFire Test Configuration |
|
Processor | Intel Core 2 Duo QX6700 (2.66GHz, 8MB Unified Cache) |
RAM | OCZ Reaper PC2-9200 (4x1GB) 2.32V, 3-3-3-9 975X, 4-4-3-6 P35 |
Hard Drive | Western Digital 150GB 10,000RPM SATA 16MB Buffer |
System Platform Drivers | Intel - 8.3.0.1013 |
Video Cards | 2 x MSI HD2900XT |
Video Drivers | ATI 8.37.4.3 (HD2900XT Release Drivers) |
CPU Cooling | Tuniq 120 |
Power Supply | OCZ ProXStream 1000W |
Optical Drives | Plextor PX-760A, Plextor PX-B900A |
Case | Cooler Master CM Stacker 830 |
Motherboards | Intel D975XBX2KR (Intel 975X) - BIOS 2692 ASUS P5K Deluxe (Intel P35) - BIOS 0304 |
Operating System | Windows Vista Ultimate 64-bit |
. |
Test conditions were maintained the same, as much as possible, over the platforms tested. Our game tests were run at settings of 1280x1024 4xAA, 1600x1200 4xAA, and 1920x1200 4xAA with 8xAF implemented in games that support this feature. These settings were used on both our single card and CrossFire setups. We feel like these settings and resolutions will provide accurate benchmark results for the typical user utilizing a CrossFire setup with a high end CPU.
All results are reported in our charts and color-coded for easier identification of results. We utilize new drive images on each board in order to minimize any potential driver conflicts. Our 3DMark results are generated utilizing the standard benchmark resolution for each program. We run each benchmark five times, throw out the two low and high scores, and report the remaining score. All results are run at stock speeds for this article although we will provide overclocked results in the next article. For those wondering, our cards generally had no issues running at 853/1000 provided we had notified the electric company of a pending power surge.
This preview is not a graphics card review and as such we are not including results with products from the Big Green Machine yet. Those comparisons will come in our P35 chipset article. We are simply providing results on how each chipset handles CrossFire operations at this time. We will provide P965 and RD600 results in our follow-up to this article so you can have a clear picture of which Intel chipset performs the best with a CrossFire configuration. We might even throw an RD580 into the mix to see how well the R600 performs on it.
We also booked several sessions with a psychologist so we can understand why there was a lapse in our thought process for choosing Windows Vista Ultimate 64-bit as our operating system. The R600 already has enough early driver issues to make one think twice about using it, but to throw a new operating system and chipset into the mix and then to push matters further by going 64-bit was clearly not the actions of a sane person. It sounded good at the time, it really did, but after several days of constant frustration, hair pulling, dog kicking (relax PETA, just a joke), finger nail chewing, and general panic attacks about missing the article deadline... well, we would have not have done it any differently as it turns out.
Why? Whether we like it or not, Vista is the future of Windows for the time being and is required for DirectX 10. Honestly, it was time to see how far the various vendors had come since release in providing decent driver or game support. 64-bit OSes are also the future - after all, AMD released x86-64 on the world over three years ago. We collected enough information to generate a weekend short story on the subject but as we feared, progress has been slow.
NVIDIA released their first decent set of Vista drivers this last week and we are busy redoing all of our 8800GTS/GTX numbers for the P35 launch article. In the meantime, we chewed through four different driver releases from AMD and decided to stick with the publicly released 4.37.4.3 drivers for this article. We generated some really impressive 3DMark numbers with the alpha 4.37.4.2 drivers but let's just say when it came time to using actual applications those drivers were not always stable or feature capable. We did receive a new set of beta 8.38 drivers a couple of days ago and those are in testing but we do not have enough experience with them yet to publish meaningful numbers.
You might notice in our game testing that several of the more popular games are not benchmarked. We had screen corruption issues in Oblivion, S.T.A.L.K.E.R., Half-Life 2: Episode One, and even Sims 2 when utilizing CrossFire. These same issues are not evident under Windows XP so we contribute most of the issues to driver maturity, though several games we tried are also having some minor issues with XP as well. Also, our Battlefield 2142, Flight Simulator X, and Half-Life 2: Lost Coast benchmarks would not run consistently under Vista so we are back to the drawing board on those and a couple of other games.
As for providing current DX10 benchmarks from the upcoming Lost Planet and Call of Juarez games we decided it was best to wait on the next driver release before providing results as any scores generated now are basically useless. When running CrossFire with the R600 each demo has problems with rendering, tearing, jitters, and several other issues that are likely to be fixed shortly. Needless to say, our first experiences with DX10 and the R600 were not pleasant.
29 Comments
View All Comments
vailr - Thursday, May 17, 2007 - link
miss an outing of lifetime with friends[outing of a lifetime]
We are not here to single handily knock AMD
[single-handedly]
System Platform Drivers Intel - 8.3.0.1013
[Version 8.4.0.1010 Beta:
http://www.station-drivers.com/telechargement/inte...">http://www.station-drivers.com/telechargement/inte...
Note: running the .exe installer may NOT update existing installed drivers. Must be manually updated for each device in Device Manager. See the readme.txt file:
"INF files are copied to the hard disk [Program Files/Intel/INFInst folder] after running the Intel(R) Chipset Device Software executable with an '-A'
flag (i.e., "INFINST_AUTOL.EXE -A"]
Paradox999 - Thursday, May 17, 2007 - link
Wow, who would have thought ATI might have *immature* drivers for the x2900 at this point ? Duh. Moreover, why even try Crossfire when the cards in single configuration have been little more than a major league flop (don't bother spamming me, I'm an ATI fanboy). Given the poor performance (vs a much cheaper 8800GTS) and insane power requirements of a single card, you might be able to count on one hand the people eager to rush out to get a Crossfire setup. This kind of article is more in the category of 'curiosity' (like those guys that tried overclocking an x2900 with liquid nitro). Anand should be publishing more articles of a practical nature. If you want to try Crossfire and the x2900....at least wait for a few driver revisions AND then a head-to-head against the 8800gts. That *might* provide more useful information, albeit, for a very small segment of the enthusiast market.I have to totally agree with some of the previous posters and say SLI and Crossfire is overkill and a waste of money. Buy the best card you can afford now. When it doesn't work for you any more replace it with the best NEW generation card you can buy.
I'm still annoyed that the better motherboards (like my P5B Dlx Wi-Fi)come with 2 PCIE-16 slots. I use 1 x1900XTX and I'll replace it one day with one (1) much better card. The way I see it,ASUS robbed me of a PCI slot for my many expansion cards.
lopri - Thursday, May 17, 2007 - link
You must be joking, I assume? I think all PCI-E slots should be full length (x16) even though they are not electrically so. The only PCI card worth buying (for who needs one, that is) at this time would be X-Fi and that's just because of Creative's incompetency and monopoly in the market. I've ditched my X-Fi and refuse to buy Creative products until they get their act straight.
TA152H - Thursday, May 17, 2007 - link
I ready stuff like this and I wonder what people are thinking. Why would Creative make such a mistake as you suggest?Let's see, every motherboard comes with PCI slots, and there are tons of motherboards that people use that don't have PCI-E slots. They are selling upgrade parts, and PCI-E does NOTHING for these parts that they can't get from PCI. It's not like if they were using PCI-E they would get better performance or it would work better in some way. So, naturally, they are making PCI cards. Duh.
Maybe down the road when Intel stops supporting PCI, or when motherboards come out without PCI slots Creative will start making PCI-E, but until then, who needs them? They don't hurt in any way, not in performance, not in reliability. If they made them PCI-E so soon, they'd invest money in a product that currently makes no sense, and it would just jack up the costs.
lopri - Thursday, May 17, 2007 - link
I somehow doubt that those 'tons of' folks with mobo without PCI-E wouldn't mind on-board sound. Heck. I have an SLI board and I rather make do with on-board sound than dealing with Creative garbage. X-F.. what? I also doubt X-Fi's target market is folks using 5 year old motherboards. Don't get me wrong. Their SB Live! is still decent and perfectly suited for older motherboards.And.. Mistake? Umm.. I wouldn't argue about PCI-E vs PCI here, but It's not exactly the case that Creative's PCI products and supports (which is non-existent, btw) are spectacular. They didn't even have a full driver download link until very recently. (They had no choice but to upload the drivers thanks to Vista)
TA152H - Friday, May 18, 2007 - link
I'm not sure we're on the same page here. I thought you were implying that Creative needed to get their act together and get on the PCI-E bandwagon since that was what you were talking about. Apparently, you just don't like Creative and that was just kind of thrown in without respect to PCI-E.If so, I agree, they blow. Their software is horrible, and their hardware is overpriced. I don't know I'd go as far as to say that they are a monopoly; there are a lot of choices in the low end, but at the high end you can get a card from any maker you want - as long as it's Creative. I am really, really particular with respect to sound too, I have no tolerance for bad speakers or noisy computers because I listen to music on my computer, so it's extremely quiet. Unfortunately, I have to buy Creative and I have a love/hate feeling towards them. They do make the best stuff, but it's expensive, difficult and buggy. So, I know where you're coming from. Maybe NVIDIA should move into that market too. I think they'd eat up a half-rate company like Creative. How about AMD? Hell, if they're going to get into Fusion, why not do it right and put the sound processor there too? It's probably a matter of time. Sound is very important to gaming, and of course to watching TV and listening to music. Makes you wonder why more attention hasn't been placed on it, and substandard companies like Creative are given free reign.
PrinceGaz - Thursday, May 17, 2007 - link
Although it was a nicely presented article on a product which is not exactly revolutionary, I must take issue with the game benchmarks which were included.Out of the seven games tested, only two of them had any results where the average was below 60fps; one where the lowest was 54fps and the other (which was the only one with meaningful framerates) being Supreme Commander where the P35 Crossfire configuration had driver issues.
I know you might say that results of 80 vs 100 vs 120fps do still provide useful information regarding likely performance in future games, but the fact is that they don't as the demands made on the CPU, mobo, and graphics card of a much more demanding game running at 40fps tends to be quite different to that of a current game running at 120fps. I appreciate you must have spent an awful lot of time running them all (five times each for every setting, no less) but at the end of the day they didn't really provide any meaningful information other than that there are driver issues which need to be resolved (which is what we would expect).
By the way, since you already went to the trouble of running every test five times, and discarded the two highest and lowest results to prevent them from unduly affecting an average; wouldn't it be a good idea to run the tests a sixth time so that the score you used is based on the average of two results rather than just the one in the middle? I imagine the 2nd-3rd-4th places were pretty close anyway (hopefully almost identical, with 1st place being very similar, and only 5th place somewhat slower because it was the first run), but for the sake of an extra 20% testing time a sixth run would involve, the statistical accuracy of using the mean of two results would be significantly improved.
I will reiterate though that overall the review was informative and well written; it was only the benchmarks themselves which were a bit pointless.
DigitalFreak - Thursday, May 17, 2007 - link
This is yet another perfect example of why ATI needs to open up Crossfire support to NVidia chipset motherboards. In the Intel space, the only supported chipsets that actually give them the bandwidth they need are the 975X and X38. I would think they would want to sell as many cards as possible.OrSin - Thursday, May 17, 2007 - link
SLI and Crossfire as far asIi can see are not needed for almost anything. I have a 6800 and 7900 and when I was shopping around I could not find a single reason to get another 6800 and go SLI instead just of getting a 7900. Thats the same for crossfire. SLI and crossfire support in games are just not good enough. The 6800 would have been 30% less then the 7900, but the gains would have been 60% less on a good day and no gain at all for several games.With all that rambling it just means that the P35 is a great board, so unless you need crossifre (and most should not) get it. And dont wait for the next over-hyped product (X38). Hows thats :)
PrinceGaz - Thursday, May 17, 2007 - link
SLI/Crossfire is never needed a good upgrade path if the next-gen product is already out. You're almost always much better off selling your old card and buying a new one from the current generation as it works out no more expensive, but provides better performance, uses less power and makes less noise, and has none of the compatibility issues associated with twin graphics-card configurations.However, that does not make SLI and Crossfire useless. They are needed for bragging rights by people whose weekly shopping list includes having a large tub of liquid-nitrogen delivered, and by those who are worried about the size of their ePenis. The rest of us have no need of going the twin graphics-card route unless the money is burning a hole in our pocket anyway and we've nothing better to do with it.