Unreal Tournament 3 Beta Demo: Top to Bottom GPU Analysis
by Derek Wilson on October 18, 2007 4:00 AM EST- Posted in
- GPUs
Introduction
We've already looked quite a bit at Unreal Tournament 3, but, as promised, here is our low end and mainstream GPU analysis of the beta version of the demo for Unreal Tournament 3. Certainly not a string of words that instills confidence in how well these numbers will represent final game play, but it's the best we've got right now for the best looking UE3 game to date.
Our first look at high end GPU performance showed that AMD's Radeon HD 2900 XT was able to best NVIDIA's flagship hardware in a number of cases and remained very competitive even at high resolutions. Will this trend hold for the rest of the lineup, or is the 2900 XT just well suited to UT3?
We'll find out when we put our hardware to the test. First we will look at low end GPU, then the mainstream parts. Finally, we will bring it all together and look at performance across the board. Before we get to the numbers, here is the hardware we used for these numbers.
Rather than run all three flybys as we did for the high end hardware, based on the fact that scaling was fairly consistent across maps, we decided only to test the most taxing of the maps: the Suspense CTF map. We will look a resolutions ranging from 800x600 up to 2560x1600. Sit back and enjoy the ride.
We've already looked quite a bit at Unreal Tournament 3, but, as promised, here is our low end and mainstream GPU analysis of the beta version of the demo for Unreal Tournament 3. Certainly not a string of words that instills confidence in how well these numbers will represent final game play, but it's the best we've got right now for the best looking UE3 game to date.
Our first look at high end GPU performance showed that AMD's Radeon HD 2900 XT was able to best NVIDIA's flagship hardware in a number of cases and remained very competitive even at high resolutions. Will this trend hold for the rest of the lineup, or is the 2900 XT just well suited to UT3?
We'll find out when we put our hardware to the test. First we will look at low end GPU, then the mainstream parts. Finally, we will bring it all together and look at performance across the board. Before we get to the numbers, here is the hardware we used for these numbers.
Test Setup | |
CPU | Intel Core 2 Extreme X6800 |
Motherboard | NVIDIA 680i SLI |
Video Cards | AMD Radeon HD 2900 XT AMD Radeon HD 2600 XT AMD Radeon HD 2600 Pro AMD Radeon HD 2400 XT AMD Radeon X1950 XTX AMD Radeon X1950 Pro AMD Radeon X1650 XT NVIDIA GeForce 8800 Ultra NVIDIA GeForce 8800 GTX NVIDIA GeForce 8800 GTS 320MB NVIDIA GeForce 8600 GTS NVIDIA GeForce 8600 GT NVIDIA GeForce 8500 GT NVIDIA GeForce 7900 GTX NVIDIA GeForce 7950 GT NVIDIA GeForce 7600 GT |
Video Drivers | AMD: Catalyst 7.10 NVIDIA: 163.75 |
Hard Drive | Seagate 7200.9 300GB 8MB 7200RPM |
RAM | 2x1GB Corsair XMS2 PC2-6400 4-4-4-12 |
Operating System | Windows Vista Ultimate 32-bit |
Rather than run all three flybys as we did for the high end hardware, based on the fact that scaling was fairly consistent across maps, we decided only to test the most taxing of the maps: the Suspense CTF map. We will look a resolutions ranging from 800x600 up to 2560x1600. Sit back and enjoy the ride.
34 Comments
View All Comments
Ecmaster76 - Thursday, October 18, 2007 - link
I bet all those people who bought x1k cards are feeling pretty good right now. Once again, the radeon has shown in the long haul its superior longevity compared to the Geforce (assuming that future UT3 verions and drivers dont change the results significantly.legoman666 - Thursday, October 18, 2007 - link
I'm still running with my x1800xt. The problem is, I never see benchmarks for it for new games. I can't really compare it to the x1950xt either, since they're different cores. Is there any way future reviews (or maybe this reviewcould be updated?) could have the x1800xt benchmarks included?That being said, I can run all of the Orange Box games at 1280x1024, 4xAA, 8xAF, all max details with vsync on and still get 38fps (75hz/2). As long as UT3 isn't much more demanding than the source engine, I will probably be fine. Now that I typed that, I remember that Bioshock uses the UT3 engine. Bioshock also runs great on my machine with all the max details.
I guess a good thing about having a compariatively small monitor (1280x1024 instead of one the larger wide screens) is that I still get decent frame rates since the newer monitors are designed for those huge screens and I'm still using my "tiny" screen. hopefully my monitor is the next thing that gets upgraded.
Spoelie - Friday, October 19, 2007 - link
x1800xt is somewhat comparable to the x1950proit has less shading power but more pixel pushing power, so in shader heavy games like this its general performance will be slightly less than the x1950pro, but it will cope better with stuff like anti-aliasing, upping the resolution & anisotropic filtering.
johnsonx - Thursday, October 18, 2007 - link
I'm personally not too sure about my 1950Pro AGP. I don't seem to be getting such great performance.My system specs out far better than my son's (me=X2@2.5Ghz, 1GB, X1950Pro AGP, Vista) (son=A64-3500, 2GB, 7900GS PCIe, XP Pro), yet he appears to get better performance in UT3. I haven't benchmarked it, but he has all detail levels turned up to max while I run mine with the details one tick above minimum, yet his seems smoother than mine.
Between my slightly faster dual core vs. his single core, and my more powerful video card, I ought to be able to run max detail (we both run 1280x1024 LCD's, which should be a walk in the park for my rig).
His system has only one thing better than mine, which is he has 2GB of ram while I have 1GB... but I haven't noted any swapping, and the game still loads pretty fast so it doesn't seem memory constrained.
I know there are many variables here (Vista vs XP, 1Gb vs 2Gb, AGP vs PCIe), but none of those AFAIK should make all that much difference today (obviously the Vista vs XP thing was a big deal 6 months ago, but the drivers have largely reached performance parity haven't they?). I guess I need to figure out how to run the benchmarks Derek did and see what's what.
Spoelie - Friday, October 19, 2007 - link
it's not the fact that you have 1gig or the fact you have vista, but the combination of those 2 make it really a sub-par gaming machine. You really ought to double the ram if you want to game in vista, and even then the same config will get a bit better performance if it was running xp.Also "seems choppy": do you have an lcd screen? v-sync on then.
mcnabney - Thursday, October 18, 2007 - link
You are running Vista, your son is running XP. Vista cripples gaming performance across the board.ChronoReverse - Thursday, October 18, 2007 - link
Yeah, I have an x1950 and I'm feeling pretty plucky indeed =DMrKaz - Thursday, October 18, 2007 - link
Why do you put the 2600XT in the same bag of the 8600GTS.The price difference is huge.
I can buy one good 2600XT for 100€ and one good 8600GTS for 190€.
The more correct comparison is (I think)
19x0XT = 8600GTS
2600XT = 8600GT
2600PRO = 8500GT
2400PRO/XT = 8400GS
dm0r - Thursday, October 18, 2007 - link
I compared in performance, not price...anyway looks like 2600xt is getting mature with new drivers.I would like to see power consumption tests please
cmdrdredd - Thursday, October 18, 2007 - link
power consumption is covered elsewhere. Game performance reviews/previews/guides are for PERFORMANCE based comparisons.