Unreal Tournament 3 CPU & High End GPU Analysis: Next-Gen Gaming Explored
by Anand Lal Shimpi & Derek Wilson on October 17, 2007 3:35 AM EST- Posted in
- GPUs
It's been a long time coming, but we finally have Epic's first Unreal Engine 3 based game out on the PC. While the final version of Unreal Tournament 3 is still a little farther out, last week's beta release kept us occupied over the past several days as we benchmarked the engine behind Rainbow Six: Vegas, Gears of War and Bioshock.
Used in some very beautiful games, Epic's Unreal Engine 3 has been bringing us some truly next-generation game titles and is significantly more demanding on the CPU and GPU than Valve's Source engine. While far from the impossible-to-run that Oblivion was upon its release, UE3 is still more stressful on modern day hardware than most of what we've seen thus far.
The Demo Beta
Although Unreal Tournament 3 is due out before the end of the year, what Epic released is a beta of the UT3 Demo and thus it's not as polished as a final demo. The demo beta has the ability to record demos but it can't play them back, so conventional benchmarking is out. Thankfully Epic left in three scripted flybys that basically take a camera and fly around the levels in a set path, devoid of all characters.
Real world UT3 performance will be more strenuous than what these flybys show but it's the best we can muster for now. The final version of UT3 should have full demo playback functionality, with which we'll be able to provide better performance analysis. The demo beta also only ships with medium quality textures, so the final game can be even more stressful/beautiful if you so desire.
The flybys can run for an arbitrary period of time, we standardized on 90 seconds for each flyby in order to get repeatable results while still keeping the tests manageable to run. There are three flyby benchmarks that come bundled with the demo beta: DM-ShangriLa, DM-HeatRay and vCTF-Suspense.
As their names imply, the ShangriLa and HeatRay flybys are of the Shangri La and Heat Ray deathmatch levels, while the vCTF-Suspense is a flyby of the sole vehicle CTF level that comes with the demo.
Our GPU tests were run at the highest quality settings and with the -compatscale=5 switch enabled, which puts all detail settings at their highest values.
Our CPU tests were run at the default settings without the compatscale switch as we're looking to measure CPU performance and not GPU performance.
The Test
Test Setup | |
CPU | Intel Core 2 Extreme QX6850 (3.33GHz 4MB 1333FSB) |
Motherboard | Intel: Gigabyte GA-P35C-DS3R AMD: ASUS M2N32-SLI Deluxe |
Video Cards | AMD Radeon HD 2900 XT AMD Radeon X1950 XTX NVIDIA GeForce 8800 Ultra NVIDIA GeForce 8800 GTX NVIDIA GeForce 8800 GTS 320MB NVIDIA GeForce 7900 GTX |
Video Drivers | AMD: Catalyst 7.10 NVIDIA: 163.75 |
Hard Drive | Seagate 7200.9 300GB 8MB 7200RPM |
RAM | 2x1GB Corsair XMS2 PC2-6400 4-4-4-12 |
Operating System | Windows Vista Ultimate 32-bit |
72 Comments
View All Comments
kmmatney - Wednesday, October 17, 2007 - link
The benchmarks show that AMD cpu's are not performing as well as they should here. This will hopefully be fixed in the future.You sound like someone who has an AMD processor and is bitter...
clairvoyant129 - Wednesday, October 17, 2007 - link
These are the same people who said there is a big difference using Netburst CPUs and K8s. Right, if a Netburst CPU coupled with a 7800GTX got 60 FPS when a K8 got 90 FPS, it was a huge difference to them but now it doesn't seem like it.hubajube - Wednesday, October 17, 2007 - link
I'm definitely not bitter, just realistic. The difference between 90 and 180 fps is totally irrelevant. An Intel E2140 gets over 90fps. Hell, a Sempron with a decent video card could play this game extremely well.
Benchmarks are great in that you can use them to judge how your system will perform with a game but they're not the be all end all of performance nor is a CPU that does 100 fps a pile of shit because it doesn't do 105 fps. And how should they be performing in your opinion? 100 fps is not good enough for you? How about 500 fps? Is that better?
JarredWalton - Wednesday, October 17, 2007 - link
The point is that at 1920x1200 we're at a completely GPU-limited resolution (as shown by the fact that the difference between E6550 and X6850 is only 1%). AMD still runs 9% slower, so it seems that architecture, cache, etc. means that even at GPU limited resolutions AMD is still slower than we would expect. Is it unplayable? No, but we're looking at the top-end AMD CPU (6400+) and in CPU-limited scenarios it's still 10% slower than an E6550.It seems to me that we're in a similar situation to what we saw at the end of the NetBurst era: higher clock speeds really aren't bringing much in the way of performance improvements. AMD needs a lot more than just CPU tweaks to close the gap, which is why we're all waiting to see how Phenom compares.
clairvoyant129 - Wednesday, October 17, 2007 - link
That 9% was using 1920x1200. Majority of PC users use a much lower resolution than that. At 1024x768, it's much much higher.Think again moron.
KAZANI - Wednesday, October 17, 2007 - link
And most people don't care about framerates higher than their monitor's refresh rate. Both processors were well above 100 frames in 1024*768.hubajube - Wednesday, October 17, 2007 - link
No moron, 1024x768 on a 8800GTX is NOT what "most PC users users" are going to be using. The video cards that "most PC users users" will be using was not tested in this benchmark. YOU need to actually THINK next time.clairvoyant129 - Wednesday, October 17, 2007 - link
Where did I say majority of PC users with an 8800GTX use 1024x768? What's your idea of testing CPUs? Benchmark them by using GPU limited resolutions? What a joke. You people never complained when Anand compared Netburst CPUs to K8s at 1024x768 or lower resolutions.Don't get your panties twisted AMD fanny.
IKeelU - Wednesday, October 17, 2007 - link
Ummm...how do you launch the flybys used in this analysis?customcoms - Wednesday, October 17, 2007 - link
You mention that you cranked the resolution to 1920x1200, but the charts still say 1024x768...the results look like those at 1920x1200 though, so I'm guessing its a typo. GPU bound CPU Comparison charts here: http://anandtech.com/video/showdoc.aspx?i=3127&...">http://anandtech.com/video/showdoc.aspx?i=3127&...