F.E.A.R. GPU Performance Tests: Setting a New Standard
by Josh Venning on October 20, 2005 9:00 AM EST- Posted in
- GPUs
The Game/Test setup
There was a lot of hype about FEAR before it was released, which is common for ground-breaking games of this sort. Too much hype can be a bad thing as we've seen before with other games, and while it may have some similarities to the movies, we aren't quite willing to put FEAR on the same level as The Ring and The Matrix, in spite of the dramatic slow motion deaths and the scary looking little girl.
In this case, though, FEAR more or less lives up to the hype, and what we have here is possibly the most beautiful looking, first-person shooter that we've ever seen. The dark and moody atmosphere and lighting are nice, reminiscent of Doom 3 except that you can actually see most of the time. During heated action sequences, the combination of lighting effects from things like muzzle flashes and sparks from bullet ricochets, as well as dust that falls from wall shots creating hazy clouds create a beautiful scene of chaos. This is further enhanced by the much-copied (but still fun) bullet-time/reflex mode, which slows everything down so that the chaos of sparks, dust and bodies flying through the air resemble some bizarre ballet that will occasionally make you pause to marvel at its beauty. Other graphical elements worth mentioning are the fire effects, which are impressive, compared to most other games, as are the water effects (reflections, ripples and caustics).
To be fair, a few things could have looked better in the game. While the levels are pretty, they can be repetitive, as can the enemies, which are mostly hordes of a few different variations of clone soldiers, and the effectiveness of the games parallax mapped environment damage is not up to snuff. These are just a few complaints, however, and graphically, the good stuff more than outweighs the bad. Further more, because the enemy AI is so smart and action so intense, you'll be so caught up in gameplay that the small graphical problems won't matter much.
Not only is this the best looking game out right now, but it also happens to be the most graphically demanding, as we will see in our performance tests. It's so demanding in fact that it could be a good reason for people to upgrade their graphics card. FEAR only supports resolutions of up to 1600x1200, but only the highest end cards can handle this resolution well, especially with soft shadows and/or AA enabled. In fact, this may be the first game that puts the 7800 GTX to its full use, as our tests showed at 1600x1200 with soft shadows and AA enabled FEAR was barely playable.
We wanted to get an idea about how FEAR performs across a wide range of graphics cards, so we tested a good sample of high end and mid-range cards. These are the kind of cards that we could see paired with our high end test system.
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
ATI Radeon X1800 XT (not yet available)
ATI Radeon X1800 XL
ATI Radeon X1600 XT (not yet available)
ATI Radeon X1300 Pro
ATI Radeon X800 GT
We also tested all resolutions possible up to 1600x1200, the highest that FEAR will run, with and without 4xAA and 8xAF enabled. FEAR gives the option to turn on a feature called soft shadows, which we will talk about later, and because of some issues that we saw with this, we ran benchmarks with and without this enabled. All other options were turned up to their maximum quality level. For those of you with older mid-range and lower end cards, maximal detail is not really an option at any resolution.
This is our test system:
NVIDIA nForce 4 motherboard
AMD Athlon 64 FX-55 2.6 GHz Processor
1 GB OCZ 2:2:2:6 DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
We also made sure to test with sound disabled. This test isn't as strict a test of graphics performance as some of our other benchmarks. For one, we used the built-in test feature. While this gives us a consistent "run" through a scene, physics variability and slight differences in what the characters in the scene do are apparent. This is similar to the Far Cry test if Crytek had added physics cues to the camera path of their benchmark.
While we would like to see more consistent action in order to compare cards better, the built-in tool is a much better option than using fraps while running through a level. As mentioned, we tested three different game settings. Driver settings were all default except for VSYNC, which was explicitly disabled.
Before we get to the numbers, let's take a deeper look at some of the graphics and performance issues that we noted previously.
There was a lot of hype about FEAR before it was released, which is common for ground-breaking games of this sort. Too much hype can be a bad thing as we've seen before with other games, and while it may have some similarities to the movies, we aren't quite willing to put FEAR on the same level as The Ring and The Matrix, in spite of the dramatic slow motion deaths and the scary looking little girl.
In this case, though, FEAR more or less lives up to the hype, and what we have here is possibly the most beautiful looking, first-person shooter that we've ever seen. The dark and moody atmosphere and lighting are nice, reminiscent of Doom 3 except that you can actually see most of the time. During heated action sequences, the combination of lighting effects from things like muzzle flashes and sparks from bullet ricochets, as well as dust that falls from wall shots creating hazy clouds create a beautiful scene of chaos. This is further enhanced by the much-copied (but still fun) bullet-time/reflex mode, which slows everything down so that the chaos of sparks, dust and bodies flying through the air resemble some bizarre ballet that will occasionally make you pause to marvel at its beauty. Other graphical elements worth mentioning are the fire effects, which are impressive, compared to most other games, as are the water effects (reflections, ripples and caustics).
To be fair, a few things could have looked better in the game. While the levels are pretty, they can be repetitive, as can the enemies, which are mostly hordes of a few different variations of clone soldiers, and the effectiveness of the games parallax mapped environment damage is not up to snuff. These are just a few complaints, however, and graphically, the good stuff more than outweighs the bad. Further more, because the enemy AI is so smart and action so intense, you'll be so caught up in gameplay that the small graphical problems won't matter much.
Not only is this the best looking game out right now, but it also happens to be the most graphically demanding, as we will see in our performance tests. It's so demanding in fact that it could be a good reason for people to upgrade their graphics card. FEAR only supports resolutions of up to 1600x1200, but only the highest end cards can handle this resolution well, especially with soft shadows and/or AA enabled. In fact, this may be the first game that puts the 7800 GTX to its full use, as our tests showed at 1600x1200 with soft shadows and AA enabled FEAR was barely playable.
We wanted to get an idea about how FEAR performs across a wide range of graphics cards, so we tested a good sample of high end and mid-range cards. These are the kind of cards that we could see paired with our high end test system.
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
ATI Radeon X1800 XT (not yet available)
ATI Radeon X1800 XL
ATI Radeon X1600 XT (not yet available)
ATI Radeon X1300 Pro
ATI Radeon X800 GT
We also tested all resolutions possible up to 1600x1200, the highest that FEAR will run, with and without 4xAA and 8xAF enabled. FEAR gives the option to turn on a feature called soft shadows, which we will talk about later, and because of some issues that we saw with this, we ran benchmarks with and without this enabled. All other options were turned up to their maximum quality level. For those of you with older mid-range and lower end cards, maximal detail is not really an option at any resolution.
This is our test system:
NVIDIA nForce 4 motherboard
AMD Athlon 64 FX-55 2.6 GHz Processor
1 GB OCZ 2:2:2:6 DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
We also made sure to test with sound disabled. This test isn't as strict a test of graphics performance as some of our other benchmarks. For one, we used the built-in test feature. While this gives us a consistent "run" through a scene, physics variability and slight differences in what the characters in the scene do are apparent. This is similar to the Far Cry test if Crytek had added physics cues to the camera path of their benchmark.
While we would like to see more consistent action in order to compare cards better, the built-in tool is a much better option than using fraps while running through a level. As mentioned, we tested three different game settings. Driver settings were all default except for VSYNC, which was explicitly disabled.
Before we get to the numbers, let's take a deeper look at some of the graphics and performance issues that we noted previously.
117 Comments
View All Comments
Le Québécois - Thursday, October 20, 2005 - link
Like many peoples said it would have been nice to see older generation HW...especially on ATI side of thing since most of the card tested here are nowhere to be found on the market.Seeing performance with the X800XL and the X850XT would have been nice.
I also hope you'll do some CPU testing in the future since I doubt you'll see many peoples out there with AMD FX55...especially paired up with the like of X1300... :)
Kogan - Thursday, October 20, 2005 - link
Since the max upgrade for AGP users on the ATI side is an X800xt/x850xt, it would have been nice to have seen one of them included.ballero - Thursday, October 20, 2005 - link
I'm looking forward to the SLI numbersAbecedaria - Thursday, October 20, 2005 - link
It is a significant error that SLI numbers were left out of the article since it seems to be about how fast current video card technologies can play the game:"Those who want to play FEAR at the highest resolution and settings with AA enabled (without soft shadows) will basically have to use the 7800 GTX, as no other card available gets playable framerates at those settings, and the 7800 GTX does just barely (if uncomfortably)." ...unless you have an SLI setup, I assume. Does Anandtech feel that SLI is not a viable graphics technology or am I missing something?
And then there's Crossfire... while it STILL isn't available yet, it would have been interesting to see some performance numbers along with SLI tests.
I'd would be nice if you could update the article with dual card frame-rates.
abc
Abecedaria - Thursday, October 20, 2005 - link
Oh wait!!!!PC Perspective has already beat Anandtech to the punch on this subject, and the results show that SLI has a SIGNIFICANT impact on playability, even without any driver optimizations....
http://www.pcper.com/article.php?aid=175&type=...">http://www.pcper.com/article.php?aid=175&type=...
abc
Ender17 - Thursday, October 20, 2005 - link
I agree. Can we get some SLI benchmarks?Kyanzes - Thursday, October 20, 2005 - link
...to see a card performing on the top when it's not even available...9nails - Saturday, October 22, 2005 - link
Exactly! I love this Land of Make Believe. It's a good thing that I have a AMD Athlon 64 FX-55 2.6 GHz processor in my Desktop, Laptop, and PDA. And I'm loving it because after an unreal CPU like that, I would still have hundreds of dollars left to burn on make-believe GPU's. Because, if I was only a regular Joe Anad-reader with a middle tier Pentium 4 and old school AGP graphics port I would be quite upset that the author is targeting his reviews at the well connected Beverley Hills posh.Just who is Josh writing his articles for any way?! I'm going back to surfing pr0n. Because I have a far better chance at dating a porn* than owning a system like the one that he's showing scores on.
yacoub - Saturday, October 22, 2005 - link
Well thanks for supporting the thread I started in Video forum section last week addressing that very issue. All the idiots came out of the woodwork to do their best to misinterpret and misread the post and very few actually bothered to support my suggestion that a test be done with a REAL WORLD system most of us own, not an FX-55 setup with a 7800GTX that few people own.I'd LOVE to see how modern games perform on a system I'm actually thinking of buying, not an imaginary supersystem.
deathwalker - Thursday, October 20, 2005 - link
You know..it's simply come to the point to where I don't know how the average gamer can keep up. It's come to the point to where if you are not willing to spend $300-$500 every 6-12 mos. or so you just can not keep up with the demands that games are putting on computer hardware. This is stupid..I mean who the hell is dragging this industry along? Do they develop new and more powerful hardware so more demanding software can be created or do they develop more demanding software making it a necessity to develop more powerful hardware? Is all this crap really needed to have a decent gaming experience? I guess I'm just gonna have to starve the Cat for the couple of months so I can toss out my POS 6800gt and get some new wizzbang graphics cards the industry wants me to buy. This has become a never ending process that is wearing thin on me.