Sometimes it's easy to get lost in the high performance market. With games like The Elder Scrolls IV: Oblivion capable of bringing even the highest powered desktop systems to their knees, the desire to see just how beautiful we can render a game is quite strong. For professional gamers, it isn't about the attention to detail, but the rendering speed. Everyone who is the least bit interested in realtime 3D graphics can easily get excited about what the latest and greatest hardware can do for their favorite games and simulations.
But the vast majority of us can't afford to drop over $1000USD on graphics hardware. Instead, we must approach our love for graphics in one of two ways. Either we need to determine the minimum level of graphical quality we are comfortable having, or we must look for the fastest card we can afford within a certain price range. What ever perspective one might have, the end result usually ends up somewhere near the topic of this article: midrange graphics cards.
The current landscape of $200 - $300 graphics cards is quite well suited to the enthusiast who wants good performance and quality for a reasonable amount of money. As such, we will be taking a look at this market segment as it stands. This really does seem to be the sweet spot in terms of bang for the buck right now. We won't be able to run Oblivion with all the options enabled, but all the games we test will look good and play well. We won't be surprised to see a few more entries into this market before the end of the year, but we are certainly over due for a good hard look at anything but the high end.
Over the past year, we've seen the 6600 GT fall in relative performance from one of the greatest midrange cards we've seen to something more like a minimum requirement for passable graphics. Likewise, the modder's darling X800 GTO is starting to struggle to various degrees (depending on how far any given card could be pushed). These two cards (among others) are included in our test as references.
This week we did run into a little bit of a snag in our testing for this article: the price of the ATI X1900 series dropped quite a bit. Not only did the price drop for the X1900 GT end up adding quite a bit of value to the card, but the X1900 XT dropped low enough in price to put it in competition with the 7900 GT at just over $300. This week has been spent testing more cards and a few extra senarios in order to cover all the bases and truly find out what cards are the best to buy in the mid range market segment. With a variety of overclocked NVIDIA cards available and price cuts on many ATI parts, things just got a whole lot more complicated. As the high end desktop graphics market used to top out at $300, we understand that even a mid range graphics card is still a significant investment for most people. It's important to be armed with the best and latest information when making purchasing descisions in such fast paced, high tech markets.
We hope to shake out the best options in the current line up as well as help those looking to upgrade from a previous generation of midrange graphics see how their card stacks up. Let's take a look at the cards we have included and why.
74 Comments
View All Comments
gmallen - Friday, August 11, 2006 - link
Most of the PC enthusiast population interested in mid-range cards are still running AGP motherboards (this is based on sales of pci motherboards vs. agp motherboards). Where are these cards?Josh7289 - Friday, August 11, 2006 - link
They don't exist.
arturnowp - Friday, August 11, 2006 - link
HiIt's written that all card in oblivion was tested with HDR Lighting with X800GTO doesn't support. I think your results are misleading. The same with SC: Chaos Theory...
BTW: Who plays Oblivion with Actor Fade at 20%, Item Fade at 10% and Object Fade at 25% you get better graphics and performance setting those option to 50-60% and turning off grass with consums a lot of power and doesn't look good. In foliage it's better to see your enemies from greater distance the say with a horse ;-)
arturnowp - Friday, August 11, 2006 - link
OK there's writen about SC: Chaos Theory but all in all conclusion are misleading "Owners of the X800 GTO may have a little more life left in their card depending on how overclocked the card is, but even at stock clocks, it might be wise to hang on for another product cycle if possibl" where GeForce 6600GT performe on par with X800GTO. It would be better to exclude X800GTO from charts or mark it as SM 2.0 card. What's better GeForce 6600GT should be tested in SM 2.0 mode...nv40 - Friday, August 11, 2006 - link
Don't why?http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
Some difference of test are so large that it almost shocked me
For instance:
7900GT@84.21 with FX-60 can run 54 FPS avg in 1600x1200 with 4xAA 16xAF in X-bit lab
7900GT@91.33 with X6800 just be 35 FPS ave in 1600x1200 with only 4x AA in Anandtech
Problem of 91.33? Intel 975X? X6800? nVidia?
more than 40% performance difference despite X6800 is far superior to FX-60
coldpower27 - Friday, August 11, 2006 - link
They probably aren't running the same time demo sequences.nv40 - Friday, August 11, 2006 - link
Maybe... but only 9% dif in X1900GT (41 vs 38)And 7900GT test in Anandtech definitely performed much worse then X-bit lab in general
nothing with which is correct or not, but if both are right, the the conclusion may be probably draw like below:
1. Driver problem: 91.33 is much slower than 84.21 (nV Cheat, or 91.33 problem)
2. CPU problem: X6800 is much inferior than FX-60 in game (Rediculous, and far from true in every test)
3. Platform problem: nVidia cards perform much worse in intel chipset (975X)
Sharky974 - Friday, August 11, 2006 - link
I agree. I clearly remember Xbit declaring the 7900GT to win the vast majority of benches vs the X1900GT.In fact overall the X1900GT wasn't warmly recieved. I really feel this deserves some looking into.
For example, I'll have to go look, but I think Firing Sqaud also showed the X1900GT as inferior to the 7900GT.
As it stands now, it's like Anand's platforms are somehow ATI biased, on the other hand I believe Xbit platform is Nvidia biased. Xbit reviews nearly always show Nvidia winning.
Sharky974 - Friday, August 11, 2006 - link
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...I started on the first page of benches.
As one glaring example:
Firings squad: Quake 4 1280X1024 4XAA 8XAF 7900GT-87.2 X1900GT-60.6
http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...
Anand: Quake 4 1280X1024 4XAA 7900 GT-45.1 X1900GT-49.8
http://images.anandtech.com/reviews/video/roundups...">http://images.anandtech.com/reviews/video/roundups...
With similar settings, FS has the 7900GT getting nearly double the frames Anand does. The X1900GT also gets significantly more in FS review, from 49 to 60 FPS, but nowhere near the change the 7900GT sees, with the net effect the X1900GT eaks out a win at Anand, but loses by nearly 27+ FPS at FS.
The X1900GT is definitly a better card than I had remembered, even at the FS benches though.
Also, FS was using a FX-57. Anand a much more powerful CPU, making results all the more puzzling.
In addition to some of the other suggestions, I'd question drivers. FS was using older drivers on both since it is an older review. Perhaps Nvidia drivers have seen a large performance decrease, or ATI's a similar increase? This seems fairly unlikely, though, as I dont think you normally get huge differences from driver to driver.
Unless Nvidia really was cheating RE 16-bit filtering as the INQ claimed a while back, so they fixed it causing a massive performance decrease? :) Again though, that suggestion is made half-jokingly.
This definitly needs a lot of looking into I fell. Anand's results are quite different than others around the web at first blush.
JarredWalton - Friday, August 11, 2006 - link
Levels can make a huge difference in performance. For example, Far Cry has segments that get about 80 FPS max on any current CPU (maybe higher with Core 2 Extreme overclocked...), but other areas of the game run at 150+ FPS on even a moderate CPU like a 3500+. I don't have a problem providing our demo files, but some fo them are quite large (Q4 is about 130 MB if I recall). SCCT, FEAR, and X3 provide a reference that anyone can compare to, if they want. The only other thing is that ATI driver improvements are certainly not unlikely, especially in Quake 4.