NVIDIA's GeForce 6200 & 6600 non-GT: Affordable Gaming
by Anand Lal Shimpi on October 11, 2004 9:00 AM EST- Posted in
- GPUs
Although it's seeing very slow adoption among end users, PCI Express platforms are getting out there and the two graphics giants are wasting no time in shifting the competition for king of the hill over to the PCI Express realm.
ATI and NVIDIA have both traded shots in the mid-range with the release of the Radeon X700 and GeForce 6600. Today, the battle continues in the entry-level space with NVIDIA's latest launch - the GeForce 6200.
The GeForce 6 series is now composed of 3 GPUs: the high end 6800, the mid-range 6600 and now the entry-level 6200. True to NVIDIA's promise of one common feature set, all three of the aforementioned GPUs boast full DirectX 9 compliance, and thus, can all run the same games, just at different speeds.
What has NVIDIA done to make the 6200 slower than the 6600 and 6800?
For starters, the 6200 features half the pixel pipes of the 6600, and 1/4 that of the 6800. Next, the 6200 will be available in two versions: one with a 128-bit memory bus like the 6600 and one with a 64-bit memory bus, effectively cutting memory bandwidth in half. Finally, NVIDIA cut the core clock on the 6200 down to 300MHz as the final guarantee that it would not cannibalize sales of their more expensive cards.
The 6200 is a NV43 derivative, meaning it is built on the same 0.11-micron (110nm) process on which the 6600 is built. In fact, the two chips are virtually identical with the 6200 having only 4 active pixel pipelines on its die. There is one other architectural difference between the 6200 and the rest of the GeForce 6 family, and that is the lack of any color or z-compression support in the memory controller. Color and Z-compression are wonderful ways of reducing the memory bandwidth overhead of enabling technologies such as anti-aliasing. So, without support for that compression, we can expect the 6200 to take a bigger hit when turning on AA and anisotropic filtering. The benefit here is that the 6200 doesn't have the fill rate or the memory bandwidth to run most games at higher resolutions. Therefore, those who buy the 6200 won't be able to play at resolutions where the lack of color and z-compression would really matter with AA enabled. We'll investigate this a bit more in our performance tests.
Here's a quick table summarizing what the 6200 is and how it compares to the rest of the GeForce 6 family:
GPU | Manufacturing Process | Vertex Engines | Pixel Pipelines | Memory Bus Width |
GeForce 6200 | 0.11-micron | 3 | 4 | 64/128-bit |
GeForce 6600 | 0.11-micron | 3 | 8 | 128-bit |
GeForce 6800 | 0.13-micron | 6 | 16 | 256-bit |
The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration. While NVIDIA insists that they cannot force their vendor partners to distinguish the two card configurations apart, we're more inclined to believe that NVIDIA simply would like all 6200 based cards to be known as a GeForce 6200, regardless of whether or not they have half the memory bandwidth. NVIDIA makes a "suggestion" to their card partners that they should add the 64-bit or 128-bit designation somewhere on their boxes, model numbers or website, but the suggestion goes no further than just being a suggestion.
The next issue of variability comes in the topic of clock speeds. NVIDIA has "put a stake in the ground" at 300MHz as the desired clock speed for the 6200 GPUs regardless of configuration, and it does seem that add-in board vendors would have no reason to clock their 6200s any differently, since they are all paying for a 300MHz part. The variability really comes when you start talking about memory speeds. The 6200 only supports DDR1 memory and is spec'd to run at 275MHz (effectively 550MHz). However, as we've seen in the past, this is only a suggestion - it is up to the manufacturers as to whether or not they will use cheaper memory.
NVIDIA is also only releasing the 6200 as a PCI Express product - there will be no AGP variant at this point in time. The problem is that the 6200 is a much improved architecture compared to the current entry-level NVIDIA card in the market (the FX 5200), yet the 5200 is still selling quite well as it is not really purchased as a hardcore gaming card. In order to avoid cannibalizing AGP FX 5200 sales, the 6200 is kept out of competition by being a strictly PCI Express product. While there is a PCI Express version of the FX 5200, its hold on the market is not nearly as strong as the AGP version, so losing some sales to the 6200 isn't as big of a deal.
In talking about AGP versions of recently released cards, NVIDIA has given us an update on the status of the AGP version of the highly anticipated GeForce 6600GT. We should have samples by the end of this month and NVIDIA is looking to have them available for purchase before the end of November. There are currently no plans for retail availability of the PCI Express GeForce 6800 Ultras - those are mostly going to tier 1 OEMs.
The 6200 will be shipping in November and what's interesting is that some of the very first 6200 cards to hit the street will most likely be bundled with PCI Express motherboards. It seems like ATI and NVIDIA are doing a better job of selling 925X motherboards than Intel these days.
The expected street price of the GeForce 6200 is between $129 and $149 for the 128-bit 128MB version. This price range is just under that of the vanilla ATI X700 and the regular GeForce 6600 (non-GT), both of which are included in our performance comparison - so in order for the 6200 to truly remain competitive, its street price will have to be closer to the $99 mark.
The direct competition to the 6200 from ATI are the PCI Express X300 and X300SE (128-bit and 64-bit versions respectively). ATI has a bit of a disadvantage here because the X300 and X300SE are still based on the old Radeon 9600 architecture and not a derivative of the X800 and X700. ATI is undoubtedly working on a 4-pipe version of the X800, but for this review, the advantage is definitely in NVIDIA's court.
44 Comments
View All Comments
Saist - Monday, October 11, 2004 - link
xsliver : I think it's because ATi has generally cared more about optimizing for DirectX, and more recently just optimizing for API. OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX. How long it will take to convert that priority into performance is unknown.Also, keep this in mind: Nvidia specifically built the Geforce mark-itecture from the ground up to power John Carmack's 3D dream. Nvidia has specifically stated they create their cards based on what Carmack says. Wether or not that is right or wrong I will leave up to you to decide, but that does very well explain the disparity between ID games and other games, even under OGL.
xsilver - Monday, October 11, 2004 - link
Just a conspiracy theory -- does the NV cards only perform well on the most popular / publicised games whereas the ATI cards excel due to a better written driver / better hardware?Or is the FRAPS testing biasing ATI for some reason?
Cygni - Monday, October 11, 2004 - link
What do you mean "who has the right games"? If you want to play Doom3, look at the Doom3 graphs. If you want to play FarCry, look a the FarCry graphs. If you want to play CoH, Madden, or Thunder 04, look at HardOCP's graphs. Every game is going to handle the cards differently. I really dont see anything wrong with AnandTech's current group of testing programs.And Newegg now has 5 6600 non-GTs in stock, ranging in price from $175-$148. But remember that it takes time to test and review these cards. When Anand went to get a 6600, its very likely that that was the only card he could find. I know I couldnt find one at all a week ago.
T8000 - Monday, October 11, 2004 - link
Check this, a XFX 6600 in stock for just $143:http://www.gameve.com/gve/Store/ProductDetails.asp...
Furthermore, the games you pick for a review make a large difference for the conclusion. Because of that, HardOCP has the 6200 outperforming the x600 by a small margin. So, I would like to know who has the right games.
And #2:
The X700/X800 is simular enough to the 9800 to compare them on pipelines and clock speeds. Based on that, the x700 should perform about the same.
Anand Lal Shimpi - Monday, October 11, 2004 - link
Thanks for the responses, here are some answers in no specific order:1) The X300 was omitted from the Video Stress Test benchmark because CS: Source was released before we could finish testing the X300, no longer giving us access to the beta. We will run the cards on the final version of CS: Source in future reviews.
2) I apologize for the confusing conclusion, that statement was meant to follow the line before it about the X300. I've made the appropriate changes.
3) No prob in regards to the Video Processor, I've literally been asking every week since May about this thing. I will get the full story one way or another.
4) I am working on answering some of your questions about comparing other cards to what we've seen here. Don't worry, the comparisons are coming...
Take care,
Anand
friedrice - Monday, October 11, 2004 - link
Here's my question, what is better? A Geforce 6800 or a Geforce 6600 GT? I wish there was like a Geforce round-up somewhere. And I saw some benchmarks that showed SLI does indeed work, but these were just used on 3dmark and anyone know if there is any actual tests out yet on SLI?Also to address another issue some of you have brought up, these new line of cards beat the 9800 Pro by a huge amount. But it's not worth the upgrade. Stick with what you have until it no longer works, and right now a 9800 Pro works just fine. Of course if you do need a new graphics card, the 6600 GT seems the way to go. If you can find someone that sells them.
O, and to address the pricing. nVidia only offers suggested retail prices. Vendors can up the price on parts so that they can still sell the inventory they have on older cards. In the next couple of months we should see these new graphics cards drop to the MSRP
ViRGE - Monday, October 11, 2004 - link
#10, because it's still an MP game at the core. The AI is as dumb as rocks, and is there for the console users. Most PC users will be playing this online, not alone in SP mode.rbV5 - Monday, October 11, 2004 - link
Thanks for the tidbit on the 6800's PVP. I'd like to see Anandtech take on a video card round up aimed at video processing and what these cards are actually capable of. It would fit in nicely with the media software/hardware Andrew's been looking at, and let users know what to actually expect from their hardware.thebluesgnr - Monday, October 11, 2004 - link
Buyxtremegear has the GeForce 6600 from Leadtek for $135. Gameve has 3 different cards (Sparkle, XFX, Leadtek) all under $150 for the 128MB version.#1,
they're probably talking about the power consumption under full load.
Sunbird - Monday, October 11, 2004 - link
All I hope is that the 128bit and 64bit versions have some easy way of distinguishing between them.