Overclocking our Geforce 6600GTs
Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.
There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.
For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"
The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.
The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.
If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.
So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.
84 Comments
View All Comments
geogecko - Wednesday, December 29, 2004 - link
Well, two e-mails later to XFX, without an answer to my questions, and now I see this PNY PCI-E card with dual dvi and hdtv out pod...guess who's going to get my money?Beatnik - Tuesday, December 28, 2004 - link
Two links of interest:
http://www.xfxforce.com/pinetechnotes/Fan%20Update...
http://www.pny.com/products/verto/performance/6600...
Nice article folks!
Beatnik - Tuesday, December 28, 2004 - link
Seems pretty clear that a lot of people are waiting on their next upgrade, hence the continued heavy AGP interest. w.r.t. the XFX, it looks like they have a online store, and now have a pretty cool looking heatsink on the AGP card:
http://www.xfxforce.com/pinetechnotes/Fan%20Update...
The PNY 6600GT AGP product looks interesting also: http://www.pny.com/products/verto/performance/6600...
Might be the only DVI+DVI+component video out.
(Outstanding article guys!)
Rekonn - Monday, December 27, 2004 - link
I too would really like to see a roundup like this one done for 6600GT AGP cards.zoros - Sunday, December 26, 2004 - link
Anyone know how well PNY 6600GT is doing in there tests.. I have tried to find information everywhere, but with no sucess.. :-(geogecko - Monday, December 20, 2004 - link
I agree. PC's have started to move into the home theater more than ever now, and people (me included) are reading your articles to obtain knowledge when building home theater PC's. This information is not there, and thus still leaves me in the dark as to which video card to purchase for my HTPC.No word from XFX on their HDTV Output compatibility, so I must assume they don't support it, which stinks, considering they have the only card with dual DVI connectors, and a decent HSF design.
I realize this was a quick review, but video cards are now being required to have HD compatibility since so many people are interested in HTPC's these days. No one wants a DVR that forces advertising on them when they fast forward past commercials...so why not build a DVR that does more than TiVo intead...
How about an update with HDTV Output compatibility, along with who includes the cables?
nvdm24 - Sunday, December 19, 2004 - link
How much longer will us readers allow these ridiculous reviews to go on? Many of the readers of these tech sites want to know the FULL capabilities of these cards, not just how they run doom 3 and other 3d games. Sadly, reviewers at anandtech and every other tech site ignore the VIDEO capabilities of VIDEO cards. Even this review of the new 6600 agp ignores the video aspect of the 6600, despite the problems of the 6800, that weren't discovered by any reviewer, since none of them tested it for video. Not testing the video aspect does a HUGE disservice to readers. It's quite simple, just test a dvd movie and make sure the video aspect works and let readers know. If you feel particularly energetic, you could also test how fast it renders home movies, etc. You may think this is the job of a VIDEO site or PC site, but you are a PC site, a tech site. You would be surprised at the people who read your reviews. Others are going to start doing the job better, thus pull away readers if you don't get it together.ChineseDemocracyGNR - Friday, December 17, 2004 - link
Hi Derek,any word from the manufactures that had problems, are they sending you new cards?
I was reading some user reviews for the MSI 6600GT _AGP_ at gameve.com and it also has heating problems, which is disapointing. Do you plan a similar article on the 6600GT AGP cards?
1q3er5 - Friday, December 17, 2004 - link
ouch u got me good there :( im never posting again :o lolDerekWilson - Friday, December 17, 2004 - link
#54 We scored cards more on construction, cooling, and noise, rather than on overclockability. thus the Albatron didn't get an award.Also, the leadtek card you liked to is the AGP version. We tested PCI Express parts only. The heatsink you mention is not cooling RAM, but the HSI (PCIe to AGP bridge).