XFX
This card is the beast of our roundup: The XFX 6600GT Extreme Gamer Edition. XFX is the only vendor that we've seen take a stand and do something different. The first thing to notice is the dual DVI connectors on the board. This isn't normally something one would need on a mid-range solution, but having just come from newegg.com and noticed that the standard XFX card with dual DVI costs less than some PCIe 6600 GT parts without dual DVI, there's no reason to start talking about cost being a huge issue, and thus, no good argument for why dual DVI isn't on these cards.There is something that this card has for which a premium may be charged: 1.6ns GDDR3 running at 600MHz. We haven't seen pricing yet, but this part is obviously not going to be the "be all, end all" value of graphics cards. Adding memory bandwidth is a good thing for the 6600 GT, considering the 128-bit bus. The problem is that the performance benefit is maybe half the increase in memory bandwidth, if we are lucky. And we might see better scaling with AA enabled, but on a mainstream part, that's pushing the limits.
Anyway, modifying the stock HSF, XFX placed a copper plate between the die and heatsink in order to increase the tension in the spring pegs and keep harder pressure on the GPU. Also, they are doing the same thing that we saw Leadtek do - there is a bit of material around the silicon that acts as a spacer between the rest of the GPU and the heatsink. This is necessary because the copper plate lifted the rubber nubs off the PCB making them ineffective stabilizers.
This card was loud, but cooled well due to their innovative adaptation of the stock cooling solution. The inclusion of 1.6ns GDDR3 will also be very attractive at a default clock speed of 600MHz. But this will not be appealing if it is incredibly higher priced than the current round of 6600 GT products, especially since (whether by design or chance) Sparkle's 6600 GT had 2ns RAM that overclocked to 610MHz as well.
84 Comments
View All Comments
Bonesdad - Wednesday, February 16, 2005 - link
Yes, I too would like to see an update here...have any of the makers attacked the HSF mounting problems?1q3er5 - Tuesday, February 15, 2005 - link
can we please get an update on this article with more cards, and replacements of defective cards?I'm interested in the gigabyte card
Yush - Tuesday, February 8, 2005 - link
Those temperature results are pretty dodge. Surely no regular computer user would have a caseless computer. Those results are only favourable and only shed light on how cool the card CAN be, and not how hot they actually are in a regular scenario. The results would've been much more useful had the temperature been measured inside a case.Andrewliu6294 - Saturday, January 29, 2005 - link
i like the albatron best. Exactly how loud is it? like how many decibels?JClimbs - Thursday, January 27, 2005 - link
Anyone have any information on the Galaxy part? I don't find it in a pricewatch or pricegrabber search at all.Abecedaria - Saturday, January 22, 2005 - link
Hey there. I noticed that Gigabyte seems to have modified their HSI cooling solution. Has anyone had any experience with this? It looks much better.Comments?
http://www.giga-byte.com/VGA/Products/Products_GV-...
abc
levicki - Sunday, January 9, 2005 - link
Derek, do you read your email at all? I got Prolink 6600 GT card and I would like to hear a suggestion on improving the cooling solution. I can confirm that retail card reaches 95 C at full load and idles at 48 C. That is really bad image for nVidia. They should be informed about vendor's doing poor job on cooling design. I mean, you would expect it to be way better because those cards ain't cheap.levicki - Sunday, January 9, 2005 - link
geogecko - Wednesday, January 5, 2005 - link
Derek. Could you speculate on what thermal compound is used to interface between the HSF and the GPU on the XFX card? I e-mailed them, and they won't tell me what it is?! It would be great if it was paste or tape. I need to be able to remove it, and then later, would like to re-install it. I might be able to overlook not having the component video pod on the XFX card, as long as I get an HDTV that supports DVI.Beatnik - Friday, December 31, 2004 - link
I thought I would add about the DUAL-DVI issue, in the new NVIDIA drivers, they show that the second DVI can be used for HDTV output. It appears that even the overscan adjustments are there.
So not having the component "pod" on the XFX card appears to be less of a concern than I thought it might be. It would be nice to hear if someone tried running 1600x1200 + 1600x1200 on the XFX, just to know if the DVI is up to snuff for dual LCD use.