Galaxy
We haven't had a Galaxy card in our labs before, and we were happy to see a card with a custom-round HSF attached very firmly after all the loosely attached rectangular solutions that we had the pleasure of handling. The springs on the hardware are much tighter than on the other solutions that we've seen, and the shape of the heatsink doesn't give us enough leverage when we press on it to pop it off the GPU. Though it is still possible to twist the HSF around the axis created by the two spring pin mounts, it is something that a consumer would have to try to do.Since this card has no shroud on its fan, this will also sound different than some of the other solutions that we've seen. Shrouds can help to direct airflow on larger heatsinks, but since this cooling solution doesn't try to cool the RAM as well as the core, a small circular design is fine.
We were happy with this card's ability to stay cool at idle and load. The only downside that we saw was the board's overclocking performance. Galaxy sent us this part at a pre-overclocked 525/550. We originally thought this was a press sample card, as the packaging materials that came with the box indicated a 500/500 clock speed. On the contrary, Galaxy has informed us that all of their 6600 GT products are shipping at 525/550 clock speeds. In the end, this part was our worst overclocker, which is a surprising combination when talking about our coolest part. It really could have been luck of the draw as the worst case GPU and worst case RAM configuration of the bunch, but that does seem like an awful bit of luck.
If you aren't planning to overclock from stock, this is absolutely a wonderful 6600 GT option. For someone who needs good cooling and low noise in their case, Galaxy is top notch. Since this product comes with added performance at no extra hassle, it's definitely a top pick in our books.
84 Comments
View All Comments
Bonesdad - Wednesday, February 16, 2005 - link
Yes, I too would like to see an update here...have any of the makers attacked the HSF mounting problems?1q3er5 - Tuesday, February 15, 2005 - link
can we please get an update on this article with more cards, and replacements of defective cards?I'm interested in the gigabyte card
Yush - Tuesday, February 8, 2005 - link
Those temperature results are pretty dodge. Surely no regular computer user would have a caseless computer. Those results are only favourable and only shed light on how cool the card CAN be, and not how hot they actually are in a regular scenario. The results would've been much more useful had the temperature been measured inside a case.Andrewliu6294 - Saturday, January 29, 2005 - link
i like the albatron best. Exactly how loud is it? like how many decibels?JClimbs - Thursday, January 27, 2005 - link
Anyone have any information on the Galaxy part? I don't find it in a pricewatch or pricegrabber search at all.Abecedaria - Saturday, January 22, 2005 - link
Hey there. I noticed that Gigabyte seems to have modified their HSI cooling solution. Has anyone had any experience with this? It looks much better.Comments?
http://www.giga-byte.com/VGA/Products/Products_GV-...
abc
levicki - Sunday, January 9, 2005 - link
Derek, do you read your email at all? I got Prolink 6600 GT card and I would like to hear a suggestion on improving the cooling solution. I can confirm that retail card reaches 95 C at full load and idles at 48 C. That is really bad image for nVidia. They should be informed about vendor's doing poor job on cooling design. I mean, you would expect it to be way better because those cards ain't cheap.levicki - Sunday, January 9, 2005 - link
geogecko - Wednesday, January 5, 2005 - link
Derek. Could you speculate on what thermal compound is used to interface between the HSF and the GPU on the XFX card? I e-mailed them, and they won't tell me what it is?! It would be great if it was paste or tape. I need to be able to remove it, and then later, would like to re-install it. I might be able to overlook not having the component video pod on the XFX card, as long as I get an HDTV that supports DVI.Beatnik - Friday, December 31, 2004 - link
I thought I would add about the DUAL-DVI issue, in the new NVIDIA drivers, they show that the second DVI can be used for HDTV output. It appears that even the overscan adjustments are there.
So not having the component "pod" on the XFX card appears to be less of a concern than I thought it might be. It would be nice to hear if someone tried running 1600x1200 + 1600x1200 on the XFX, just to know if the DVI is up to snuff for dual LCD use.