Gigabyte Dual GPU: nForce4, Intel, and the 3D1 Single Card SLI Tested
by Derek Wilson on January 6, 2005 4:12 PM EST- Posted in
- GPUs
Doom 3 Performance
There's a 2.7% difference in frame rate between the 3D1 and the 2 x 6600GT SLI solution at 16x12 under Doom 3 without AA. This performance improvement is all due to the memory clock speed increase over the stock 6600 GT speed on the 3D1. The extra 120MHz with which each GPU can hit memory helps to make up for the limited bandwidth to each chip. Off the bat, we don't see any performance gains inherent in going with a single card SLI solution.
With about a 5.3% increase in performance bump, the 3D1's lead over the stock 2 x 6600 GT solution is simply due to its 12% memory clock speed increase. Of course, it is good to confirm that there are no negatives that come from going with a single card SLI solution here.
Throughout this test, the Intel SLI solution performs very poorly, putting in numbers between one- and three-quarters their potential shown on the AMD platform. The fact that the Intel system is not as swift a performer under Doom 3 in general is not a help here either, but we are working with GPU limited tests that help to negate that factor.
43 Comments
View All Comments
reactor - Thursday, January 6, 2005 - link
so basically it performs the same as sli and for the same price as the sli setup, but only works with gb boards. wouldve like to see some power/cooling comparisons and pics although ive already seen it.in the end id rather get a 6800gt.
mkruer - Thursday, January 6, 2005 - link
Just wait we will see Dual Core GPU's soon enough.yelo333 - Thursday, January 6, 2005 - link
#5,#7,#9 - you've hit the nail on the head...Esp. for something like this, we need those pics!
For those who need to slake their thirst for pics, just run a google search for "gigabyte 3d1" - it turns up plenty of other review's w/ pics.
Paratus - Thursday, January 6, 2005 - link
Speedo - Thursday, January 6, 2005 - link
yea, not a single pic in the whole review...semo - Thursday, January 6, 2005 - link
yeah, it's bad enough i can never own onewe want to see some pretty pictures!
miketheidiot - Thursday, January 6, 2005 - link
I agree with #5wheres the pics?
pio!pio! - Thursday, January 6, 2005 - link
#4 dual core video cards in SLI on a dual core cpu dual cpu mobo w/ quad power suppliespio!pio! - Thursday, January 6, 2005 - link
no pics of this card in the article??Gigahertz19 - Thursday, January 6, 2005 - link
It's only a matter of time until we see dual video cards that each have dual cores in a system...>>Homer Simpson>>ahhhhhgggggaaaahhhhhhhhh Quad GPU's :)