Gigabyte Dual GPU: nForce4, Intel, and the 3D1 Single Card SLI Tested
by Derek Wilson on January 6, 2005 4:12 PM EST- Posted in
- GPUs
Introduction
One of the first things we thought of when we heard that NVIDIA was going to try to bring back the multi-GPU craze was the single board solution. Even back in the 3dfx days, there was Obsidian ready with the single board SLI solution. Gigabyte is hitting multi-GPU technology hard out of the gate with a single board 6600 GT solution dubbed the 3D1. We were able to get our hands on this and two new motherboards from Gigabyte last week for a round of holiday testing.The two major focuses of the article will be to explore any advantages offered by the 3D1 over two-card SLI solutions, and to take a first look at the performance of the GA-8AENXP Dual Graphic Intel SLI offering from Gigabyte. This is the 925XE version of the earlier announced 915P based Dual Graphic board.
The reader should understand this before beginning the review: these solutions are somewhat limited in application until NVIDIA changes its philosophy on multi-GPU support in ForceWare drivers. In order to get any multi-GPU support at all, the driver must detect an SLI capable motherboard. This means that we had to go back to the 66.81 driver in order to test Intel SLI. It also means that even if the 3D1 didn't require a special motherboard BIOS in order to boot video, it wouldn't be able to run in SLI mode unless it were in an SLI motherboard.
As it stands, the optimal single card solution can't be had until NVIDIA allows multi-GPU functionality to be enabled on motherboards without explicit SLI support. Combine this with a multi-GPU graphics card that doesn't require special BIOS hooks to POST, and we have a universal single card solution. Until then, bundling the GA-K8NXP-SLI motherboard and 3D1 is a very good solution for Gigabyte. Those who want to upgrade to PCI Express and a multi-GPU solution immediately have a viable option here. They get the motherboard needed to run an SLI system and two GPUs in one package with less hassle.
For now, we are very interested in taking a look at the first of many innovations that are sure to come out the graphics card vendors' multi-GPU R&D departments.
43 Comments
View All Comments
johnsonx - Friday, January 7, 2005 - link
To #19:from page 1:
"....even if the 3D1 didn't require a special motherboard BIOS in order to boot video..."
In other words, the mainboard BIOS has to do something special to deal with a dual-GPU card, or at least the current implementation of the 3D1.
What NVidia should do is:
1. Update their drivers to allow SLI any time two GPU's are found, whether they be on two boards or one.
2. Standardize whatever BIOS support is required for the dual GPU cards to POST properly, and include the code in their reference BIOS for the NForce4.
At least then you could run a dual-GPU card on any NForce4 board. Maybe in turn Quad-GPU could be possible on an SLI board.
bob661 - Friday, January 7, 2005 - link
#19I think the article mentioned a special bios is needed to run this card. Right now only Gigabyte has this bios.
pio!pio! - Friday, January 7, 2005 - link
#18 use a laptopFinalFantasy - Friday, January 7, 2005 - link
Poor Intel :(jcromano - Friday, January 7, 2005 - link
From the article, which I enjoyed very much:"The only motherboard that can run the 3D1 is the GA-K8NXP-SLI."
Why exactly can't the ASUS SLI board (for example) use the 3D1? Surely not just because Gigabyte says it can't, right?
Cheers,
Jim
phaxmohdem - Friday, January 7, 2005 - link
ATI Rage Fury MAXX Nuff said...lol #6 I think you're on to something though. Modern technology is becoming incredibly power hungry I think that more steps need to be taken to reduce power consumption and heat production, however with the current pixel pushing slugfest we are witnessing FPS has obviously displaced these two worries to our beloved Video card manufacturers. At some point though when consumers refuse to buy the latest Geforce or Radeon card with a heatsink taking up 4 Extra PCI slots, I think that they will get the hint. I personally consider a dual slot heatsink solution ludicrous.
Nvidia, ATI, Intel, AMD... STOP RAISING MY ELECTRICITY BILL AND ROOM TEMPERATURE!!!!
KingofCamelot - Friday, January 7, 2005 - link
#16 I'm tired of you people acting like SLI is only doable with an NVIDIA motherboard, which is obviously not the case. SLI only applies to the graphics cards. On motherboards SLI is just a marketing term for NVIDIA. Any board with 2 16x PCI-E connectors can pull off SLI with NVIDIA graphics cards. NVIDIA's solution is unique because they were able to split a 16x line and give each connector 8x bandwidth. Other motherboard manufacturer's are doing 16x and 4x.sprockkets - Thursday, January 6, 2005 - link
I'm curious to see how all those lame Intel configs by Dell and others pull off SLI long before thie mb came out.Regs - Thursday, January 6, 2005 - link
Once again - history repeats itself. Dual core SLI solutions are still a far reach from reality.Lifted - Thursday, January 6, 2005 - link
Dual 6800GT's???? hahahahahehhehehehahahahah.Not laughing at you, but those things are so hot you'd need a 50 pound copper heatsink on the beast with 4 x 20,000 RPM fans running full boar just to prevent a China Syndrome.
Somebody say dual core? Maybe with GeForce 2 MX series cores.