Gigabyte Dual GPU: nForce4, Intel, and the 3D1 Single Card SLI Tested
by Derek Wilson on January 6, 2005 4:12 PM EST- Posted in
- GPUs
Introduction
One of the first things we thought of when we heard that NVIDIA was going to try to bring back the multi-GPU craze was the single board solution. Even back in the 3dfx days, there was Obsidian ready with the single board SLI solution. Gigabyte is hitting multi-GPU technology hard out of the gate with a single board 6600 GT solution dubbed the 3D1. We were able to get our hands on this and two new motherboards from Gigabyte last week for a round of holiday testing.The two major focuses of the article will be to explore any advantages offered by the 3D1 over two-card SLI solutions, and to take a first look at the performance of the GA-8AENXP Dual Graphic Intel SLI offering from Gigabyte. This is the 925XE version of the earlier announced 915P based Dual Graphic board.
The reader should understand this before beginning the review: these solutions are somewhat limited in application until NVIDIA changes its philosophy on multi-GPU support in ForceWare drivers. In order to get any multi-GPU support at all, the driver must detect an SLI capable motherboard. This means that we had to go back to the 66.81 driver in order to test Intel SLI. It also means that even if the 3D1 didn't require a special motherboard BIOS in order to boot video, it wouldn't be able to run in SLI mode unless it were in an SLI motherboard.
As it stands, the optimal single card solution can't be had until NVIDIA allows multi-GPU functionality to be enabled on motherboards without explicit SLI support. Combine this with a multi-GPU graphics card that doesn't require special BIOS hooks to POST, and we have a universal single card solution. Until then, bundling the GA-K8NXP-SLI motherboard and 3D1 is a very good solution for Gigabyte. Those who want to upgrade to PCI Express and a multi-GPU solution immediately have a viable option here. They get the motherboard needed to run an SLI system and two GPUs in one package with less hassle.
For now, we are very interested in taking a look at the first of many innovations that are sure to come out the graphics card vendors' multi-GPU R&D departments.
43 Comments
View All Comments
Gigahertz19 - Thursday, January 6, 2005 - link
1st is the worst...2nd is the best....3rd is the one with the hairy chest :)bbomb - Thursday, January 6, 2005 - link
It seems like Nvidia just wants to make sure that none of their partners can benefit from SLI technology to ensure that Nvidia has some new technology to introduce in th future.I bet Nvidia already has a multi-gpu card that work on any board and can probably work in SLI with another multi-GPU card sitting in a cabinet somewhere until Nvidia sees fit to let us get our hand on the technology.
I hope ATI's solution stomps Nvidias into the ground, but then again Nvidias software team cant seem to get it right and they blow away ATI's driver progam which leads me to beleive that ATI will have driver problems as well.
HardwareD00d - Thursday, January 6, 2005 - link
yippie first post!