NVIDIA's GeForce 6 SLI: Demolishing Performance Barriers
by Anand Lal Shimpi on November 23, 2004 10:23 AM EST- Posted in
- GPUs
Enabling SLI
We’ve already described the SLI setup process in our Preview of NVIDIA SLI Performance, but we will revisit it here today using the ASUS A8N-SLI Deluxe board as there are some differences.
The first step in enabling SLI is to reconfigure the PCI Express x16 lanes from the nForce4 SLI chipset into two x8 lanes, this is done by inserting the SLI card in the appropriate direction:
Next, you plug in both PCI Express graphics cards. They must be the same GPU type, but you can use cards from different manufacturers if you would like (although it is recommended to have the same BIOS revisions, etc…).
Third, connect the two PCI Express graphics cards using the ASUS supplied bridge PCB.
Fourth, connect the appropriate power connectors to both PCI Express graphics cards.
Fifth, connect power to ASUS’ on-board 4-pin power connector.
Finally, connect your monitor to either one of the outputs on the first PCI Express card and power up your system.
Once in Windows, using the 66.93 drivers, you simply enable SLI mode from NVIDIA’s control panel and reboot your system to enable SLI. Note that only your primary graphics card’s display outputs will be active in SLI mode.
Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.
74 Comments
View All Comments
bob661 - Tuesday, November 23, 2004 - link
#18The hardcore gamers would just buy new video cards.
reboos - Tuesday, November 23, 2004 - link
"Nvidia bought the patents, pending patent applications, trademarks, brand names, and chip inventory related to the graphics business of 3dfx."http://slashdot.org/articles/00/12/15/2244256.shtm...
fuzzynavel - Tuesday, November 23, 2004 - link
I think 3DFX were bought by nvidia...or at least the rights to the technology....so it is technically the same company...I remember the days of 3DFX scan line interleave....fantastic!bob661 - Tuesday, November 23, 2004 - link
#17Two Opterons would be downright scary if they were limited too. But a 4000 is no slouch. :-) It's still amazing. I happen to agree with #12 but the real test of that theory would be to test slower CPU's and see how the performance scales.
reboos - Tuesday, November 23, 2004 - link
Odd as it may sound, should we be thanking 3DFX for this?http://slashdot.org/articles/00/12/15/2244256.shtm...
Gnoad - Tuesday, November 23, 2004 - link
Although SLI is exciting, I found myself wanting more info on the Asus board...haris - Tuesday, November 23, 2004 - link
I just had some more thoughts about why SLI/Multi rendering might not be such a great move by Nvidia/ATI.When they launch their next generation cards they are expecting to rake in some extra money from the extreme gamers, right? What happens to that same card when they start purchasing relatively cheap last gen cards instead. This might then lead to something like this: In order for them to get that additional $ during the begining of the next gen card's life cycle they might have to slow down the production cycle of cards to give them more time in the high-end position.
Jeff7181 - Tuesday, November 23, 2004 - link
#14... why? You have TWO GPU's here... and ONE CPU. Why is it so amazing that two GPU's can put the squeenze on one CPU? Now... stick a 6800U SLI setup with a couple Opteron 250's with an application that's multi-threaded and THEN I'd be amazed if it was still CPU limited.Aquila76 - Tuesday, November 23, 2004 - link
Or was that 330 Watts the total system usage? (doubtful)Aquila76 - Tuesday, November 23, 2004 - link
What power supply was used in your testbed? If the SLI setup requires at load ~ 330 Watts, I would think you'd need around a 550W unit for your setup.