NVIDIA SLI Performance Preview with MSI's nForce4 SLI Motherboard
by Anand Lal Shimpi on October 29, 2004 5:06 AM EST- Posted in
- GPUs
Setting up SLI
NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.
The SLI card slot looks much like a SO-DIMM connector:
The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.
With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.
As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.
You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.
We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:
Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.
We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.
84 Comments
View All Comments
ImJacksAmygdala - Friday, October 29, 2004 - link
Thanks for the article.I think I will skip the nforce3 and nforce4 boards. I hear that there will even be HT problems with the Nforce4 AO3 silicon and I don't feel like rolling the dice with any other problems.
I;m not sold on SLI anymore either. I have the cash for it, but I'm considering the extra cost of 2 high end cards instead of just getting the latest and greatest every 1.5 to 2 years. I'm concerned about the extra heat and noise aswell.
I would have much rather had Sound Storm than SLI. I think I will just wait and see if a Dolby Live 5.1 encoding sound solution shows up before I upgrade to a AMD64 system. Intel has Dolby Live 5.1 encoding so maybe Creative will soon too.
Lord Banshee - Friday, October 29, 2004 - link
Can you please test Spec ViewPref 7.1.1 or above with the next SLI mobo you test. Alot of us 3D modelers want to know if SLI will benifit.CrystalBay - Friday, October 29, 2004 - link
GJ Anand, you scooped everyone (other review sites) again... :)bob661 - Friday, October 29, 2004 - link
mrdudesirSee #34.
mrdudesir - Friday, October 29, 2004 - link
I dont get why everyone is bitching about the added cost for people who dont want it. There is no added cost if you dont want SLI. Just buy a board based on the NF4 Ultra Chipset. ITs the exact same chipset just with no SLI. In fact if anything SLI lowers the price because it leaves a new top of the line chipset so that the NF4 Ultra doesn't have to be the absolute best and hence it is cheaper.nserra - Friday, October 29, 2004 - link
I already had a dual voodoo2 SLI, and besides the extra speed (and not always), no more....This is not that brilliant:
1st - Need motherboard support and a special/specific one (voodoo2 didn’t)
2nd - Doesn’t bring anything new features besides extra speed (play at 1280x1024 instead of 1024x768?)
3rd - More heat and power requirement.
4th - The driver must support the game (I don’t know if voodoo2 also needed this)
5th - It will prolong your PC how? Does the SLI 6600GT have the same functionalities/features of future products (NV50) don’t think so.
6th – Price, price, price …..
7th – Voodoo2 also had a version of SLI in a single board, a much cleaver solution, for the immediate since every board would accept it.
8th - I bet there will be games incompatibles (voodoo2 had to disable SLI in some games in order to work/play)
….
Reflex - Friday, October 29, 2004 - link
#35: If you do not wish to use the second slot for graphics, it is still a fully functioning PCI Express slot you can use for *anything* else, so it is not wasted board space at all.Reflex - Friday, October 29, 2004 - link
#9: There will be no add in SoundStorm solution. The group that developed that technology at nVidia has been dissolved and moved on to other projects.Just as well, it was not a quality solution anyways.
bob661 - Friday, October 29, 2004 - link
The hardware does exist. You can buy 6600GT's right now on Newegg.haris - Friday, October 29, 2004 - link
SLI is an option on the motherboard. Great. SLI might work because of the driver, but doesn't the hardware have to exist for the feature to be used in the driver?What if Nvidia/ATI have to use up valuable board space for a feature that will only be used by high end users, this means that everyone else is paying for a feature that they don't want or will never use. I don't like the idea that I might be paying extra for my card because one person out of ten thousand (or whatever the % of high end to average users is) wanted that feature.