Morphing nForce4 Ultra into nForce4 SLI
by Wesley Fink on January 18, 2005 7:30 AM EST- Posted in
- CPUs
Performance: x16 vs. x16/x2 vs. x8/x8 (SLI)
The best way to verify the success of the mod was to run benchmarks. We had already done extensive testing of SLI performance in Anand's NVIDIA's GeForce 6 SLI: Demolishing Performance Barriers. To get right to the point, we tested the Ultra modded to SLI with Half Life 2, Doom 3, and Far Cry at both 1280x1024 and 1600x1200. We also benchmarked at both settings with and without the eye candy - since Anti-Aliasing and Anisotropic Filtering can exact a large hit on a single GPU.We were interested to see exactly what performance you could get with two video cards on the Ultra board before the mod to SLI, so we also ran benchmarks of the performance of x16/X2 Ultra dual-video card mode.
All tests were run on a DFI LANParty UT nF4 Ultra-D and a DFI LANParty nF4 SLI-DR. We first confirmed that test results were the same on the LANParty UT modified to SLI and the LANParty nF4 SLI, which is a native SLI chipset board. There was no difference in performance after the SLI modification to the Ultra chipset, so results are reported as SLI and relevant to either SLI or Ultra modified to SLI.
Video cards were a single MSI 6800 Ultra PCIe or a matched pair of MSI 6800 Ultra in SLI and x16/x2 modes. Memory in all benchmarks was OCZ 3200 Platinum Rev. 2 (Samsung TCCD) at 2-2-2-10 timings. The CPU was an Athlon 64 4000+, and the power supply was an OCZ PowerStream 600.
In the course of testing, we found that we could actually run the x16/x2 mode on either the SLI board or the Ultra board by leaving the jumpers in normal mode, using an SLI bridge across the two video cards, and enabling SLI in the nVidia driver. Results on the SLI board in x16/x2 mode were, as expected, the same on the nF4 Ultra board as shipped or the Ultra after SLI modification. The one huge advantage of the SLI-mod was that once we had SLI-modded the Ultra chip, we could run x16/x2 mode with any nVidia Forceware driver up to 70.xx. The 70.90 driver was the highest driver to support x16/x2 mode even with an SLI chip. x16/x2 would not run, however, with the most recent 71.xx drivers. 71.xx drivers report the board to be SLI-capable, but it does not recognize the second card as an appropriate card for SLI. Clearly, nVidia must have turned off x16/x2 support in the most recent driver as well, only allowing their specified x8/x8 mode to work. We suspect that enthusiasts will find a way to correct this very quickly.
UPDATE: The Gigabyte 3D1 is a single video card with two 6600GT GPUs. It will only work in x8/x8 (nVidia) SLI mode on a Gigabyte SLI board. However, we did find the 3D1 will operate in x16/x2 mode on both DFI boards with jumpers in "normal" position. We have added test results to our charts with both single 6600GT and x16/x2 dual video mode with the 3D1. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.
85 Comments
View All Comments
archcommus87 - Tuesday, January 18, 2005 - link
So what can nVidia do to stop this? Can a driver change do it or does something in the hardware need to change?I wouldn't ever do this myself, as I like having my warranties, but for all the modders, sounds awesome!
PseudoKnight - Tuesday, January 18, 2005 - link
I get the feeling Nvidia will be tweaking their drivers again. =\bob661 - Tuesday, January 18, 2005 - link
#2They are MUCH cheaper than that now. Zipzoomfly.com has them for $229. I'm getting mine this Wednesday.
knitecrow - Tuesday, January 18, 2005 - link
Nvidia.... that greedy money hogging .... trying to milk us customers for all our worth.An SLI board costs like $300. If I could get SLI on a $100 nforce board, why stop it? it'll only sell more geforce cards.
I hope ATI isn't that bastardly with their cards and chipsets.
Rapsven - Monday, January 17, 2005 - link
Awesome. SLI for cheap.Though you'll still need an extremely good PSU for it.