The Card and The Test
Well, we don't know what NV45 will be called, we don't know when it will be available, and we don't actually know if these are the final clocks. But that's what we get with a preview or first look. Please note that these speeds could change before the final product is released. Our NV45 is running at 435MHz core and 1.1GHz memory. If our 6800 Ultra Extreme sample from NVIDIA had not been DOA (despite a couple hours of on-site BIOS flashing help from NVIDIA's Jim Black), this is the speed at which it should have run.Of course, we are hearing something closer to 460MHz core clock (1.2GHz memory) for most vendors who have 6800 Ultra Extreme parts coming out, but that remains to be seen. In any event, since there's no real performance impact, we will finally be able to bring you numbers that are representative of what we should have seen, if we had gotten a working Ultra Extreme sample. Of course, it's on a 3.4GHz P4 EE running DDR2 RAM, so it's not really comparable to the number that we ran on the AMD Athlon 64 3400+, but we have some vendor's 6800 Ultra Extreme parts coming to the lab soon enough, and perhaps before then, we'll switch around our graphics test platform a little.
Here's the test platform we used.
Performance Test Configuration | |
Processor(s): | Intel Pentium 4 3.4GHz EE Socket 775 Intel Pentium 4 3.4GHz EE Socket 478 |
RAM: | 2 x 512MB Micron DDR2 533 2 x 512MB Corsair 3200XL (Samsung 2-2-2-5) |
Hard Drive(s): | Seagate Barracuda 7200.7 |
Video AGP & IDE Chipset Drivers: | Intel Chipset Driver 6.0.0.1014 Intel Application Accelerator 4.0.0.6211 |
Video Card(s): | nVidia NV45 nVidia GeForce 6800 Ultra PCI Express nVidia GeForce 6800 Ultra AGP 8X ATI Radeon X800 XT PCI Express ATI Radeon X800 XT AGP 8X |
Video Drivers: | nVidia 61.45 Graphics Drivers ATI Catalyst 4.6 beta |
Operating System(s): | Windows XP Professional SP1 |
Power Supply: | HiPro 470W (Intel) Vantec Stealth 470W Aluminum |
Motherboards: | Intel 925XCV (Intel 925X) Socket 775 Intel D875PBZ (Intel 875P) Socket 478 |
And here are the numbers that we've all been waiting for.
14 Comments
View All Comments
Pete - Thursday, July 1, 2004 - link
Thanks much for the prompt reply, Derek.DerekWilson - Wednesday, June 30, 2004 - link
PPro cahce was a seperate hunk of silicon nudged right up next to the core ;-) search google images for pentium pro and you'll see the what it looked like under there.Pete,
#1 not sure when that's going to happen, but it will be enevitable for both companies. How we get there is the question. ;-)
#2 our demo is the same, only drivers have changed from our previous tests. We are looking into IQ also...
hope that helps,
Derek Wilson
Pete - Tuesday, June 29, 2004 - link
Sorry, Derek, please ignore #3. For some reason I missed that you specified the 6800U (PCIe) was a custom-clocked NV45.Pete - Tuesday, June 29, 2004 - link
Hi Derek,Can you help clarify three things?
1. I remember reading that ATi would eventually move to all-PEG-native GPUs, and then use a bridge for AGP cards, similar to nV's future plans. Did you hear anything about this from ATi or their OEMs/partners?
2. Did you change your Far Cry demo, or are new drivers responsible for the impressive gains the 6800U has made to significantly overtake the X800XT(PE)?
3. The "NV45" in your charts is simply a higher-clocked "6800U (PCIe)," right? Did you down-clock the NV45 to attain the 6800U PCIe numbers used in Anand's earlier LGA article?
KF - Monday, June 28, 2004 - link
I'm not sure how garbled a recollecton can be, Minotaar. That's not the way I remember it. The PPro had the cache on separate chips in a complex package that plugged into a socket. PIIs, slot cartridge style, had separate cache chips at first, and no cache chips for the first Celeron, true. Later PIIs, and PII style Celerons had on-die full speed cache. On die, reduced size, cache for such as the notable 300 MHz (not 266) Celery that OCed 1.5x by setting the bus speed to 100 instead of 66.Back to the subject, There are some very odd results for a solution that is supposed to be, and mostly is, equal between AGP and PCIe.
GTMan32 - Monday, June 28, 2004 - link
There was a web site reporting that NVIDIA opened up one of ATIs PCIe chips and found it wasn't native but just had the bridging on-chip like the NV45.Then there was another comment that the ATI PCIe chips were clocked lower because of OEM fears that they were running too hot at the same speed as the AGP parts.
One could conclude that the tacked on AGP->PCIe bridge was causing this since it would be the same size as the AGP plus the extra circuits for the bridge. If the ATI solution was really native then it shouldn't have any heating problems?
But was all this just a rumor. I haven't heard anything on this since.
OCedHrt - Monday, June 28, 2004 - link
A possible explanation for the drop in performance on the PCIe cards could be due to specific optimizations that aren't enabled for the PCIe cards in the current drivers. Just a wild guess.ZobarStyl - Monday, June 28, 2004 - link
This bridging seems to be the perfect solution for first generation PCI-E chips which have nothing to gain over AGP anyway...just so long as nV doesn't get lazy and has a native PCI-E card by next gen (which might actually use the bandwidth), they really haven't lost anything with this solution. Good article.Filibuster - Monday, June 28, 2004 - link
Minotaar,The Pentium Pro, by todays standards, *does* have on package cache because it was not part of the cpu core. It is full cpu speed however.
http://members.iweb.net.au/~pstorr/pcbook/images/p...
The big deal about the Pentium 2 cache was that Intel had to make it that way so they could test the cache chips separately from the cpu and thus save money, because the PPRo was soo expensive.
Wonga - Monday, June 28, 2004 - link
-----Pentium Pro also had the advantage of clock speed cache, whereas P2's cache was bus speed.
-----
Well, if we want to be correct about everything, the P2's cache was not run at bus speed, but instead a fraction of the core speed. Half the core speed, to be precise.
Anyway, cheers for the review. Looks like nVidia listened to their OEM partners here and did the sensible thing bringing the HSI on package.