GeForce GTX 285: 55nm Enabling Higher Performance
by Derek Wilson on January 15, 2009 9:00 AM EST- Posted in
- GPUs
Smaller Die + More Performance = More Power
Power isn't going to be straight forward here, as this is both a die shrink and an overclock. If all other things were equal, the die shrink would have enabled a some power savings, but increasing the clock speeds (and likely voltages) means that we have factors at work that will push against each other. As for which will win, let's take a look at the data and find out.
Since we didn't take a look at power in our GeForce GTX 295 article, we'll keep an eye on that card as well. Also, keep in mind that there have been 55nm GTX 260s being slowly phased in but that our GTX 260 parts are 65nm. The 55nm GTX 260s will show a power advantage over similarly clocked 65nm GTX 260s.
Idle power shows that NVIDIA is able to get some power savings when nothing is going on with the GPU. Power draw at idle decreased by about 10W with the move to 55nm which shows that in addition to their power saving features the die shrink does help. This advantage carries over to SLI as well with the GTX 285 SLI landing between the two single card dual-GPU systems.
The GeForce GTX 295 slides in just above the single GPU 4870 1GB while AMD's 4870 X2 consumes about 10W more than NVIDIA's higher performing dual-GPU card.
We see a different story when we look at load power. In spite of the die shrink, the added overclock pushes the GeForce GTX 285 higher under load than any other single GPU part. When SLI is enabled this becomes the most power hungry dual card setup we tested.
As for the GeForce GTX 295, we once again see good performance with lower power draw than the Radeon HD 4870 X2 and, in fact, less power draw than all the other dual-GPU setups we tested.
While a half node die shrink isn't the holy grail of power savings, the major advantage for NVIDIA comes from the die size decrease. We don't have measurements on the GPU after the shrink (we don't want to tear apart our hardware until we've tested things like 3-way SLI), but with the massive size of GT200 and the heavy price cuts NVIDIA was forced to make shortly after launch, the cost savings is a very important factor in this move.
NVIDIA needs to keep its price competitive and that means it needs to keep its costs down. Building an overclocked GTX 280 helps raise the price while building the parts at 55nm helps lower the cost. NVIDIA wants this card to be successful.
76 Comments
View All Comments
Kroneborge - Thursday, January 15, 2009 - link
I'd like to second the request for info on sound levels. I do music production, AND play games on my computer. So it's important to be able to find a nice balance. I know some people don't care if their cards are loud, but there are many others that do.Thanks,
mczak - Thursday, January 15, 2009 - link
I'm not sure those power consumption figures are relevant. You're measuring a overclocked card. Even though you've downclocked it to standard speeds, it could well be setup to run at slightly higher voltages to guarantee stable operation at the overclocked frequencies without the manufacturer having to do much further qualification.gungan3 - Thursday, January 15, 2009 - link
Oh and the GTX 285 has only 2 X 6 pin PCIE connectors while the GTX 280 had one 6 pin and one 8 pin connectorgungan3 - Thursday, January 15, 2009 - link
Yes i would have really liked to see some tests with the 4850 X2 as well. At a $299 pricepoint for the 2X1 GB version it should offer higher performance than a GTX 280/285. You could throw in 9800 GTX+ SLI as well there, it should probably smoke its own brother as well.Also why oh why are there no tests on fan noise and GPU temperatures? Those would be very useful to consumers. Another test could be case Temperature which would be a big help to buyers of the GTX 295 which dumps hot air inside the case itself. How about overclocking tests? No time for those as well?
And some more insight into the actual changes in hardware would also be appreciated. Pictures of the fronf of the PCB and the cooling system would be helpful. To quote from your review "The hardware looks the same as the current GeForce GTX 280. There really isn't anything aside from the GPU that appears different (except the sticker on the card that is)"
Might i point out that as is the case with the 55 nm GTX 260(as well as the GTX 295), that all the memory chips are now on the front of the card as opposed to the original PCB's which had memory on both sides thus requiring more layers in the PCB( afaik 16 layers as opposed to 12 layers). Possibly some changes in power/memory voltage circuitry as well. Was that too hard to notice?
Daeros - Thursday, January 15, 2009 - link
I was just noticing something about several gfx card reviews I have seen here lately, the lack of CF results to compare with SLI. Of course the top of the chart is full of Nvidia cards when you don't test any multi-card solutions from ATI. I know the new test platform supports this, so I really don't understand the reasoning.Also, there is an excellent competitor for the GTX285, the 4850x2. It comes with 2x1GB GDDR3 so it will be slightly stronger than two standard 4850's in CF, and Newegg has them for $299 w/ free shipping.
Goty - Thursday, January 15, 2009 - link
Why include Crossfire results when you have the 4870X2 in the mix? It's nearly identical to two 4870s in a Crossfire configuration, so there's no need to run another set of benchmarks if you're going to get the same numbers.Daeros - Thursday, January 15, 2009 - link
My point is that the 4870x2 is designed to compete with the GTX280/285 cards from Nvidia. All I was saying was it would be nice to have multi-card comparisons for both brands at similar price points (ie GTX285SLI=$760, GTX280SLI=$650 4870x2CF=$860, 4870x2+4870(1GB)=$670), 4850x2CF=$600). So why not test a couple more cards in similar brackets and give more-useful, fully-fleshed reviews.elerick - Thursday, January 15, 2009 - link
How I believe the Price wars between AMD and Nvidia are going to be good for consumers. I can't wait to see the new pricing for GTX 280 with these rolling out. Glad to see performance increases this early on in the year.Stonedofmoo - Thursday, January 15, 2009 - link
Really I'm bored of reading about top end parts eating hundreds of watts of power.I'd really like to see the GT200 technology migrated too midrange parts. In the UK we have a situation where Nvidia does not have one single competative part for sale between £150-200. The GTX 260's are all above £200 and the Geforce 9 series parts are not worth considering when you see how much faster the ATI 48xx cards are in that price range.
Nvidia really needs to forget the race for top performance cards that eat power for breakfast, and start taking note that not everyone wants the most powerful card, some of us are looking for the new 8800GT of this generation...
Goty - Thursday, January 15, 2009 - link
I think you hit the nail on the head when you said the only benefit to the end user from the GTX285 is that it will drop the price on the GTX280, Derek. You get more performance, but it still slots into exactly the same spot performance-wise: faster than the GTX260/HD4870, slower than the 4870X2. Add in the fact that there are no power savings and you've got a pointless product aside from the fact that it saves NVIDIA a little money.As for the review itself, why only results using 4xAA? I'd like to see how performance falls off with 8xAA vs the HD4870 and see if the marginally increased clockspeeds help at all in that department.