The RV870 Story: AMD Showing up to the Fight
by Anand Lal Shimpi on February 14, 2010 12:00 AM EST- Posted in
- GPUs
The Other Train - Building a Huge RV870
While the Radeon HD 5800 series just launched last September, discussions of what the GPUs would be started back in 2006.
Going into the fall of 2007 ATI had a rough outline of what the Evergreen family was going to look like. ATI was pretty well aware of DirectX 11 and Microsoft’s schedule for Windows 7. They didn’t know the exact day it would come out, but ATI knew when to prepare for. This was going to be another one of those market bulges that they had to align themselves with. Evergreen had to be ready by Q3 2009, but what would it look like?
Carrell wanted another RV770. He believed in the design he proposed earlier, he wanted something svelte and affordable. The problem, as I mentioned earlier, was RV770 had no credibility internally. This was 2007, RV770 didn’t hit until a year later and even up to the first day reviews went live there were skeptics within ATI.
Marketing didn’t like the idea of building another RV770. No one in the press liked R600 and ATI was coming under serious fire. It didn’t help that AMD had just acquired ATI and the CPU business was struggling as well. Someone had to start making money. Ultimately, marketing didn’t want to be on the hook two generations in a row for not being at the absolute top.
It’s difficult to put PR spin on why you’re not the fastest, especially in a market that traditionally rewards the kingpin. Marketing didn’t want another RV770, they wanted an NVIDIA killer. At the time, no one knew that the 770 would be an NVIDIA killer. They thought they just needed to build something huge.
AMD's new GPU strategy...but only for the RV770
From August through November 2007, Carrell Killebrew came very close to quitting. The argument to build a huge RV870 because NVIDIA was going to build a huge competitor infuriated him. It was the exact thinking he fought so hard against just a year earlier with the RV770. One sign of a great leader is someone who genuinely believes in himself. Carrell believed his RV770 strategy was right. And everyone else was trying to get him to admit he was wrong, before the RV770 ever saw the light of day.
Even Rick Bergman, a supporter of Carrell’s in the 770 design discussions, agreed that it might make sense to build something a bit more aggressive with 870. It might not be such a bad idea for ATI to pop their heads up every now and then. Surprise NVIDIA with RV670, 770 and then build a huge chip with 870.
While today we know that the smaller die strategy worked, ATI was actually doing the sensible thing by not making another RV770. If you’re already taking a huge risk, is there any sense in taking another one? Or do you hedge your bets? Doing the former is considered juvenile, the latter - levelheaded.
Carrell didn’t buy into it. But his options were limited. He could either quit, or shut up and let the chips fall where they may.
A comparison of die sizes - to scale.
What resulted was sort of a lame compromise. The final PRS was left without a die size spec. Carrell agreed to make the RV870 at least 2x the performance of what they were expecting to get out of the RV770. I call it a lame compromise because engineering took that as a green light to build a big chip. They were ready to build something at least 20mm on a side, probably 22mm after feature creep.
132 Comments
View All Comments
Spoelie - Thursday, February 18, 2010 - link
phoronix.comfor all things ATi + Linux
SeanHollister - Monday, February 15, 2010 - link
Fantastic work, Anand. It's so difficult to make pieces like this work without coming across as puffery, but everything here feels genuine and evenhanded. Here's hoping for similar articles featuring individuals at NVIDIA, Intel and beyond in the not-too-distant future.boslink - Monday, February 15, 2010 - link
Just like many others i'm also reading/visiting anandtech for years but this article made me register just to say damn good job.Also for the long time i didn't read article from cover to cover. Usually i read first page and maybe second (enough to guess what's in other pages) and than skip to conclusions.
But this article remind us that Graphic card/chip is not only silicon. Real people story is what makes this article great.
Thanks Anand
AmdInside - Monday, February 15, 2010 - link
Great article as usual. Sunspot seems like the biggest non-factor in the 5x00 series. Except for hardware reviews sites which have lots of monitors lying around, I just don't see a need for it. It is like NVIDIA's 3D Vision. Concept sounds good but in general practice, it is not very realistic that a user will use it. Just another check box that a company can point to to an OEM and say we have it and they don't. NVIDIA has had Eyefinity for a while (SLI Mosaic). It just is very expensive since it is targeted towards businesses and not consumers and offers some features Eyefinity doesn't offer.I think NVIDIA just didn't believe consumers really wanted it but added it afterwards just so that ATI doesn't have a checkbox they can brag about. But NVIDIA probably still believes this is mainly a business feature.It is always interesting to learn how businesses make product decisions internally. I always hate reading interviews of PR people. I learn zero. Talk to engineers if you really want to learn something.
BelardA - Tuesday, February 16, 2010 - link
I think the point of Eyefinity is that its more hardware based and natural... not requiring so much work from the game publisher. A way of having higher screen details over a span of monitors.A few games will actually span 2 or 3 monitors. Or some will use the 2nd display as a control panel. With Eyefinity, it tells the game "I have #### x #### pixels" and auto divides the signal onto 3 or 6 screens and be playable. That is quite cool.
But as you say, its a bit of a non-factor. Most users will still only have one display to work with. Hmmm. there was a monitor that was almost seamless 3-monitors built together, where is that?
Also, I think the TOP-SECRET aspect of Sun-Spots was a way of testing security. Eyefinity isn't a major thing... but the hiding of it was.
While employees do move about in the business, the sharing of trade-secrets could still get them in trouble - if caught. It does happen, but how much?
gomakeit - Monday, February 15, 2010 - link
I love these insightful articles! This is why Anandtech is one of my favorite tech sites ever!Smell This - Monday, February 15, 2010 - link
Probably could have done without the snide reference to the CPU division at the end of the article - it added nothing and was a detraction from the overall piece.It also implies a symbiotic relationship between AMDs 40+ year battle with Chipzilla and the GPU Wars with nV. Not really an accurate correlation. The CPU division has their own headaches.
It is appropriate to note, however, that both divisions must bring their 'A' Game to the table with the upcoming convergence on-die of the CPU-GPU.
mrwilton - Monday, February 15, 2010 - link
Thank you, Anand, for this great and fun-to-read article. It really has been some time where I have read an article cover to cover.Keep up the excellent work.
Best wishes, wt
Ananke - Monday, February 15, 2010 - link
I have 5850, it is a great card. However, what people saying about PC gaming is true - gaming on PC slowly fades towards consoles. You cannot justify several thousand-dollar PC versus a 2-300 multimedia console.So powerful GPU is a supercomputer by itself. Please ATI, make better Avivo transcoder, push open software development using Steam further. We need many applications, not just Photoshop and Cyberlink. We need hundreds, and many free, to utilize this calculation power. Then, it will make sense to use this cards.
erple2 - Tuesday, February 16, 2010 - link
Perhaps. However, this "PC Gaming is being killed off by the 2-300 multimedia console" war has been going on since the Playstation 1 came out. PC gaming is still doing very well.I think that there will always be some sort of market (even if only 10% - that's significant enough to make companies take notice) for PC Gaming. While I still have to use the PC for something, I'll continue to use it for gaming, as well.
Reading the article, I find it poignant that the focus is on //execution// rather than //ideas//. It reminds me of a blog written by Jeff Atwood (http://www.codinghorror.com/blog/2010/01/cultivate...">http://www.codinghorror.com/blog/2010/01/cultivate... if you're interested) about the exact same thing. Focus on what you //do//. Execution (ie "what do we have an 80%+ chance of getting done on time) is more important than the idea (ie features you can claim on a spec sheet).
As a hardware developer (goes the same for any software developer), your job is to release the product. That means following a schedule. That means focusing on what you can do, not on what you want to do. It sounds to me like ATI has been following that paradigm, which is why they seem to be doing so well these days.
What's particularly encouraging about the story written was that Management had the foresight to actually listen to the technical side when coming up with the schedules and requirements. That, in and of itself, is something that a significant number of companies just don't do well.
It's nice to hear from the internal wing of the company from time to time, and not just the glossy presentation of hardware releases.
I for one thoroughly enjoyed the read. I liked the perspective that the RV5-- err Evergreen gave on the process of developing hardware. What works, and what doesn't.
Great article. Goes down in my book with the SSD and RV770 articles as some of the best IT reads I've done.