The RV870 Story: AMD Showing up to the Fight
by Anand Lal Shimpi on February 14, 2010 12:00 AM EST- Posted in
- GPUs
Adjusting Trajectory & Slipping Schedule
Carrell didn’t believe in building big chips anymore. It wasn’t that it was too difficult, it’s that it took too long for a $600 GPU to turn into a $200 GPU. AMD believed that the most important market was the larger (both in quantity and revenue) performance mainstream segment.
Rather than making the $200 - $300 market wait for new technology, Carrell wanted to deliver it there first and then scale up/down to later address more expensive/cheaper markets.
The risk in RV770 was architecture and memory technology. The risk in RV870 was architecture and manufacturing process, the latter which was completely out of AMD’s control.
Early on Carrell believed that TSMC’s 40nm wasn’t mature enough and that when it was ready, its cost was going to be much higher than expected. While he didn’t elaborate on this at the time, Carrell told me that there was a lot of information tuning that made TSMC’s 40nm look cheaper than it ended up being. I'll touch on this more later on in the article.
Carrell reluctantly went along with the desire to build a 400+ mm2 RV870 because he believed that when engineering wakes up and realizes that this isn’t going to be cheap, they’d be having another discussion.
In early 2008, going into Februrary, TSMC started dropping hints that ATI might not want to be so aggressive on what they think 40nm is going to cost. ATI’s costs might have been, at the time, a little optimistic.
Engineering came back and said that RV870 was going to be pretty expensive and suggested looking at the configuration a second time.
Which is exactly what they did.
The team met and stuck with Rick Bergman’s compromise: the GPU had to be at least 2x RV770, but the die size had to come down. ATI changed the configuration for Cypress (high end, single GPU RV870) in March of 2008.
And here’s where the new ATI really showed itself. We had a company that had decided to both 1) not let schedule slip, and 2) stop designing the biggest GPU possible. Yet in order to preserve the second belief, it had to sacrifice the first.
You have to understand, changing a chip configuration that late in the game, 1.5 years before launch, screws everything up. By the time RV770 came out, 870 was set in stone. Any changes even a year prior to that resets a lot of clocks. You have to go back and redo floorplan and configuration, there’s a lot of adjusting that happens. It takes at least a couple of weeks, sometimes a couple of months. It impacted schedule. And ATI had to work extremely hard to minimize that where possible. The Radeon HD 5870 was around 30 - 45 days late because of this change.
Remember ATI’s nothing-messes-with-schedule policy? It took a lot of guts on the part of the engineering team and Rick Bergman to accept a month+ hit on redesigning RV870. If you don’t show up to the fight, you lose by default, and that’s exactly what ATI was risking by agreeing to a redesign of Cypress.
This is also super important to understand, because it implies that at some point, NVIDIA made a conscious decision to be late with Fermi. ATI wasn’t the only one to know when DX11/Windows 7 were coming. NVIDIA was well aware and prioritized features that delayed Fermi rather than align with this market bulge. GPUs don’t get delayed without forewarning. AMD risked being late in order to make a smaller chip, NVIDIA risked being late to make a bigger one. These two companies are diverging.
The actual RV870
Engineering was scrambling. RV870 had to be a lot smaller yet still deliver 2x the computational power of RV770. Features had to go.
132 Comments
View All Comments
tomoyo - Monday, February 15, 2010 - link
Another awesome article about the real situation behind the hardware from you Anand! I was on the USS Hornet and wish I had talked to you, but it was a great time nonetheless. It's interesting the change in their thought process between the RV770 and RV870, I hope they keep the winning streak up for the next refresh cycle (which hopefully will stay on the market bulges).WT - Monday, February 15, 2010 - link
*sigh* ^^^There's always one in the crowd.
Take care in the fact that you are the only person who hasn't enjoyed this read.
MegaManX4 - Monday, February 15, 2010 - link
Reminds me much of the Anglo-Saxon "documantaries", where it is always of tertiary relevance WHAT is actually discussed, but it is always of utmost interest how the responsible person "feels" about what he is just seeing, other than just stating the facts.There seems to be a huge crowd vowing for that kind of journalism, Whatever pleases the canaille.
"Jedem das Seine" or "to each his own" then
MegaManX4 - Monday, February 15, 2010 - link
This was actually the worst article i have ever read at anandtech. I know that you Americans always strive for emotionally .Driven stories, but this outright borders on silly exaggeration."Heroes of our Industry", what a Schmalz.
Also, if one would take the real informations presented in that article, it wouldn't justify even a 2 Page Article, let alone that 11 Page behemoth.
They are engineers, they do their jobs. Nothing more, nothing less.
Greetings from Germany
blowfish - Monday, February 15, 2010 - link
hmm, with an attitude like that you'll never get past middle management!Like most here, I loved this article. Anand obviously has the friendship and respect of some very senior players, and we were treated to some great insights into how things work at AMD ATI.
As the reader, you can choose to read or not read the article, simple as that. Maybe you should up your medication.
MegaManX4 - Monday, February 15, 2010 - link
unreasonable polemicpmonti80 - Monday, February 15, 2010 - link
You are the one being unreasonable. This may not be a "scientifically written" article, but no one is claiming it to be. And that's the reason this article is so interesting.saiga6360 - Thursday, February 18, 2010 - link
Apparently German engineers are just soulless robots. His confusion is understandable.BelardA - Monday, February 15, 2010 - link
I enjoyed this article even more than the RV770. I do recommend that everyone read that one too.Kind of shocking that Nvidia didn't use that info from the RV770 article to learn to NOT make big GPUs like the GTX 2xx. yeah yeah, it takes 2-4 years to design a chip.
I thank ATI (and AMD) for not playing marketing games like Nvidia does... I think they have a bigger marketing department than engineers nowadays. They started with the GF2-MX 400 & GF4-MX cards (which were re-labeled updated GF2 cards that were not up to GF3 standards)... but the latest cluster-muck of Nvidia products is nothing but a mess. 8800 re-badged as a 9800 re-badged into the gts 250. Code-name of NVxx go to G80 to G92 to G100. The GT-1xx products that are actually low-end 9xxx products, same with most G200 & G300. I'm not going to be surprised when the GTX 285 gets renamed into the GTS450 at $200! I've seen people who bought the GTS250 and post on the internet "why isn't my new gts250 much faster than my old 8800GT"... because you bought a faster version of your card and thought it was something new. Wow, 3 years with 3 names for the same product, that is marketing.
ATI does good with the entire 4000 series being DX 10.1 products and 5000s are DX11. (Does anyone really use HD-5xxx?) It doesn't feel like ATI is pulling our chain with their products.
AMD should be learning from ATI, they are getting better with CPUs - 2 years late, but AMD CPUs are now faster than Core2 and compete well against the lower end intel i-confused model CPUs. There is still room for improvement which was recommend to them some time ago, but AMD is just going to come out with a new design for next year. But had AMD tweaked their CPUs a bit for another 10~20% performance, they'd be up there with i7s.
I hope in the next ATI GPU, some form of Physics engine is added to go up against nvidia's PhsyX. But perhaps that'll be part of DX12... but Microsoft no longer supports Games for Windows.
Actually, with more and more games going ONLY to consoles, I don't think the need for high-end gaming cards will be needed anymore in the next few years. If there are no games, who needs a $300 3D Gaming card?
Zink - Monday, February 15, 2010 - link
Would also like to say great article. I can't wait for new distributed computing cores come out optimized for ATI's architectures.