AMD & ATI: The Acquisition from all Points of View
by Anand Lal Shimpi on August 1, 2006 10:26 PM EST- Posted in
- CPUs
AMD’s Position
Given that AMD is the one ponying up $5.4 billion dollars for the acquisition of ATI, there had better be some incredibly good reasons motivating the investment, especially considering that AMD isn't sitting on a ton of cash at the moment. AMD is obviously extremely bullish on the move, but still vague on most details as to what it plans on doing with ATI assuming the deal goes through. The majority of AMD's statements publicly have been reassuring the market that its intention isn't to become another Intel, that it will continue to value its partners (even those that compete with ATI) and still treat them better than Intel would.
Completing the Platform & Growing x86 Market Share
While AMD has always publicly stated that it prefers to work with its partners, rather than against them like Intel does, this move is all about becoming more like Intel. From the platform standpoint, AMD would essentially be expanding its staff to include more engineers, team leaders and product managers that could develop chipsets with and without integrated graphics for AMD processors. Each AMD CPU sold helps sell a great deal of non-AMD silicon (e.g. NVIDIA GPU, NVIDIA North Bridge, NVIDIA South Bridge), and by acquiring ATI AMD would be able to offer a complete platform that could keep all of those sales in-house. From a customer standpoint, it’s a lot easier to sell a complete package to a customer than it is to sell an individual component. Intel proved the strength of the platform with Centrino and AMD is merely following in the giant’s footsteps.
Going along with completing the platform, being able to provide a complete AMD solution of CPU, motherboard and chipset with integrated graphics could in theory increase AMD’s desktop and mobile market share. According to AMD, each percentage point of x86 market share is worth about $300M in revenues. At current profit margins of around 60%, if the acquisition can help increase AMD’s market share by enough percentage points it’s a no-brainer. AMD is convinced that with a complete platform, it could take even more market share away from Intel particularly in the commerical desktop and consumer/commerical mobile markets.
Step 2 in Becoming Intel: Find Something to do with Older Fabs
Slowly but surely, AMD has been following in Intel’s footsteps, aiming to improve wherever possible. We saw the first hints of this trend with the grand opening of Fab 36 in Dresden, and the more recent commitment to build a fab in New York. AMD wants to get its manufacturing business in shape, which is necessary in order to really go after Intel.
A secondary part of that requirement is that you need to have something to manufacture at older fabs before you upgrade them to help extend the value of your investment. By acquiring ATI, chipsets and even some GPUs can be manufactured at older fabs before they need to be transitioned to newer technologies (e.g. making chipsets at Fab 30 on 90nm while CPUs are made at Fab 36 at 65nm).
Once the New York fab is operational, AMD could have two state of the art fabs running the smallest manufacturing processes, with one lagging behind to handle chipset and GPU production. The lagging fab would change between all three fabs, as they would each be on a staggered upgrade timeline - much like how Intel manages to keep its fabs full. For example, Intel's Fab 11X in New Mexico is a 90nm 300mm fab that used to make Intel's flagship Pentium 4/D processors, but now it's being transitioned to make chipsets alongside older 90nm CPUs while newer 65nm CPUs are being made at newly upgraded fabs.
Presently, AMD has no plans to change the way ATI GPUs and chipsets are manufactured. ATI's business model of using TSMC/UMC for manufacturing will not change for at least the next 1 - 2 years, after which AMD will simply do what makes sense.
What if GPUs and CPUs Become One
If GPUs do eventually become one with CPUs as some are predicting, then the ATI acquisition would be a great source of IP for AMD. For Intel, getting access to IP from companies like ATI isn’t too difficult, because Intel has a fairly extensive IP portfolio that other companies need access to in order to survive (e.g. Intel Bus license). The two companies would simply strike out a cross licensing agreement, and suddenly Intel gets what it wants while the partner gets to help Intel sell more CPUs.
AMD doesn’t quite have the strength of Intel in that department, but by acquiring ATI it would be fairly well prepared for merging CPUs and GPUs. The process doesn't have to be that extreme, however. Remember AMD's Torrenza announcement back at its June 2006 analyst day? Part of the strategy included putting various types of "accelerators" either in a Hyper Transport slot or on-package with an AMD CPU, not necessarily on-die.
Conveniently, the "accelerator" blocks are all colored red in AMD's diagram, but you can see many areas that ATI's IP could be used here. AMD could put ATI's Avivo engine in the chipsets for HTPC or CE applications, you could find an ATI GPU in a HTX slot or integrated on the CPU package.
We're moving to quad-core CPUs next year, and there is definitely some debate about how useful that will truly be for the home computer user. Beyond quad cores, what do CPU manufacturers do to continue to sell product? Ramping up clock speeds is becoming more difficult, and while two cores definitely shows some promise, and four cores can be useful, it's really difficult to imagine a computing environment at this point where the typical user needs four or more CPU cores. Long-term, throwing more cores on could give way to putting a GPU into the core, and given the nearly infinitely parallel nature of graphics it becomes a bit easier to make use of additional transistors. Remember that ATI and NVIDIA both have flagship products with over 300M transistors, while AMD is currently using about half that for the 2x512K X2 chips. Core 2 4M is close to 300M transistors, but a large number of those are devoted to cache, and doubling cache quickly has diminishing returns.
A Great Way of Penetrating the CE Market
Intel had a huge showing at this year’s Consumer Electronics Show (CES) in Vegas, making very clear its intentions to be a significant force in the CE market moving forward. AMD unfortunately has very little recognition or penetration in the CE market, but buying ATI would change all of that. Aside from the fact that ATI is in Microsoft’s Xbox 360, an item that Microsoft wants to be entrenched in the Digital Home, ATI silicon is also used in many digital televisions as well as cell phones. By acquiring ATI, AMD would be able to gain entry into the extremely lucrative CE market.
If the world of convergence devices truly do take off, AMD's acquisition of ATI would pay off as it would give AMD the starting exposure necessary to make even further moves into the CE market.
61 Comments
View All Comments
johnsonx - Thursday, August 3, 2006 - link
Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
Gary Key - Thursday, August 3, 2006 - link
where is the edit button... led toPrinceGaz - Wednesday, August 2, 2006 - link
Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).
All useless information today really, but a bit history is worth knowing.
johnsonx - Wednesday, August 2, 2006 - link
Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.yacoub - Wednesday, August 2, 2006 - link
Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)johnsonx - Wednesday, August 2, 2006 - link
Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).mattsaccount - Wednesday, August 2, 2006 - link
>>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.The food in Intel's cafeteria is actually quite good :)
stevty2889 - Wednesday, August 2, 2006 - link
Not when you work nights..it really sucks then..dev0lution - Thursday, August 3, 2006 - link
But the menu changes so often you don't get bored ;)NMDante - Wednesday, August 2, 2006 - link
Night folks get shafter with cafe times.That's probably why there's so many 24 hr. fast food offerings around RR site. LOL