My personal experience with AMD has been pretty miserable. As the proud owner of an ATI (acquired by AMD 2 years ago) video card I've watched new drivers roll out, wondering when AMD would decide to support its product. They even refused for a long time to release APIs so that the opensource community could aid in development. So I can see why these guys are dead in the water. Two questions:
Is there any chance of ATI not going under with them? Before they were acquired they were good.
Second Intel and nVidia might (probably will) simultanously lose their most direct competitors. Is another company going to fill the niche are might they begin viaing for each others market shares?
I think ATI will be bought off by someone .. maybe even Intel given that Intel is now developing its own GPUs. On the x86 front, the only possible acquirer for AMD CPU divison seems to be IBM.
Sun also bought a x86 startup a few months back but Sun has its own problems. Samsung was considered as another contender but Samsung wont actually get the x86 license (the license can only be transferred to US based companies I believe).
Or maybe AMD will survive. They need to find a new management and probably need to deeply collaborate with IBM on fabs and tech.
Despite Nvidia's sabre rattling, it cannot compete with Intel on the CPU front though Intel certainly has the resources to give Nvidia a fight.
AMD competed very well with Intel for years, only recently have they fallen behind with the dominance of the Core 2 Duo chips. I wouldn't count them out just yet.
Also, why would AMD who recently (2006) bought ATI sell it to Intel??? Makes no sense.
I'm a regular HN community member (under a different account). Coincidentally, I also work for AMD. Everything I say here is my own words and none of this is endorsed/backed by my employer.
From what I can tell, AMD's strategies are more long-term than some people would like. For instance, we're taking time to rewrite software when we could squeak by with just modifying it. We just reorganized my entire building. These things have lots of up-front costs: lost time, productivity, etc. But I would be surprised if they didn't pay off in the future.
Until last month, we were in a hiring/raise freeze. Most benefits were frozen (profit sharing, buying stock, etc.) Now that stuff is being lifted. Management believes (as do most employees) that we have passed the worst of the storm.
HR and legal can be a real pain (more so than many other companies i've worked at). They can regularly prevent real things from happening. Thankfully, they are physically seperated from the hardware/software boys (literally on the other side of town). They have a dress code; we don't, they have lots of inane security policies (no camera phones, etc.), we don't, etc.
The management style is very hands-off (at least for hardware/software boys), which is both good and bad. The "very good" developers can really shine. If you want to work on something, just do it. Rarely do you need manager approval, and if you do it's usually just one level up. I've had several ideas that I've gotten a chance to implement in the last few months that have really streamlined the manufacturing process and saved the company a lot of money.
On the other hand, average/mediocre software developers are totally lost. Upper-level management communication basically consists of elaborate ways of saying "Make more money" and "make chips fail less" and that sort of thing, things everybody should already know anyway. How to go about doing any of that is largely left up to the ingenuity of the small team or even individual developer. Great for smart people; bad for mediocre developers.
I certainly don't think AMD is in danger of collapsing or anything. It's just not as competitive at this precise moment as it was in 2001-2005. I can say that the company is treating its engineers/developers right--flexible hours, casual dress, nice facilities, all the sort of things that lead to happy programmers/engineers and great products. Most of the employees in my building seem to own stock, so I would imagine they also think AMD has a good future.
I think Apple + AMD merger/acquisition would make more sense than anything else in this situation. And if it's going to happen, this is the right time more than ever.
1. Apple's main bread-earner right now is the Macbook, MBP and iMac. All of them are premium models and significantly, all of them use mobile processors. Only the Mac Pro uses desktop processors. AMD's mobile line cannot compete with Core 2 Duo currently and is only found in cheaper laptops.
I cannot see what apple gain by acquiring AMD. And what exactly will they do after acquiring AMD? Not buy Intel processors anymore and instead compete with Penryn and (future) Nehalem? And once again lose out to generic Windows boxes in the performance game?
2. Apple has no chip expertise in the desktop/laptop area. They have a small inhouse chip divison but its dedicated to mobiles and handhelds. Their recent acquisition was also in this space and was fairly small (less than $50mn if i remember).
3. Apple does not have the resources to compete with Intel. OTOH they have a very good working relationship with Intel and I dont see why they would want to disrupt that and instead focus on non-core areas.
Well, I think it would make much more sense for IBM to take over AMD. They have an entire history together, they've shared/are sharing multiple technological solutions and don't forget the mountain of rumors surrounding the possibility of an IBM/AMD merger http://www.google.com/search?hl=en&q=IBM+AMD&btnG=Se...
As for my personal experience with AMD, I have had only pleasant encounters. I believe they've made quite a performance with beating the bytes out of INTEL in the 2001-2005 interval. I think that's an important achievement for AMD, being able to stay extremely competitive for that period of time + the fact that they were a small fraction of the INTEL's presence. But the big mistake AMD did was that they dedicate all there power (money and labor) on retouching the K8 (which was basically an enhanced K7) architecture. They should, really, have invested in an alternative solution (maybe an early big project with IBM?) contrary to the strategy they've embraced. Just my opinion.
I'm not so sure it makes sense for IBM to buy AMD either. If they are already sharing technology what incentive does IBM have to acquire AMD? Why buy the cow if you can have the milk for free?
I guess they could get into the desktop chip market, but I'm thinking if they were interested in that they would have put more effort into making Apple happy a few years ago. Would an acquisition buy them anything that would improve their position in the server market?
It seems like a struggling chip maker that's years away from having a top product again is more a liability than an asset. I don't think anyone is going to touch them with a ten foot pole. Besides, the companies out there with the money and the know-how to take over a chip manufacturer will most likely be able to get what they want- technology deals, engineers looking to jump off a sinking ship, etc.- for a far smaller price than that of an acquisition.
IBM isn't "getting the milk for free" from AMD. IBM and AMD are sharing a lot of R&D costs, along with most of the major semiconductor manufacturer's out there (like Samsung, TSMC, UMC, Chartered Semi, etc). IBM and AMD are also trading technologies; IBM provided AMD with low-K semiconductor technology, and AMD provided IBM with automation technology for IBM's fabs.
And don't forget that IBM WAS in the desktop chip market. They even had an x86 license that allowed IBM to design x86 processors, as well as manufacture them.
They tried to get back into that market with the G5, and both AMD and Intel pretty much steamrolled it -- because IBM launched the G5 while AMD and Intel were hot on each others' heels (AMD lead performance at the time, Intel lead price).
IBM didn't make Apple happy because IBM couldn't afford to. Back then, IBM's semiconductor division was losing (each quarter) approximately AMD's annual operating budget. IBM wasn't willing to invest in a fresh design for the G5, so they went with a cut-down implementation of POWER4, which doesn't work so well without the massive caches and buses that POWER systems get.
IBM is good at makeing cost-no-object systems like POWER (which only make money because the systems and services for IBM's servers are so lucrative), and custom jobs like Gekko, Cell, and Xenon. MS and Sony have 2nd sources for Xenon and Cell (Chartered or UMC, I forget which, and Toshiba respectively), because IBM's not so good at making high-volume chips at reasonable cost.
Would it be worth for IBM? It's hard to say, particularly since IBM is also one of Intel's biggest OEM's, and IBM no longer sells PC's.
Regardless of the readiness of IBM or any other monster in the field to take over AMD, we are talking about scenarios that make more sense. As a consumer, do you care about IBM and what they are doing? I don't.
Now, some basic facts: Apple needs (and presumably wants) to be independent; Apple's software will run nicely on AMD; AMD got 64-bitness right even before Intel; own video card would be a big bonus for Apple. And this is probably not all.
And the fact is that AMD is quite large (especially considering the acquisition of ATI) and even that it's market share dropped significantly it's still a hard buy to digest. I personally don't thing that Apple would make such a risky buy knowing what the ATI acquisition made to AMD itself.
And also a fact is that AMD did very well on the server market. A buy would mean for IBM a win/win situation. What could possibly stand in IBM's way of reentering the mainstream market? One lesson I've come to learn is that as bigger is one's portfolio of good products as better is for one's income. Don't you agree?
Would Apple sell AMD chips to other Manufacturers or would you only be able to get them in Apple hardware? I don't think Apple is interested to start selling chips. And I wouldn't want Intel to basically be the only chip manufacturer for non-Apple hardware. As a consumer I care about competition and I think IBM would be more likely to be able compete with Intel than Apple.
Furthermore, it doesn't seem to make sense for Apple to buy AMD since they ([Apple]; correct me if I'm wrong) seem to put more emphasis on mobile processors, and there Intel is ahead of AMD now and probably for the foreseeable future.
Why does Apple need to be independent? How would be in Apple's best interests to take on the massive overhead of being a CPU manufacturer?
I'm not even convinced that it would be in Apple's best interests to be in the graphics hardware business either, but at least that doesn't entail owning a fab. Neither nVidia nor ATI fabricated their own chips, they contracted that work out to foundries like Chartered and UMC. Owning a fabless graphics hardware developer might work out for Apple, but I'm a bit skeptical about that. I'm also not convinced that it was a wise purchase for AMD though.
How does that make sense? Maybe I'm ignorant, but it seems Apple has a good thing going with Intel these days. Why would they want to spend a ton of money to acquire AMD only to have to compete against a current partner who has a superior product?