Monday, September 21st 2015
CPU Whiz Jim Keller Leaves AMD
Jim Keller, one of the lead architects of AMD's x86 CPU architectures, has left the company. He held the post of Chief Architect of Microprocessor Cores at AMD. With his association, AMD's launched some of its most successful CPU architectures, such as the original K7 (Athlon, Athlon XP, Duron); the very first 64-bit x86 architecture, and K8 (Athlon64). Keller then left AMD to join Apple, in its development of the A4 and A5 SoCs, before rejoining AMD in 2012 to begin work on the "Zen" architecture.
Keller's departure doesn't throw "Zen" in jeopardy. "Jim helped establish a strong leadership team that is well positioned for success as we enter the completion phase of the "Zen" core and associated system IP and SoCs," said AMD in a statement. "Zen" remains on-track for sampling in 2016, and its "first full year of revenue" in 2017, which would indicate a market launch some time in 2016. AMD CTO Mark Papermaster will take over as additional charge of Keller's position.
Source:
Hexus.net
Keller's departure doesn't throw "Zen" in jeopardy. "Jim helped establish a strong leadership team that is well positioned for success as we enter the completion phase of the "Zen" core and associated system IP and SoCs," said AMD in a statement. "Zen" remains on-track for sampling in 2016, and its "first full year of revenue" in 2017, which would indicate a market launch some time in 2016. AMD CTO Mark Papermaster will take over as additional charge of Keller's position.
68 Comments on CPU Whiz Jim Keller Leaves AMD
He always did that, project complete => leave, so nothing alarming (yet).
I wonder how much of the 40% IPC improvement promise will Zen really deliver. AMD's APUs that got them into both Xbox/PS4, who is catching up with Iris?
AMD's 64 bit instruction set vs I64?
AMD's "fuck long pipelines", vs, cough, Prescott?
AMD's DirectX 12 support?
HBM?
How long did it take them to respond with FreeSync? Why does it cost next to nothing (built-in into upscalers) and put no restrictions on ports (gsync => 1 display port and nothing else)?
And, FFS, you are mocking a company that manages to stay afloat vs competitors wich vastly more resources, please, at least get some clue.
Some people are still on first-gen Core-i because to the lack of major improvements.
Although, no wonder, considering fabs are only about to go 16/14nm.
My main and only point is the company is sinking. I know they are better off alive but something needs to change to acomplish that.
(1). What Fable Legends shows is that neither AMD's or Nvidia's architectures offers a home run. Games will need to be taken on a case by case basis depending upon what is implemented. From this analysis by Anandtech it looks like AMD's gains in asynch compute are offset by their lack of hardware-based conservative rasterization and voxelization.
(3) Sounds like a tall order. For that to happen, AMD would need to sell 24.1 million cards more than Nvidia over the next four quarters. To put that into perspective, that means that AMD have to go from 18% (to Nvidia's 81%) of the market to 66% consistently starting from the quarter we are in at the moment. Just to add a little more perspective, AMD launched a whole top-to-bottom DX11 series in Q3 2009-Q1 2010 when Nvidia could not field a single DX11 capable card and relied upon a trickle of GTX 285/275's and the GTS 250 ( 9800GTX+ by another name), yet still conspired to lose out on market share ( AMD sold 22.51 million cards to Nvidia's 36.78 million over the three quarters before the GTX 480/470 appeared in the retail channel)
(4) see point 1. Also, the vast majority of graphics cards are sold to people who understand nothing of GPU technology, and of the tiny fraction of those who might actually frequent a tech site, they'll look at a bar graph and call it good.
(5) It has already been established that (1) Async compute isn't the be all and end all, and (2) For all the dire prophesy, AMD's architecture isn't destroying all comers in benchmarks. A GTX 980 Ti still compares well with a Fury X. The only real difference is price (excepting the Fury line), but since AMD has had that edge forever and still hasn't made any inroads into gaining market share, how vital is that?
(6) Nvidia's stance is that they'll provide a driver by the end of the year. I guess we'll find out if the first of your predictions will come true on 31st December. I'll keep updating this post to let you know how many cards AMD needs to sell to reach parity with volumes conceded after Q2 2014.
1) What Fable Legents shows, is that AMD's cards are back in the game. More people will consider them as a future proof option, less will care if they are rebrands. Having a game like Fable Legents giving you a full analysis and showing that it doesn't favor any architecture specifically, makes it a really objective benchmark and probably representative of what we should expect. Many where considering AotS biased. They can't say that about FL, still 390X beats 980 and 390 beats 970. With DX11 things where much more in Nvidia's favor.
2) Money is money. There is no difference if it is money from CPUs or GPUs. It's money. Even if we are talking about 20-30 millions more every quarter, considering AMD's financial position, that difference can have a very positive impact on R&D expenses in the next 12 months. Now, if you want to see them becoming equal to Intel, until you say that their future is secured, then you will have to wait.
3) If in the last quarter the number of AIBs sold where less that 10 millions, how do you make that 24.1 millions cards number? And what exactly is your target here. By getting 66% of the market for 4 quarters, where is AMD going to land after 4 quarters? 20% market share? 40% market share? 60%? 80%? I think you are the one here with the tall order, just so you can make it look impossible.
4) We are in 2015. Almost everybody has access to internet or knows someone who do have access to internet. Many people ask others who DO know about GPUs. Or just register an account on a tech site, create a thread "What can I buy with that amount of money", get the answers, buy the card they where told them to buy and then never again visit that forum. People do look at graphs, also many just buy gigabytes. If things where as simple as you describe them, the market share would have been closer to 50-50 than 20-80. That 20-80 can happen ONLY if most people ASK before buying.
5) You need a full GM200 with it's 96 ROPs to have a victory over hi end AMD cards. Compare any GM204 with 64/56 ROPs against the 64 ROPs AMD hi end cards, and AMD cards win. I like how you say here that AMD had always the edge on price but that didn't help them enough. So can this be possible an indication that people DO ask about GPUs before buying?
6) The end of the year. The question is, which year. Don't answer that.
A financial quarters market share taken in isolation means little, and history tells us that even with being first to DX11, and having virtually no opposition (including none at the enthusiast level aside from the GTX 295 for most of a year), and Nvidia's Fermi delay, AMD have never managed to claw back more than 10% market share before dropping back. The ONLY time ATI/AMD took serious market share from Nvidia was in 2004 and that had little to do with the cards and everything to do with ATI being able to field PCI-E interface cards when the Intel 915/925X chipsets launched.
How many times has the AMD revolution been imminent? and how many times has it come up short? You can argue the future all you like, but until it happens it is moot....just like the AMD revolution One huge flaw in your argument is that one company possesses top of the mind brand awareness among consumers, and one doesn't. Believe it or not, people had the internet in 2014, and 2013, and 2012...check if you don't believe me. AMD have had performance parity during almost the entire modern GPU era, including a period where they owned the fastest card (HD 5970), fastest single GPU card (HD 5870), best bang for buck card since the 8800GT ( HD 5850), Eyefinity, and full DirectX 11 compliance when its competitor had none. YET STILL LOST MARKET SHARE. Yet, for some reason - that you put down to people having internet access in 2015, AMD is going to double its market share because it has (at best) a slight edge in DX12 and a distinct disadvantage in DX11 (where a lot of titles will reside for some time). Yeah, OK. That's why I'm going to keep a cursory eye on your predictions - I love a good come from behind win. Ask a hundred people whether Intel is better than AMD. Do you really think the split would accurately reflects the relative merits of the processors and platforms?
You seem fundamentally unaware of brand recognition and top of the mind awareness. AMD have plenty of adherents and advocates on tech forums, indulge in guerrilla/viral marketing, and have PR up to the eye teeth - yet they still suffer in image (and thus in sales), in part because they lack the profile, and in part because for every step forward they tend to shoot themselves in the foot taking the step. AMD's produces great hardware, but for every Fury launch and DX12 benchmark, AMD have a big Roy Taylor moment - and when sites start calling them on it, they and their more ardent fans pitch a full scale hissy fit. It really shouldn't be a surprise that company confidence leads to a stronger customer marketing perception (and customer perception equals reality in marketing - one look at Apple should suffice as an example) ....and quite frankly, AMD possesses none. Intel seldom indulge in public displays of whining and even Nvidia seemed to have learned after Jen-Hsun's "open a can of whoop ass" moment, yet AMD persist with the plucky underdog shtick - which just tells the average consumer that the company is giving itself a licence to fail.
Anyhow, we'll see how well in tune with the technology market we both are in due course. It is rather pointless arguing over future history.
/OUT
30% will say "Intel".
28% will say "wtf did you just say?"
2% read forums.
Polled 452 people at school. :P
Obviously I don't expect them to go much higher than 25-30%+. We will have to see if that will continue with the FinFETs, or if Nvidia will manage to gain more market share. And it's not false economy. It is reality. You can keep asking from AMD to become Intel or Nvidia overnight, only so you can come to easy conclusions. Double standards and easy conclusions about AMD, are really common and boring to read. And don't make me look like someone who is struggling to pass his opinion. You started it remember? You write the big posts here.
The rest of your post is just your own conclusions and imaginations with the usual double standard approach. In many things you do have a point, but it is really easy to hit someone who is down. We can talk all day about Roy's stupidity, but he wasn't the one who wanted us to believe that in the same company enginnears do not use the internet and talk a different language than the marketing department. Of course Roy will go on the bookmarks list. Nvidia's excuses where swallowed instantly.
Now if you want to make this personal, start reading and writing Greek. Then I will be happy to lose some more time arguing with
the wallyou.Very few average people have ever heard of AMD, and THEY are the ones who make or break a company, not the few of us with tech knoledge and interest. Obviously, that is a major problem in their strategy.
Until they fix that visibility image, it doesn't matter how good their products are.
I say that again and it's your choice to understand it or not. For the same things people in here accuse AMD, they will find plenty of excuses for Nvidia and Intel. And for every example they have to show where AMD fails, they will throw under the carpet 10 equal examples where Intel or Nvidia failed. It's really boring. But it is really good to see that drivers/performance/efficiency, as reasons for AMD failures, have been replaced with perception/name/brand recognition.
No one is making excuses for anyone. We're just talking about reality. Until AMD can overcome the lack of name recognition, they will not recover their glory.
Your business example is alot more believable than john_ attributing childish shenanigans to major companies.
It doesn't change though, that brand recognition is important for a company overall, just maybe not to the degree I attributed it.
I wasn't expecting any kind of comments about Nvidia and Intel in the smartphone markets either. You have two companies with, money, brand name recognition, hi technology that can give them the edge over the competition and they fail miserably. In the end one company is paying the manufacturers to use it's chips, the other end up suing the competition because it can not compete. But, let's forget about that. Let's talk about AMD. Shoot...
Sometimes I wonder why I bother posting anything on this site...so many people seem to be hoping to be offended that they jump at the chance to claim they were insulted.