Wednesday, September 12th 2018
AMD CEO Speaks with Jim Cramer About the "Secret Sauce" Behind its Giant-Killing Spree
Jim Cramer of CNBC Mad Money interviewed AMD CEO Dr. Lisa Su on the floor of the NYSE remarking her company as the year's biggest tech turnaround stories. The two spoke a variety of topics, including how the company went from a single-digit stock and a loss-making entity to one of the hottest tech-stocks, which threatens both Intel and NVIDIA. Dr. Su placed emphasis on taking long term strategic decisions that bear fruit years down the line.
"We decided to make the right investments. Technology is all about making the right choices, where we're going to invest, and where we're not going to invest...three or four years ago, it was mobile phones, tablets, and IoT that were the sexy things, and we were like 'hey we know that those are good markets, but those are not AMD.' We focused on what we thought the future would hold for us," said Dr. Su. "We are making decisions now that you won't see the outcome of for the next 3-5 years. We're making some good decisions," she added.AMD Can Stay Competitive Even If Intel Sorts Out Its Foundry Mess
AMD is armed with a deep CPU architecture roadmap going all the way down to "Zen 5," stated Dr. Su. She seems to express pride in some of the investment decisions taken in designing AMD processors, such as the way AMD is building its EPYC chips (a multi-chip module as opposed to a monolithic die that would have eaten up far more resources to design and manufacture alongside a smaller die). Right now AMD only has to manage two dies - a CPU-only die that builds Ryzen and EPYC processors; and a CPU+GPU die for Ryzen with Vega APUs and some of the company's mobile Ryzen SKUs.
There Can Be Many Winners in the GPU Market
Cramer's interview focused on the secrets behind AMD's giant-killing feat against Intel, which is saddled with not just a dated CPU architecture peppered with security holes, but also silicon fabrication foundry issues that are preventing an advance from 14 nanometer. Dr. Su mentioned that AMD does not count on competitors underperforming, and is mindful that the competition is "very strong." Towards the end of the interview, almost like a "one more thing," question, Cramer questioned how AMD's rivalry with NVIDIA is going. Dr. Su's response was crafty.
In the first part of her response to that question, she mentioned that "competition is good for the marketplace and GPUs is a great market, but I've always said that there can be multiple winners in this market." With this, AMD hinted that although its market-share in the discrete gaming GPU market is on the decline, there are areas where the company is winning. AMD rode, although conservatively, the crypto-mining boom over the last year with highly marked-up graphics cards; and is dominating the game console semi-custom SoC market.
AMD is Helping Both Microsoft and Sony with Their Own "Secret Sauce"
Elaborating on AMD's partnerships with competing firms Microsoft and Sony (in the gaming console market), Dr. Su stated that her company is providing semi-custom chips, and is helping both firms develop their own "secret sauce" for their consoles. The partnership with Microsoft spans not just consoles but also Windows and Azure. AMD could be working with Microsoft in future cloud-computing projects driven by its EPYC and Radeon Pro/Instinct products. "Our strength is that we can work with all customers and we can differentiate for each one of them."
You can catch the full video in the source link below.
Source:
CNBC
"We decided to make the right investments. Technology is all about making the right choices, where we're going to invest, and where we're not going to invest...three or four years ago, it was mobile phones, tablets, and IoT that were the sexy things, and we were like 'hey we know that those are good markets, but those are not AMD.' We focused on what we thought the future would hold for us," said Dr. Su. "We are making decisions now that you won't see the outcome of for the next 3-5 years. We're making some good decisions," she added.AMD Can Stay Competitive Even If Intel Sorts Out Its Foundry Mess
AMD is armed with a deep CPU architecture roadmap going all the way down to "Zen 5," stated Dr. Su. She seems to express pride in some of the investment decisions taken in designing AMD processors, such as the way AMD is building its EPYC chips (a multi-chip module as opposed to a monolithic die that would have eaten up far more resources to design and manufacture alongside a smaller die). Right now AMD only has to manage two dies - a CPU-only die that builds Ryzen and EPYC processors; and a CPU+GPU die for Ryzen with Vega APUs and some of the company's mobile Ryzen SKUs.
There Can Be Many Winners in the GPU Market
Cramer's interview focused on the secrets behind AMD's giant-killing feat against Intel, which is saddled with not just a dated CPU architecture peppered with security holes, but also silicon fabrication foundry issues that are preventing an advance from 14 nanometer. Dr. Su mentioned that AMD does not count on competitors underperforming, and is mindful that the competition is "very strong." Towards the end of the interview, almost like a "one more thing," question, Cramer questioned how AMD's rivalry with NVIDIA is going. Dr. Su's response was crafty.
In the first part of her response to that question, she mentioned that "competition is good for the marketplace and GPUs is a great market, but I've always said that there can be multiple winners in this market." With this, AMD hinted that although its market-share in the discrete gaming GPU market is on the decline, there are areas where the company is winning. AMD rode, although conservatively, the crypto-mining boom over the last year with highly marked-up graphics cards; and is dominating the game console semi-custom SoC market.
AMD is Helping Both Microsoft and Sony with Their Own "Secret Sauce"
Elaborating on AMD's partnerships with competing firms Microsoft and Sony (in the gaming console market), Dr. Su stated that her company is providing semi-custom chips, and is helping both firms develop their own "secret sauce" for their consoles. The partnership with Microsoft spans not just consoles but also Windows and Azure. AMD could be working with Microsoft in future cloud-computing projects driven by its EPYC and Radeon Pro/Instinct products. "Our strength is that we can work with all customers and we can differentiate for each one of them."
You can catch the full video in the source link below.
99 Comments on AMD CEO Speaks with Jim Cramer About the "Secret Sauce" Behind its Giant-Killing Spree
Nvidia's strength is in the ASIC. They market their RT cores as being 6x faster than general cores. It's hard to guess how they got this result, but the simple fact is: you needed HPC for RTRT just few years ago. Suddenly RTRT is playable in AAA titles on a single GPU. Maybe not at the fps that some of us would like, but still - a huge jump.
Even if we assume AMD has an advantage in this kind of tasks, it's few tens of %, not few hundreds. ASIC is generations ahead.
It's more likely that Navi (or whatever it's called) will have similar RT cores. AMD can easily develop them and add via IF.
Mind you, we've already seen rumors that Navi will have a dedicated AI solution (rival to Nv's Tensor Cores). ;-)
Now I do expect Nvidia to be absurdly more efficient at it from the get go, but amd and brute force are definitely friends of old.
-AMD had very little money, and Zen had to succeed.
-Furthermore they clearly understood after how Fermi and Kepler went, that for whatever reason people will not buy high-end AMD cards en mass until their CPU's are perceived as the best as well. If you can spend 1/4th as much on GPU's and still hold 30-40% of the market, why not just do that?
-Polaris and Vega are every bit as efficient as Pascal when you keep them at their intended clockspeeds: Under 1200MHz.
When AMD chooses to compete in high-end gaming again, they will. It's that simple lol.
I am sure they will have a brand new GPU architecture out by 2020 as well as stated on their roadmap that won't be based on GCN.
AMD has been only giving them what they "spec". If AMD had the funds to advance above that they would again be more a "solution provider". Lately AMD has just been a design house that uses what's in the toy box to achieve the next level Sony and Microsoft consider what they "need" to be the next goal, they should bring to market. Once AMD pits the two in more a rivals... not "here's what you ask for"... it will stop being here's your spec's all within your price constraints. AMD did get some 'margin$' at first when they had the "solution" in being the single source for CPU/GPU. They just haven't innovated from that, if/when they do they can see a more viable market, especially if entry gaming stay high and 4K keeps out-pacing prudent outlay of funds.
hwbot.org/submission/3921710_mrgenius_specviewperf_12_catia_radeon_rx_vega_64_149.06_fps
;)
nVidia?? On which front? On most sexy office ladies, or in the bowling championship?? Not true at all. I remember the HD 4870 and then HD 5870 times. I had both an HD4870X2 (it died after I try replacing it with a custom cooler) and then an HD 5870. They were the best price/performance cards at that time. Than the disaster happened and AMD lost the crown completely.
The only problem is that NV compute card with proper double-precision capabilities is so much more expensive. But for deep learning uses which only require FP32 or lower precision, I haven't seen a single lab that uses AMD card. For enterprise segment, AMD's MI25 hasn't found a single customer yet.
Here's some test of AMD's own ProRender from just few months ago. Vega64 lost to Titan Xp and was just 15% faster than 1080Ti.
techgage.com/article/performance-testing-amds-radeon-prorender-in-autodesk-3ds-max/
Clearly Vega shines in rendering compared to how it performs in games, but that isn't RTX-level for sure.
Remember, that before RTX came RTRT wasn't really considered as a thing in gaming. It's just way too slow. That's why many people on this forum never heard of it.
However, it's not a new solution outside of gaming.
When earlier this year Nvidia announced that they're working on a new RTRT solution, AMD quickly answered saying that they'll improve RTRT implementation in their ProRender.
But Nvidia wasn't talking about a software approach, but about the ASIC for RTX.
I assume AMD has known about RTX for a while. They don't have a hardware answer yet, so they went for this ProRender thing as a temporary marketing solution.
But I'm pretty sure they're working on a hardware RTX competitor as well. They'll need it to keep up.
Sure, we may have doubts about RTRT popularity in gaming in next few years (well, I sure hope it stays).
But if Nvidia makes an RT accelerator based on this tech, they'll quickly eat whatever AMD still has in the 3D business.
BTW: Nvidia has just announced a dedicated Tensor accelerator for machine learning.
This means that until AMD shows their product (there have been rumors that they're working on a Nv Tensor alternative), everything on page is obsolete:
www.amd.com/en/graphics/servers-radeon-instinct-deep-learning As I said: we can expect AMD to be slightly more efficient in RT on general cores (GCN vs CUDA), but this is nowhere near the jump an ASIC gives.
BTW: I've seen Vega doing RTRT - I doubt it would be enough for Quake II. :)
AMD certainly has a larger slice of the GPU market than its share of the CPU market. It doesn't matter if people are gaming, mining, or doing something kinky with the cards they bought.
So yeah, despite the fact that it lacks a GTX 1080 Ti-competitor, AMD does threaten NVIDIA's bottom-line as of now.
AMD could have very easily been called the little guy for years. Now the little guy is not so little anymore. They've got their big brother sweating like a pig to stay on top. That is neither an advertorial, nor bad journalism. That is simply what is.
I hope 7nm will change this. Have not had an AMD GPU since my 7970, which IMO was their last really good GPU. She didn't do jack. She is cringe. I remember her fucking up the Fury X release saying it was an Overclockers dream. Biggest lie ever.
You can thank Jim Keller instead. The brain behind Ryzen.
Now we just need GPU competition again.