Wednesday, September 12th 2018

AMD CEO Speaks with Jim Cramer About the "Secret Sauce" Behind its Giant-Killing Spree

Jim Cramer of CNBC Mad Money interviewed AMD CEO Dr. Lisa Su on the floor of the NYSE remarking her company as the year's biggest tech turnaround stories. The two spoke a variety of topics, including how the company went from a single-digit stock and a loss-making entity to one of the hottest tech-stocks, which threatens both Intel and NVIDIA. Dr. Su placed emphasis on taking long term strategic decisions that bear fruit years down the line.

"We decided to make the right investments. Technology is all about making the right choices, where we're going to invest, and where we're not going to invest...three or four years ago, it was mobile phones, tablets, and IoT that were the sexy things, and we were like 'hey we know that those are good markets, but those are not AMD.' We focused on what we thought the future would hold for us," said Dr. Su. "We are making decisions now that you won't see the outcome of for the next 3-5 years. We're making some good decisions," she added.
AMD Can Stay Competitive Even If Intel Sorts Out Its Foundry Mess
AMD is armed with a deep CPU architecture roadmap going all the way down to "Zen 5," stated Dr. Su. She seems to express pride in some of the investment decisions taken in designing AMD processors, such as the way AMD is building its EPYC chips (a multi-chip module as opposed to a monolithic die that would have eaten up far more resources to design and manufacture alongside a smaller die). Right now AMD only has to manage two dies - a CPU-only die that builds Ryzen and EPYC processors; and a CPU+GPU die for Ryzen with Vega APUs and some of the company's mobile Ryzen SKUs.

There Can Be Many Winners in the GPU Market
Cramer's interview focused on the secrets behind AMD's giant-killing feat against Intel, which is saddled with not just a dated CPU architecture peppered with security holes, but also silicon fabrication foundry issues that are preventing an advance from 14 nanometer. Dr. Su mentioned that AMD does not count on competitors underperforming, and is mindful that the competition is "very strong." Towards the end of the interview, almost like a "one more thing," question, Cramer questioned how AMD's rivalry with NVIDIA is going. Dr. Su's response was crafty.

In the first part of her response to that question, she mentioned that "competition is good for the marketplace and GPUs is a great market, but I've always said that there can be multiple winners in this market." With this, AMD hinted that although its market-share in the discrete gaming GPU market is on the decline, there are areas where the company is winning. AMD rode, although conservatively, the crypto-mining boom over the last year with highly marked-up graphics cards; and is dominating the game console semi-custom SoC market.

AMD is Helping Both Microsoft and Sony with Their Own "Secret Sauce"
Elaborating on AMD's partnerships with competing firms Microsoft and Sony (in the gaming console market), Dr. Su stated that her company is providing semi-custom chips, and is helping both firms develop their own "secret sauce" for their consoles. The partnership with Microsoft spans not just consoles but also Windows and Azure. AMD could be working with Microsoft in future cloud-computing projects driven by its EPYC and Radeon Pro/Instinct products. "Our strength is that we can work with all customers and we can differentiate for each one of them."

You can catch the full video in the source link below.
Source: CNBC
Add your own comment

99 Comments on AMD CEO Speaks with Jim Cramer About the "Secret Sauce" Behind its Giant-Killing Spree

#51
HD64G
btarunrExtremely insignificant. AMD probably has 6-8 engineers developing and testing AGESA, who aren't even doing that full-time. On the other hand, it takes dozens of engineers, endless back-and-forth with game devs, and a Russia-based testing/optimization team with dozens more people to keep the "Game Ready" cycle working. Millions of dollars per driver update, and hundreds of millions spent over the GPU's lifecycle.

This is why I think AMD will only invest in a new big GPU if that GPU can crush NVIDIA's big GPU. Until then it will stick to lower segments (bigger market), or semi-custom work. Investment into said big GPU will only happen when AMD has a ton of disposable income from selling CPUs.

If Lisa Su is as smart as I hope she is, she will spin-off Radeon as a separate company once she's confident AMD has a CPU market leadership that is sustainable.
Imho, Navi will get us a Polaris successor mid-class GPU to combat 2060 or 2070 for a bit less money just to keep AMD related to the biggest segment of the gaming market. If that arch is proven good for it they will produce an even bigger to combat in the high-end tier. They won't produce another expensive to make computing GPU and fetch it in gaming market just to say "we have a fast GPU". Vega 56 should be the only gaming GPU of Vega line as the 64 is clearly underutilised most times than not in games. Vulkan shows that power is there but not many games take advantage of it due to some arch's limits probably. Raja is engineer and although the Vega arch is good in many aspects, he made a mess as a leader in his market related decisions. That's why he departed afterall.
Posted on Reply
#52
v12dock
Block Caption of Rainey Street
Tech aside Lisa Su has done a spectacular job turning AMD around. CEO of the decade!
Posted on Reply
#53
notb
cdawallI don't know the width of the vega products would be conducive especially with GCN being general purpose from the get go. I am not saying we will see 2080ti performance from one, but it will be very very interesting to see it against a 2070 or 1080ti just to see how it falls. Remember AMD already has Ray tracing on the cards and has basically since launch.
Every GPU (and CPU) can do RT and, hence, also RTRT if the modeled scene is simple enough. It's a fairly basic mathematical problem.
Nvidia's strength is in the ASIC. They market their RT cores as being 6x faster than general cores. It's hard to guess how they got this result, but the simple fact is: you needed HPC for RTRT just few years ago. Suddenly RTRT is playable in AAA titles on a single GPU. Maybe not at the fps that some of us would like, but still - a huge jump.
Even if we assume AMD has an advantage in this kind of tasks, it's few tens of %, not few hundreds. ASIC is generations ahead.

It's more likely that Navi (or whatever it's called) will have similar RT cores. AMD can easily develop them and add via IF.
Mind you, we've already seen rumors that Navi will have a dedicated AI solution (rival to Nv's Tensor Cores). ;-)
Posted on Reply
#54
Frick
Fishfaced Nincompoop
ArpeegeeIs it possible to ban "las"? He/she/they/it is an antagonizing a**hole and I'm tired of reading their stupid replies in every topic. Seriously, take a step back and evaluate your life.
Use the universal ignore function known as "don't bother" if you have something against someone.
Posted on Reply
#55
cdawall
where the hell are my stars
notbEvery GPU (and CPU) can do RT and, hence, also RTRT if the modeled scene is simple enough. It's a fairly basic mathematical problem.
Nvidia's strength is in the ASIC. They market their RT cores as being 6x faster than general cores. It's hard to guess how they got this result, but the simple fact is: you needed HPC for RTRT just few years ago. Suddenly RTRT is playable in AAA titles on a single GPU. Maybe not at the fps that some of us would like, but still - a huge jump.
Even if we assume AMD has an advantage in this kind of tasks, it's few tens of %, not few hundreds. ASIC is generations ahead.

It's more likely that Navi (or whatever it's called) will have similar RT cores. AMD can easily develop them and add via IF.
Mind you, we've already seen rumors that Navi will have a dedicated AI solution (rival to Nv's Tensor Cores). ;-)
Remember 4096 stream processors on the AMD side that have proven to be strong on the openCL side (real time Ray tracing uses openCL). I really think a lot of this is going to come down to optimizations done at game and driver level. There are already ray tracing benchmarks that a simple vega 64 beats GV100. Again I don't think they will be beating a 2080ti I just want to see the performance side by side with a 2070 lol

Now I do expect Nvidia to be absurdly more efficient at it from the get go, but amd and brute force are definitely friends of old.
Posted on Reply
#56
Captain_Tom
RejZoRI really hope AMD gets their shit together with graphic cards as well. ZEN processors are doing really well. They might not be total Intel killers, but they are very competitive and that's what matters the most.
It really cannot be emphasized enough that both Polaris and Vega were choices.

-AMD had very little money, and Zen had to succeed.
-Furthermore they clearly understood after how Fermi and Kepler went, that for whatever reason people will not buy high-end AMD cards en mass until their CPU's are perceived as the best as well. If you can spend 1/4th as much on GPU's and still hold 30-40% of the market, why not just do that?
-Polaris and Vega are every bit as efficient as Pascal when you keep them at their intended clockspeeds: Under 1200MHz.

When AMD chooses to compete in high-end gaming again, they will. It's that simple lol.
Posted on Reply
#57
RejZoR
Polaris certainly was as they talked about it in that state for ages. Vega was a bit of a letdown given how they also talked about it... But it's still a capable GPU no denying that.
Posted on Reply
#58
Nkd
lasHaha wrong. What matters is what AMD gets out of it. And it's not much. But a few dollars is better than no dollars. For AMD every penny counts.
Once again, Sony (especially) and Microsoft are the winners when looking at the console market. Not AMD.



Why, it's a fact. Vega is a joke. 300W with 1070-1080 performance.
AMD sits at less than 15% GPU market share these days.
it doesn't matter. Do you really think AMD doesn't make anything on consoles? if it was not worth it I doubt AMD invests in it. I am sure its decent money. Consoles are sold at cost or at loss almost all the time, because they are making money on the software side. But AMD doesn't get anything from the software side. I mean its guaranteed money and consoles sell in the millions. That is easy money, you don't just walk away from that. AMD is the only one that can make CPU+GPU combos for console usage, its cheaper for sony and microsoft, Even though sony and microsoft may be selling the initial consoles at a loss they still pay the suppliers in advance. So regardless AMD gets paid. This is the only reason nvidia doesn't have major console share because then sony and microsoft will have to source the CPU from AMD or intel and would have to spend more on a cooling solution as well. Or go totally with nvidia tegra crap and be stuck with nintendo switch like graphics. AMD has that market locked up so you don't just abandon guaranteed revenue.

I am sure they will have a brand new GPU architecture out by 2020 as well as stated on their roadmap that won't be based on GCN.
Posted on Reply
#59
Casecutter
lasSony and Microsoft pays peanuts for those low end custom APU's. AMD is not the winner here. I'm obviously talking about PC GPU marketshare.
IDK, consider those companies are always longing to compact and smaller packages, that streamline across so many parts of their engineering, packaging, transportation, and cost are the bottom line. If AMD could come with a AIO chip (not on an interposer) that packs and interconnects the CPU/GPU/HBM (Super APU) on one process/substrate both companies would pay-up... a least at first.

AMD has been only giving them what they "spec". If AMD had the funds to advance above that they would again be more a "solution provider". Lately AMD has just been a design house that uses what's in the toy box to achieve the next level Sony and Microsoft consider what they "need" to be the next goal, they should bring to market. Once AMD pits the two in more a rivals... not "here's what you ask for"... it will stop being here's your spec's all within your price constraints. AMD did get some 'margin$' at first when they had the "solution" in being the single source for CPU/GPU. They just haven't innovated from that, if/when they do they can see a more viable market, especially if entry gaming stay high and 4K keeps out-pacing prudent outlay of funds.
Posted on Reply
#60
StrayKAT
Lets not get too excited about consoles though. Look who used to rule that world (and many other platforms): Motorola.
Posted on Reply
#61
lexluthermiester
lasMeanwhile AMD completely lost the GPU market after GPU mining dried out
25% to 30% market GPU share is hardly "lost".
Posted on Reply
#62
MrGenius
Performance per WattTF-ever. Like I care. Like anybody really cares. That's such a BS line. Is that all you've got? "Yeah...it performs as it should...but it uses more power". B F D!!!
TheinsanegamerNVega 64 SHOULD be around 1080 performance. IME, the vega 64 can vary wildly. In some games it's a bit faster and other times it barely reaches the 1070.
And...there are some non-gaming scenarios where it TOTALLY ANNIHILATES everything but a Titan V.

hwbot.org/submission/3921710_mrgenius_specviewperf_12_catia_radeon_rx_vega_64_149.06_fps

;)
Posted on Reply
#63
dorsetknob
"YOUR RMA REQUEST IS CON-REFUSED"
AnymalUrgent meeting at AMD HQ: Raja and CMO are there, Lisa is still absent.
:) One of the Mods Gives Creative writing lessons would you like me to link you to him :)
Posted on Reply
#64
Totally
Article"We decided to make the right investments. Technology is all about making the right choices, where we're going to invest, and where we're not going to invest...three or four years ago, it was mobile phones...
Nearly spit up my soda there, so much bullshit there. In some alternate reality, if amd never sold off Adreno she'd be singing a very different tune. The fact is AMD is no longer in the position to invest mobile, it's not a decision when you simply can't.
Posted on Reply
#65
StrayKAT
TotallyNearly spit up my soda there, so much bullshit there. In some alternate reality, if amd never sold off Adreno she'd be singing a very different tune. The fact is AMD is no longer in the position to invest mobile, it's not a decision when you simply can't.
They might not be singing a different tune. They might be dead.
Posted on Reply
#66
Totally
lasNot quite. Vega 56 is 10-15% better than Vega 64.

www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/29.html

1070 is 145W on average in gaming.
Vega 56 229W.
Vega 64 292W.

1070 Ti generally beats both Vega cards in gaming perf. While using 50/125 watts less.
Couple things he said 1070Ti, so I don't understand why you're moving the goalpost by mentioning the 1070. As far as the 1070ti is considered it averages 177-190W which is 50-30W less than the 56, this is disregarding that vega uses significantly less power(as much as 70 watts) when undervolted with little to no penalty performance-wise.
You know, the console business is all about the software sales. Hardware is not profitable. It just needs to be cheap. AMD delivered cheap. Also, a generation lasts ~10 years (supported).
Another nonsensical point, that's Microsoft/Sony's problem. AMD sells APUs/GPUs not consoles with the added bonus of MS/Sony picking up the overhead, regardless they are selling them with enough profit to keep interest. The console itself being not profitable is NOT their problem.
StrayKATThey might not be singing a different tune. They might be dead.
Not ruling that possibility out but I doubt it, if not selling would've killed them I doubt the sell would've been regarded as one of the biggest financial missteps of the decade.
Posted on Reply
#67
Prima.Vera
btarunrwhich threatens both Intel and NVIDIA
Wait...What??? :eek::eek::eek::laugh::laugh::laugh:
nVidia?? On which front? On most sexy office ladies, or in the bowling championship??
moproblems99...Even when the had the better product no one bought them. Everyone wanted that fancy Jacket.
Not true at all. I remember the HD 4870 and then HD 5870 times. I had both an HD4870X2 (it died after I try replacing it with a custom cooler) and then an HD 5870. They were the best price/performance cards at that time. Than the disaster happened and AMD lost the crown completely.
Posted on Reply
#68
Jism
lasMeanwhile AMD completely lost the GPU market after GPU mining dried out



Not sure Nvidia agrees on that
The note on where they would gain is proberly the compute area, as AMD cards favor compute related tasks.
Posted on Reply
#69
B-Real
lasWhy, it's a fact. Vega is a joke. 300W with 1070-1080 performance.
AMD sits at less than 15% GPU market share these days.
And Nvidia sits around 17-18%. Intel rules the whole market with 70%. When comparing only NV and AMD, it's around 31-69%. AMD was never ever at 50% in share. Closest was a 44% in 2010 and their lowest was 18% in 2015 Q2. If we compare with the latest result after exactly 3 years, 31% (nearly double of the 2015 Q2 data) doesn't seem that bad.
Posted on Reply
#70
First Strike
JismThe note on where they would gain is proberly the compute area, as AMD cards favor compute related tasks.
Actually, nope. Throughput on paper is one thing, actual throughput in real-world workload is another (cache-miss, warp latency, etc). Run some dgemm test on both cards you will see. Not to mention CUDA and its tool chain are much easier to use.
The only problem is that NV compute card with proper double-precision capabilities is so much more expensive. But for deep learning uses which only require FP32 or lower precision, I haven't seen a single lab that uses AMD card. For enterprise segment, AMD's MI25 hasn't found a single customer yet.
Posted on Reply
#71
notb
cdawallThere are already ray tracing benchmarks that a simple vega 64 beats GV100.
Show me. :)

Here's some test of AMD's own ProRender from just few months ago. Vega64 lost to Titan Xp and was just 15% faster than 1080Ti.
techgage.com/article/performance-testing-amds-radeon-prorender-in-autodesk-3ds-max/
Clearly Vega shines in rendering compared to how it performs in games, but that isn't RTX-level for sure.

Remember, that before RTX came RTRT wasn't really considered as a thing in gaming. It's just way too slow. That's why many people on this forum never heard of it.
However, it's not a new solution outside of gaming.

When earlier this year Nvidia announced that they're working on a new RTRT solution, AMD quickly answered saying that they'll improve RTRT implementation in their ProRender.
But Nvidia wasn't talking about a software approach, but about the ASIC for RTX.
I assume AMD has known about RTX for a while. They don't have a hardware answer yet, so they went for this ProRender thing as a temporary marketing solution.
But I'm pretty sure they're working on a hardware RTX competitor as well. They'll need it to keep up.
Sure, we may have doubts about RTRT popularity in gaming in next few years (well, I sure hope it stays).
But if Nvidia makes an RT accelerator based on this tech, they'll quickly eat whatever AMD still has in the 3D business.

BTW: Nvidia has just announced a dedicated Tensor accelerator for machine learning.
This means that until AMD shows their product (there have been rumors that they're working on a Nv Tensor alternative), everything on page is obsolete:
www.amd.com/en/graphics/servers-radeon-instinct-deep-learning
Now I do expect Nvidia to be absurdly more efficient at it from the get go, but amd and brute force are definitely friends of old.
As I said: we can expect AMD to be slightly more efficient in RT on general cores (GCN vs CUDA), but this is nowhere near the jump an ASIC gives.
BTW: I've seen Vega doing RTRT - I doubt it would be enough for Quake II. :)
Posted on Reply
#72
lexluthermiester
TotallyNearly spit up my soda there, so much bullshit there. In some alternate reality, if amd never sold off Adreno she'd be singing a very different tune. The fact is AMD is no longer in the position to invest mobile, it's not a decision when you simply can't.
Incorrect. AMD is in a perfect position to offer mobile products. But would it be wise in an over-saturated market? They made a choice and it is working for them.
Posted on Reply
#73
btarunr
Editor & Senior Moderator
Prima.VeraWait...What??? :eek::eek::eek::laugh::laugh::laugh:
nVidia?? On which front? On most sexy office ladies, or in the bowling championship??
The following is just for desktop discrete graphics cards (does not include APU/IGPs, console SoCs, or even the tons of mobile GPUs AMD is shipping to Apple for its Macbooks and MacPros, sauce):



AMD certainly has a larger slice of the GPU market than its share of the CPU market. It doesn't matter if people are gaming, mining, or doing something kinky with the cards they bought.

So yeah, despite the fact that it lacks a GTX 1080 Ti-competitor, AMD does threaten NVIDIA's bottom-line as of now.
Posted on Reply
#74
hat
Enthusiast
notbAnd yet you're providing us with this very weird text about how AMD is fighting the "giants". I'll keep calling it advertorial. Or at least: very bad journalism.
Also, you're going on about AMD threatening Nv's bottom-line like if Nv was a huge company and AMD was an ambitious newbie. :-D

As of 2017 Nvidia's revenue is less than twice as much as AMD's (9.7 vs 5.3 bln USD). These are fairly similar companies and they're just competing in a duopoly market - on equal terms.
And how many years has Intel been shitting on AMD? They've made an explosive comeback on all fronts of the CPU market. Their stock price has went up, what, six fold? They're hot on Intel's heels and now Intel is shitting themselves rather than shitting on AMD. After years of what can be accurately be called stagnation, they've rushed 6 and even 8 core chips out the door on the mainstream platform, while pushing clockspeeds up as well, even relying on solder again after years of toothpaste, and they're scrambling at red alert to get something bigger and badder out the door, even relying on TSMC to produce stuff for them, since their own foundries are a mess.

AMD could have very easily been called the little guy for years. Now the little guy is not so little anymore. They've got their big brother sweating like a pig to stay on top. That is neither an advertorial, nor bad journalism. That is simply what is.
Posted on Reply
#75
las
B-RealAnd Nvidia sits around 17-18%. Intel rules the whole market with 70%. When comparing only NV and AMD, it's around 31-69%. AMD was never ever at 50% in share. Closest was a 44% in 2010 and their lowest was 18% in 2015 Q2. If we compare with the latest result after exactly 3 years, 31% (nearly double of the 2015 Q2 data) doesn't seem that bad.
Obviously talking about gaming segment here. Nvidia sits close to 80% according to Steam HW Survey.
lexluthermiester25% to 30% market GPU share is hardly "lost".
More like 15% when looking at gaming segment. Steam HWS.

I hope 7nm will change this. Have not had an AMD GPU since my 7970, which IMO was their last really good GPU.
v12dockTech aside Lisa Su has done a spectacular job turning AMD around. CEO of the decade!
She didn't do jack. She is cringe. I remember her fucking up the Fury X release saying it was an Overclockers dream. Biggest lie ever.

You can thank Jim Keller instead. The brain behind Ryzen.

Now we just need GPU competition again.
Posted on Reply
Add your own comment
Dec 20th, 2024 07:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts