• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD CEO Speaks with Jim Cramer About the "Secret Sauce" Behind its Giant-Killing Spree

Is it possible to ban "las"? He/she/they/it is an antagonizing a**hole and I'm tired of reading their stupid replies in every topic. Seriously, take a step back and evaluate your life.
Antagonising* Take this to the moderators, not in a public thread.
 
It has a x86 license it does not have license for later instruction sets that matter ( and probably never will)

Yup unless they want to pay AMD for AMD64 instruction sets

Is it possible to ban "las"? He/she/they/it is an antagonizing a**hole and I'm tired of reading their stupid replies in every topic. Seriously, take a step back and evaluate your life.

Same with notb...
 
I doubt it'll be competitive at all.

Good thing that most games won't require it anyways. Or many won't even use it.

I don't know the width of the vega products would be conducive especially with GCN being general purpose from the get go. I am not saying we will see 2080ti performance from one, but it will be very very interesting to see it against a 2070 or 1080ti just to see how it falls. Remember AMD already has Ray tracing on the cards and has basically since launch.

https://gpuopen.com/announcing-real-time-ray-tracing/

https://pro.radeon.com/en/software/radeon-rays/
 
Haha wrong. What matters is what AMD gets out of it. And it's not much. But a few dollars is better than no dollars. For AMD every penny counts.
Once again, Sony (especially) and Microsoft are the winners when looking at the console market. Not AMD.
AMD gets their money from consoles, and Sony sells ps4 with loss (It was $60 loss at ps4 launch). They make money off PSPlus and games sales. Don't know about Xbox, but I assume it's the same.
 
Extremely insignificant. AMD probably has 6-8 engineers developing and testing AGESA, who aren't even doing that full-time. On the other hand, it takes dozens of engineers, endless back-and-forth with game devs, and a Russia-based testing/optimization team with dozens more people to keep the "Game Ready" cycle working. Millions of dollars per driver update, and hundreds of millions spent over the GPU's lifecycle.

This is why I think AMD will only invest in a new big GPU if that GPU can crush NVIDIA's big GPU. Until then it will stick to lower segments (bigger market), or semi-custom work. Investment into said big GPU will only happen when AMD has a ton of disposable income from selling CPUs.

If Lisa Su is as smart as I hope she is, she will spin-off Radeon as a separate company once she's confident AMD has a CPU market leadership that is sustainable.
Imho, Navi will get us a Polaris successor mid-class GPU to combat 2060 or 2070 for a bit less money just to keep AMD related to the biggest segment of the gaming market. If that arch is proven good for it they will produce an even bigger to combat in the high-end tier. They won't produce another expensive to make computing GPU and fetch it in gaming market just to say "we have a fast GPU". Vega 56 should be the only gaming GPU of Vega line as the 64 is clearly underutilised most times than not in games. Vulkan shows that power is there but not many games take advantage of it due to some arch's limits probably. Raja is engineer and although the Vega arch is good in many aspects, he made a mess as a leader in his market related decisions. That's why he departed afterall.
 
I don't know the width of the vega products would be conducive especially with GCN being general purpose from the get go. I am not saying we will see 2080ti performance from one, but it will be very very interesting to see it against a 2070 or 1080ti just to see how it falls. Remember AMD already has Ray tracing on the cards and has basically since launch.
Every GPU (and CPU) can do RT and, hence, also RTRT if the modeled scene is simple enough. It's a fairly basic mathematical problem.
Nvidia's strength is in the ASIC. They market their RT cores as being 6x faster than general cores. It's hard to guess how they got this result, but the simple fact is: you needed HPC for RTRT just few years ago. Suddenly RTRT is playable in AAA titles on a single GPU. Maybe not at the fps that some of us would like, but still - a huge jump.
Even if we assume AMD has an advantage in this kind of tasks, it's few tens of %, not few hundreds. ASIC is generations ahead.

It's more likely that Navi (or whatever it's called) will have similar RT cores. AMD can easily develop them and add via IF.
Mind you, we've already seen rumors that Navi will have a dedicated AI solution (rival to Nv's Tensor Cores). ;-)
 
Is it possible to ban "las"? He/she/they/it is an antagonizing a**hole and I'm tired of reading their stupid replies in every topic. Seriously, take a step back and evaluate your life.

Use the universal ignore function known as "don't bother" if you have something against someone.
 
Every GPU (and CPU) can do RT and, hence, also RTRT if the modeled scene is simple enough. It's a fairly basic mathematical problem.
Nvidia's strength is in the ASIC. They market their RT cores as being 6x faster than general cores. It's hard to guess how they got this result, but the simple fact is: you needed HPC for RTRT just few years ago. Suddenly RTRT is playable in AAA titles on a single GPU. Maybe not at the fps that some of us would like, but still - a huge jump.
Even if we assume AMD has an advantage in this kind of tasks, it's few tens of %, not few hundreds. ASIC is generations ahead.

It's more likely that Navi (or whatever it's called) will have similar RT cores. AMD can easily develop them and add via IF.
Mind you, we've already seen rumors that Navi will have a dedicated AI solution (rival to Nv's Tensor Cores). ;-)

Remember 4096 stream processors on the AMD side that have proven to be strong on the openCL side (real time Ray tracing uses openCL). I really think a lot of this is going to come down to optimizations done at game and driver level. There are already ray tracing benchmarks that a simple vega 64 beats GV100. Again I don't think they will be beating a 2080ti I just want to see the performance side by side with a 2070 lol

Now I do expect Nvidia to be absurdly more efficient at it from the get go, but amd and brute force are definitely friends of old.
 
I really hope AMD gets their shit together with graphic cards as well. ZEN processors are doing really well. They might not be total Intel killers, but they are very competitive and that's what matters the most.

It really cannot be emphasized enough that both Polaris and Vega were choices.

-AMD had very little money, and Zen had to succeed.
-Furthermore they clearly understood after how Fermi and Kepler went, that for whatever reason people will not buy high-end AMD cards en mass until their CPU's are perceived as the best as well. If you can spend 1/4th as much on GPU's and still hold 30-40% of the market, why not just do that?
-Polaris and Vega are every bit as efficient as Pascal when you keep them at their intended clockspeeds: Under 1200MHz.

When AMD chooses to compete in high-end gaming again, they will. It's that simple lol.
 
Polaris certainly was as they talked about it in that state for ages. Vega was a bit of a letdown given how they also talked about it... But it's still a capable GPU no denying that.
 
Haha wrong. What matters is what AMD gets out of it. And it's not much. But a few dollars is better than no dollars. For AMD every penny counts.
Once again, Sony (especially) and Microsoft are the winners when looking at the console market. Not AMD.



Why, it's a fact. Vega is a joke. 300W with 1070-1080 performance.
AMD sits at less than 15% GPU market share these days.

it doesn't matter. Do you really think AMD doesn't make anything on consoles? if it was not worth it I doubt AMD invests in it. I am sure its decent money. Consoles are sold at cost or at loss almost all the time, because they are making money on the software side. But AMD doesn't get anything from the software side. I mean its guaranteed money and consoles sell in the millions. That is easy money, you don't just walk away from that. AMD is the only one that can make CPU+GPU combos for console usage, its cheaper for sony and microsoft, Even though sony and microsoft may be selling the initial consoles at a loss they still pay the suppliers in advance. So regardless AMD gets paid. This is the only reason nvidia doesn't have major console share because then sony and microsoft will have to source the CPU from AMD or intel and would have to spend more on a cooling solution as well. Or go totally with nvidia tegra crap and be stuck with nintendo switch like graphics. AMD has that market locked up so you don't just abandon guaranteed revenue.

I am sure they will have a brand new GPU architecture out by 2020 as well as stated on their roadmap that won't be based on GCN.
 
Sony and Microsoft pays peanuts for those low end custom APU's. AMD is not the winner here. I'm obviously talking about PC GPU marketshare.
IDK, consider those companies are always longing to compact and smaller packages, that streamline across so many parts of their engineering, packaging, transportation, and cost are the bottom line. If AMD could come with a AIO chip (not on an interposer) that packs and interconnects the CPU/GPU/HBM (Super APU) on one process/substrate both companies would pay-up... a least at first.

AMD has been only giving them what they "spec". If AMD had the funds to advance above that they would again be more a "solution provider". Lately AMD has just been a design house that uses what's in the toy box to achieve the next level Sony and Microsoft consider what they "need" to be the next goal, they should bring to market. Once AMD pits the two in more a rivals... not "here's what you ask for"... it will stop being here's your spec's all within your price constraints. AMD did get some 'margin$' at first when they had the "solution" in being the single source for CPU/GPU. They just haven't innovated from that, if/when they do they can see a more viable market, especially if entry gaming stay high and 4K keeps out-pacing prudent outlay of funds.
 
Last edited:
Lets not get too excited about consoles though. Look who used to rule that world (and many other platforms): Motorola.
 
Performance per WattTF-ever. Like I care. Like anybody really cares. That's such a BS line. Is that all you've got? "Yeah...it performs as it should...but it uses more power". B F D!!!
Vega 64 SHOULD be around 1080 performance. IME, the vega 64 can vary wildly. In some games it's a bit faster and other times it barely reaches the 1070.
And...there are some non-gaming scenarios where it TOTALLY ANNIHILATES everything but a Titan V.

http://hwbot.org/submission/3921710_mrgenius_specviewperf_12_catia_radeon_rx_vega_64_149.06_fps

;)
 
Low quality post by Anymal
Vega64 is Radeon and it is for gaming. It also has performance of 1080 (gp104) and higher W than 1080ti (gp102). That is a failure!

Somewhere in 2016: Urgent meeting at AMD HQ: Raja and CMO are there, Lisa is still absent.

CMO: Howdi, Raja, how are you doing these days? Lisa is late, she is probably on the phone right now with her North Korean leader talking about counter measures against Green Hornet.
Raja: Hey, I am fine, just stop being such racist, you offend me too. BTW, she is from South...
CMO: I am not sure for her. But you are most definitely from south... something.
Raja: Just, stop.
CMO: No, I am too pissed of after reading of financial results from Q3 of Nvidia`s fiscal year 2017. They are rolling in the deep of money sea and we are still waiting you to finish Vega.
Raja: Wait, wait, we are still polishing...
CMO: No waiting any more, Lisa hired you to give us milking cow, I guess she thought that is what real Indians do for tech firms in USA, but after 290 series I think you are most definitely Native American Indian, giving us Hawaii climate-hot-chip with almost 1 Horse Power of TDP, Chief NotSoPale Horse. You should try to market that shit. We were most creative with PR slides, lying and stuff.
Raja: Hahaha, 4gb for 4k gaming, I remember, you silly fascist. But we gave you 8gb 290x, didnt we?
CMO: Yeah, you did and again we were forced to go street magician style so we could sell 390 series with 8gb as brand new gpu, we were succesful even, till you did it again, 4gb on higher-end Fiji. At least my department have been inspired immidiately to name this serie after Lisas reaction to that GPU of yours.
Raja: Forget about it, BetterRedneck. Polaris 10 is here to revolutionize.
Lisa: Hi, you two fanboys, sorry I am late. Did I hear Polaris and revolution in the same sentence? Who the fuck gives that high hope names, we are AMDesperate after all?
Raja: Hey, Lisa, whats up? Dont give up.
CMO: I am behind Polaris name. We have to stray away from negativity as AMDesperate, Fury, Rage and that kind of shit Raja forged.
Raja: You, prick. Why haven´t you came up with such a successful Geforce and Titan names if you are so smart?
Lisa: Chill out, Chief. Put away your tomahawk.
CMO: He is not...
Lisa: Just kidding. Relax, I know. Koduri, where is my Vega? Look me in the eyes and say it: Vega is better than Pascal and is on track for 2016 release.
Raja: Go ask Jong-Un.
Lisa: You mean Huang?
CMO: He did.
Raja: I am out of here, go fuck yourself, you racist scums. Vega is deliberately late to the market because that is just what I do.
CMO: I knew it. Lisa see, this browncookie tanned motherfucker sabotaged it, my precious.
Lisa: Language! Both, please! We are better than green goblin. Speaking of Huang`s marvel balls, Raja, did you discover where their gorgeous p/W ratio is coming from. You promised me that Polaris will be a notch better hence 14nm FinFet.
Raja: Oh, come on, only numbers, 16nm from TSCM is superior, GloFo FinFucked us with 14nm. Polaris only Achieve Maxwell-like efficiency. Now we party like its 2014. I am out of here, bitches.
CMO: Dammit, 14nm is so much easier to...
Lisa: OK, you can go now too. You have done more than enough, just think something better than Fury for Vega. Valkyrie maybe.
CMO: Nice, Lisa. Take care of Raja though, he might go green after another fail. Bye.
Lisa: No worries, DX12 is his game. Our is ZEN. Peace out.
 
Urgent meeting at AMD HQ: Raja and CMO are there, Lisa is still absent.
:) One of the Mods Gives Creative writing lessons would you like me to link you to him :)
 
Article said:
"We decided to make the right investments. Technology is all about making the right choices, where we're going to invest, and where we're not going to invest...three or four years ago, it was mobile phones...

Nearly spit up my soda there, so much bullshit there. In some alternate reality, if amd never sold off Adreno she'd be singing a very different tune. The fact is AMD is no longer in the position to invest mobile, it's not a decision when you simply can't.
 
Nearly spit up my soda there, so much bullshit there. In some alternate reality, if amd never sold off Adreno she'd be singing a very different tune. The fact is AMD is no longer in the position to invest mobile, it's not a decision when you simply can't.

They might not be singing a different tune. They might be dead.
 
Not quite. Vega 56 is 10-15% better than Vega 64.

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/29.html

1070 is 145W on average in gaming.
Vega 56 229W.
Vega 64 292W.

1070 Ti generally beats both Vega cards in gaming perf. While using 50/125 watts less.

Couple things he said 1070Ti, so I don't understand why you're moving the goalpost by mentioning the 1070. As far as the 1070ti is considered it averages 177-190W which is 50-30W less than the 56, this is disregarding that vega uses significantly less power(as much as 70 watts) when undervolted with little to no penalty performance-wise.


You know, the console business is all about the software sales. Hardware is not profitable. It just needs to be cheap. AMD delivered cheap. Also, a generation lasts ~10 years (supported).

Another nonsensical point, that's Microsoft/Sony's problem. AMD sells APUs/GPUs not consoles with the added bonus of MS/Sony picking up the overhead, regardless they are selling them with enough profit to keep interest. The console itself being not profitable is NOT their problem.

They might not be singing a different tune. They might be dead.

Not ruling that possibility out but I doubt it, if not selling would've killed them I doubt the sell would've been regarded as one of the biggest financial missteps of the decade.
 
Last edited:
which threatens both Intel and NVIDIA

Wait...What??? :eek::eek::eek::laugh::laugh::laugh:
nVidia?? On which front? On most sexy office ladies, or in the bowling championship??

...Even when the had the better product no one bought them. Everyone wanted that fancy Jacket.
Not true at all. I remember the HD 4870 and then HD 5870 times. I had both an HD4870X2 (it died after I try replacing it with a custom cooler) and then an HD 5870. They were the best price/performance cards at that time. Than the disaster happened and AMD lost the crown completely.
 
Last edited:
Meanwhile AMD completely lost the GPU market after GPU mining dried out



Not sure Nvidia agrees on that

The note on where they would gain is proberly the compute area, as AMD cards favor compute related tasks.
 
Why, it's a fact. Vega is a joke. 300W with 1070-1080 performance.
AMD sits at less than 15% GPU market share these days.
And Nvidia sits around 17-18%. Intel rules the whole market with 70%. When comparing only NV and AMD, it's around 31-69%. AMD was never ever at 50% in share. Closest was a 44% in 2010 and their lowest was 18% in 2015 Q2. If we compare with the latest result after exactly 3 years, 31% (nearly double of the 2015 Q2 data) doesn't seem that bad.
 
Last edited:
Back
Top