# AMD's next-gen RDNA 2 rumor: 40-50% faster than GeForce RTX 2080 Ti



## P4-630 (Jul 14, 2020)

_A new rumor for Big Navi is teasing some truly huge performance, with YouTuber Moore's Law Is Dead saying that AMD's flagship RDNA 2-based graphics card will be 40-50% faster than NVIDIA's current flagship GeForce RTX 2080 Ti graphics card.

But it's also not the first time we've heard that Big Navi would be twice as fast as the Navi 10-powered Radeon RX 5700 XT. In those rumors, we heard about 80 compute units, 5120 stream processors, and 17.5 TFLOPs of compute performance.

The leak says that AMD will be using 72 SMs with "two clusters of 36 CUs in each", with an unknown amount of GDDR6 on a 384-bit memory bus. The GPU has a game clock of 2.05GHz and boost clock of 2.15GHz in its current form.

Big Navi being this fast means AMD has caught up to NVIDIA in a big, big freaking way. It also means that the previous leaks of NVIDIA's upcoming GeForce RTX 3060 being about as fast as the GeForce RTX 2080 Ti that much more true.

If Big Navi is indeed this fast, it'll be the biggest shake up in the GPU industry in almost forever.

Lisa Su and the re-focused team at AMD and Radeon are ready to take this battle right to NVIDIA's door it seems, but don't think NVIDIA is going to take this lying down._







						AMD's next-gen RDNA 2 rumor: 40-50% faster than GeForce RTX 2080 Ti
					

Big Navi is waving something Big alright around, AMD's next-gen RDNA 2 leaks see it destroying NVIDIA's Ampere GeForce RTX 2080 Ti.




					www.tweaktown.com


----------



## Chomiq (Jul 14, 2020)

5ghz Zen 2 called back, it wants its bs rumors back.


----------



## Space Lynx (Jul 14, 2020)

tweaktown = giant grain of salt, lulz.  they hyped a bunch of stuff last couple years that turned out to be false, really they should not even be allowed to be considered news anymore imo


----------



## P4-630 (Jul 14, 2020)

lynx29 said:


> tweaktown = giant grain of salt



I was reading this on Dutch site hardware.info first:








						Gerucht: AMD's 'Big Navi' wordt 40 tot 50% sneller dan RTX 2080 Ti
					

AMD heeft een tijd geleden als de RDNA 2-architectuur besproken die in onder andere de aanstaande high-end producten van de chipontwerper terecht moet k...




					nl.hardware.info
				




 take your pick:


			big navi 50% faster than 2080ti - Google Search
		


It's about the video from Moore's Law is Dead.


----------



## kapone32 (Jul 14, 2020)

Interesting and it indeed could be true but we have to wait and see when they are actually available. I do like the fact that AMD is keeping this close to the vest in terms of true specs though.


----------



## Space Lynx (Jul 14, 2020)

kapone32 said:


> Interesting and it indeed could be true but we have to wait and see when they are actually available. I do like the fact that AMD is keeping this close to the vest in terms of true specs though.



even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.


----------



## kapone32 (Jul 14, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.


 Well I am on Vega and I am now addicted to AMD Radeon software. Even though there are negative reviews that data is at best reaching to say it is purely a driver issue. I have built systems with the 5600XT and 5700 and neither of those clients has had an issue.


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.


Driver problems caused by new architecture, but they are mostly sorted out now. Question is, will the driver problems be a problem again with another new architecture (RDNA 2), or will it this time be a smooth start? If so, I would consider a Radeon for myself.


----------



## xman2007 (Jul 14, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.


Still beating that same old drum, change the record and stop regurgitating crap you read on the Internet and spreading it as fact, it just makes you look foolish and ignorant


----------



## Space Lynx (Jul 14, 2020)

Kanan said:


> Driver problems caused by new architecture, but they are mostly sorted out now. Question is, will the driver problems be a problem again with another new architecture (RDNA 2), or will it this time be a smooth start? If so, I would consider a Radeon for myself.



Well I think Lisa Su is well aware of the driver stuff and has taken big steps to fix it. If GamersNexus/other reviews gives Big Navi the A-OK on driver improvements, I might get one. He is really one of the last people I trust on youtube, and he loves ryzen but called out the rx 5700 drivers and really tried working with AMD behind the scenes to help them be more aware of the issues too. Which shows his objectivity in the matter.



xman2007 said:


> Still beating that same old drum, change the record and stop regurgitating crap you read on the Internet and spreading it as fact, it just makes you look foolish and ignorant



or before I spend $800 I will read reviews.... but sure... keep attacking me fanboys... i own intel and AMD systems at the moment, sorry for being cautious... I guess?

damn tpu really has become toxic... im out, peace lol. one thread after another im just getting hammered. i can spend my time better than this. enjoy your day dude


----------



## Kanan (Jul 14, 2020)

lynx29 said:


> Well I think Lisa Su is well aware of the driver stuff and has taken big steps to fix it. If GamersNexus/other reviews gives Big Navi the A-OK on driver improvements, I might get one. He is really one of the last people I trust on youtube, and he loves ryzen but called out the rx 5700 drivers and really tried working with AMD behind the scenes to help them be more aware of the issues too. Which shows his objectivity in the matter.


True. HWUB is good as well.


----------



## xman2007 (Jul 14, 2020)

lynx29 said:


> one thread after another im just getting hammered. i can spend my time better than this. enjoy your day dude


Maybe cause you're turning into "one of those guys" saying the same in anything related to amd threads?  we get it amd bad Nvidia awesome, you don't have to keep saying the same thing over and over again and yes I'm a fanboy cause I don't listen to people like you and choose to spend my money on what I want, not what you think I should buy, then again I'm not in every Intel Nvidia thread trashing them and calling people fanboys either


----------



## xkm1948 (Jul 14, 2020)

lynx29 said:


> Well I think Lisa Su is well aware of the driver stuff and has taken big steps to fix it. If GamersNexus/other reviews gives Big Navi the A-OK on driver improvements, I might get one. He is really one of the last people I trust on youtube, and he loves ryzen but called out the rx 5700 drivers and really tried working with AMD behind the scenes to help them be more aware of the issues too. Which shows his objectivity in the matter.
> 
> 
> 
> ...




I am surprised they care attacking YOU, who have been saying all the nice thing about Radeon. Ouch.


Back to the topic. I wouldn't trust these ones too much. Most of the reliable rumors come from Taiwanese/Chinese hardware forums as there are folks who actually work in the production pipeline. The rest of the tech sites are either regurgitating, or just outright making shit up to get the clicks/views/traffic.

Anyway, let's give it another "Poor Ampere" shall well? And see if RTG actually delivers this round.


----------



## Vya Domus (Jul 14, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's



And here people, is why you are paying 500$ for 106 chips from Nvidia.  

God bless.


----------



## Assimilator (Jul 14, 2020)

Oh good, it's time to spin up the "2x faster than NVIDIA" rumour again. This one seems to come up every year and it never turns out to be true, yet people are happy to parrot it relentlessly regardless. Even better when it comes from an arbitrary YouTube personality with zero credibility!

2x perf over Ampere is not going to happen, especially considering this will be AMD's first try at hardware RT. It's always possible they've f**ked up RT so much they've just decided to cram more raster hardware on to compensate, but that seems unlikely considering the whole RT thing is still being sold as a feature of the next-gen consoles.

Yet sheeple will continue to believe these random YouTubers over facts and history, even when said YouTubers are proven wrong time and time again.


----------



## Calmmo (Jul 14, 2020)

Everyone would like to see nvidia's highend reign to end given how they've 2-3fold increased the pricing of enthusiast products in recent years,
but this is just more fake news wishful thinking to pander to fans. I guess it works since it keeps happening


----------



## Hyderz (Jul 14, 2020)

the way i see it, i want this to be true.

1. big increase in perf is always a welcome for us consumer
2. it will force nvidia to lower price if big navi is priced cheaper
3. we can choose nvidia/amd in the high end segment


----------



## cucker tarlson (Jul 14, 2020)

_



			new rumor for Big Navi is teasing some truly huge performance, with YouTuber Moore's Law Is Dead
		
Click to expand...

_lol
Mlid
Same one who made up ampere charts in ms paint.

This would mean that big rdna 2 has the performance of two 5700xt's and a quarter.thats 1.5x of 1.5x
On same 7nm node,with 50% perf/wat increase.
Good luck.


----------



## HD64G (Jul 14, 2020)

Have said months ago that if Navi21 has 80CUs it will end up being 30-40% faster than stock 2080Ti while consuming ~250W. If clocks are pushed higher it will get ~10% more performance and will consume 300W. Maybe a limited WC and more expensive model will be made to reach max clockspeeds.


----------



## theonek (Jul 14, 2020)

if all the rumors were true in reality till now....


----------



## nguyen (Jul 14, 2020)

Would be interesting if this were true, that means Nvidia can't charge 1500usd for their RTX 3090 anymore


----------



## neatfeatguy (Jul 14, 2020)

Aside from trying a handful of drivers to get one that didn't cause my brother issues, the 5700XT works wonders for him. Now, before anyone says "drivers sux for AMD!" - remember that not all drivers will work with all hardware configurations. You'll always have someone that has some issue with drivers, be it from AMD or Nvidia; I've had issues with nvidia drivers on my cards where others with similar systems didn't, so it happens.

Hopefully AMD can come back and throw it in Nvidia's face that they're still relevant in the high end gaming market.

I'm just patiently waiting for pricing to balance out to warrant the purchase of a new GPU. Right now $400 range gets me a card that maybe a 25-30% improvement over my 980Ti. Then in the $600 range around 40%, then the $700-750 gets me about a 50% improvement.....eh. Just not worth it in my opinion. Hopefully AMD can deliver and pricing won't be off the chart for both companies.

If a new line of cards can bring 2080Ti performance to $400 price range, that would be an upgrade I can live with. If not, then I keep using what I have, I'm in no hurry to dump a lot of cash for what I'd consider a bad investment.


----------



## CrAsHnBuRnXp (Jul 14, 2020)

Even though I will still buy nvidia because I like what they have to offer better (im constantly using Shadowplay and have been since it's introduction), we really need AMD to be competitive with nvidia again so they can drop their prices back down. I dont want to pay $1500 for a new GPU in a few months. I want the Ti to come back down around the $700-800 price range and be more affordable.


----------



## Fouquin (Jul 14, 2020)

Assimilator said:


> Oh good, it's time to spin up the "2x faster than NVIDIA" rumour again. This one seems to come up every year and it never turns out to be true, yet people are happy to parrot it relentlessly regardless. Even better when it comes from an arbitrary YouTube personality with zero credibility!
> 
> 2x perf over Ampere is not going to happen, especially considering this will be AMD's first try at hardware RT. It's always possible they've f**ked up RT so much they've just decided to cram more raster hardware on to compensate, but that seems unlikely considering the whole RT thing is still being sold as a feature of the next-gen consoles.
> 
> Yet sheeple will continue to believe these random YouTubers over facts and history, even when said YouTubers are proven wrong time and time again.



+50% is not 2x. That would be +100%.

There are promising things being circulated about RDNA2 and patent filings show AMD hasn't been sleeping. Their new BVH traversal logic block is a pretty unique and seemingly efficient design. Will it be a 1.5x uplift over TU102 though? That seems like a long shot but I don't doubt they have the ability to do it. The real question is what has nVidia targeted as their performance uplift over TU102? 1.6x? Maybe as high as 1.8x?


----------



## milewski1015 (Jul 14, 2020)

While it would be awesome to see AMD pull a card 40-50% faster than a 2080Ti out of it's hat, the grain of salt I'm taking with this rumor is bigger than my car.


----------



## 95Viper (Jul 14, 2020)

Stay on topic.
Quit insulting each other.
Be civil in your discussion.

Thank You and Have a Good Day


----------



## Vya Domus (Jul 14, 2020)

CrAsHnBuRnXp said:


> Even though I will still buy nvidia because I like what they have to offer better (im constantly using Shadowplay and have been since it's introduction), we really need AMD to be competitive with nvidia again



See, it doesn't work like that. The thing that decides if something was competitive or not is whether or not people bought it, "I want AMD to compete so that I can buy Nvidia" is a fallacy, it doesn't work. If you really need AMD to be competitive buy their products not Nvidia's.


----------



## cucker tarlson (Jul 14, 2020)

Vya Domus said:


> If you really need AMD to be competitive buy their products not Nvidia's.


hilarious.
so we've got 70% of consumers who don't need amd to be competitive.they don't give two shits.they just buy nvidia.


----------



## TheoneandonlyMrK (Jul 14, 2020)

xkm1948 said:


> I am surprised they care attacking YOU, who have been saying all the nice thing about Radeon. Ouch.
> 
> 
> Back to the topic. I wouldn't trust these ones too much. Most of the reliable rumors come from Taiwanese/Chinese hardware forums as there are folks who actually work in the production pipeline. The rest of the tech sites are either regurgitating, or just outright making shit up to get the clicks/views/traffic.
> ...


Some of those leaks on AMD hardware are from Nvidia, make what you will of that.

Sounds like a nice change this generation regardless of the shade of glasses you ware.


----------



## CrAsHnBuRnXp (Jul 14, 2020)

Vya Domus said:


> See, it doesn't work like that. The thing that decides if something was competitive or not is whether or not people bought it, "I want AMD to compete so that I can buy Nvidia" is a fallacy, it doesn't work. If you really need AMD to be competitive buy their products not Nvidia's.


I wont buy, but others will. What makes them competitive is how close in performance the new AMD cards will run in comparison to Ampere. That will help drive down the price to make nvidia say "buy ours because the price to performance is better".


----------



## TheoneandonlyMrK (Jul 14, 2020)

CrAsHnBuRnXp said:


> I wont buy, but others will. What makes them competitive is how close in performance the new AMD cards will run in comparison to Ampere. That will help drive down the price to make nvidia say "buy ours because the price to performance is better".


So Vya domus was right then.

Reviews don't matter to your type of blind buyer either, never mind opinions ,well Apple would also like your money.

And it's clearly a waste of your time and anyone else's arguing about AMD GPU rumours.

Personally I'll wait on reviews and watch the market, see what's going on, I'm definitely buying a ps5 and that could well do for a bit.

Or not if rumours are true.


----------



## Vya Domus (Jul 14, 2020)

CrAsHnBuRnXp said:


> I wont buy, but others will. What makes them competitive is how close in performance the new AMD cards will run in comparison to Ampere. That will help drive down the price to make nvidia say "buy ours because the price to performance is better".



You do not understand that sales precede "competitiveness", GPUs like the 290X/Fury X were very fast and relativity cheap but they were perceived as being uncompetitive because of bizarre reasons such as the heat/noise meme and Nvidia still outsold and out priced everything AMD had. Performance matters no where near as much as you think it does for the overall success of a product, look at CPUs, AMD is not the fastest in the only metric that apparently everyone cares about which is gaming, yet in just a few years they gained a considerable amount of mind share and sales. The general consensuses now is that AMD is ahead of Intel and sure enough CPUs that were previously in the thousands of dollars are now a few hundreds bucks.


----------



## CrAsHnBuRnXp (Jul 14, 2020)

theoneandonlymrk said:


> So Vya domus was right then.
> 
> Reviews don't matter to your type of blind buyer either, never mind opinions ,well Apple would also like your money.
> 
> ...


Im not a blind buyer. I look at reviews. Nvidia just offers what I want/need better. Whether or not AMD gives nvidia a run for their money this time around has yet to be seen.


----------



## Vya Domus (Jul 14, 2020)

CrAsHnBuRnXp said:


> Nvidia just offers what I want/need better.



Then you don't need AMD to be competitive, because you already prefer their products. Can you not see the issue with your logic ?



CrAsHnBuRnXp said:


> Im not a blind buyer.



Yet you said it yourself that you'll only buy from a particular brand no matter what. What is that if not the definition of a blind buyer ? What's the point of checking reviews if you're already convinced you'll purchase from the same camp anyway.


----------



## TheoneandonlyMrK (Jul 14, 2020)

CrAsHnBuRnXp said:


> Im not a blind buyer. I look at reviews. Nvidia just offers what I want/need better. Whether or not AMD gives nvidia a run for their money this time around has yet to be seen.


Do you believe what you just said, apple owners frequently say the same things, then always buy apple, does it really matter what reviews say at that point, no.


----------



## R0H1T (Jul 14, 2020)

Yes a rumor like the billion others this site publishes. Also lots of amateur mathematicians here, get your numbers right folks this is a tech forum not *reddit *


----------



## CrAsHnBuRnXp (Jul 14, 2020)

Vya Domus said:


> Then you don't need AMD to be competitive, because you already prefer their products. Can you not see the issue with your logic ?[/qoute]
> 
> 
> Yet you said it yourself that you'll only buy from a particular brand no matter what. What is that if not the definition of a blind buyer ? What's the point of checking reviews if you're already convinced you'll purchase from the same camp anyway.



Of course I do. I dont want to pay inflated prices forever.

Blind buyer = someone that doesnt care what the benchmarks show and/or from what camp. I pay attention to reviews. I care about reviewer opinions. Blind buyers dont. Nvidia just has features I care about more, their shit just works, and they are better performance wise. You cant say that about AMD right now.

After you contradicted yourself in post #32, I stopped caring what you said.



theoneandonlymrk said:


> Do you believe what you just said, apple owners frequently say the same things, then always buy apple, does it really matter what reviews say at that point, no.


Of course I do. You two are the ones trying to pick apart everything that I am saying to try and favor yourselves. I dont buy apple, I havent bought anything apple, and I never will.[/QUOTE]


----------



## Vya Domus (Jul 14, 2020)

CrAsHnBuRnXp said:


> After you contradicted yourself in post #32



Of course I didn't, you did, by claiming to not be a blind buyer yet admitting you'll only purchases from a particular brand no matter what. I don't even have to pick apart everything, the paradox in what you said couldn't be any more apparent. Anyway good luck in having AMD being competitive so that you can buy Nvidia, I am sure that will work out great, it's a bulletproof logic : 

I'd like Pepsi to make a better drink so I can keep buying Coca-Cola.


----------



## Vayra86 (Jul 14, 2020)

Vya Domus said:


> Then you don't need AMD to be competitive, because you already prefer their products. Can you not see the issue with your logic ?
> 
> 
> 
> Yet you said it yourself that you'll only buy from a particular brand no matter what. What is that if not the definition of a blind buyer ? What's the point of checking reviews if you're already convinced you'll purchase from the same camp anyway.



Being aware of the market reality is not being a blind buyer, it is being realistic about the differences between what's on offer between these two companies. Feature complete, or feature lacking.

The only one who can change that is AMD itself and so far they have not delivered.


----------



## Vya Domus (Jul 14, 2020)

Vayra86 said:


> Being aware of the market reality is not being a blind buyer, it is being realistic about the differences between what's on offer between these two companies.



Being aware of what ? Why does it even matter, the point is that if you absolutely want something from a certain brand, you don't need anyone else to be competitive. Assuming that the other guys were competitive, then the inevitable outcome is that you'd end up buying something inferior, since you prioritized that one brand anyway so that will work against you. There's something inherently dysfunctional with that mentality but each to their own.

Basically what I want to say is that that companies being competitive is only relevant to those who are looking at either camps.


----------



## CrAsHnBuRnXp (Jul 14, 2020)

Vya Domus said:


> Being aware of what ? Why does it even matter, the point is that if you absolutely want something from a certain brand, you don't need anyone else to be competitive. Assuming that the other guys were competitive, then the inevitable outcome is that you'd end up buying something inferior, since you prioritized that one brand anyway so that will work against you. There's something inherently dysfunctional with that mentality but each to their own.
> 
> Basically what I want to say is that that companies being competitive is only relevant to those who are looking at either camps.


It affects market price too and if you think that it doesnt, youre either ignorant or stupid.


----------



## TheoneandonlyMrK (Jul 14, 2020)

CrAsHnBuRnXp said:


> Of course I do. I dont want to pay inflated prices forever.
> 
> Blind buyer = someone that doesnt care what the benchmarks show and/or from what camp. I pay attention to reviews. I care about reviewer opinions. Blind buyers dont. Nvidia just has features I care about more, their shit just works, and they are better performance wise. You cant say that about AMD right now.
> 
> ...


[/QUOTE]
There's no favour to be had in this debate here , I just disagree with you based on what you just said, it wasn't logical to me.
Moving on, I will say Both companies need to hurry the f up, rumours(after this long waiting mind) are boring, I prefer price/performance debate's.


----------



## Vya Domus (Jul 14, 2020)

CrAsHnBuRnXp said:


> It affects market price



The prices are set according to the ceiling caused by what consumers are willing to pay, companies keep pushing until there is a negative response in numbers of sales, that's independent of the situation with regards to how competitive they are between each other. For now, Nvidia keeps selling more and more GPUs despite the constant prices hikes, which means that ceiling hasn't been reached. So expect yet another price hike irrespective of what AMD brings to the table, hell it might be a price bump initiated by AMD themselves. The market that you're describing where prices would go down every time competition picks up would mean every company would slowly tend towards making no money at all, which obliviously can't be the case. 

If you are still convinced prices would go up and down just because of competition, you're not ignorant nor stupid, you just have a ultra simplistic model of how this works. Which is common for your typical forum dweller that doesn't know how these industry operate at large.


----------



## kiriakost (Jul 14, 2020)

Vya Domus said:


> If you are still convinced prices would go up and down just because of competition, you're not ignorant nor stupid, you just have a ultra simplistic model of how this works. Which is common for your typical forum dweller that doesn't know how these industry operate at large.



Lets help industry then to decide faster,  TPU it should start a voting thread, all VGA models listed, and the voter will have a single vote and choose or decide how much he is willing to pay about it.


----------



## moproblems99 (Jul 14, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.



My GTX 980 was shit. CTD and black screen galore.  My V56 has been so-so.  There are an impossible number of software configurations to test.  Will it stop me from buying either brand?  Nope.


----------



## R0H1T (Jul 14, 2020)

Yup not one one time have I seen tax savings, when the govt decided to lower taxes, being passed on to the consumers. Same goes in any & every industry out there, let's just say *capitalism, free markets & trickle down* is overrated. Companies just increase their margins whenever they can, Intel only reduced their prices because they were forced (with declining sales) to & unless people shift to buying AMD en masse Nvidia will continue their trope & price Ampere to heavens this fall.

I'm not pointing this just at Nvidia users, I'm also one, it's the reality. The DIY market probably flipped towards majority(?) AMD even before zen2 launched, if Mindfactory numbers & Amazon bestsellers list is anything to go by. So the excuse that you need top performance to be able to lead a particular sector is just BS, having said that AMD right now probably covers only two thirds of the market & price range that Nvidia's selling into. They don't have a competitive sub $200 GPU & nothing above $800, would be interesting to see if non Nvidia fans choose AMD provided RDNA2 based GPU's are competitive across the board. You know put your *$* where your mouth is?


----------



## moproblems99 (Jul 14, 2020)

neatfeatguy said:


> If not, then I keep using what I have, I'm in no hurry to dump a lot of cash for what I'd consider a bad investment.



That's thing with PC parts, unless you use it for work, there are no good investments.  You are spot on there.



R0H1T said:


> Intel only reduced their prices because they were forced



That is the essence of capitalism.  Ostensibly, it has nothing to do with government.  But that is not for here.



CrAsHnBuRnXp said:


> though I will still buy nvidia because I like what they have to offer better (im constantly using Shadowplay and have been since it's introduction)



While I haven't used it, I hear their recording stuff works quite well.



R0H1T said:


> AMD right now probably covers only two thirds of the market & price range that Nvidia's selling into. They don't have a competitive sub $200 GPU & nothing above $800



This is another one of their problems.  However, even when they have better products people bought NV.  Just look at Fermi (480,470), AMDs product was better but NV still sold just as well if not better.  If my memory serves me, it may not.  Fast forward to Hawaii, it was terrible because it was hot and had high power draw.  Guess who sold more?


----------



## TheLostSwede (Jul 14, 2020)

Calmmo said:


> Everyone would like to see nvidia's highend reign to end given how they've 2-3fold increased the pricing of enthusiast products in recent years,
> but this is just more fake news wishful thinking to pander to fans. I guess it works since it keeps happening


Can we please stop calling rumours fake news? The two aren't related and shouldn't be used interchangeably.



nguyen said:


> Would be interesting if this were true, that means Nvidia can't charge 1500usd for their RTX 3090 anymore


Well, if AMD charges $2,000 for their top of the range card (assuming the rumours are true), they could...


----------



## Vayra86 (Jul 14, 2020)

Vya Domus said:


> Being aware of what ? Why does it even matter, the point is that if you absolutely want something from a certain brand, you don't need anyone else to be competitive. Assuming that the other guys were competitive, then the inevitable outcome is that you'd end up buying something inferior, since you prioritized that one brand anyway so that will work against you. There's something inherently dysfunctional with that mentality but each to their own.
> 
> Basically what I want to say is that that companies being competitive is only relevant to those who are looking at either camps.



You're mixing up cause and effect. That is what I want to say. It makes your comment completely out of place. People are not blind to the other brand, that is the figment of your imagination. After all, Ryzen proves you wrong. People take a long look at it, and quite a few take the plunge despite a bad track record in the past near-decade.


----------



## Vya Domus (Jul 15, 2020)

Vayra86 said:


> People are not blind to the other brand



I just pointed out to someone which openly admitted that they were.



Vayra86 said:


> After all, Ryzen proves you wrong.



It proves me right, Ryzen is not the fastest in gaming, something which everyone yells and moans about, yet it still gained a lot of traction. Absolute performance does not indicate if something is competitive or whether or not you should expect prices to change.


----------



## Kanan (Jul 15, 2020)

AMD has a lot of ground to make up. If they can release a new GPU with great drivers from day 1, THIS would show people that things have changed. Or will it be faulty again, for experts and tweakers, but not for everyday people, who just wanna install and play? RDNA 2 has the potential to be great, but drivers are the main thing, when it comes to GPUs, that's the funny thing about it. And then there is to hope the reference design coolers are well made, and not subpar again. Nvidia is reckless, if AMD can't be comparable to their high quality level, they will stay at their low percentage market share forever. The literal reason why Nvidia has won, even in desperate times, is marketing and good drivers, but it all started with the reckless marketing. They also more or less always delivered, had great products, especially since GTX 600 series. No loud and hot memes anymore, well balanced products. AMD has to be like Nvidia or better, in order to win. Ryzen is a great product, that's why it is winning. Can Radeon be like Ryzen? That is what waits to be seen.


----------



## CrAsHnBuRnXp (Jul 15, 2020)

Vya Domus said:


> I just pointed out to someone which openly admitted that they were


I never said that I was. I said that I just favored one brand over the other. That's not being blind. If that's the case that can be used in an example if you like Charmin toilet paper over Angel soft. One's not being blind to the other. It's just down to personal preference. Stop confusing the two.


----------



## cucker tarlson (Jul 15, 2020)

CrAsHnBuRnXp said:


> I never said that I was. I said that I just favored one brand over the other. That's not being blind. If that's the case that can be used in an example if you like Charmin toilet paper over Angel soft. One's not being blind to the other. It's just down to personal preference. Stop confusing the two.


this is very fair.

you know what is not - a clown who preaches how good the other brand is and how people are not objective not buying it but he himself has a history of buying the other one only.


----------



## Frick (Jul 15, 2020)

CrAsHnBuRnXp said:


> I never said that I was. I said that I just favored one brand over the other. That's not being blind. If that's the case that can be used in an example if you like Charmin toilet paper over Angel soft. One's not being blind to the other. It's just down to personal preference. Stop confusing the two.



This is nitpicking, and you have a decent point, but toilet paper definitely is preference only. GPUs have clear performance/price ratios (X is %faster than Y but uses %more power than Y).


----------



## Calmmo (Jul 15, 2020)

TheLostSwede said:


> Can we please stop calling rumours fake news? The two aren't related and shouldn't be used interchangeably.



When it comes to radeon performance - rumours have been proven to be fake. At least my memory doesnt go back enough to recall a point in time when they were true.
They boil down to "radeon amazing otherworldly performance incoming" only to be proven false.

I'll believe it when i see it.
(it'd be great if it does to be clear)


----------



## Lindatje (Jul 15, 2020)

theoneandonlymrk said:


> Do you believe what you just said, apple owners frequently say the same things, then always buy apple, does it really matter what reviews say at that point, no.


The Apple comparison does not hold. Only Apple has MacOS / IOS.
With Nvidia you can play games and with AMD you can play the same games so that's something different.

But the AMD drivers are still bad.


----------



## Kanan (Jul 15, 2020)

Lindatje said:


> But the AMD drivers are still bad.


People will never stop to say this, even if drivers are fine. Endless meme achieved.


----------



## Vya Domus (Jul 15, 2020)

Kanan said:


> People will never stop to say this, even if drivers are fine. Endless meme achieved.



Don't forget the hot and loud meme.


----------



## Kanan (Jul 15, 2020)

Vya Domus said:


> Don't forget the hot and loud meme.


Well but that's AMDs own fault again, the ref cooler of 5700 series wasn't exactly good.  That's why I hope the new coolers will be better. I think they will have 2 normal fans, no blower anymore.


----------



## TheoneandonlyMrK (Jul 15, 2020)

Lindatje said:


> The Apple comparison does not hold. Only Apple has MacOS / IOS.
> With Nvidia you can play games and with AMD you can play the same games so that's something different.
> 
> But the AMD drivers are still bad.


The guy I was talking to highlighted Nvidia's features like Ansel, shadow play.

I was talking about brand loyalty, so it was the same.

Nice baiting though I chuckled.


----------



## kapone32 (Jul 15, 2020)

"They don't have a competitive sub $200 GPU". Show me a competitive sub $200  from anybody. There is something most people are forgetting in this debate. After Tahiti AMD had no money to create a new node as the R9 380 shows. Polaris was a financial success for AMD and gave them (and the consoles) enough money to totally develop a new node in the form of Navi, as evidenced by the teething issues that any new node has (X299 anyone). The fact that AMD trades at $48 to $50 a share also has relevance in the theory that a 900% increase in share price should allow for a lot more R&D. 1st gen Ryzen had some memory compatibility issues. 2nd Gen Ryzen solved all the memory issues. I see 2nd Gen Navi being rock solid because it will not just be in PCs but also consoles so you know it's been tested thoroughly. The elephant in the room is that Nvidia has increased the price of it's entire stack. As they are the leader in the space it means AMD too has to increase the price of their stack. As an example I can buy a Ryzen4 laptop with a 2060M for less than a 2080Ti. Much less I can buy prebuilts with 2070 or 5700XT for less money than a 2080TI. The 5700XT is around $700 here in Canada. I expect big Navi to start at $799 US or $1100 Canadian. If it is is as fast or faster than a 2080TI at that price point it will make sense from an objective standpoint.


----------



## Kanan (Jul 15, 2020)

theoneandonlymrk said:


> The guy I was talking to highlighted Nvidia's features like Ansel, shadow play.
> 
> I was talking about brand loyalty, so it was the same.
> 
> Nice baiting though I chuckled.


Ansel is meaningless, Shadowplay is nothing special. 780 Ti 980 Ti 1080 Ti owner here. I had 2 Radeons and both were great, second card was even Crossfire (HD 5970) and only 1 F2P game didn't work properly (Path of Exile). Everything else was great. I bet I could've bought RX 5700 XT on day one and it would've been fine, I just didn't want to take the risk back then. But if you're experienced user it will all work out anyway. What i want for AMD is rock solid drivers, idiot-proof, and more people helping AMD to break the fanboy powered mindshare nvidia has. It will benefit everyone if we have a more balanced market for GPUs.


----------



## CrAsHnBuRnXp (Jul 15, 2020)

Frick said:


> This is nitpicking, and you have a decent point, but toilet paper definitely is preference only. GPUs have clear performance/price ratios (X is %faster than Y but uses %more power than Y).


Ok bad example. But you get my point.


----------



## Lindatje (Jul 15, 2020)

[/QUOTE[/QUOTE]


theoneandonlymrk said:


> The guy I was talking to highlighted Nvidia's features like Ansel, shadow play.
> 
> I was talking about brand loyalty, so it was the same.
> 
> Nice baiting though I chuckled.


No it’s not the same. Where can you buy a smartphone/tablet with IOS that is not from Apple? Same with MacOS.



Kanan said:


> People will never stop to say this, even if drivers are fine. Endless meme achieved.


I have used AMD GPU’s for many years now, 2020 
is the year I bought Nvidia because I started getting tired of driver problems at AMD.
If there is someone who has almost only used AMD GPUs, it is me.


----------



## wickerman (Jul 15, 2020)

The gpu market is littered with examples of giant leaps in performance over a companies previous gen products, and it's also full of titanic failures where one competitor fell behind for generations at a time. So really nothing should come as a surprise to us anymore, regardless of which side of that line RDNA 2 lands. But, AMD has had massive success with Ryzen and AMD knew what it had on its hands long before we did. If they chose to dump the fruits of that success (money, talent, R&D, etc) into the graphics division we really should be expecting that to pay off right about now. So maybe big RDNA 2 gpus will actually manage to something we havn't seen in a while - a clear win for AMDs graphics division... outside of APUs of course, where AMD really has done well to make integrated graphics more than just the most basic tech capable of animated UIs and video playback. 

When its time to upgrade my 2080, I'll still be reading the reviews to determine the best card for my needs. I really don't see a downside to getting to read deep dives and extensive testing to make that choice, so I'd like these rumors to come true.


----------



## BoboOOZ (Jul 15, 2020)

Assimilator said:


> Oh good, it's time to spin up the "2x faster than NVIDIA" rumour again. This one seems to come up every year and it never turns out to be true, yet people are happy to parrot it relentlessly regardless. Even better when it comes from an arbitrary YouTube personality with zero credibility!


To be accurate, the rumor is that Navi2x means 2X the performance of Navi 10, so the 5700XT, not the 2080Ti. 

You can infer what that would mean for the 208TI, but it's definitely not 2X.

To the OT, I would expect Big Navi to have 2X the performance, it's really easy, given that Navi10 is only 40CU and that they are moving to a better node.
Does that mean that it will be better than Nvidia 3080 Ti? No idea, Nvidia is not wasting time either. What matters in the end for most people is how much these cards will cost, and that is decided at the last minute.

Should be people selling their 2080TI now? Yes, they should, I hear they still can get a great price right now due to shortages and  I bet in 6 months they will be worth half as much.


----------



## Divide Overflow (Jul 15, 2020)

I'll wait for the hand's on reviews and advise everyone else to do the same.  You are responsible for managing your own expectations.
I do hope that AMD has finally caught up to Nvidia in the performance GPU market.  Healthy competition is a win for consumers!


----------



## Kanan (Jul 15, 2020)

wickerman said:


> The gpu market is littered with examples of giant leaps in performance over a companies previous gen products, and it's also full of titanic failures where one competitor fell behind for generations at a time. So really nothing should come as a surprise to us anymore, regardless of which side of that line RDNA 2 lands. But, AMD has had massive success with Ryzen and AMD knew what it had on its hands long before we did. If they chose to dump the fruits of that success (money, talent, R&D, etc) into the graphics division we really should be expecting that to pay off right about now. So maybe big RDNA 2 gpus will actually manage to something we havn't seen in a while - a clear win for AMDs graphics division... outside of APUs of course, where AMD really has done well to make integrated graphics more than just the most basic tech capable of animated UIs and video playback.
> 
> When its time to upgrade my 2080, I'll still be reading the reviews to determine the best card for my needs. I really don't see a downside to getting to read deep dives and extensive testing to make that choice, so I'd like these rumors to come true.


RDNA 2 will be good, because it is the now finished RDNA architecure, or what RDNA should've been from the start. The GCN elements are gone, it will be completely new, so big improvements are to be expected, just how big, is what is to be awaited.


----------



## EarthDog (Jul 15, 2020)

I'm still sticking with my initial guess that big navi/rdna2 will fall somewhere between a 2080ti and Ampre's flagship (non titan). My biggest question is power use to get there on the same process...

Whereas on the NV side, its a new arch plus a die shrink.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> I'm still sticking with my initial guess that big navi/rdna2 will fall somewhere between a 2080ti and Ampre's flagship (non titan). My biggest question is power use to get there on the same process...
> 
> Whereas on the NV side, its a new arch plus a die shrink.


7nm EUV is 20% denser than 7nm, so there is a gain for RDNA2, too








						7 nm lithography process - WikiChip
					

The 7 nanometer (7 nm) lithography process is a technology node semiconductor manufacturing process following the 10 nm process node. Mass production of integrated circuit fabricated using a 7 nm process began in 2018. The process technology will be phased out by leading-edge foundries by...




					en.wikichip.org


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> 7nm EUV is 20% denser than 7nm, so there is a gain for RDNA2, too
> 
> 
> 
> ...


That's good. They'll need it.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> That's good. They'll need it.


We need it, we are the ones who want moar performance


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> We need it, we are the ones who want moar performance


I'm not worried about them being in the ball park for flagship performance, more so the power it takes to get there comparitively. We see how the reference 5700XT seemed to be using more power (at least 10% more) than a FE (faster than reference) 2070 and just as fast (that changed down low with 5500XT IIRC). While that isn't a deal breaker for many, that was at 7nm vs 12nm(?) for Nvidia. Now NV is shrinking and AMD is tweaking...I worry that gap will grow which could turn off some users.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> I'm not worried about them being in the ball park for flagship performance, more so the power it takes to get there comparitively. We see how the reference 5700XT seemed to be using more power (at least 10% more) than a FE (faster than reference) 2070 and just as fast (that changed down low with 5500XT IIRC). While that isn't a deal breaker for many, that was at 7nm vs 12nm(?) for Nvidia. Now NV is shrinking and AMD is tweaking...I worry that gap will grow which could turn off some users.


Well, AMD really OC'd the crap out of the 5700XT, I guess it was their strategy to make some money after selling the Radeon VII at a loss.

If competition remains high, I imagine both AMD and Nvidia will leave less and less performance on the table, so we will see TDP's going higher and higher. 
Personally I don't mind having a 350W card, I just need to read the reviews to make sure the cooling solution is adequate. In practice, I tend to restrict the TDP of my card depending on the game I play, with the option of having the full performance only for when it's truly needed.


----------



## bug (Jul 15, 2020)

HD64G said:


> Have said months ago that if Navi21 has 80CUs it will end up being 30-40% faster than stock 2080Ti while consuming ~250W. If clocks are pushed higher it will get ~10% more performance and will consume 300W. Maybe a limited WC and more expensive model will be made to reach max clockspeeds.


Wow, you're so good you actually know the power draw. Kudos, dude.

I, for one, hope the rumor is true (yeah, yeah, I know better), because we really need more HP in the mid-range for 4k gaming to actually become mainstream.


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> Personally I don't mind having a 350W card


That's you and likely not some of these users who double talk through power (complain when NV uses more, but OK when AMD does, lol). Surely that is from both sides of the track, but AMD users here are surely a vocal group... lol.

EDIT: Cute comment


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> That's you and likely not some of these users who double talk through power (complain when NV uses more, but OK when AMD does, lol). Surely that is from both sides of the track, but AMD users here are surely a vocal group... lol.


Well, for a general consumer product, or a pro-oriented product, I understand the problem, but high-end gaming cards are, well, for gamers, so why complain about a few extra watts? 

And there's somebody driving the boat? I haven't noticed then, I've seen so much fanboy flame around here, I was certain it's only the wind


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> Well, for a general consumer product, or a pro-oriented product, I understand the problem, but high-end gaming cards are, well, for gamers, so why complain about a few extra watts?
> 
> And there's somebody driving the boat? I haven't noticed then, I've seen so much fanboy flame around here, I was certain it's only the wind


It's like wallpaper... changes with the times and who's doing it... but I digress. 

A few extra watts? My man... Nvidia's flagship was 225W and you're good up to 350W? People just don't want to deal with cooling such loads... most gamers rock 1080p and good with mid-range and below. So yeah, if the flagship start at 350W, lol on that. 

That said, I know you used 350W as an example... and I agree that if it is within 'a few watts' people won't care. But, it wasn't within a few watts (5700XT to 2070) it was 20W/10%. If that gap increases, anyone who doesn't have bias will complain.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> A few extra watts? My man... Nvidia's flagship was 225W and you're good up to 350W? People just don't want to deal with cooling such loads... most gamers rock 1080p and good with mid-range and below. So yeah, if the flagship start at 350W, lol on that.
> 
> That said, I know you used 350W as an example... and I agree that if it is within 'a few watts' people won't care. But, it wasn't within a few watts (5700XT to 2070) it was 20W/10%. If that gap increases, anyone who doesn't have bias will complain.


Well, I agree that for 1080p/144FPS 150W is enough. But we don't need flagships for that.

But for high refresh ultra-wide or for 4k/120 FPS I am prepared to accept 350W or even more, as long as the card is reasonably silent and it doesn't cost an arm. 
An arm is 700 euro for me, btw  .


----------



## bug (Jul 15, 2020)

BoboOOZ said:


> Well, I agree that for 1080p/144FPS 150W is enough. But we don't need flagships for that.
> 
> But for high refresh ultra-wide or for 4k/120 FPS I am prepared to accept 350W or even more, as long as the card is reasonably silent and it doesn't cost an arm.
> An arm is 700 euro for me, btw  .


You have to account for what a 350W high-end card does to the mid-range, too 
Fortunately, 375W is the most you can get out of a PCIe card (and that's with two 8 pin connectors), no one will go so close to that limit.


----------



## TheLostSwede (Jul 15, 2020)

Calmmo said:


> When it comes to radeon performance - rumours have been proven to be fake. At least my memory doesnt go back enough to recall a point in time when they were true.
> They boil down to "radeon amazing otherworldly performance incoming" only to be proven false.
> 
> I'll believe it when i see it.
> (it'd be great if it does to be clear)


Rumours ≠ fact. 
Fake news is something else entirely.

Rumours should always be taken with a healthy dose of NaCl.


----------



## BoboOOZ (Jul 15, 2020)

bug said:


> You have to account for what a 350W high-end card does to the mid-range, too
> Fortunately, 375W is the most you can get out of a PCIe card (and that's with two 8 pin connectors), no one will go so close to that limit.


Hahah, you say that because you don't listen to rumors.

But I was hearing yesterday night some crazy rumors about Nvidia using a new power connector on some high-end card just because of that (going above 350W).


----------



## HD64G (Jul 15, 2020)

bug said:


> Wow, you're so good you actually know the power draw. Kudos, dude.
> 
> I, for one, hope the rumor is true (yeah, yeah, I know better), because we really need more HP in the mid-range for 4k gaming to actually become mainstream.


We have enough elements to calculate on both performance and power draw after official slides from AMD showed the +50% efficiency vs Navi gen1 and 80CUs of Navi21 is known for some time now. Minimum clocks of RDNA2 is given from the next-gen consoles also.

I hope AMD manages to increase their market share on the top tier of the GPU market as that will push prices down both for new and used GPUs. We as customers need that as much as we needed Zen to arrive and create a healthy CPU market as the one we have now with $400 for 12C/24T for 3900X.


----------



## cucker tarlson (Jul 15, 2020)

BoboOOZ said:


> Hahah, you say that because you don't listen to rumors.
> 
> But I was hearing yesterday night some crazy rumors about Nvidia using a new power connector on some high-end card just because of that (going above 350W).


I heard about the new connector too,it'll charge your phone at 225w striaght off the single 12-pin



HD64G said:


> We have enough elements to calculate on both performance and power draw after official slides from AMD showed the +50% efficiency vs Navi gen1 and 80CUs of Navi21 is known for some time now. Minimum clocks of RDNA2 is given from the next-gen consoles also.
> 
> I hope AMD manages to increase their market share on the top tier of the GPU market as that will push prices down both for new and used GPUs. We as customers need that as much as we needed Zen to arrive and create a healthy CPU market as the one we have now with $400 for 12C/24T for 3900X.


if 50% perf/wat is the only known fact it'd be wise to see what amd achieved with +50% perf/wat last time they had it.you may be disappointed that it wasn't 1.5x performance of a card that is 1.5x faster (2.25x).not even 1.5x performance actually.
and no,80cu is not "known for some time".it's "rumored for some time" with no basis except clickbait tech sources on TT/YT


----------



## BoboOOZ (Jul 15, 2020)

cucker tarlson said:


> I heard about the new connector too,it'll charge your phone at 225w striaght off the single 12-pin


As long as the RT performance of your phone is 10X that of a 2080Ti, who cares . we're trolling too much at this point.


cucker tarlson said:


> if 50% perf/wat is the only known fact it'd be wise to see what amd achieved with +50% perf/wat last time they had it.
> and no,80cu is not "known for some time".it's "rumored for some time" with no basis except clickbait tech sources on TT/YT


The thing is, RDNA2 doesn't need 80CU to have 2X the performance of the Navi10, there's also IPC and clock speeds in the equations. 64 CU could be enough.


----------



## kapone32 (Jul 15, 2020)

bug said:


> Wow, you're so good you actually know the power draw. Kudos, dude.
> 
> I, for one, hope the rumor is true (yeah, yeah, I know better), because we really need more HP in the mid-range for 4k gaming to actually become mainstream.


 Exactly we need a card to serve the masses. I have a 1.2KW PSU so the power draw is a non factor for me. The price has to be right though because if it is too high in the stratosphere it will fail, regardless of the 2080TI being faster or not. I actually do not like when people reference the 2080TI vs RDNA2 that is not AMD's official guidance.


----------



## cucker tarlson (Jul 15, 2020)

BoboOOZ said:


> As long as the RT performance of your phone is 10X that of a 2080Ti, who cares . we're trolling too much at this point.
> 
> The thing is, RDNA2 doesn't need 80CU to have 2X the performance of the Navi10, there's also IPC and clock speeds in the equations. 64 CU could be enough.


doesn't perf/wat include ipc ?


----------



## bug (Jul 15, 2020)

BoboOOZ said:


> Hahah, you say that because you don't listen to rumors.
> 
> But I was hearing yesterday night some crazy rumors about Nvidia using a new power connector on some high-end card just because of that (going above 350W).


Using two 8 pin connectors is already outside PCI-SIG specs, I doubt anything on top of that will make its way into consumer space.



cucker tarlson said:


> doesn't perf/wat include ipc ?


Nope. IPC is perf/clock. Higher clocks mean higher power draw, but one does not include the other.


----------



## cucker tarlson (Jul 15, 2020)

bug said:


> Using two 8 pin connectors is already outside PCI-SIG specs, I doubt anything on top of that will make its way into consumer space.
> 
> 
> Nope. IPC is perf/clock. Higher clocks mean higher power draw, but one does not include the other.


but ipc is a way to achieve higher perf/wat


----------



## BoboOOZ (Jul 15, 2020)

cucker tarlson said:


> doesn't perf/wat include ipc ?


Perf includes already everything: IPC gains, efficiency from shrinking, better efficiency from the maturing process, improving the architecture, improving clock speeds, etc.
But you don't necessarily need to have any IPC gains to improve perf/watt. The simplest way is just to downclock the chip.

Anyways, what I pointed out to you is that 80CU is not needed for RDNA2 to achieve 2X Navi10 performance.


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> Well, I agree that for 1080p/144FPS 150W is enough. But we don't need flagships for that.
> 
> But for high refresh ultra-wide or for 4k/120 FPS I am prepared to accept 350W or even more, as long as the card is reasonably silent and it doesn't cost an arm.
> An arm is 700 euro for me, btw  .


That's you bud... you. 350W+ single GPU... lol.. I hope not.



BoboOOZ said:


> The simplest way is just to downclock the chip.


No?

You are also lowering performance this way...


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> No?
> 
> You are also lowering performance this way...


Second part is true, but first part is not.
You are indeed lowering performance, but you are lowering electrical consumption more. The 5700XT is most efficient in its Apple variant at slightly above 1GHz, it's still decent at 1.6GHz and well, we know how "glorious" it is at 1.9GHz and above...


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> Second part is true, but first part is not.
> You are indeed lowering performance, but you are lowering electrical consumption more. The 5700XT is most efficient in its Apple variant at slightly above 1GHz, it's still decent at 1.6GHz and well, we know how "glorious" it is at 1.9GHz and above...



More? How do you know? You list the 5700XT.... I can buy that because of its power use compared to performance... but what about every other card?

So again I say, NO as you lower performance and power this way... no way to tell which is "more" or how much per MHz/GHz it is actually lowering power. No. lol.


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> That's you bud... you. 350W+ single GPU... lol.. I hope not.
> 
> No?
> 
> You are also lowering performance this way...


yup.
even if you keep it quiet,which is actually easy on 300w cards with how good coolers have become,you still have to deal with heat output.
I'd say I'd draw a line at 250w,but ideally I wanna be closer to 200w.Can't afford it with 2070S,gotta be overclocked for 1440p.Maybe next time it'll be a better idea to get a bigger card but run it stock.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> More? How do you know? You list the 5700XT.... I can buy that because of its power use compared to performance... but what about every other card?
> 
> So again I say, NO as you lower performance and power this way... no way to tell which is "more" or how much per MHz/GHz it is actually lowering power. No. lol.


Well, yes, just look at the mobile graphic cards. Look at Nvidias 2060, for instance, which variant is the mùost efficient? The 80W MaxQ version, of course. The numbers are out there, this isn't a difficult question to answer. I only took the example of the 5700XT because it's the most telling, you have the results for 2X the clock speed increase available.


----------



## moproblems99 (Jul 15, 2020)

EarthDog said:


> I'm still sticking with my initial guess that big navi/rdna2 will fall somewhere between a 2080ti and Ampre's flagship (non titan). My biggest question is power use to get there on the same process...
> 
> Whereas on the NV side, its a new arch plus a die shrink.



Weren't the "rumors" of "Big" Ampere already over 300w, hence the 12 pin power connector?

I hope AMD isn't worse...on the flip side I'll be able to cook eggs and bacon while gaming.  I can see it now:. RGB grease catch.  I should patent that.


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> Well, yes, just look at the mobile graphic cards. Look at Nvidias 2060, for instance, which variant is the mùost efficient? The 80W MaxQ version, of course. The numbers are out there, this isn't a difficult question to answer. I only took the example of the 5700XT because it's the most telling, you have the results for 2X the clock speed increase available.


Sure, but each card is different man.. you just can lower the clocks and accomplish that goal. Sorry.


moproblems99 said:


> Weren't the "rumors" of "Big" Ampere already over 300w, hence the 12 pin power connector?
> 
> I hope AMD isn't worse...on the flip side I'll be able to cook eggs and bacon while gaming.  I can see it now:. RGB grease catch.  I should patent that.


I haven't come across that rumor... however, I've seen you (someone) say it twice now in this thread.

3 hours ago... https://www.pcgamer.com/nvidia-ampere-12pin-power-connector/

they aren't buying it.......

And think about it... what kind of a BEAST would Ampre have to be at over 350W? If we consider what we have now at 225W... plus a node shrink plus new arch, plus 50% more power to use.... I don't think any competitor stands a chance on the performance front if that is true.

So many rumors... so many people taking these as the gospel and standing by them... god bless the lemmings...... every one!


----------



## BoboOOZ (Jul 15, 2020)

moproblems99 said:


> I hope AMD isn't worse...on the flip side I'll be able to cook eggs and bacon while gaming.  I can see it now:. RGB grease catch.  I should patent that.


High cholesterol AND gaming? That's a really unhealthy life you plan on leading...


----------



## Vayra86 (Jul 15, 2020)

cucker tarlson said:


> but ipc is a way to achieve higher perf/wat



No, higher perf/watt is. IPC is instructions per clock, but if you can do more within a single tick that tick is likely also drawing more power.


----------



## BoboOOZ (Jul 15, 2020)

EarthDog said:


> Sure, but each card is different man.. you just can lower the clocks and accomplish that goal. Sorry.


No need to be sorry, just show me that card. the card which is more efficient at a higher clock speed.


----------



## EarthDog (Jul 15, 2020)

BoboOOZ said:


> No need to be sorry, just show me that card. the card which is more efficient at a higher clock speed.


You're missing the point I think?

All I am saying is that you cannot simply lower the clocks to achieve what you want. Power doesn't drop linearly with clocks. Yes, it goes down... that's DUH obvious, but it doesn't scale linearly.


----------



## moproblems99 (Jul 15, 2020)

EarthDog said:


> Sure, but each card is different man.. you just can lower the clocks and accomplish that goal. Sorry.
> I haven't come across that rumor... however, I've seen you (someone) say it twice now in this thread.
> 
> 3 hours ago... https://www.pcgamer.com/nvidia-ampere-12pin-power-connector/
> ...



I believe there was a pictures card, connector, and pin out... I'll see if I can find it.  I may not have been paying close enough attention.

Edit:. Yeah, I'll retract this.  Definitely not enough substance. The pic of the connector is not convincing enough to think it wasn't shopped.  No card pic, no pinout.


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> You're missing the point I think?
> 
> All I am saying is that you cannot simply lower the clocks to achieve what you want. Power doesn't drop linearly with clocks. Yes, it goes down... that's DUH obvious, but it doesn't scale linearly.


same as cu count doesn't scale.
even nvidia found that out painfully.
4300 cuda to achieve 22% over 3072 cuda.
ouch.

well,ouch for us.they probably made tons of money off TU102 2080Ti's,Titans and Quadros.


----------



## EarthDog (Jul 15, 2020)

moproblems99 said:


> I believe there was a pictures card, connector, and pin out... I'll see if I can find it.  I may not have been paying close enough attention.
> 
> Edit:. Yeah, I'll retract this.  Definitely not enough substance. The pic of the connector is not convincing enough to think it wasn't shopped.  No card pic, no pinout.


I have no idea if it is true or not but think about it.........

225W now... (2080 Ti)

350W... a 50%+ increase of power + node shrink + IPC increases from the new arch.... This thing would be a 4k/120 card if all that were true.


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> I have no idea if it is true or not but think about it.........
> 
> 225W now... (2080 Ti)
> 
> 350W... a 50%+ increase of power + node shrink + IPC increases from the new arch.... This thing would be a 4k/120 card if all that were true.


dude,2080Ti is 280 at least


----------



## EarthDog (Jul 15, 2020)

cucker tarlson said:


> dude,2080Ti is 280 at least


At reference speeds, its 225W. FE = 250W. That doesn't change my point, however. 

EDIT: Oops... 250/260W according to NV website (reference and FE). So let me update for accuracy.................

250W now... (2080 Ti)

350W... a ~40% increase of power + node shrink + IPC increases from the new arch.... This thing would be a 4k/120 card if all that were true.


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> At reference speeds, its 225W. FE = 250W. That doesn't change my point, however.
> 
> EDIT: Oops... 250/260W according to NV website (reference and FE). So let me update for accuracy.................
> 
> ...


well,a 10nm node shrink.64MT/mm
it's not 7nm tsmc at close to a 100


----------



## EarthDog (Jul 15, 2020)

cucker tarlson said:


> well,a 10nm node shrink.64MT/mm
> it's not 7nm tsmc at close to a 100


In English, plz..


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> In English, plz..


they're going with samsung's 8nm (10nm extension). 64 MegaTransistor/mm2. tsmc is much denser.


----------



## EarthDog (Jul 15, 2020)

cucker tarlson said:


> they're going with samsung's 8nm (10nm extension). 64 MegaTransistor/mm2. tsmc is much denser.


Gotcha... but still improvements to be had versus current process, yes?


----------



## cucker tarlson (Jul 15, 2020)

EarthDog said:


> Gotcha... but still improvements to be had versus current process, yes?


yuge

almost twice.tsmc's 12nm is 16nm extension at 33 mt/mm.
still more to gain than amd from 7 duv to 7 euv


----------



## HD64G (Jul 15, 2020)

Efficiency=Perf/W and Performance is the sum of all things that result in it (arch, IPC, core and VRAM clockspeeds, VRAM bandwidth, etc).


----------



## moproblems99 (Jul 15, 2020)

EarthDog said:


> At reference speeds, its 225W. FE = 250W. That doesn't change my point, however.
> 
> EDIT: Oops... 250/260W according to NV website (reference and FE). So let me update for accuracy.................
> 
> ...



I hope it is true and frankly I think they are both true-ish.  If NV has 40% or more increase in performance over 2080ti then I don't think AMD can surpass them - the gap is just so large.  They'll get to a very competitive spot and it will be wonderful for all.


----------



## bug (Jul 15, 2020)

cucker tarlson said:


> but ipc is a way to achieve higher perf/wat


It's tempting to think that, but no. The concepts are actually different. Look who's offering the highest perf/W: https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-tuf-evo/29.html
It's mid-rangers which don't have the highest IPC by a long shot. And the explanation is very simple: as you increase clocks, IPC goes up, but power draw goes up faster. Inevitably you'll reach a point where IPC goes up (it always does with clock increase), but perf/W goes down. That's where high-end cards sit.

Edit: see below about IPC.


----------



## EarthDog (Jul 15, 2020)

bug said:


> It's mid-rangers which don't have the highest IPC by a long shot.


But IPC wouldn't change with the same architecture. What does are clock speeds and ROPs/TMU/Shader counts. 

When you increase clocks, IPC does NOT go up. Remember IPC is Instructions per CLOCK cycle. IPC doesn't change when you increase clock speeds. If I can do 100 instructions per clock at XXX MHz, It's still 100 instructions per clock at a higher MHz... the difference there is ONLY in clock speeds not how much work gets done within each cycle.


----------



## bug (Jul 15, 2020)

EarthDog said:


> But IPC wouldn't change with the same architecture. What does are clock speeds and ROPs/TMU/Shader counts.
> 
> When you increase clocks, IPC does NOT go up. The If I can do 100 instructions per clock at XXX MHz, It's still 100 instructions per clock at a higher MHz... teh difference there is ONLY in clock speeds.


Yeah, brain fart. IPC goes up when you add hardware resources, but it doesn't with clock changes.


----------



## TheoneandonlyMrK (Jul 15, 2020)

AMD are going to win this round I think , until next year when Nvidia port ampere to Tsmc and call them supers.

Yeah Nvidia might get near or pip the performance of big Navi at the high end but at 50-100 watts more and pushed to it's limit's, I think AMD might hold back a ltd edition counter punch.

Nvidia would just roll with Samsung if they were that good but they Did run back to Tsmc in the end, but that'll take time to fit in now.

Mad rumours of 350-450 watt amperes abound.


----------



## Fluffmeister (Jul 15, 2020)

The 2080 Ti is already as good as 2 years old, I'd hope for more than 40-50% more performance frankly.


----------



## Metroid (Jul 16, 2020)

50% faster x nvidia 2080 ti rumor does not sound impossible, but given the misleading track record of amd in the past, however, remember that amd is 7nm and rnda 2 can indeed give that kind jump in performance. People should not underestimate AMD, look at ryzen.


----------



## kings (Jul 16, 2020)

We have to wait and see, there is always a lot of hype with each new AMD generation. It will always be the greatest thing ever. Fury was going to be the Titan killer and the overclockers dream, Vega were the "Poor Volta" thing... In every new generation, Nvidia is "doomed" due to something special that AMD is preparing.

And frankly, in my opinion these rumors thrown into the air by random youtubers to get clicks end up damaging AMD's image more than benefiting. We've seen this in the past, people start to create an idea of what the product will be and then when it finally arrives, it disappoints, not because the product is bad, but because the expectations were unrealistic.


----------



## cucker tarlson (Jul 16, 2020)

time to go back to amd
they'll beat 2080Ti by 50% and their drivers have no issues
they'll trounce nvidia's 450w cards


----------



## GoldenX (Jul 16, 2020)

I expect the biggest bugfest since the HD2900 was announced, followed by 8 years of small and slow improvements.


----------



## bug (Jul 16, 2020)

GoldenX said:


> I expect the biggest bugfest since the HD2900 was announced, followed by 8 years of small and slow improvements fine wine.


Ftfy


----------



## cucker tarlson (Jul 16, 2020)

bug said:


> Ftfy


lol,you won tpu for today.


----------



## BoboOOZ (Jul 16, 2020)

cucker tarlson said:


> they'll trounce nvidia's 450w cards


It's not 450W, it reads "up to 600W"


----------



## kayjay010101 (Jul 16, 2020)

BoboOOZ said:


> Well, I agree that for 1080p/144FPS 150W is enough. But we don't need flagships for that.
> 
> But for high refresh ultra-wide or for 4k/120 FPS I am prepared to accept 350W or even more, as long as the card is reasonably silent and it doesn't cost an arm.
> An arm is 700 euro for me, btw  .


For me specifically, I don't care how much power draw there is. If it's the top end card, I'm watercooling it anyway, and in that case it doesn't matter since I have upwards of a thousands watts of cooling power with my rads.


----------



## Kanan (Jul 16, 2020)

If you wanna be on the save side, simply await the reviews, but on top, look at the postings in AMD subreddit. If it has a lot of "I have a problem with the RX 6700 XT" posts, you know what to do.


----------



## cucker tarlson (Jul 16, 2020)

5700xt was supposed to be $250 150W.they were just $200 and 100w off
nuff said.ppl never learn.


----------



## Kanan (Jul 16, 2020)

cucker tarlson said:


> 5700xt was supposed to be $250 150W.they were just $200 and 100w off
> nuff said.ppl never learn.


The 5700 XT was hiked up, according to performance and pricing vs Nvidia GPUs. Life is hard. 2020 will bring improvements to the GPU situation.


----------



## cucker tarlson (Jul 16, 2020)

Kanan said:


> Life is hard


no it isn't
unless you mean in Africa.
then yes,it's hard.


----------



## EarthDog (Jul 16, 2020)

theoneandonlymrk said:


> AMD are going to win this round I think , until next year when Nvidia port ampere to Tsmc and call them supers.
> 
> Yeah Nvidia might get near or pip the performance of big Navi at the high end but at 50-100 watts more and pushed to it's limit's, I think AMD might hold back a ltd edition counter punch.
> 
> ...


bookmarking this post....


----------



## Super XP (Jul 16, 2020)

AMD has been really focused on RDNA2 just like they've been for all the ZEN releases.
I can see RDNA2 having the same effect as ZEN2 had in the CPU world. Or perhaps the upcoming ZEN3, which should be a much larger punch upside the head.


----------



## bug (Jul 16, 2020)

Super XP said:


> AMD has been really focused on RDNA2 just like they've been for all the ZEN releases.
> I can see RDNA2 having the same effect as ZEN2 had in the CPU world. Or perhaps the upcoming ZEN3, which should be a much larger punch upside the head.


I've been hearing this story (in various forms) for literally _every single launch_ since Polaris.
I've grown to hope for the best, but expect the worst.


----------



## EarthDog (Jul 16, 2020)

bug said:


> I've been hearing this story (in various forms) for literally _every single launch_ since Polaris.
> I've grown to hope for the best, but expect the worst.


Some people's cups runneth over, others take a more realistic approach.


----------



## oxrufiioxo (Jul 16, 2020)

I feel like this happens towards the end of every GPU lifecycle. Tons of rumors about crazy performance at lower prices etc etc with the hype train getting put of control. I'm guessing ampere will be out first and regardless of how it performs it'll be expensive at a min cards will be priced tier for tier with Turing if we're lucky. Then a month or 2 later AMD will come out with something that is competitive with the second tier nvidia GPU assuming nvidia doesn't hold back big ampere and will price it $50-100 less.... 

Then fanboys will fight for another 2 years with AMD is a better value, nvidia is faster and more efficient, amd drivers suck, nvidia is mean, AMD is dumb till we get rumors about RDNA 3 and nvidia 4000 lol.


I still feel most people just want AMD competitive so that they can buy an nvidia GPU for cheaper defeating the whole point of AMD being competitive.... 

Me I don't really care who makes the fastest card but whoever does will get my $$$


----------



## EarthDog (Jul 16, 2020)

oxrufiioxo said:


> Then fanboys will fight for another 2 years with AMD is a better value, nvidia is faster and more efficient, amd drivers suck, nvidia is mean, AMD is dumb till we get rumors about RDNA 3 and nvidia 4000 lol.


This has the potential to be my first signature ever at TPU... 

Forums, especially this one and several participants, are getting so tiring.....


----------



## BoboOOZ (Jul 16, 2020)

EarthDog said:


> Forums, especially this one and several participants, are getting so tiring.....


I used to spend time on Sherdog, a MMA forum a while ago. The insults and the flame wars were much, much worse than here. Don't hesitate to use the "ignore" function, it cleans up the threads quite nicely.


----------



## EarthDog (Jul 16, 2020)

BoboOOZ said:


> I used to spend time on Sherdog, a MMA forum a while ago. The insults and the flame wars were much, much worse than here. Don't hesitate to use the "ignore" function, it cleans up the threads quite nicely.


I do already.. trust me. :/


----------



## Super XP (Jul 16, 2020)

bug said:


> I've been hearing this story (in various forms) for literally _every single launch_ since Polaris.
> I've grown to hope for the best, but expect the worst.


RDNA 2 has nothing to do with previous launches. The main difference is that AMD separated the Gaming GPU and the Server GPU. Just like the ATI days. 

That's the main difference, and that's why it should have an industry effect.


----------



## bug (Jul 16, 2020)

Super XP said:


> RDNA 2 has nothing to do with previous launches. The main difference is that AMD separated the Gaming GPU and the Server GPU. Just like the ATI days.
> 
> That's the main difference, and that's why it should have an industry effect.


Yes, I'm guessing that's why it's called RDNA2, because it has nothing to do with RDNA 
Wait and see, I guess?



BoboOOZ said:


> I used to spend time on Sherdog, a MMA forum a while ago. The insults and the flame wars were much, much worse than here. Don't hesitate to use the "ignore" function, it cleans up the threads quite nicely.


Ignoring leaves you with quotes seemingly out of nowhere. I honestly don't know which is worse.


----------



## BoboOOZ (Jul 16, 2020)

bug said:


> Ignoring leaves you with quotes seemingly out of nowhere. I honestly don't know which is worse.


I'll take quotes instead of biased, redundant, aggressive content any day. Keeps me in a better mood.


----------



## moproblems99 (Jul 16, 2020)

kayjay010101 said:


> For me specifically, I don't care how much power draw there is. If it's the top end card, I'm watercooling it anyway, and in that case it doesn't matter since I have upwards of a thousands watts of cooling power with my rads.



Yes, who loves dumping 1000 watts of heat into a room for hours on end.  No one in a warm climate.


----------



## kayjay010101 (Jul 16, 2020)

moproblems99 said:


> Yes, who loves dumping 1000 watts of heat into a room for hours on end.  No one in a warm climate.


I don't live in a warm climate


----------



## xkm1948 (Jul 16, 2020)

This is just an attempt to hyper drive the RDNA2 hype train. There was one for Navi, one for Vega, one for Polaris. Oh wait, there was almost one for every gen of RTG cards!

Vega:








						AMD Vega Discussion Thread
					

Friendly notice, this is not your daily hype train. VEGA WILL NOT run on rainbow power and output 16K@60fps.   Source: https://www.chiphell.com/forum.php?mod=viewthread&tid=1716524&page=1&authorid=42930  Using google translate, here is what I can get out of it.  1. Consumer version VEGA will...




					www.techpowerup.com
				




Navi:








						CHOO CHOOOOO!!!!1! Navi Hype Train be rollin'
					

Didn't see this pop up on the forums so thought it should be shared.  https://wccftech.com/amd-navi-20-radeon-rx-graphics-card-ray-tracing-gcn-architecture-rumor/




					www.techpowerup.com
				




And I will just leave the froggy overlord @R-T-B 's word here:









						Editorial - Hype Trains and You:  A PSA
					

Hype Trains are bad. They are not just bad because a random frog on the internet told you so either, they are bad because they build upon themselves to the point that you would believe a random frog on the internet if he said something beneficial about your chosen product.  It's not just...




					www.techpowerup.com


----------



## moproblems99 (Jul 16, 2020)

xkm1948 said:


> This is just an attempt to hyper drive the RDNA2 hype train. There was one for Navi, one for Vega, one for Polaris. Oh wait, there was almost one for every gen of RTG cards!



Hey, my choo choo hype train was a joke.


----------



## TheoneandonlyMrK (Jul 16, 2020)

EarthDog said:


> bookmarking this post....


I'll accept I'm optimistic.


----------



## Fluffmeister (Jul 16, 2020)

Yeah maybe Nvidia don't want to disrupt 4K gaming, they want to blow it out of the frikin water.


----------



## TheoneandonlyMrK (Jul 16, 2020)

Fluffmeister said:


> Yeah maybe Nvidia don't want to disrupt 4K gaming, they want to blow it out of the frikin water.


They're being forced to push the envelope, that's clear, like with Intel efficiency is out the window, perhaps competition Is turning up.


----------



## Fluffmeister (Jul 16, 2020)

theoneandonlymrk said:


> They're being forced to push the envelope, that's clear, like with Intel efficiency is out the window, perhaps competition Is turning up.



Yeah that 12pin lark in the other thread is odd to say the least, interesting how it will play out.


----------



## TheoneandonlyMrK (Jul 16, 2020)

Fluffmeister said:


> Yeah that 12pin lark in the other thread is odd to say the least, interesting how it will play out.


Hopefully, I can't wait for review's, probably not on my buy list though.
The Rona f'd my savings right the f up, man's skint , just started a new contract though so you never know.


----------



## GoldenX (Jul 16, 2020)

xkm1948 said:


> This is just an attempt to hyper drive the RDNA2 hype train. There was one for Navi, one for Vega, one for Polaris. Oh wait, there was almost one for every gen of RTG cards!



BuT vEgA iS nOt GcN!


----------



## EarthDog (Jul 16, 2020)

theoneandonlymrk said:


> I'll accept I'm optimistic.


Isn't it odd, with the same information (rumors) out there that two people can come up with nearly polar opposite opinions?  

I don't think big Navi beats big Ampre. I think it gets close enough to be competitive for sure, however... which is all anyone asks. I don't believe it will win and use less power (one or the other). The hurdles there are too tall IMO.


----------



## TheoneandonlyMrK (Jul 16, 2020)

EarthDog said:


> Isn't it odd, with the same information (rumors) out there that two people can come up with nearly polar opposite opinions?
> 
> I don't think big Navi beats big Ampre. I think it gets close enough to be competitive for sure, however... which is all anyone asks. I don't believe it will win and use less power (one or the other). The hurdles there are too tall IMO.


Nah, AMD are at the start of a improvement curve, Nvidia at the end and it seams like a few decisions could go against Nvidia.

Regardless it's all rumours and waffles, I sure as shit wouldn't put a fiver on who will win, I'm that confident I'm right.


----------



## EarthDog (Jul 16, 2020)

theoneandonlymrk said:


> Nah, AMD are at the start of a improvement curve, Nvidia at the end and it seams like a few decisions could go against Nvidia.
> 
> Regardless it's all rumours and waffles, I sure as shit wouldn't put a fiver on who will win, I'm that confident I'm right.


I wouldn't bet on my thoughts being right either... 

Seriously... how about a gentleman's bet? Maybe one of us have to wear an avatar of the other's choice for who is right or wrong? LOLOLOL. I think you'd look great in an "I  Earthdog" custom avatar for a month. 

EDIT: That said, I'm starting a scorched earth program here so, I doubt I'll be around long enough!


----------



## TheoneandonlyMrK (Jul 16, 2020)

EarthDog said:


> I wouldn't bet on my thoughts being right either...
> 
> Seriously... how about a gentleman's bet? Maybe one of us have to wear an avatar of the other's choice for who is right or wrong? LOLOLOL. I think you'd look great in an "I  Earthdog" custom avatar for a month.
> 
> EDIT: That said, I'm starting a scorched earth program here so, I doubt I'll be around long enough!


Shame, you do what you must.

You'd be missed.

I'll take that bet if it's for a week or month , eternity might be too much.

I may not always agree but respect was always present ,and I hope I didn't annoy you.
Never the intent.


----------



## xkm1948 (Jul 16, 2020)

*theoneandonlymrk *care to explain your "Nvidia bad decision" a bit more? Now I am curious.


----------



## moproblems99 (Jul 16, 2020)

Fluffmeister said:


> Yeah that 12pin lark in the other thread is odd to say the least, interesting how it will play out.



I don't see why they would do it.  No PSUs have it.  Unless it is meant to adapt to 6 pins together....but we already have that!?


----------



## trparky (Jul 17, 2020)

I'll believe it when I see it (the benchmarks, that is).


----------



## TheoneandonlyMrK (Jul 17, 2020)

xkm1948 said:


> *theoneandonlymrk *care to explain your "Nvidia bad decision" a bit more? Now I am curious.


Trying to wring savings out of Tsmc, I don't think that's going to work in their favour, we will see.


----------



## xkm1948 (Jul 17, 2020)

ahhh i see. I thought something wrong with their uArc design


----------



## TheoneandonlyMrK (Jul 17, 2020)

xkm1948 said:


> ahhh i see. I thought something wrong with their uArc design


No , it speaks for itself, it's selling well, but like I said tech progression curves are not on Nvidia's favour either, but obviously things are going to change with three to as many as five discrete GPU makers too, the future is looking absurdity unreadable ,I mean like in ten years the future not next two to three.


----------



## Easo (Jul 17, 2020)

Sorry, but sounds too good to be true. Catching up to Nvidia alone would be an achievement for AMD, a start to crawl back. But this is just a fantasy, nothing more.


----------



## moproblems99 (Jul 17, 2020)

Easo said:


> Sorry, but sounds too good to be true. Catching up to Nvidia alone would be an achievement for AMD, a start to crawl back. But this is just a fantasy, nothing more.



I think they end within 10-15% of Nvidia.  Navi was like Zen and this is going to like Zen+.  Maybe Zen 2 if we are lucky.

If we add and multiply rumors, Nvidia said Ampere was +50% over Turing.  AMD says about 40% over 2080ti.  Simple rumor math results in AMD ending up within about 10%.


----------



## bug (Jul 17, 2020)

Easo said:


> Sorry, but sounds too good to be true. Catching up to Nvidia alone would be an achievement for AMD, a start to crawl back. But this is just a fantasy, nothing more.


Well, 5700 has caught up so well with 2060, it's got it beat in both perf and perf/W. Sure, the 5700 doesn't do RTRT, but AMD catching up to Nvidia isn't so far fetched. I mean, I don't expect that from RDNA2, but I wouldn't be totally surprised if it happens.


----------



## ARF (Jul 18, 2020)

If this becomes delivered as promised, I will try to get something with it, perhaps Navi 21.
The largest GPU and fastest graphics card.


----------



## ARF (Jul 20, 2020)

*AMD aims Big Navi launch for November as 'show of strength' for RDNA 2*

Read more: https://www.tweaktown.com/news/7388...ovember-as-show-of-strength-rdna-2/index.html

*AMD Big Navi GPU could launch alongside the PlayStation 5 and Xbox Series X in November*








						AMD Big Navi GPU could launch alongside the PlayStation 5 and Xbox Series X in November
					

Would be a busy time for AMD




					www.techradar.com
				




*AMD’S ‘NVIDIA-KILLER’ BIG NAVI GPU IS NO NEXT-GEN XBOX, PS5 BUZZKILL*





						AMD’s ‘Nvidia-Killer’ Big Navi GPU is No Next-Gen Xbox, PS5 Buzzkill | CCN.com
					






					www.ccn.com


----------



## ARF (Jul 30, 2020)

*AMD Radeon Instinct MI100 ‘CDNA GPU’ Alleged Performance Numbers Show Its Faster Than NVIDIA’s A100 in FP32 Compute, Impressive Perf/Value*








						AMD Radeon Instinct MI100 'CDNA GPU' Alleged Performance Numbers Show Its Faster Than NVIDIA's A100 in FP32 Compute, Impressive Perf/Value
					

Alleged performance numbers of AMD's next-genDNA GPU based Radeon Instinct MI100 have leaked out and its faster than NVIDIA's Ampere A100.




					wccftech.com
				




*NVIDIA RTX 3090 Will Allegedly Offer A Massive 50% Performance Increase*








						NVIDIA RTX 3090 Will Allegedly Offer A Massive 50% Performance Increase
					

We have a very quintessential rumor on the alleged performance of the RTX 3090 with us today. A couple of credible leakers on twitter have commented that the performance of the upcoming NVIDIA RTX 3090 is shaping up to be as much as a 50% increase over the RTX 2080 Ti. According to the same, […]




					wccftech.com
				





Wccftech had an article some time ago about nVidia's new naming scheme. They speculated that if nVidia drops the Ti from its names, that means it loses this round.


----------



## arbiter (Jul 30, 2020)

Just using AMD's history of "look how much better our gpu is over nvidia" then reviewers get their hands on it and its not. People should expect it flop from start. I remember one gpu AMD touted in their press conf to be like 10-20% faster then competing nvidia card. Then AMD released the settings in game they used and within hours reviewers tore their claims apart cause AMD turned off things in game that used ROPS and cranked everything up that used shaders so they end up having settings NO person would ever use to play a game. Looking at hype around these new cards and comparing it to hype of years past. Looking like another round of disappointment is abound.

On a side note wonder if AMD will come up with something to compete with NVenc that current nvidia cards have that make it a huge selling point for people looking for entry level HQ streaming.


----------



## moproblems99 (Jul 30, 2020)

arbiter said:


> Looking like another round of disappointment is abound.



Disappointment is mostly in the eye of the beholder.  If they buy the train ticket, they're along for crash.


----------



## fritolayduck (Jul 31, 2020)

R0H1T said:


> Yup not one one time have I seen tax savings, when the govt decided to lower taxes, being passed on to the consumers. Same goes in any & every industry out there, let's just say *capitalism, free markets & trickle down* is overrated. Companies just increase their margins whenever they can, Intel only reduced their prices because they were forced (with declining sales) to & unless people shift to buying AMD en masse Nvidia will continue their trope & price Ampere to heavens this fall.
> 
> I'm not pointing this just at Nvidia users, I'm also one, it's the reality. The DIY market probably flipped towards majority(?) AMD even before zen2 launched, if Mindfactory numbers & Amazon bestsellers list is anything to go by. So the excuse that you need top performance to be able to lead a particular sector is just BS, having said that AMD right now probably covers only two thirds of the market & price range that Nvidia's selling into. They don't have a competitive sub $200 GPU & nothing above $800, would be interesting to see if non Nvidia fans choose AMD provided RDNA2 based GPU's are competitive across the board. You know put your *$* where your mouth is?



NVIDIA is far ahead of AMD, there is no comparison, just look at the technologies, it must be bringing DLSS 3.0, there is still no reason to buy AMD.



BoboOOZ said:


> Well, AMD really OC'd the crap out of the 5700XT, I guess it was their strategy to make some money after selling the Radeon VII at a loss.
> 
> If competition remains high, I imagine both AMD and Nvidia will leave less and less performance on the table, so we will see TDP's going higher and higher.
> Personally I don't mind having a 350W card, I just need to read the reviews to make sure the cooling solution is adequate. In practice, I tend to restrict the TDP of my card depending on the game I play, with the option of having the full performance only for when it's truly needed.



The trend now is technologies via software like DLSS that will make the boards age better.


----------



## Fry178 (Jul 31, 2020)

Well and how many times in the past 10y was an amd card supposed to be beating the crap out of an Nv card?
How many times did it actually happen?

Love how ppl want to see a price drop on certain cards. sure if its thru competition,
but why would a company lower their prices just because they were lower in the past?
Virtually everything costs more than the same item did 10 or 20y ago.
And if "you" cant buy a ti, guess what buy a xx80 or xx70, as with everything else you should buy with your wallet and brain,
not just "whats the top model...".
I mean no one goes to a dealership and says, "i want that +700 HP-2 door-RWD-super sports car for the price of your 4 door cookie box,
cause its way overpriced, compared to other cars/brands...."


----------



## Vayra86 (Jul 31, 2020)

Fluffmeister said:


> Yeah maybe Nvidia don't want to disrupt 4K gaming, they want to blow it out of the frikin water.



Not at all, they want to drag it out as long as possible. That is why they introduced RT, to make even 1080p a renewed struggle.


----------



## Vya Domus (Jul 31, 2020)

Fry178 said:


> Well and how many times in the past 10y was an amd card supposed to be beating the crap out of an Nv card?



Can you tell me how many times AMD came out and told everyone "they're going to beat the crap out of Nvidia" or was it just that avid trolls and fanboys claimed such things ?


----------



## Vayra86 (Jul 31, 2020)

moproblems99 said:


> I think they end within 10-15% of Nvidia.  Navi was like Zen and this is going to like Zen+.  Maybe Zen 2 if we are lucky.
> 
> If we add and multiply rumors, Nvidia said Ampere was +50% over Turing.  AMD says about 40% over 2080ti.  Simple rumor math results in AMD ending up within about 10%.



Navi was like Zen? How? The only similarity is the node advantage, the technology is ancient at its core. It is in no way anything like Zen on an architectural level. It is not set to scale up, look at die size versus power draw. It also does not even remotely compare to Zen's position on the performance stack - Navi can barely catch up to upper midrange. RDNA2 might change those things but when I read they're already exploding the die size... not a good sign, even if you consider some 15-20% reserved for RT/increased shader sizes.

Navi 2 might be AMD's GPU Zen moment but its not a given, Ampere is still vague. And thén they will have to keep delivering. Navi 1 was not delivering. It was crawling out of the grave, and only barely. So far RDNA was only good for providing AMD with a competitive, good margin product in the midrange... when I hear that, I hear Polaris all over again, and Raja's echo of 'focus' when he launched RX480 



Vya Domus said:


> Can you tell me how many times AMD came out and told everyone "they're going to beat the crap out of Nvidia" or was it just that avid trolls and fanboys claimed such things ?



You know they have an army of blind idiots for that, come on. If anything works in AMD's PR, it is their ability to mobilize them.


----------



## Vya Domus (Jul 31, 2020)

Vayra86 said:


> Navi was like Zen? How? The only similarity is the node advantage, the technology is ancient at its core.



What ? How is it "ancient" ?


----------



## Vayra86 (Jul 31, 2020)

Vya Domus said:


> What ? How is it "ancient" ?



Its just GCN with different letters and new VRAM.


----------



## Vya Domus (Jul 31, 2020)

Vayra86 said:


> Its just GCN



No, it's not. How many times will we go over this ?


----------



## Vayra86 (Jul 31, 2020)

Vya Domus said:


> No, it's not. How many times will we go over this ?



Okay, its not, but the difference is irrelevant to the market it caters to, maybe that explains it better. Technical differences need to translate to customer advantages, Navi does not do this. It just manages to keep up. Nvidia on the other hand translates new architecture into new features. Navi 2 is hopefully doing similar.


----------



## Vya Domus (Jul 31, 2020)

The market it catered to was mid range, where the most money AMD could have made. That market does not care if it's GCN or not or even about much in the way of features, they just care about perf/price.


----------



## Vayra86 (Jul 31, 2020)

Vya Domus said:


> The market it catered to was mid range, where the most money AMD could have made. That market does not care if it's GCN or not or even about much in the way of features, they just care about perf/price.



The market definitely does care about features, that seems to be a returning mistake in considerations between the two camps. More importantly, the feature set in the mid range actually. There are segments and one feeds the other, Navi 1 is in no way feeding anything but revenue. It does not make up a stack, because Navi 2 will make 1 obsolete with a bigger feature set. Its basically an odd one out, and it is based on updated, old technology.

But you know all this and in a basic sense I don't think we see different things here. We just appraise them differently.


----------



## Super XP (Jul 31, 2020)

Vayra86 said:


> The market definitely does care about features, that seems to be a returning mistake in considerations between the two camps. More importantly, the feature set in the mid range actually. There are segments and one feeds the other, Navi 1 is in no way feeding anything but revenue. It does not make up a stack, because Navi 2 will make 1 obsolete with a bigger feature set. Its basically an odd one out, and it is based on updated, old technology.
> 
> But you know all this and in a basic sense I don't think we see different things here. We just appraise them differently.


I really didn't want to get into the history as to why VEGA and why NAVI etc., short and sweet. 
AMD put most of its resources into ZEN back in 2012 when Jim Keller was hired. So they dabbled just enough in the GPU department just to stay afloat and compete on a price/performance level. That's basically all they could do, without losing the server market fully to Nvidia. 
As soon as ZEN was released in 2017, a lot more resources went into the Radeon Technology Group to develop RDNA 1. After ZEN's success, especially ZEN+ & 2's success, AMD moved several key ZEN engineers to assist the Radeon Technology Group and to help finish off the new GPU design known as RDNA2. 

AMDs bread and butter is in CPUs. AMDs strategy actually worked out quite well. Now we will soon find out if this strategy worked to pull its GPU department out of the gutter and into Nvidias fat face.   

*AMD Processor Designs Leading up to the Superior ZEN Micro-Architecture*





						Loop Finance
					






					trybe.one


----------



## EarthDog (Jul 31, 2020)

Super XP said:


> I really didn't want to get into the history as to why VEGA and why NAVI etc., short and sweet.
> AMD put most of its resources into ZEN back in 2012 when Jim Keller was hired. So they dabbled just enough in the GPU department just to stay afloat and compete on a price/performance level. That's basically all they could do, without losing the server market fully to Nvidia.
> As soon as ZEN was released in 2017, a lot more resources went into the Radeon Technology Group to develop RDNA 1. After ZEN's success, especially ZEN+ & 2's success, AMD moved several key ZEN engineers to assist the Radeon Technology Group and to help finish off the new GPU design known as RDNA2.
> 
> ...


Interesting take. For all intents and purposes RTG and the CPU group are separate, right? Meaning they could focus, concurrently, on CPUs and GPUs at the same time as they are two different things too. Different people, etc. What Zen engineers were moved over for GPU?


----------



## Super XP (Jul 31, 2020)

EarthDog said:


> Interesting take. For all intents and purposes RTG and the CPU group are separate, right? Meaning they could focus, concurrently, on CPUs and GPUs at the same time as they are two different things too. Different people, etc. What Zen engineers were moved over for GPU?


With regards to other companies, that may be so, but I was referring to an article I read way back, and an interview from Dr. Lisa Su or one of the companies executives, in which they explained what I posted. If I come by this information again, I will be happy to share it. I have great memory, I just need to make it a habit of linking sources.


----------



## Vayra86 (Jul 31, 2020)

Super XP said:


> I really didn't want to get into the history as to why VEGA and why NAVI etc., short and sweet.
> AMD put most of its resources into ZEN back in 2012 when Jim Keller was hired. So they dabbled just enough in the GPU department just to stay afloat and compete on a price/performance level. That's basically all they could do, without losing the server market fully to Nvidia.
> As soon as ZEN was released in 2017, a lot more resources went into the Radeon Technology Group to develop RDNA 1. After ZEN's success, especially ZEN+ & 2's success, AMD moved several key ZEN engineers to assist the Radeon Technology Group and to help finish off the new GPU design known as RDNA2.
> 
> ...



But I don't question that and I do agree that their _overall_ strategy was good when it comes to Zen. I don't agree that it was a wise move regarding GPU - the moment they announced their focus on midrange with Polaris was the moment they fell behind so far that they still struggle to keep up - even with said midrange. Navi only underlines this so far, being clocked out of its efficiency curve to be competitive at its price point. They really did play a dangerous game the past years, giving Nvidia every reason and chance to come out on top.

As far as focus is concerned _for AMD_ then yes, even their GPU strategy was wise because they simply didn't have resources for more and their focus for performance was targeted around maximum use of their technology - in consoles, with Intel (KabyLake with their IGP), towards dGPU and professional markets. But that does not affect me as a customer. That is an important distinction and one reason for the disconnect I often see in these discussions.


----------



## Assimilator (Jul 31, 2020)

Regardless of whether RDNA is a rebrand of GCN or not, it can't be denied that Navi made massive efficiency strides against NVIDIA. Recall that the previous best AMD GPUs, Fury and Vega, were such power hogs that they had to use HBM, and even then weren't power-competitive with NVIDIA - but Navi uses GDDR and competes with Turing in perf/W. It's a pretty remarkable improvement.

The big question is whether AMD is able to continue this trend with RDNA2, especially considering the inclusion of RT hardware.


----------



## EarthDog (Jul 31, 2020)

Assimilator said:


> Regardless of whether RDNA is a rebrand of GCN or not, it can't be denied that Navi made massive efficiency strides against NVIDIA. Recall that the previous best AMD GPUs, Fury and Vega, were such power hogs that they had to use HBM, and even then weren't power-competitive with NVIDIA - but Navi uses GDDR and competes with Turing in perf/W. It's a pretty remarkable improvement.
> 
> The big question is whether AMD is able to continue this trend with RDNA2, especially considering the inclusion of RT hardware.


55/5600XT, sure. 5700 XT, was punching above its weight class and a bit out of its sweetspot. It loses out to the 2070S by 12% (1440) while that card uses 5% less power. It isn't a big difference and better agreed, but 5700XT wasn't a mark of efficiency. 

I'm wondering if we'll see more of the same with bNavi.


----------



## Super XP (Jul 31, 2020)

Vayra86 said:


> But I don't question that and I do agree that their _overall_ strategy was good when it comes to Zen. I don't agree that it was a wise move regarding GPU - the moment they announced their focus on midrange with Polaris was the moment they fell behind so far that they still struggle to keep up - even with said midrange. Navi only underlines this so far, being clocked out of its efficiency curve to be competitive at its price point. They really did play a dangerous game the past years, giving Nvidia every reason and chance to come out on top.
> 
> As far as focus is concerned _for AMD_ then yes, even their GPU strategy was wise because they simply didn't have resources for more and their focus for performance was targeted around maximum use of their technology - in consoles, with Intel (KabyLake with their IGP), towards dGPU and professional markets. But that does not affect me as a customer. That is an important distinction and one reason for the disconnect I often see in these discussions.


Agreed. 
Sometimes when you are shedding $$$$ because of the Bulldozer debacle you have no choice but to GUT the GPU department and pour all resources into the companies Bread and Butter, the CPU. 

Great point though,


----------



## RedelZaVedno (Jul 31, 2020)

Never sit on a GPU hype train, if you don't want to derail


----------



## Super XP (Jul 31, 2020)

RedelZaVedno said:


> Never sit on a GPU hype train, if you don't want to derail


RDNA2 is the real deal. Finally!!!!!!


----------



## EarthDog (Jul 31, 2020)

Super XP said:


> RDNA2 is the real deal. Finally!!!!!!


^^

Spurs dug in that horse..........


----------



## moproblems99 (Jul 31, 2020)

EarthDog said:


> ^^
> 
> Spurs dug in that horse..........



Conductor you mean?


----------



## kapone32 (Jul 31, 2020)

When I look at AMD GPUs I only compare them to themselves. Was the 7950 noticeably faster than the 6850 yes. Was Polaris faster than Tahiti? Not really but at noticeably less power. Was Vega noticeably faster than Polaris yes. Is RDNA faster than Vega. Yes and it uses less power at 2/3 the power draw and 1/2 the processors. If the Big Navi has the refinements of a year's more knowledge of the node and the same or similar amount of processors as Vega 64 it should be noticeably faster than the 5700XT. This could be why Nvidia is talking about a 12 pin connector for their upcoming GPUs. What I don't like is how people can be so quick to focus on the high end crown that they forget that there is a huge and I mean huge price difference between the 2080TI (Propoganda's darling) vs the 5700XT which costs less than 1/2 in some cases. If AMD does produce a Navi card specced similar to the Vega 64 though I would venture to say it should be a very decent card.


----------



## EarthDog (Jul 31, 2020)

kapone32 said:


> it should be noticeably faster than the 5700XT.


Of course. But the 5700XT is also nearly 50% slower than the 2080Ti. Price to performance doesn't scale and never has... especially with true flagships and high-performance.

But only comparing anything to itself/previous generations feels myopic to me. Akin to a horse running a race with its blinders on so they can't see the other horses next to it and get freaked out (with ponies, that is good, not so with products). To get a good idea of the market and a product's placement, you have to consider the competition as part of the metric. Then you filter the data by your own personal needs. Some couldn't care less about power, for example, others do (and yet others only care when the competitor uses more, LOL).


----------



## moproblems99 (Jul 31, 2020)

EarthDog said:


> Of course. But the 5700XT is also nearly 50% slower than the 2080Ti. Price to performance doesn't scale and never has... especially with true flagships and high-performance.
> 
> But only comparing anything to itself/previous generations feels myopic to me. Akin to a horse running a race with its blinders on so they can't see the other horses next to it and get freaked out (with ponies, that is good, not so with products). To get a good idea of the market and a product's placement, you have to consider the competition as part of the metric. Then you filter the data by your own personal needs. Some couldn't care less about power, for example, others do (and yet others only care when the competitor uses more, LOL).



In a competition, the competitor very much matters.  AMD doesn't need to win this round to start crawling out if the gutter.  A good start is to make it look like their GPU group is actually trying.  Within 10-20% of NV top card is all they need this round and that isn't going to be easy.  They are going to have to double Navi 1.


----------



## FinneousPJ (Jul 31, 2020)

Would be great if true. However I'm expecting 20-30% better than 2080 To which is already HUGE.


----------



## EarthDog (Jul 31, 2020)

moproblems99 said:


> In a competition, the competitor very much matters.  AMD doesn't need to win this round to start crawling out if the gutter.  A good start is to make it look like their GPU group is actually trying.  Within 10-20% of NV top card is all they need this round and that isn't going to be easy.  They are going to have to double Navi 1.


Thats what im saying too more or less.

5700xt is 45% behind a 2080ti. RDNA2 flagship has to overcome that deficit with a process tweak and new architecture while NV has a big process shrink and new architecture. Even if Ampre's non titan flagship is 30% faster than a 2080ti (same margin as 1080ti to 2080ti) that's 75% increase to match it. I dont recall ever seeing a 75% increase from flagship to flagship, do you? Id be happy if it was within 10% of Ampre honestly... power be damned. I dont recall a 65% increase either.........

Then... these rumors of ampre using an arseload of power (300W nominal, 400w oc) on a new arch and process shrink..... id have to imagine its going to be well over 30%.... approaching 50% if that all pans out. 

Where does that leave big navi? Needing to be almost 100% faster? Yikes..


----------



## moproblems99 (Jul 31, 2020)

EarthDog said:


> Thats what im saying too more or less.
> 
> 5700xt is 45% behind a 2080ti. RDNA2 flagship has to overcome that deficit with a process tweak and new architecture while NV has a big process shrink and new architecture. Even if Ampre's non titan flagship is 30% faster than a 2080ti (same margin as 1080ti to 2080ti) that's 75% increase to match it. I dont recall ever seeing a 75% increase from flagship to flagship, do you? Id be happy if it was within 10% of Ampre honestly... power be damned.



I know what you meant, just extending.  My memory isn't good enough (nor do I care enough) to remember past yesterday   

There isn't any reason they can't do it.  It is just a massive task and will be super impressive if they even come close.


----------



## Dante Uchiha (Jul 31, 2020)

AMD could already hit/tie with the 2080 ti using an RDNA1 GPU w/ 64CU and better memory, has an article in techspot that shows well that Turing and RDNA1 are architectures of the similar level. AMD's problem is software, not hardware, they need to focus on that point...I'm not just talking about fixing bugs, I'm talking about working on optimizing more games to maintain a more uniform superiority/performance.


----------



## Fluffmeister (Jul 31, 2020)

Yeah it's interesting, one of the common myths that was peddled was AMD would pull ahead when they got both major consoles using their hardware, thus all games would run great on AMD hardware, and slowly and surely Nvidia would suffer till eventually they died. The world would then be a better place and gamers and children alike would play with gumdrop smiles.


----------



## dragontamer5788 (Jul 31, 2020)

Vayra86 said:


> Its just GCN with different letters and new VRAM.



Navi is actually a huge change in architecture.

Navi is Wave32 native. While GCN was Wave64 native. A huge amount of lower-level code doesn't work anymore. The assembly language is grossly different, especially with regards to cross-lane operations. I'd say RDNA / Navi is the biggest leap in 10 years for AMD... strictly from an assembly-language perspective.



			https://developer.amd.com/wp-content/resources/RDNA_Shader_ISA.pdf
		




			https://developer.amd.com/wp-content/resources/Vega_Shader_ISA_28July2017.pdf
		


---------

NVidia's stuff is all PTX-compatible, but that doesn't change the huge leaps they made from Maxwell through Turing. Similarly, AMD has kept large parts of GCN compatible (ie: there's still a Multiply, Dot-product, and XOR instruction). But the major changes to architecture (Wave32, independent load/stores, etc. etc.) make Navi a hugely different beast.

Different enough that Navi isn't supported by AMD's ROCm platform. Because ROCm makes a lot of Wave64 assumptions. Navi literally broke a bunch of code that AMD has been trying for months to fix.

---------

AMD's weaker software team has shown the advantage NVidia has. Something like Navi (with huge leaps) comes out, but the software isn't really ready to take off (at least in the compute sector. DirectX / Vulkan seemed ready). NVidia can be more agile since they have a leaner, quicker moving, more mature software team. This also hit them when Vega came out, and their software team was unable to get primitive shaders implemented. Maybe it was the hardware team's fault. Or maybe it was the software team's fault. But a more agile software process would have figured out that primitive shaders weren't ready long before the release.

Overall, AMD has hardware that's as strong, or maybe even stronger, than NVidia's. But their weak software is holding them back.


----------



## xkm1948 (Jul 31, 2020)

Fluffmeister said:


> Yeah it's interesting, one of the common myths that was peddled was AMD would pull ahead when they got both major consoles using their hardware, thus all games would run great on AMD hardware, and slowly and surely Nvidia would suffer till eventually they died. The world would then be a better place and gamers and children alike would play with gumdrop smiles.




I have come to accept this constant re-occurrence of "underdog wins" for RTG. If RTG does get on top and remain on top for a while I bet most folks will be singing a different tune.


----------



## moproblems99 (Jul 31, 2020)

xkm1948 said:


> I have come to accept this constant re-occurrence of "underdog wins" for RTG. If RTG does get on top and remain on top for a while I bet most folks will be singing a different tune.



I probably shouldn't say this to prevent a shit show....but mirror image of American politics...just Red vs Green instead.  I don't get it.  How could something like the Red Green show devolve in this?







Edit:. Sorry for the huge image.... doesn't seem to want to let me edit it from my phone.


----------



## xkm1948 (Jul 31, 2020)

moproblems99 said:


> I probably shouldn't say this to prevent a shit show....but mirror image of American politics...just Red vs Green instead.  I don't get it.  How could something like the Red Green show devolve in this?
> 
> 
> 
> Edit:. Sorry for the huge image.... doesn't seem to want to let me edit it from my phone.




Human LOVES picking an easy side and fight. It rewards our primate brains with those delicious dopamine. Also not to mention the superiority folks get because these are "high tech gadget" while in reality we are doing no more than playing advanced lego building.


----------



## John Naylor (Jul 31, 2020)

AMD GFX card rumors are replacing "the Boy who cried Wolf" as the most popular nursery rhyme.   "AMD is gonna ..." has been nthe rallying cry for 8 years now.

Mantle is going to change everything ....
HBM2 is going to change everything ....
2xx series  is going to change everything ....
4xx is going to change everything ....
5xxe is going to change everything ....
Fury is going to change everything ....
Vega  is going to change everything ....
5xxx is going to change everything ....

For a while, with each generation ... AMD lost a tier.  With all cards OC'd, nVidia had the top 2 spots with 7xx / 2xx ... then they took a 3rd with the 950 ... then a 4th with the 1060.  And while AMD narrowed the gap significantly performance wise, they also caught up with efficiency, temps and nise.  The 5600 XT wins at all these things in its performance tier.

It's like the ever cheating spouse who says "this time it's going to be different."  The only shining light keeping some hope alive in my head over all of this time is the 5600 XT.   Not interested in rumors, not interested die size or technology ... only thing that affects our purchase decisions / recommendations is a)  Performance in applications we or our "customer" actually uses ... on a daily basis, b) Noise, c) temps, d) power consumption, e) total system cos, f) total ownership cosyts includin g PSU sizem power costs, additional fans if heat disparity.   Everything els is just noise.


----------



## moproblems99 (Jul 31, 2020)

John Naylor said:


> Mantle is going to change everything ....



This wasn't really false.



John Naylor said:


> AMD lost a tier. With all cards OC'd, nVidia had the top 2 spots with 7xx / 2xx ... then they took a 3rd with the 950 ... then a 4th with the 1060.



Lost a tier in what?  I don't recall the 950 beating AMDs best?


----------



## TheoneandonlyMrK (Jul 31, 2020)

John Naylor said:


> AMD GFX card rumors are replacing "the Boy who cried Wolf" as the most popular nursery rhyme.   "AMD is gonna ..." has been nthe rallying cry for 8 years now.
> 
> Mantle is going to change everything ....
> HBM2 is going to change everything ....
> ...


What are you on about ,which of the millions of sku's of 1060 was better than Polaris, cause I didn't see it.

And your mentioning TCO like we all run servers, less than 1% gives a shit about TCO ,power use as well for that matter.

And cost comes first , initial cost for at least 80% of buyers then brand by all accounts then performance and that's about as far as 99% go.


----------



## HD64G (Jul 31, 2020)

AMD Sienna Cichlid (Navi 21 "Big Navi") to feature up to 80 Compute Units? - VideoCardz.com
					

AMD “Sienna Cichlid” GPU features 80 Compute Units? It has been discovered by @_rogame and @komachi_ensaka that upcoming Navi 21 GPU could feature up to 80 Compute Units, a doubling of what the Navi 10 has to offer. The data has been found and decoded by the leakers, to whom we promised not to...




					videocardz.com


----------



## ARF (Jul 31, 2020)

HD64G said:


> AMD Sienna Cichlid (Navi 21 "Big Navi") to feature up to 80 Compute Units? - VideoCardz.com
> 
> 
> AMD “Sienna Cichlid” GPU features 80 Compute Units? It has been discovered by @_rogame and @komachi_ensaka that upcoming Navi 21 GPU could feature up to 80 Compute Units, a doubling of what the Navi 10 has to offer. The data has been found and decoded by the leakers, to whom we promised not to...
> ...



What does 80 CUs actually represent?


----------



## moproblems99 (Jul 31, 2020)

ARF said:


> What does 80 CUs actually represent?



Each one represents a 1% chance they can catch Nvidia.


----------



## dragontamer5788 (Jul 31, 2020)

ARF said:


> What does 80 CUs actually represent?



NVidia and AMD GPUs are composed of "compute units". For those familiar with CPUs, a "CU" is similar to an AVX512 unit, except NVidia / AMD GPUs are much wider. I forget if AMD CUs were composed of 32 SIMD cores, or 64 SIMD cores. For this post, I'm assuming 64 SIMD-cores per CU.

NVidia calls them "SM" (Symmetric Multiprocessors), but its basically the same concept. NVidia SMs change grossly between generations. AMD's CU for Navi is extremely different from Vega (and earlier chips). So I don't quite recall RDNA's details right now.

--------

From a programming perspective, a CU controls the instruction pointer. So a CU is close to a CPU-core. If we pretended that it was a CPU, we'd call it a 80-core CPU. AMD / NVidia market based on the number of shaders however, so we'd probably call it a 5120-shader GPU. (64 shaders per CU x 80 CUs == 5120 shaders, or CUDA Cores)


----------



## TheoneandonlyMrK (Jul 31, 2020)

dragontamer5788 said:


> NVidia and AMD GPUs are composed of "compute units". For those familiar with CPUs, a "CU" is similar to an AVX512 unit, except NVidia / AMD GPUs are much wider. I forget if AMD CUs were composed of 32 SIMD cores, or 64 SIMD cores. For this post, I'm assuming 64 SIMD-cores per CU.
> 
> NVidia calls them "SM" (Symmetric Multiprocessors), but its basically the same concept. NVidia SMs change grossly between generations. AMD's CU for Navi is extremely different from Vega (and earlier chips). So I don't quite recall RDNA's details right now.
> 
> ...


AMD used quad 16bit simd units as a 64bit wavefront on gcn ,rDNA uses a pair of 32bit simd unit's which can either pair up to run 64bit or just run two 32bit wavefronts.


----------



## Othnark (Jul 31, 2020)

Nope.

Xx% faster WHEN YOU'RE NOT FIGHTING DRIVER ISSUES.

Faster, but non-functional, hardware is slower.  Step 1: write good drivers.

AMD cannot, see the tech support areas on the web for examples.  Easy to find a hoard of people with driver issues on 5xxx cards STILL.


----------



## Chrispy_ (Jul 31, 2020)

I haven't read back through all nine pages yet but one thing I will add to this "AMD hasn't taken the performance crown" discussion is that they haven't even been trying.

Navi 10 is a small, cheap, profitable chip by Nvidia standards. The 5700XT has fewer transistors than the RTX 2060S and obviously at 7nm is a much smaller die that is cheaper to produce (despite the higher cost of TSMC's 7nm process).

Yes, the 2070S/2080S and 2080Ti are faster, but they're also eye-wateringly expensive. Big Navi is also going to be expensive - both to produce and for you and me as consumers.

I somehow doubt Big Navi is it, but I really like the idea of GPUs following the Zen2 design of glueing chiplets together to make a big chip. A "Threadripper" or "EPYC" GPU based on 4 or 8 Navi cores could give us linear performance/$ increases, compared to traditional single-die solutions that become exponentially more expensive as the dies get larger. Imagine the performance of 8 5700XTs working in tandem. Clockspeeds would need to be dialled back to keep power consumption in check, but it'd still probably be 5-6x faster than a 5700XT....


----------



## dragontamer5788 (Jul 31, 2020)

Chrispy_ said:


> I somehow doubt it, but I really like the idea of GPUs following the Zen2 design of glueing chiplets together to make a big chip. A "Threadripper" or "EPYC" Gpu based on 4 or 8 Navi cores could give us linear performance/$ increases, compared to traditional single-die solutions that become exponentially more expensive as the dies get larger.



Its strange, isn't it? That although GPUs are super-parallel machines, that its far harder to make a chiplet-GPU design than a chiplet-CPU design. Its probably because of the stupid-high memory bandwidth of GPUs. Zen2 Infinity Fabric is only 40GBps or 50GBps per IFOP. GPUs have 500GBps, or 1000GBps bandwidths to GDDR6 or HBM2.

GPU programmers need another "tier" of RAM to represent something slower than VRAM (~500GBps on higher end GPUs right now), but faster than PCIe x16 (15GBps), to support chiplets. Or some new architecture with a lot of thought into memory channels needs to be made.

EDIT: NVidia's NVLink and DGX-2 computers are the right kind of thinking for this problem.


----------



## Fluffmeister (Jul 31, 2020)

xkm1948 said:


> I have come to accept this constant re-occurrence of "underdog wins" for RTG. If RTG does get on top and remain on top for a while I bet most folks will be singing a different tune.



Hey, the underdog apparently might beat your two year old graphics card. Hahaha, take that Nvidia!


----------



## nguyen (Aug 1, 2020)

LOL a lot of "industry experts" here keep comparing Navi die size to Turing but they are quick to forget Vega 56 die size
With a die size of 495mm2 + 8GB HBM2 (that GamersNexus estimated HBM2 modules and interposer alone cost 175usd), yet AMD was happy to sell Vega 56 for 400usd in 2017 and 300usd in 2018-2019.
It probably cost Nvidia less to produce 2080 and 2080 Super compare to Vega 56, but Nvidia can charge 700usd for them just because there are no competition. But hey at least having a healthy profit margin mean you can get a proper driver developer team that make proper drivers .


----------



## DuxCro (Aug 1, 2020)

I'm sure AMD will make something that competes with 2018. RTX 2080TI. But Nvidia will again have the fastest cards. And probably a lot more capable of ray tracing.  It would be great if AMD cought up to Nvidia highest end RTX 3000 series, but it's probably not gonna happen.


----------



## Chrispy_ (Aug 1, 2020)

nguyen said:


> LOL a lot of "industry experts" here keep comparing Navi die size to Turing but they are quick to forget Vega 56 die size
> With a die size of 495mm2 + 8GB HBM2 (that GamersNexus estimated HBM2 modules and interposer alone cost 175usd), yet AMD was happy to sell Vega 56 for 400usd in 2017 and 300usd in 2018-2019.
> It probably cost Nvidia less to produce 2080 and 2080 Super compare to Vega 56, but Nvidia can charge 700usd for them just because there are no competition. But hey at least having a healthy profit margin mean you can get a proper driver developer team that make proper drivers .


Exactly. AMD were making less profit on Vega than Nvidia have in a long time, simple cost to produce vs sale cost.

Anyone using die sizes in mm² to compare chips fabricated at different facilities on different process nodes needs a slap in the face and reality check. I use transistor count, but even that is only half of the picture because it's a very generalised figure that can't be directly compared between architectures:

AMD's stream processors are more capable vector units with scalar calculations offloaded to dedicated scaler-only logic (2x INT32 scalers per 64-SP CU) whilst Nvidia's 'CUDA cores' are simple scaler-based units and not much else.

Meanwhile, Nvidia dedicates a good chunk of the transistor budget to Tensor cores and DXR calculations, something RDNA1 doesn't. _"But what about __that Crytek Raytracing demo__ running on AMD?"_ That's because it's using the far more powerful vector capabilities of GCN to do raytracing with shader compute power instead of dedicated DXR-specific hardware like RTX does. One is not inherently better than the other, it's just that GCN's (and RDNA's) vector 'cores' were around long before DXR or RTX, so obviously not optimised _specifically for raytracing_. If anything, the tensor cores in Turing are actually more GCN-like than anyone with green-tinted spectacles is willing to admit.

Regardless of the underlying tech or how well it's implemented by drivers and software developers, the simple fact is that one 10bn transistor GPU is going to approximately the same cost to make as another 10bn transistor GPU, provided they are on the same node at the same foundry.

I do not actually know exactly how competitive the relative costs of each process are. All things being equal, smaller nodes are cheaper to make chips on. However, the market is subject to supply and demand, and there's a lot of demand for the smallest nodes and only so much supply. I think it was Ian Cutress talking about how, in the initial two quarters on TSMC's 7nm, almost all of the cost savings from die-shrinkage were offset by the extra cost of that process node. I guess that's what inspired the "chiplet" design of Zen2 to mix and match cheap and expensive silicon on the same package. Hopefully now that TSMC has improved and scaled up the 7FF node, it'scloser to being back in line with the generally-accepted 'smaller = cheaper' rule that's been pushing Moore's Law forwards for almost three decades now....


----------



## nguyen (Aug 1, 2020)

Chrispy_ said:


> Exactly. AMD were making less profit on Vega than Nvidia have in a long time, simple cost to produce vs sale cost.
> 
> Anyone using die sizes in mm² to compare chips fabricated at different facilities on different process nodes needs a slap in the face and reality check. I use transistor count, but even that is only half of the picture because it's a very generalised figure that can't be directly compared between architectures:



Pretty bold words for someone who doesn't know that TSMC charge Nvidia and AMD per wafer, not per working die, smh...
Let say for each 12nm wafer, TSMC charge 300usd more than GF according to ICInsight






Spread that 300usd difference to around 60 die per 300mm wafer, the difference is only 5usd per die (~500mm2 die).


----------



## ARF (Aug 1, 2020)

nguyen said:


> LOL a lot of "industry experts" here keep comparing Navi die size to Turing but they are quick to forget Vega 56 die size
> With a die size of 495mm2 + 8GB HBM2 (that GamersNexus estimated HBM2 modules and interposer alone cost 175usd), yet AMD was happy to sell Vega 56 for 400usd in 2017 and 300usd in 2018-2019.
> It probably cost Nvidia less to produce 2080 and 2080 Super compare to Vega 56, but Nvidia can charge 700usd for them just because there are no competition. But hey at least having a healthy profit margin mean you can get a proper driver developer team that make proper drivers .



Vega is a compute oriented architecture which behaves badly if you put it in gaming which is not its primary application area, to begin with.

Probably several years ago, under the threat of bankruptcy, AMD cancelled most of the meaningful GPU projects, including a larger die size Navi 1st gen chip, and renamed the remaining 251 sq. mm to Navi 10. Navi 10 should have been at least 495 sq. mm but hey no...

Totally faulty strategy in the graphics department.


----------



## Fry178 (Aug 2, 2020)

@Vya Domus
that was exactly my question, not the opposite, as you try to make it.


----------



## Chrispy_ (Aug 2, 2020)

nguyen said:


> Pretty bold words for someone who doesn't know that TSMC charge Nvidia and AMD per wafer, not per working die, smh...
> Let say for each 12nm wafer, TSMC charge 300usd more than GF according to ICInsight
> 
> 
> ...


It's not that simple though. If a wafer contains 20 defects and the process node is large enough that you only get 40 chips out of it, then you get 20 working parts.

The same 20 defects in a wafer and same GPU design but on a smaller node that means you can get 80 chips out of it, you still get 20 defects but now you have 60 working parts instead of 20

All else being equal, smaller nodes are cheaper because they double-dip into the beneficial attributes. More parts per wafer is lower cost per part, and smaller parts means that a fewer percentage of them will be defective, so better yields.


----------



## nguyen (Aug 2, 2020)

Chrispy_ said:


> It's not that simple though. If a wafer contains 20 defects and the process node is large enough that you only get 40 chips out of it, then you get 20 working parts.
> 
> The same 20 defects in a wafer and same GPU design but on a smaller node that means you can get 80 chips out of it, you still get 20 defects but now you have 60 working parts instead of 20
> 
> All else being equal, smaller nodes are cheaper because they double-dip into the beneficial attributes. More parts per wafer is lower cost per part, and smaller parts means that a fewer percentage of them will be defective, so better yields.



You can get a rough idea of how much a silicon die cost form this calculator (learn it from AdoredTV)

Here is how much a wafer cost (from Wccftech)





Enter the die size, wafer price (6000usd for 16nm and 10000usd for 7nm) and yield to get the cost per silicon die
Now the only unknown value is yield, with TSMC 16/12nm being a much more mature node, I would think the TU104 (545mm2) and TU106 (445mm2) get around 80% yield while TSMC 7nm be in the 60% yield (AdoredTV expected Navi 10 yield was around 50% at launch)

So from my calculation:
TU104: 75usd per die
TU106: 60usd per die
Navi 10: 70usd per die

As you can see, AMD doesn't get any cost benefit going for TSMC 7nm, but rather the performance and efficiency associated with the superior node (which AMD sorely need)

Note: there are also defective die that can be laser cut and salvageable into a lower variant, like the RTX 2060 for TU106 and 2070 Super for TU104, 5700 and 5600XT for Navi 10. But for the sake of simplicity I just take into account the perfect die that make it into the RTX 2070 (TU106), 2080 Super (TU104) and 5700XT (Navi10)


----------



## Chrispy_ (Aug 2, 2020)

nguyen said:


> You can get a rough idea of how much a silicon die cost form this calculator (learn it from AdoredTV)
> 
> Here is how much a wafer cost (from Wccftech)
> 
> ...


Nice calculator link but your argument only holds up if you use production estimates from around 16 months ago near the start of the 7nm process.

Zen2 Chiplet yields were 92.% flawless in December (8 months ago) and feeding that 8-month old data into the calculator at 250mm die size gives me 47 defective dies out of 240 for a Navi 10 yield estimate of 80.4% making each Navi 10 die $51.64 from the calculator you just linked.

You can see from posts like this that 7FF yielded better and sooner than it's predecessors too, so the lower yields of TSMC's 7FF that you're talking about and basing your reasoning on was pretty much untrue by the time Navi AIB card were on store shelves a year ago.





Your logic is sound but you're using data so old that it's giving you useless, incorrect results.


----------



## Lindatje (Aug 2, 2020)

RDNA 2 is 50% faster then the 2080TI, just like the 3080TI.
But the problem is the drivers....

I still buy the GTX 3070.


----------



## TheoneandonlyMrK (Aug 2, 2020)

Lindatje said:


> RDNA 2 is 50% faster then the 2080TI, just like the 3080TI.
> But the problem is the drivers....
> 
> I still buy the GTX 3070.


Strange input ,so mythical cards better than old cards ,no proof yet, driver issues on unreleased card means your buying a different unicorn.

Great input, come again.

In fact I'm not sure you should bother.

Your minds made up of unicorn poop.


----------



## EarthDog (Aug 2, 2020)

Lindatje said:


> RDNA 2 is 50% faster then the 2080TI, just like the 3080TI.
> But the problem is the drivers....
> 
> I still buy the GTX 3070.


----------



## Lindatje (Aug 2, 2020)

theoneandonlymrk said:


> Strange input ,so mythical cards better than old cards ,no proof yet, driver issues on unreleased card means your buying a different unicorn.
> 
> Great input, come again.
> 
> ...


Strange guy .


----------



## TheoneandonlyMrK (Aug 2, 2020)

Lindatje said:


> Strange guy .


You could have just said I'm buying Nvidia regardless.

That represents what you said better and encompasses all the facts and known data points in your post better.

But is still pointless to this thread.


----------



## Vayra86 (Aug 2, 2020)

Lindatje said:


> Strange guy .



Nah... he's seeing things pretty well if you ask me. If you're buying rumors that card XYZ is whatever percentage faster than whichever other card at this point in time, you simply didn't get it and you probably won't this gen either. Live and learn, come back for the next round and maybe you'll be wiser.


----------



## Vya Domus (Aug 2, 2020)

John Naylor said:


> For a while, with each generation ... AMD lost a tier.  With all cards OC'd, nVidia had the top 2 spots with 7xx / 2xx ... then they took a 3rd with the 950 ... then a 4th with the 1060.



You are delirious.



dragontamer5788 said:


> Its strange, isn't it? That although GPUs are super-parallel machines, that its far harder to make a chiplet-GPU design than a chiplet-CPU design.



GPUs are far easier to implement using chiplets, threads only need to communicate among other threads within the same CU, this greatly simplifies everything compared to a CPU where you need to ensure performance when communicating across chiplets.



nguyen said:


> but Nvidia can charge 700usd for them just because there are no competition



It has been proven time and time again that this is not how it works. AMD could come out with a 2000$ GPU faster than anything else out there and Nvidia would still sell something like a 2080 for 700$, the prices are set by what consumers are willing to pay not by the competition. How is it that Apple charged and still does so much for their phones all these years ? Can you say they had no competition ?

Here is another fallacy for you : if prices would go down every time there was "competition" then that would mean profits would slowly tend towards zero, because eventually there will always be competition and prices would have to drop no matter what according to this wonderful logic. Obviously that doesn't happen since these companies grow by each passing year, even the ones that aren't "competitive", prices go down when consumers no longer buy the same volume.

Nvidia's volume kept increasing and so did their prices.
AMD's volume either stagnated or went down, their products either remained the same or became cheaper.

Coincidence ? Nvidia's prices will only go down when they hit a plateau in terms of volume sold, which they will inevitably at some point. There is a limited amount of potential consumers and their cash.


----------



## Lindatje (Aug 2, 2020)

theoneandonlymrk said:


> But is still pointless to this thread


Uhmm look at the title and the history from AMD and Nvidia ....
if you had a little more experience in the hardware, you would have known why I react the way I react. 
So it is not pointless.


----------



## nguyen (Aug 3, 2020)

Vya Domus said:


> It has been proven time and time again that this is not how it works. AMD could come out with a 2000$ GPU faster than anything else out there and Nvidia would still sell something like a 2080 for 700$, the prices are set by what consumers are willing to pay not by the competition. How is it that Apple charged and still does so much for their phones all these years ? Can you say they had no competition ?
> 
> Here is another fallacy for you : if prices would go down every time there was "competition" then that would mean profits would slowly tend towards zero, because eventually there will always be competition and prices would have to drop no matter what according to this wonderful logic. Obviously that doesn't happen since these companies grow by each passing year, even the ones that aren't "competitive", prices go down when consumers no longer buy the same volume.
> 
> ...



Aren't you mixing MRSP and price gouging ? retailer price gouging all the time with products on high demand. Nvidia, AMD and even AIBs don't get to pocket those money.

Well there have been rumor that Nvidia and AMD have been mingling in some price fix scheme to maintain a healthy profit margins. It's also vital for Nvidia to keep their opponent alive otherwise they would be subjected to antitrust law. That is the sad reality of a Duopoly, had Intel been successful with their GPU you will see much better prices/performance.

Currently there are no competitor to 2080 Super and 2080 Ti in term of performance, Nvidia can set whatever MSRP they want with those. But of course they have to know what price customer are willing to pay for that kind of GPU performance, this is done through market research. The 2080 Ti has been selling at over 1200usd for almost 2 years now, do you think because they are selling in big volume ?


----------



## ARF (Aug 6, 2020)

*AMD Navi 21 ‘Big Navi’ GPU Rumored To Be Featured In 16 GB & 12 GB Radeon RX Gaming Graphics Cards*








						AMD Navi 21 'Big Navi' GPU Rumored To Be Featured In 16 GB & 12 GB Radeon RX Gaming Graphics Cards
					

AMD's Radeon RX enthusiast gaming graphics card featuring the Big Navi (Navi 21 GPU) are rumored to feature 16 GB & 12 GB VRAM configurations.




					wccftech.com


----------



## dragontamer5788 (Aug 6, 2020)

Vya Domus said:


> GPUs are far easier to implement using chiplets, threads only need to communicate among other threads within the same CU, this greatly simplifies everything compared to a CPU where you need to ensure performance when communicating across chiplets.



I'll believe it when I see it.

There have been multiple CPUs implemented as chiplets. Not only the recent Zen chips, but also IBM's Power5 back in 2004 used a chiplet.









						POWER5 - Wikipedia
					






					en.wikipedia.org
				







In contrast, there hasn't been  GPU or large-scale SIMD system, to my knowledge, that ever was made using a chiplet design.

Ergo: GPU / SIMD systems as chiplets must be harder. Otherwise, we would have made a SIMD-chiplet by now. Its not like we're missing on demand either. The fastest multiGPU / SIMD system available so far today is NVidia's DGX-2, but even that doesn't use chiplets yet.

----------

Again: this is because CPU to CPU communications is lower-bandwidth than what GPU-to-GPU communications are. The DGX-2 NVLink / NVSwitch system is 300GBps bandwidth chip-to-chip. In contrast, AMD Zen CPUs are only around ~50GBps for IFOPs (Infinity Fabric On Package) links.

GPUs need more bandwidth in their core-to-core communications than CPUs do. I expect this higher GPU-bandwidth requirement makes making GPU chiplets harder in practice.


----------



## ARF (Aug 6, 2020)

How so ? AMD and others have always claimed that they haven't found a way for GPU chiplets to be implemented because of CrossFire type of issues?


----------



## TheoneandonlyMrK (Aug 6, 2020)

dragontamer5788 said:


> I'll believe it when I see it.
> 
> There have been multiple CPUs implemented as chiplets. Not only the recent Zen chips, but also IBM's Power5 back in 2004 used a chiplet.
> 
> ...


Xe, Hopper?!

Intel uses tiles ie chiplets 

Nvidia's Hopper features similar but called something else no doubt, like game cores !?

Some rumours of Rdna3 point to chiplets.


----------



## dragontamer5788 (Aug 6, 2020)

theoneandonlymrk said:


> Xe, Hopper?!



I do think that Intel is onto something with its EMIB technology. If Intel is the first one to figure something out for GPU-chiplets, I wouldn't be surprised.

Hopper is pretty secretive. I haven't found much information on it.

------

I'm certain that GPUs will eventually be chiplets. The issues at 7nm and 5nm have made the chiplet methodology the clear path forward. But I don't believe that it will be an easy journey. There will be architectural changes and new issues brought up.


----------



## ARF (Aug 6, 2020)

theoneandonlymrk said:


> Xe, Hopper?!
> 
> Intel uses tiles ie chiplets
> 
> ...



The Zen chiplets are good for EPYC, not for Ryzen. For CDNA, not for RDNA.
Renoir (monolithic) is faster and more efficient than Matisse (chiplets).


----------



## Super XP (Aug 6, 2020)

ARF said:


> The Zen chiplets are good for EPYC, not for Ryzen. For CDNA, not for RDNA.
> Renoir (monolithic) is faster and more efficient than Matisse (chiplets).


The Chiplets are good for Ryzen, ZEN2 proved this as being a massive success. 
But ZEN3 is going to be a new design. Probably not fully Chiplets based. 

If AMD can utilize the Chiplets with RDNA3 and has high performance and low power draw, it's a good thing.


----------



## ARF (Aug 6, 2020)

Super XP said:


> The Chiplets are good for Ryzen, ZEN2 proved this as being a massive success.
> But ZEN3 is going to be a new design. Probably not fully Chiplets based.
> 
> If AMD can utilize the Chiplets with RDNA3 and has high performance and low power draw, it's a good thing.



Zen 2 proved it's better than the mediocre Intel counterparts.
But if you are wiser, just get your mighty Renoir laptop with a 15-watt APU that is as fast as a 65-watt desktop counter-part.


----------



## TheoneandonlyMrK (Aug 6, 2020)

ARF said:


> The Zen chiplets are good for EPYC, not for Ryzen. For CDNA, not for RDNA.
> Renoir (monolithic) is faster and more efficient than Matisse (chiplets).


Economies of scale weigh against that argument, once you scale up chiplets on the smallest node.

Monolithic may be better but it's scaling is limited, so is output possibly and it doesn't help with heat management, it will remain the choice used for most device's though due to costs.

N Ryzen works fine.


----------



## ARF (Aug 6, 2020)

theoneandonlymrk said:


> Economies of scale weigh against that argument, once you scale up chiplets on the smallest node.
> 
> Monolithic may be better but it's scaling is limited, so is output possibly and it doesn't help with heat management, it will remain the choice used for most device's though due to costs.
> 
> N Ryzen works fine.



What do you mean by economies of scale?
Ryzen with chiplets may be or may not be cheaper for production than the monolithic Renoir.
Renoir is 156 mm^2. Ryzen 3000 is 80 + 120 mm^2.

You can offer the 15-watt Ryzen 7 4800U as a full replacement to the 65-watt Ryzen 5 3600, and then use the chiplets for everything above.


----------



## TheoneandonlyMrK (Aug 6, 2020)

ARF said:


> What do you mean by economies of scale?
> Ryzen with chiplets may be or may not be cheaper for production than the monolithic Renoir.
> Renoir is 156 mm^2. Ryzen 3000 is 80 + 120 mm^2.
> 
> You can offer the 15-watt Ryzen 7 4800U as a full replacement to the 65-watt Ryzen 5 3600, and then use the chiplets for everything above.


We're talking about graphics cards!?

To attain 8K 120Hz frame rates of a raytraced version of battlefield 7 we are not going to find that easy to run, yet some are working towards this, tile based rendering helps by the fact the tiles split the workload.

And as epyc is proving ,for high throughput computation , chiplets can work well.

On chip Mgpu that's invisible to the user.


----------



## ARF (Aug 6, 2020)

theoneandonlymrk said:


> We're talking about graphics cards!?
> 
> To attain 8K 120Hz frame rates of a raytraced version of battlefield 7 we are not going to find that easy to run, yet some are working towards this, tile based rendering helps by the fact the tiles split the workload.
> 
> ...



Let's first focus on 4K. Because it hasn't been made popular enough just yet.


----------



## Vya Domus (Aug 6, 2020)

dragontamer5788 said:


> I expect this higher GPU-bandwidth requirement makes making GPU chiplets harder in practice.



If you begin with 1 chiplet and n memory chips you can then move onto m chiplets and n * m memory modules. It can all scales linearly need be, this isn't a problem, especially now that we have HBM. In fact you can take a look at existing GPUs and you'll see than usually bandwidth does not need to increase linearly with number compute units, it's sublinear. I wont go into why that's the case but the point is memory bandwidth is not the reason why MCM GPU haven't been made.



dragontamer5788 said:


> I'll believe it when I see it.
> 
> There have been multiple CPUs implemented as chiplets. Not only the recent Zen chips, but also IBM's Power5 back in 2004 used a chiplet.
> 
> ...



A SIMD chiplet doesn't even make sense, SIMD needs centralized instruction dispatch and logic, modern GPUs aren't SIMD , meaning there isn't a 64*32bit wide vector register physically on the chip. It's all scalar and it's compartmentalized in CUs which is why you can easily spread CUs in multiple chiplets. With a GPU you are guaranteed that the CU in one chiplet does not need to communicate with a CU in another chiplet. In other words they wouldn't suffer from the same issues CPUs do where a core would need to read/write to a cache line in another core, or even worse, if it encounters the issue of false sharing.

CPUs are undoubtedly harder to implement using chiplets than GPUs but here's why they became a thing before GPUs : There was a need for it that couldn't be resolved in any other way. With GPUs because of the way software is written for them you can just scale the problem up by using multiple GPUs that don't necessarily need to communicate with each other (because the way GPGPU algorithms have to be implemented makes it a requirement from the start that you can't communicate above a certain level, which is the CU).

Therefore you can stuff a lot of GPUs on a single motherboard. On the other hand CPUs are intended for different tasks that can't be scaled up in the same way, socket to socket communication is basically a death sentence for achieving high performance so the only solution is to stuff as many CPUs on a single socket , those would be the "chiplets".



dragontamer5788 said:


> this is because CPU to CPU communications is lower-bandwidth than what GPU-to-GPU communications are.



GPUs don't have to communicate with each other the same way CPUs do, as I explained above. That's why AMD and Nvidia largely gave up putting multiple GPUs on a single board where theoretically you could have achieved higher GPU to GPU bandwidth, because it's a waste of time.


----------



## dragontamer5788 (Aug 6, 2020)

Vya Domus said:


> If you begin with 1 chiplet and n memory chips you can then move onto m chiplets and n * m memory modules.



While CUDA seems to have the programming API for this, I don't believe this is common in DirectX (11 or 12), OpenGL, or Vulcan code. Even then, I don't think that people typically use CUDA's memory management interface like this, because its only relevant on the extremely niche DGX-class of computers.

In contrast, CPU shared memory is almost completely transparent to the programmer. The OS could easily migrate your process to other chips (affinity settings notwithstanding). In fact, affinity settings were invented to prevent the OS from moving your process around.

I say "almost" completely transparent, because NUMA does exist if you really want to go there. But CPU programmers have gotten surprisingly far without ever worrying about NUMA details (unless you're the cream-of-the-crop optimizer. Its a very niche issue where most programmers simply trust the OS to do the right thing).

The software ecosystem that would support a multi-chipset architecture, with each chipset having an independent memory space, simply does not exist. Therein lies the problem: we either have to make a NUMA-like API where each chiplet has a NUMA-like memory space that the programmer has to manage. *OR*, we build a crossbar, similar to AMD's Infinity Fabric (IFOP) which transparently copies data between chips... providing the programmer an illusion that all of the memory is in the same space.

50GBps is sufficient for AMD Infinity Fabric. For the same thing to happen on GPUs, NVidia has demonstrated that 300GBps is needed in their DGX-2 computers.

This isn't an easy problem, by any stretch of the imagination. I do imagine that it will be solved eventually, but I'm grossly interested in seeing how its done. I'm betting that NVidia will shrink their NVLink and NVSwitch system down and make it cheaper somehow.



Vya Domus said:


> A SIMD chiplet doesn't even make sense, SIMD needs centralized instruction dispatch and logic, modern GPUs aren't SIMD , meaning there isn't a 64*32bit wide vector register physically on the chip.



We've discussed this before Vya. Your understanding of GPU architecture is off.



			https://developer.amd.com/wp-content/resources/Vega_Shader_ISA_28July2017.pdf
		






AMD Vega (and all GCN processors) had the above memory diagram. The 256 VGPR registers were arranged in a 64 x 32-bit array called "SIMD 0". Vega's compute units are pretty complicated and there is also SIMD1, SIMD2, SIMD3 with independent instruction pointers.

The entire class of VGPRs operate in a SIMD fashion, as demonstrated by chapter 6.





So when you do an "v_add_F32 0, 1", this means all 64-values in VGPR#1 are added to all 64-values in VGPR#0, and then the result is stored into VGPR#0. Its a 64-wide SIMD operation. All "v" operations on Vega are 64-wide. RDNA changed this to a 32-wide operation instead. But the concept is similar.

I'm less familiar with NVidia's architecture, but I assume a similar effect happens with their PTX instructions.



> With a GPU you are guaranteed that the CU in one chiplet does not need to communicate with a CU in another chiplet.



At a minimum, video games share textures. If Chiplet#1 has the texture for Gordon Freeman's face, but Chiplet#2 doesn't have it, how do you expect Chiplet#2 to render Gordon Freeman's face?

GPUs, as currently architected, have a unified memory space where all information is shared. Crossfire halved the effective memory, because to solve the above issue, they simply copied the texture to both GPUs. (IE: two 4GB GPUs will have a total of 4GBs of VRAM, because every piece of data will be replicated between the two systems). It was a dumb and crappy solution, but it worked for the purposes of Crossfire.

This is why inter-chip communications might happen. If you want Gordon Freeman's face to be rendered on Chiplet#1 and Chiplet#2 in parallel, you need a way to share that face-texture data between the two chips. This is the approach of NVidia's NVSwitch in the DGX-2 computer.

Alternatively, you could tell the programmer that Chiplet#2 *cannot* render Gordon Freeman's face because the data is unavailable. This would be a NUMA-like solution (the data exists only on chiplet#1). Its a harder programming model, but it can be done.

Or maybe a mix of the two approaches can happen. Or maybe a new system is invented in the next year or two. I dunno, but its a *problem*. And I'm excited to wait and see what the GPU-architects will invent to solve the problem whenever chiplets arrive.


----------



## Vya Domus (Aug 6, 2020)

dragontamer5788 said:


> In contrast, CPU shared memory is almost completely transparent to the programmer.



Which is why it's slow and why GPU kernels can run orders of magnitude faster.



dragontamer5788 said:


> We've discussed this before Vya. Your understanding of GPU architecture is off.



SIMD "fashion" does not mean physical SIMD hardware, Terrascale was the last SIMD-like architecture which is why it also relied on VLIW to work effectively. My understanding of GPU architecture isn't off, your is simply outdated. You have to understand the way CUs work in both Nvidia and AMD hardware is analogous to SIMD but not the same, there are things that are impossible to do with regular SIMD. A CU can issue the same instructions in lock-step but on data which was addressed from multiple places indirectly, a single instruction can also generate multiple paths which can't be done in regular SIMD. For this very reason, physically there is no single "2048 bit" ALU, that would be insane, that's why they say "32 x 64", because that's how it's implemented, there are 64 separate ALUs/FPUs/etc that execute wavefronts.

Think for a moment, in Turing you can have both integer and floating point being issued within the same clock cycle using that "2048 bit unit", that wouldn't be possible with a SIMD arrangement.



dragontamer5788 said:


> If Chiplet#1 has the texture for Gordon Freeman's face, but Chiplet#2 doesn't have it, how do you expect Chiplet#2 to render Gordon Freeman's face?



It seems that you don't understand how any of this works at all or you are making a colossal confusion. Chiplet #1 or #2 has no problem accessing both textures, because as you said global memory is shared, what isn't shared is the memory that each CU has. Now there is no reason why a CU would need to access the memory of a CU from another chiplet because the programming model prohibits this, that's what you don't seem to understand. The premise from the beginning, is that none of this stuff can happen. If you need a texture for something, why would two chiplets try and apply the same instructions on the same data ? They wouldn't, at worst they would just each pull the portion of the texture that they need, because that's how shaders/kernels work.

No sharing of CU memory means no synchronization across CUs, which in the case of GPUs where you can have thousandths of threads in flight means no performance penalty of using chiplets.


----------



## ARF (Oct 7, 2020)

Chiphell leaks it:









						我也来一发对Navi21、22、23的估计 - 电脑讨论 -  Chiphell - 分享与交流用户体验
					

我也来一发对Navi21、22、23的估计,如图，196.8%*221w/340w/84.7%=1.51能效比提升50%，按照350w tdp计，性能差不到哪去，但是要保持N卡的IPC提升节奏有点难，rdna1好不容易赶上图灵，这次就拿频率来凑旗舰 ...,电脑讨论,讨论区-技术与经验的讨论 ,Chiphell - 分享与交流用户体验




					www.chiphell.com


----------



## Vya Domus (Oct 7, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...



What the hell is non RTX performance, RTX is a brand name. Maybe non-_DXR _? Even then, what's that supposed to include.


----------



## bug (Oct 7, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...


Oh good, they have 3070 Super and 3070 Ti in there. Definitely not made up


----------



## ixi (Oct 7, 2020)

lynx29 said:


> even if it is true it doesn't mean it will be worth a buy, AMD has a lot of trust to be gained on the driver side of things for GPU's... even going to amazon or newegg now and filtering reviews by "most recent"  scary stuff. I play a lot of older games, so I need to know drivers are stable for even those games a long long time ago. not just what is in the spotlight.



Atleast in the past I have never had driver issues with amd drivers. , lucky me I guess.


----------



## ARF (Oct 7, 2020)

I think I will be getting the RX 6900


----------



## Fluffmeister (Oct 7, 2020)

Official slides already mention RDNA3, best wait for that.


----------



## Metroid (Oct 7, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...



Fake benchmarks, nothing makes sense in those images.


----------



## FinneousPJ (Oct 7, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...


Without any test specs it's pretty meaningless. But I'm hopeful there is truth in this.


----------



## docnorth (Oct 7, 2020)

Somebody wants a cheap 2080ti.
Anyway such a card would make the 3090 look small and power efficient.


----------



## arbiter (Oct 7, 2020)

I get feeling this is another one those amd leaks that claimed faster but they did some iffy stuff to get there. Looking at "specs" listed on linked post of the gpu. It doesn't make sense looking at history. Looking at people wanting lower draw, 6900xt is rated at around same TDP as 3090 so.

Biggest thing is "Non RTX *Relative* perf". The Relative is the biggest question as to what is the relative to? My guess is its not using same base line. RTX is a standard so games having it enabled could start being norm in bench-marking since its part of the standard. with no info on the chart is very suspect to not give details about game used, base line, etc.


----------



## DemonicRyzen666 (Oct 7, 2020)

arbiter said:


> I get feeling this is another one those amd leaks that claimed faster but they did some iffy stuff to get there. Looking at "specs" listed on linked post of the gpu. It doesn't make sense looking at history. Looking at people wanting lower draw, 6900xt is rated at around same TDP as 3090 so.
> 
> Biggest thing is "Non RTX *Relative* perf". The Relative is the biggest question as to what is the relative to? My guess is its not using same base line. RTX is a standard so games having it enabled could start being norm in bench-marking since its part of the standard. with no info on the chart is very suspect to not give details about game used, base line, etc.



It's most likely rasterization with no DLSS and no Ray tracing.

Iffy stuff like 8K gaming at 60fps.....


----------



## EarthDog (Oct 7, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...


Bring it!

High clocks... high power... and its 3090 perf if that is true!

*wonders if those b1tching about power use will still b1tch about power use...lol


----------



## Caring1 (Oct 8, 2020)

Vya Domus said:


> What the hell is non RTX performance, RTX is a brand name. Maybe non-_DXR _? Even then, what's that supposed to include.


I'd say they meant RTX off, for a fair comparison.



EarthDog said:


> *wonders if those b1tching about power use will still b1tch about power use...lol


Yes I will.


----------



## Wolfkin (Oct 8, 2020)

arbiter said:


> I get feeling this is another one those amd leaks that claimed faster but they did some iffy stuff to get there. Looking at "specs" listed on linked post of the gpu. It doesn't make sense looking at history. Looking at people wanting lower draw, 6900xt is rated at around same TDP as 3090 so.
> 
> Biggest thing is "Non RTX *Relative* perf". The Relative is the biggest question as to what is the relative to? My guess is its not using same base line. RTX is a standard so games having it enabled could start being norm in bench-marking since its part of the standard. with no info on the chart is very suspect to not give details about game used, base line, etc.


With *Relative *I'm guessing they mean relative to RTX 2080 as that card is used as base 100% in the chart. So to read the chart, the % numbers show how much faster or slower cards are than RTX 2080.

Even so I sure will take the numbers with some salt until we have official word from AMD and independent reviews from trusted reviewers.
But yes I'm hoping, if true it will sure make for some interesting GPU competition, something we haven't had for a long time now.


----------



## CrAsHnBuRnXp (Oct 8, 2020)

Vya Domus said:


> What the hell is non RTX performance, RTX is a brand name. Maybe non-_DXR _? Even then, what's that supposed to include.


Non ray tracing most likely.


----------



## Super XP (Oct 9, 2020)

The RTX 2080Ti seems so SLOW nowadays


----------



## Fluffmeister (Oct 9, 2020)

Super XP said:


> The RTX 2080Ti seems so SLOW nowadays



It's over two years old after all.


----------



## Super XP (Oct 9, 2020)

Fluffmeister said:


> It's over two years old after all.


Yes but it was a beast of a GPU though.


----------



## ratirt (Oct 9, 2020)

Super XP said:


> Yes but it was a beast of a GPU though.


Moon seems big when you look at it  but if you look at earth from a moon then you realize how big earth is. 
2080 Ti's a beast until something else comes out and you can see it's performance  . I'm sure same thing is going to happen with NV 30 series when something much faster comes out.


----------



## bug (Oct 9, 2020)

Super XP said:


> Yes but it was a beast of a GPU though.


Can't complain much when performance evolves this fast. But I'll still complain about prices


----------



## aQi (Oct 9, 2020)

1 Jan 2021. Amd RDNA 3 graphics that beats hell out of anything on the planet even Covid-19 doesn’t stand a chance.


----------



## Rei (Oct 9, 2020)

Aqeel Shahzad said:


> 1 Jan 2021. Amd RDNA 3 graphics that beats hell out of anything on the planet even Covid-19 doesn’t stand a chance.


A rather biased comment considering the specs hasn't come out yet but considering this is AMD it is unlikely to beat the RTX 3080. We have yet to see it.
Also the bit about Covid-19 is rather disrespectful to those affected by it.


----------



## bug (Oct 9, 2020)

Rei said:


> A rather biased comment considering the specs hasn't come out yet but considering this is AMD it is unlikely to beat the RTX 3080. We have yet to see it.
> Also the bit about Covid-19 is rather disrespectful to those affected by it.


While I'm the first to advocate for caution, nobody expected 5700 to beat the 2060. Or the 5700XT to basically tie the 2070, while selling for $100 less. So yes, AMD at this point has a solid tradition of lagging behind, but they can still surprise us.


----------



## aQi (Oct 9, 2020)

Rei said:


> A rather biased comment considering the specs hasn't come out yet but considering this is AMD it is unlikely to beat the RTX 3080. We have yet to see it.
> Also the bit about Covid-19 is rather disrespectful to those affected by it.



well amd might stand for a fair fight but all this ridiculous hype is not going to help either way just like the 5700 did.

You are on the same page as i stated all theoretical claims and nothing new that may surprise until they finally show up something.

and i mean no disrespect to the Covid-19 victims.
You may interpret my words twisted but the literal meaning was someone should not claim something that they are not capable of.

Every single gamer knows. Nvidia rules and leads graphic technology and Amd just follows its footsteps and claims being better.

If their graphic division was so innovative they had to be the first delivering hardware level ray tracing.

Now all these months amd proposed they will deliver in the footsteps of nvidia “ray tracing”

if there was such a feature they might have announced it yet lakes behind. Again claimed but not delivered.

There is a reason why people buy Nvidia rather than low price amd graphics with the same specs.


----------



## Rei (Oct 9, 2020)

bug said:


> While I'm the first to advocate for caution, nobody expected 5700 to beat the 2060. Or the 5700XT to basically tie the 2070, while selling for $100 less. So yes, AMD at this point has a solid tradition of lagging behind, but they can still surprise us.





Aqeel Shahzad said:


> well amd might stand a for fair fight but all this ridiculous hype is not going to help either way just like the 5700 did.
> 
> You are on the same page as i stated all theoretical claims and nothing new that may surprise until they finally show up something.
> 
> ...


Yes, well, fair enough... I did not put price-to-performance ratio into consideration with my previous comment. My bad...


----------



## aQi (Oct 9, 2020)

Rei said:


> Yes, well, fair enough... I did not put price-to-performance ratio into consideration with my previous comment. My bad...



it ok if you did not mention. Most of us know amd has done quite uniquely in their cpu line, might be in graphics too. But there is something you are missing. Intel and Nvidia. They spend most in RnD to bring technological break through

Amd is just another competitor. Just sees whats going on, keeps an eye on releases, brings out same stuff with lower prices.
The ones who create out of the blue are others.

The fight is against AI and AR. Intel is behind in most aspects from amd by now but still ahead in hardware level instructions. They are not too bothered about who has the world’s best gaming processor. They are more leaning towards graphic world with mixed AI and AR.

ofcorse data center business margin is also taken away by amd but amd even intel cannot take away something delivered by nvidia. Thats the very reason of Intel jumping into graphics in the first place. The future belongs to AI and AR.


----------



## Rei (Oct 9, 2020)

Aqeel Shahzad said:


> it ok if you did not mention. Most of us know amd has done quite uniquely in their cpu line, might be in graphics too. But there is something you are missing. Intel and Nvidia. They spend most in RnD to bring technological break through
> 
> Amd is just another competitor. Just sees whats going on, keeps an eye on releases, brings out same stuff with lower prices.
> The ones who create out of the blue are others.
> ...


It seems like Intel is also coming out with their own line of discrete GPU but I don't seem to know it's progress, when it's gonna come out, if it ever will. Can anyone elaborate more on this?

It's seems to me that Nvidia has it's toe dip deeper into AI tech than Intel or AMD. Even though Intel has their Mobileye division for that, the news hasn't reached my ears.


----------



## Mr Bill (Oct 9, 2020)

One thing for sure is, the everyday search for more speed will keep Nvidia, AMD, Intel very happy, and help us poor folks, to be able to buy the "just" outdated hardware at a 50% discount.


----------



## Ja.KooLit (Oct 9, 2020)

Ill be happy even if it is 40% faster. Either way, competition is good.


----------



## HD64G (Oct 9, 2020)

It is already confirmed by the yesterday shown official benchmarks that the rumour about Big Navi being 40-50% faster than 2080Ti was true from the start. Price is the big ? now and a smaller one is the power draw (estimations placing it at 250-270W) and thermals/noise of the new cooler shroud.


----------



## cueman (Oct 9, 2020)

rumors,thouse leak alot,and now amd doing it or amd partners.

rx 6900 xt is and cant be nothing else than 2 x rx 5700 xt CF with Rt support.

rdna2 is still same 7nm core and gddr6 mems, i mean both rdna1 and rdna2....

amd info rx 6900xt tdp is 300W...it sure, that thouse watts no deal for rx 3080 performance and also no way 45% faster than rtx 2080 ti....or amd doing miracle!
we must remembe fact, rtx 2080 ti IS 45% faster than fastest rx 5700 xt gpu what is sapphire rx 5700xt nitro+, tdp 285w offical.   google...
but that nitro+ rx 5700xt version loose even rtx 2070 super Fe gpu....enough..

so,remembe,rx 6900 xt loose about 20-25% for rtx 3080 10gb version,and rtx 3080 20gb version with amd ryzen cpu for PC 5-10% more at least.

for rtx 3090 gpu,with thouse rigs, different is about 45%..yes

and amd rx 6900 xt TDP ...or do i say power eat is over 360W.

if,i repeat if amd rx 6900xt was even close rtx 3080 10gb performance,you will 100% sure seeing more amds ' leaks' from internet...
reason is that rtx 3000 series selling like hell and really earn it btw, and amd not like it...but...seen any leaks...why? cant do it?

remembe my presents and see yourself 28 october...but as i say... rtx 3080 20gb super and rtx 3080 TI coming. if need....

ryzen 5900 cpu is excellent for all PC rigs,and amd using that it own odd tease leak...and odd games choice...no1 site not use thouse games...??
.,and rx 6900 xt 16gb memory help just there 4K resolution,so 20gb rtx 3080 is perfect compare.


wait and see..you too...

anyway, we need amd,nvidia and also intel Xe gpus, coming great battle...where is intel Xe...think its go 2021.. as also intel 1st 10nm cpu.

all great weekends!!


----------



## Super XP (Oct 9, 2020)

Umm the 5700XT wasn't hyped last I checked. But Vega was hyped by the AMD Fan community. RDNA2 isn't being hyped as much as Vega was, I consider it being cautiously hyped because I believe AMD has something really good.


----------



## Fouquin (Oct 9, 2020)

cueman said:


> rx 6900 xt is and cant be nothing else than 2 x rx 5700 xt CF with Rt support.
> 
> rdna2 is still same 7nm core and gddr6 mems, i mean both rdna1 and rdna2....



You are forgetting that an architecture is not the process node. As AMD literally just showed everyone with their Ryzen 5000 announcement you can build a better chip without changing process nodes or even package type.


----------



## Maye88 (Oct 9, 2020)

ARF said:


> Chiphell leaks it:
> 
> View attachment 171150
> 
> ...



If you translate the site, it literally says ESTIMATES in the title. I saw this earlier and as much as I want it to be true I can't take it as anything more than speculation. Unless someone can provide an accurate translation of what was said by the poster that states otherwise.

Navi 21 XE is an OEM only variant. Just like Navi 10 XE was OEM only. There are only 3 consumer SKUs based on Device IDs: Navi21 XTX, Navi21 XT and Navi21 XL. Everything else is OEM, Apple or Pro variants. Comparing these with the Navi 10 series the Navi 10 XT was the 5700XT and the Navi 10 XL was the 5700. XTX was used for Navi 10 Anniversary Edition and before that the Vega Founder's Edition. Thus the XTX should be the highest end SKU.

The 6800 and 6800XT cards are likely cut down to 72 CUs for yields while only the 6900 XTX has the full fat 80 CUs.

My theory. The clocks are a bit of a guess but you get the idea.

Navi 21 XTX = 6900XTX: 80CU 2300Mhz (+11% perf over 6800XT) [Basic math 80CUs/72Cus = 1.11x faster.]
Navi 21 XT = 6800XT: 72CU 2300Mhz (+12% perf)
Navi 21 XL = 6800: 72 CU 2050Mhz (base)

Many speculate that AMD did not tease their highest end SKU. I agree. Now whether they showed Navi 21 XT or Navi 21 XL we do not know. That AMD did not show their highest model is supported by an interview Scott Herkelman gave in which he stressed that AMD did not state which GPU they showed.

If the card teased was Navi 21 XT (6800XT) then the Navi 21 XTX (6900 XTX), likely ~11% faster, should trade blows with the 3090.

However, If the card teased was the Navi 21 XL (6800) then the higher clocked 6800XT will beat the 3080 (!) and is positioned to compete with any 3080 Super/Ti variant Nvidia introduces. More importantly the 6900XTX should beat the 3090 in most games.

Edit: I want to add that Lisa Su stated that they showed Big Navi which the internet itself named. This moniker has only ever applied to Navi 21 and its variants. Thus the GPU they teased can only be one of these 3 models. Any speculation that this was Navi 22 they showed is wrong as it has never been considered Big Navi.


----------



## Zach_01 (Oct 9, 2020)

HD64G said:


> It is already confirmed by the yesterday shown official benchmarks that the rumour about Big Navi being 40-50% faster than 2080Ti was true from the start. Price is the big ? now and a smaller one is the power draw (estimations placing it at 250-270W) and thermals/noise of the new cooler shroud.


If we are going to believe rumors the best thing AMD can offer in almost a month is a close to 3080 performance, mostly loosing by it and maybe wining in some, less than 3080 RTRT perf as it is alleged to implement that workload inside the same architecture and not on some dedicated RT cores, some DLSS equivalent probably with somewhat less quality and around 300W draw or may be a little more. Pricewise will be very 3080 competitive and if all the above is true AMD knows that it must be priced accordingly and aggresively otherwise it wont stand a chance. Hence all the following: the GDDR6 usage (cheaper than GDDR6X and less power hungry), the low memory bus width (256bit(?) = less expensive and complicated design, less core power utilization, less transistor usage), and of course the implementation of a very large cache memory to compensate. And all this cant be true with current 7nm node. The also rumored 2.2GHz turbo boost + the little higher transistor density suggests the next enhanced node 7NP (not 7nm+) by TSMC.

Again if all this is true, AMD has chosen a very different path that may present the chance to compete in price whilst offers a really close to 3080 performance.

More to come with RDNA3


----------



## ARF (Oct 10, 2020)

Super XP said:


> The RTX 2080Ti seems so SLOW nowadays



It's made on the old 16/12nm TSMC process back in September 2018, when AMD got its first 7nm card only 4 months later - in February 2019 with the current top card - the Radeon VII.


----------



## cueman (Oct 11, 2020)

well i cant belive its 45% faster than rtx 2080 ti...i will anyway want see it first...aand then what is tdp,real one gaming tdp i mean.

fact is that amd FASTEST rx 5700 xt gpu is sapphire nitro+ Special Edition gpu and its 44% slower than rtx 2080 ti.

and,all know that rx 6900 xt is clear 2x5700xt CF mode gpu,so it just cant. it wont work. anyway tdp is very high over 450W that case.

but even then its can be 10% maybe max 20% faster than rtx 2080 ti,but it must running high way clocks.

amd also get info out that rx 6900 xt tdp is 300W, only 300W for that tdp cant belive it.

i mean if rx 6900 xt tdp was somewhere 360-400W i belive it.


bcoz there is internet tested example rx 5600xt and rx5700 CF mode, and that tdp was 464W, yes 464W
doesent matter how much amd so called optimazed it 7nm core, its cant cut 120W off.
even RT support eat juice more.. i guess 25-50W.

rtx 3080 320W is excellent value when we  look its performance. truly 4K gpu.

sure amd cant beat it,its clear calculate and re-search work and compare work..nothing else, amd cant brake physical laws.

i mean if it can,sure...100% sure we will see for now and must earlier many 'leaks' on net from amd.

reson is clear, rtx 3000 series selling like hell and amd not like it at all,its want stop it,
 and best way is stop it is lay out 'leaks' rx 6900 xt performance fps...but only if they are good ones..
seen any? no.

 except lisa su odd choice cherry picked games and rigs that no1 cant test,example 5900xt ryzen 12-core cpu. hmm..
and also what isee, rx 6900 xt have 16gb memory,its help just there high resolution, 4K games, so rtx 3080 20gb version with now ryzen 5900xt cpu best choice for battle.
until intel Rocket lake cpu coming Q1/2020.,,lol

and btw, every1 want intel Adler Lake hydrib cpu when it coming Q2/2021, its intel 1st 10nm cpu and also hydrib, terrible blood freeze performance and efficency! wait little. lol

so,i say, rx 6900 xt loose rtx 3080 10gb/20Gb 20-35%
and if can get that close,rx 6900 xt  tdp will be over 350W, price is about 699-799$ maybe more,if amd continue it zero/2-3% profit politics.

well...blah,blah. lets waiting few weeks and all see.


is it nice to also intel Xe gpu battle line as well, getting then good soap...

2021 they promise...with Adler lake cpu?


----------



## Zach_01 (Oct 11, 2020)

cueman said:


> well i cant belive its 45% faster than rtx 2080 ti...i will anyway want see it first...aand then what is tdp,real one gaming tdp i mean.
> 
> fact is that amd FASTEST rx 5700 xt gpu is sapphire nitro+ Special Edition gpu and its 44% slower than rtx 2080 ti.
> 
> ...


Please... 6900 series GPUs are not 2x5700XTs in CF mode. You do not know what you are talking about. It will be about 300W with double or more the raw performance of a 5700XT.
How?

Please be my guest:








						AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs
					

Makes me wonder was what the whole point of releasing RDNA1,something with feature parity with Vega, so already obsolete at launch. Primitive shaders are dead by now. Both Microsoft and Khronos went for Mesh shaders. Two generations of products marketed to have something that never worked and...




					www.techpowerup.com


----------



## aQi (Oct 11, 2020)

Rei said:


> It seems like Intel is also coming out with their own line of discrete GPU but I don't seem to know it's progress, when it's gonna come out, if it ever will. Can anyone elaborate more on this?
> 
> It's seems to me that Nvidia has it's toe dip deeper into AI tech than Intel or AMD. Even though Intel has their Mobileye division for that, the news hasn't reached my ears.



as far as the situation Intel is struggling to meet the supply demand. Nvidia bothers less into gaming despite the fact its on top still they more bend towards AI and dominates the automotive market in terms of self driven electric vehicles (Tesla & Volvo) and deep machine learning.
On the other hand Intel is slowly taking its grip. With lunch of 10th gen chips with hardware level AI instructions but thats not enough we need graphical punch too.
Decade ago Intel dropped its budget for internal discrete graphics devision and subsequently promoted integrated graphics to exceed limitations with its low TDP
Where Nvidia inspired by the crytek engine consistently worked on ray tracing all these years they conquered modern AI designs ahead of their time.
Now again the cupboards are open by Intel to get into graphics because the igp never was a thing to compete anyway. By looking at the new XE igp demo available on prototype of 11th gen its good even better then before but not 

Its all just a possibility for each company to deliver not a product but the future of any consumer.

Amd has impressively dominated Intel. Intel is just hanging onto 3rd party TSMC for 10nm fabrication. So everyone has all the chances to step on Intel for the moment. Nvidia buying ARM and then no contract from apple this year and couple of law suite and infringements brought Intel to split their internal divisions too.

Intel has to bind alot before they can come up with anything better then atleast ryzen and RDNA2. 
it will be fun hold on......


----------



## P4-630 (Oct 15, 2020)

__ https://twitter.com/i/web/status/1316145669741051905


----------



## EarthDog (Oct 15, 2020)

P4-630 said:


> __ https://twitter.com/i/web/status/1316145669741051905
> View attachment 171891


Awwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww...I'm tellllllllllllllling.................you posted news........................................................ lolol

On a serious note, I was wondering about RT perf and if they were able to keep up on the first attempt vs Nvidia who is on gen 2.

Cue users minimizing RT because it's supposedly going to be slower on AMD and only on the flagship .........


----------



## arbiter (Oct 16, 2020)

P4-630 said:


> __ https://twitter.com/i/web/status/1316145669741051905
> View attachment 171891


So i guess say problem here "Rumors: #Navi21 #RX6000" Even says RT less then 3080, if the amd card doesn't have dedicated hardware it will could take a big hit kinda like 1000 series does maybe not to that degree of loss.


----------



## Shatun_Bear (Oct 16, 2020)

This is my prediction and I'll be interested to see where the chips fall in about a month's time after independent reviews:

At 4K:

*3080 is 30% faster than 2080 Ti.
16GB 6800X is 25% faster than 2080 Ti.

24GB 6900X is ~35% faster than 2080 Ti *(this card is coming next year not in November, this is not the one AMD benchmarked).

Anyone expecting +50% are going to be disappointed. The 3090 is not even close to that kind of performance.


----------



## TheoneandonlyMrK (Oct 16, 2020)

EarthDog said:


> Awwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww...I'm tellllllllllllllling.................you posted news........................................................ lolol
> 
> On a serious note, I was wondering about RT perf and if they were able to keep up on the first attempt vs Nvidia who is on gen 2.
> 
> Cue users minimizing RT because it's supposedly going to be slower on AMD and only on the flagship .........


What's Raytracing !?  ,Lol


----------

