# AMD Radeon RX 3080 XT "Navi" to Challenge RTX 2070 at $330



## btarunr (May 7, 2019)

Rumors of AMD's next-generation performance-segment graphics card are gaining traction following a leak of what is possibly its PCB. Tweaktown put out a boatload of information of the so-called Radeon RX 3080 XT graphics card bound for an 2019 E3 launch, shortly after a Computex unveiling. Based on the 7 nm "Navi 10" GPU, the RX 3080 XT will feature 56 compute units based on the faster "Navi" architecture (3,584 stream processors), and 8 GB of GDDR6 memory across a 256-bit wide memory bus. 

The source puts out two very sensational claims: one, that the RX 3080 XT performs competitively with NVIDIA's $499 GeForce RTX 2070; and two, that AMD could start a price-war against NVIDIA by aggressively pricing the card around the $330 mark, or about two-thirds the price of the RTX 2070. Even if either if not both hold true, AMD will fire up the performance-segment once again, forcing NVIDIA to revisit the RTX 2070 and RTX 2060. 





*View at TechPowerUp Main Site*


----------



## sergionography (May 7, 2019)

Interesting. I wonder how power efficiency will play out on this though. Hopefully AMD doesn't squeeze things way above the sweetspot efficiency curve.


----------



## Brusfantomet (May 7, 2019)

Is the “performs competitively with NVIDIA's $499 GeForce RTX 2070” the same as Radeon VII comparison to the 2080? Because to me it looks like the VII needs water-cooling to actually get close to the 2080.
One can hope that it’s the case. Having 93 % of the compute units of the Radon VII (if frequencies stays the same) it shuld place it on par with the 2070. But I am not holding my breath.


----------



## Caring1 (May 7, 2019)

Probably run hotter and use a lot more power too.


----------



## VulkanBros (May 7, 2019)

Reminds me....



 
And is still running - just changed the thermal paste and the thermal paddings


----------



## kastriot (May 7, 2019)

Well this decoy is necessary for upcoming 5nm AMD gpu and then fun begins.


----------



## sutyi (May 7, 2019)

Caring1 said:


> Probably run hotter and use a lot more power too.



RTX 2070's avg around 190-210W under gaming.  If NAVI can not close a 20-25% performance gap while being literally two nodes lower than VEGA and going up to 220-240W TDP per se... it won't be funny...


----------



## R0H1T (May 7, 2019)

This needs to be merged with the *choo choo* thread


----------



## Vayra86 (May 7, 2019)

If this is true, its a pretty neat price/perf move for the midrange. With this, AMD is fast cannibalizing on Nvidia's profits and market, and with minimal effort. Nvidia has to match this in some way; they either do it with slightly higher performance (likely) at a slightly higher price point or they do some weak refresh of Turing and compete on price instead (highly unlikely given the place Turing's at on this node). But that still makes it a pretty sweet drop from the 2060, which was already a decent bang/buck performance card.

And, it does make Navi rather interesting for as long as it will last (that is until more 7nm GPUs come to market). Very curious about the power this thing will need.



R0H1T said:


> This needs to be merged with the *choo choo* thread



Please no, let's keep hype and inflated expectations out of here for once...


----------



## The Quim Reaper (May 7, 2019)

Hmmm...is that a 2060 Ti with 8GB VRAM release, suddenly being prepared at Nvidia HQ, I see?


----------



## R0H1T (May 7, 2019)

Isn't this just a rehash of WCCF leak/rumor which itself was probably based on Jim's *reveal* It is.


----------



## THANATOS (May 7, 2019)

If It has 56CU then RTX2070 performance is possible, but they wouldn't price it at $330 If the Nvidia card is selling for $499. Nvidia's card would be ~50% more expensive and AMD is not a charity. Either the performance and parameters will be worse or the price higher. $449 would be acceptable for this kind of performance.
l


----------



## Upgrayedd (May 7, 2019)

Cut Intel with X399 and now Nvidia with 3xxx if true. Starting to feel like they are just doing it to seem competitive, more of a bark than a bite.


----------



## spnidel (May 7, 2019)

isn't the rtx 2070 pretty much a 1080 performance-wise? if so, then god damn, AMD... node shrink, and you still can't beat a 1080 ti with something that isn't HBM memory with power consumption that isn't trash. sad.


----------



## NdMk2o1o (May 7, 2019)

spnidel said:


> isn't the rtx 2070 pretty much a 1080 performance-wise? if so, then god damn, AMD... node shrink, and you still can't beat a 1080 ti with something that isn't HBM memory with power consumption that isn't trash. sad.


So you're assuming the rumor is true then that also means you must assume the other part about the cost is true or is it just the bits that suit you? And if so then 1080/2070 performance for $330 is pretty darn good and better value /performance that anything nvidia has to offer, not to mention that navi 10 was always slated to be a midrange part from initial press statements going back as much as 2 years. So what is exactly is trash about a card that costs $330 that competes with a $500 one? Nothing to me.


----------



## M2B (May 7, 2019)

Why RX? RX means R10 and it makes sense as the successor of the R9.
It should be called R11 (RXI).


----------



## erocker (May 7, 2019)

THANATOS said:


> If It has 56CU then RTX2070 performance is possible, but they wouldn't price it at $330 If the Nvidia card is selling for $499. Nvidia's card would be ~50% more expensive and AMD is not a charity. Either the performance and parameters will be worse or the price higher. $449 would be acceptable for this kind of performance.
> l


But AMD has done this before several times with both CPU's and GPU's.


----------



## PanicLake (May 7, 2019)

NdMk2o1o said:


> So you're assuming the rumor is true then that also means you must assume the other part about the cost is true or is it just the bits that suit you? And if so then 1080/2070 performance for $330 is pretty darn good and better value /performance that anything nvidia has to offer, not to mention that navi 10 was always slated to be a midrange part from initial press statements going back as much as 2 years. So what is exactly is trash about a card that costs $330 that competes with a $500 one? Nothing to me.


Exactly!
Some people are evidently bad at math or something...

Anyway lets hope for the consumers sake that AMD has a decent product. And if someone wish AMD to fail just because he is an nVidia "follower", he is a complete R _ _ _ _ D!


----------



## medi01 (May 7, 2019)

*Let's repeat adoredTV "leak", shall we*. A screenshot from the vid:

















sutyi said:


> being literally two nodes lower


It's one node, not two.


----------



## Manu_PT (May 7, 2019)

Why are we stilll talking about AdoredTv "leaks" is beyond me... I need to create an AMD rumours youtube channel quick


----------



## medi01 (May 7, 2019)

THANATOS said:


> Nvidia's card would be ~50% more expensive and AMD is not a charity


Remind me, why 570 (3 years old) ivs slower 1050Ti/1650 works.



Manu_PT said:


> Why are we stilll talking about AdoredTv "leaks" is beyond me...


He got a number of things right, some of which were quite unusual to expect, so he should have actual source within AMD.


----------



## Assimilator (May 7, 2019)

Pretty obvious that this rumour is someone taking the Radeon VII's performance vs RTX 2080 and extrapolating that to vs a 2070. But that's inane because GDDR uses a lot more watts than HBM (one of the reasons why Polaris is such a power hog), so claiming only 15W more than 2070 for the same performance is laughable at best. TSMC 7nm is obviously far superior to GloFo 14nm but there is only so much it can do with the dog that is (hopefully) GCN's last hurrah.



erocker said:


> But AMD has done this before several times with both CPU's and GPU's.



They've never undercut their competitor by such a significant amount, and they have zero incentive to do so this time around, unless they want to go all-out on trying to regain marketshare. NVIDIA putting high margins on its GPUs is good news for AMD's cashflow too (assuming the latter can produce a competitive product), and RTG certainly needs money if it's going to have any hope of competing into the future.

My prediction: almost-RTX 2070 performance at $450 and 220W TDP, quickly dropping to under $400 as NVIDIA drops its own prices in retaliation.


----------



## medi01 (May 7, 2019)

Assimilator said:


> They've never undercut their competitor by such a significant amount


8 core Zen CPUs.
290/290x.
1050Ti vs 470/570.


----------



## THANATOS (May 7, 2019)

erocker said:


> But AMD has done this before several times with both CPU's and GPU's.


When did they release a GPU with the same performance but only 2/3 of price? Even the super cheap RX570 has "only" 1/3(33%) better performance/price ratio than ~14% faster GTX1060 and this NAVI should have 1/2(50%) better price/performance ratio than RTX2070 while performing the same? I call this nonsense, there is no reason for them selling It so cheaply unless the performance is lower.


----------



## the54thvoid (May 7, 2019)

If naming conventions are true, they've done this to piss off Nvidia. Unlikely, Nvidia can make their next gen line 3080 etc.


----------



## Vayra86 (May 7, 2019)

the54thvoid said:


> If naming conventions are true, they've done this to piss off Nvidia. Unlikely, Nvidia can make their next gen line 3080 etc.



Inb4 Nvidia GTX 10000

You get a free Dragonball Z sticker with it.



NdMk2o1o said:


> So you're assuming the rumor is true then that also means you must assume the other part about the cost is true or is it just the bits that suit you? And if so then 1080/2070 performance for $330 is pretty darn good and better value /performance that anything nvidia has to offer, not to mention that navi 10 was always slated to be a midrange part from initial press statements going back as much as 2 years. So what is exactly is trash about a card that costs $330 that competes with a $500 one? Nothing to me.



The price/perf was not his point, it was efficiency, and yes, that is pretty meh for 7nm, wouldn't you agree? Navi really needs that shrink to keep it somewhat reasonable, but this can't really last long on 7nm in the current state.

 But... we haven't seen the actual numbers yet


----------



## THANATOS (May 7, 2019)

medi01 said:


> 8 core Zen CPUs.
> 290/290x.
> 1050Ti vs 470/570.


Intel 8(16T) cpus are much costlier than the ones form AMD but also noticeably faster and we are talking about similar performance for only 2/3 of price!
The competitor for 470/570 was(is) 1060 3GB and not 1050Ti! We are talking about similar performance once again.

So I ask for some example where the performance was* the same *but It cost 2/3 of the competitors price!


----------



## phill (May 7, 2019)

As always I'll hang on and wait to see what is going to be released but I would like to hope that something better is coming   Hopefully....


----------



## Zendo911 (May 7, 2019)

I do genuinely think that given the rumored performance, a 330$ is a very reasonable and competitive price. Lets not forget an RTX 2070 will have completed a year by the time this is released, and lets not forget an RTX 2070 comes with a larger die (hence more expensive to manufacture), and with the bells and whistles of RTX and DLSS. Now, RTX and DLSS have divided people a lot, I personally enjoy Metro with the Ray Traced AO/HDR, but I know not everybody would be willing to chill and extra amount of money for that. 

I still do see two major problems with this card. 1st, make no mistake, this is a 1440p card. Nobody would be gaming 4k on this card, but the problem is, there are a number of cards that already perform this job (barring DXR/DLSS) at similar price points (1070 ti is a prime example). The second problem is, it took AMD so much time (almost 2 years), and a two-nodes advantage to provide a worthy competitor that still grants them some profits (barring the mining craze, Vega 56 wasn't that profitable at msrp, given the HBM memory, and the limited quantities it was manufactured at).

Nvidia can easily counter this card. There are the stop-gap solutions, like lower the price of an RTX 2070, or introduce a 2060 ti at a similar price point. And in no more 6 months after the release of the RX 3080XT, Nvidia would be ready with the RTX 3xxx generation on a 7nm.


----------



## bug (May 7, 2019)

Caring1 said:


> Probably run hotter and use a lot more power too.


3 fans and two PCIe power connectors as pictured


----------



## THANATOS (May 7, 2019)

medi01 said:


> Remind me, why 570 (3 years old) ivs slower 1050Ti/1650 works.


I don't understand what you wrote(meant).


----------



## Totally (May 7, 2019)

THANATOS said:


> Intel 8(16T) cpus are much costlier than the ones form AMD but also noticeably faster and we are talking about similar performance for only 2/3 of price!
> The competitor for 470/570 was(is) 1060 3GB and not 1050Ti! We are talking about similar performance once again.
> 
> So I ask for some example where the performance was* the same *but It cost 2/3 of the competitors price!



The 480/580 was the card pitted against the 1060 not the 1050. Iirc the 1050 wasn't even announced when Polaris was released at $200/$229 vs the 1060 $249/$299. So am I confused now as such an obvious example is what you are asking for. The 470 came later and the 1050 was released as a response to that. 580/570 same thing all over again 1060 variant/1050ti


----------



## medi01 (May 7, 2019)

THANATOS said:


> Intel 8(16T) cpus are much costlier than the ones form AMD but also noticeably faster and we are talking about similar performance for only 2/3 of price!



Intel CPUs, when Ryzen have come. Note the cheapest 8 core.







AMD Ryzen 1800x released at $499, *half the price*.


----------



## notb (May 7, 2019)

Only 8GB?
I was just convinced in another thread that this is not enough and future proof cards should have 16GB.


----------



## bug (May 7, 2019)

medi01 said:


> How much was the cheapest Intel 8 Core, before AMD had ryzen?


$599. But Intel didn't have plain 8 cores, they had 8c/16t.


----------



## medi01 (May 7, 2019)

bug said:


> $599



In your face, fanboi:


----------



## Tsukiyomi91 (May 7, 2019)

$330 sounds quite right since 1. they ditched HBM to save cost, 2. putting it under RTX2060 pricing minus specialized cores means this may be AMD's return in spicing up the GPU market space & 3. IF $330 is the MSRP, Nvidia is going to have a hard time keeping up with the heat since they've been dominating the GPU market.


----------



## THANATOS (May 7, 2019)

Totally said:


> The 480/580 was the card pitted against the 1060 not the 1050. Iirc the 1050 wasn't even announced when Polaris was released at $200/$229 vs the 1060 $249/$299. So am I confused now as such an obvious example is what you are asking for.


Did I mention in that quotation that 480/580 was pitted against 1050? I don't think so.
Is It really such an obvious example?
RX480 4GB($199) vs GTX1060FE 6GB($299) -> 50%
RX480 8GB($229) vs GTX1060FE 6GB($299) -> 31%
RX480 4GB($199) vs GTX1060 6GB($249) -> 25%
RX480 8GB($229) vs GTX1060 6GB($249) -> 9%
As you can see at best It was 50% and at worst 9% difference.
I am inclined to ignore GTX1060FE because for lower price you could have the same GPU with the same amount of Vram and better cooling to boot, It was just Nvidia's attempt to sell the same card at a premium.


----------



## PanicLake (May 7, 2019)

the54thvoid said:


> If naming conventions are true, they've done this to piss off Nvidia. Unlikely, Nvidia can make their next gen line 3080 etc.


Not necessarily, I believe the main reason is to help unsavvy customer to better identify their product with the other brand range, as they did with CPUs Ryzen 3 - 5 - 7... it makes it easier to compare prices too.


----------



## Midland Dog (May 7, 2019)

its gcn dont get your hopes up, 250w furnace just to beat a card that performs the same/better, has more features, is far more efficient (even being a node behind) and will be LOUD, looking at you v7


----------



## bug (May 7, 2019)

medi01 said:


> In your face, fanboi:


In my face, what? Why fanboy? Why do you have to make a fuss about Intel CPUs in a thread about an upcoming AMD GPU?


----------



## Tsukiyomi91 (May 7, 2019)

well, all i know is we all have to wait it out & see if the supposedly "RX3080 XT" is as good or better than the RTX2070 for $330 or this is all just a fluke from Tweaktown & other "leak sites".


----------



## IceShroom (May 7, 2019)

Remember people,* IT IS A RUMOR. Not word from AMD.*


----------



## bug (May 7, 2019)

IceShroom said:


> Remember people,* IT IS A RUMOR. Not word from AMD.*


You don't have to be mean about it. For years, people that rooted for AMD only had rumors to live on. Because once AMD released their GPUs, the disappointment quickly settled in.


----------



## Frick (May 7, 2019)

bug said:


> You don't have to be mean about it. For years, people that rooted for AMD only had rumors to live on. Because once AMD released their GPUs, the disappointment quickly settled in.



Not only rumours, but people sucks at managing their expectations. See No Man's Sky.


----------



## oxidized (May 7, 2019)

Again with the "inferiority complex naming scheme" I totally expect this from AMD, after what they've done with CPUs naming and Chipset naming, maybe this time AdoredTV is kinda right...


----------



## THANATOS (May 7, 2019)

medi01 said:


> Intel CPUs, when Ryzen have come. Note the cheapest 8 core.
> 
> 
> 
> ...


First of all, I was talking about GPUs, this news is only about GPUs so I don't understand why you had to bring up CPUs.
Let's see what did Intel do to counter 8C Zen.
Ryzen 7 1800X
Price: US $499
Release date: March 2, 2017

Core i7 7820X
Price: US $599
Release date: June 19, 2017
Difference in price "only" 25% after 3 months of Zen's release and Intel being 15% faster.


----------



## EarthDog (May 7, 2019)

Well, this is an interesting name...

Cant say I believe that performance and pricing but if so, seems like a winner. I wonder what power consumption will be? Computex, please get here already.



Lol @ TPU being TPU again. Same clowns, different tent (thread).


----------



## kings (May 7, 2019)

I believe when I see it. Every new AMD GPU comes surrounded with a lot of hype, and looking back, usually does not materialize.

Personally, I don´t expect anything faster than the Vega 64 for mid-end Navi this year. And that would be already a 60% improvement in performance compared to RX580. That´s a perfectly fine job for a Polaris successor.

But, we will see... It would be good if the rumors came true for a change.


----------



## THANATOS (May 7, 2019)

Tsukiyomi91 said:


> $330 sounds quite right since 1. they ditched HBM to save cost, 2. putting it under RTX2060 pricing minus specialized cores means this may be AMD's return in spicing up the GPU market space & 3. IF $330 is the MSRP, Nvidia is going to have a hard time keeping up with the heat since they've been dominating the GPU market.


Does It?
1. So because they ditched HBM for GDDR6 the price is right? HBM is costlier, but then they can have higher margins thanks to lower production cost from a single card.
2. So because Navi doesn't have specialized cores for RT It should be priced under a much weaker RTX2060? RT is not a great selling point at this point.
3. Why should Nvidia have a hard time? Nvidia can cut prices on Turing no problem, It's not like RTX2070 is much costlier to make than Navi. Vram is the same and even though RTX2070 has bigger die size It's built on a much cheaper manufacturing process.
I don't see here anything to justify such a low price If the performance is comparable to RTX2070.


----------



## Super XP (May 7, 2019)

notb said:


> Only 8GB?
> I was just convinced in another thread that this is not enough and future proof cards should have 16GB.


Forget about Future Proof Cards, this methodology has been proven wrong many times over again. In 5 years time, for example, todays 16GB won't be able to play PC Games of the future, the way they are meant to be played.

Leave the 16GB to the high end, what this industry needs is cost effective 8GB GPUs with some solid horsepower. Navi may not be perfect but it just might give the gaming industry what is badly needs till AMD finished up its new GPU design. That is Price / Performance.

The RTX 2070 when released was about $700. Well overpriced lol, ya no Thank You,

It seems AMD is stuck on GCN, the Age Old Design that single handedly disabled AMD's ability to properly compete in Performance per Watt. Hopefully there new GPU design dumps GCN in the garbage once and for all.


----------



## medi01 (May 7, 2019)

THANATOS said:


> First of all, I was talking about GPUs, this news is only about GPUs so I don't understand why you had to bring up CPUs.


Because the argument went "AMD is a company" and it "never undercut by so much", not "AMD is company only when selling GPUs".


----------



## THANATOS (May 7, 2019)

Super XP said:


> The RTX 2070 when released was about $700. Well overpriced lol, ya no Thank You,


Next time check the actual price.
RTX 2070 $499 
RTX 2070FE $599
The price you mentioned was for RTX2080 non FE.


----------



## Super XP (May 7, 2019)

THANATOS said:


> Does It?
> 1. So because they ditched HBM for GDDR6 the price is right? HBM is costlier, but then they can have higher margins thanks to lower production cost from a single card.
> 2. So because Navi doesn't have specialized cores for RT It should be priced under a much weaker RTX2060? RT is not a great selling point at this point.
> 3. Why should Nvidia have a hard time? Nvidia can cut prices on Turing no problem, It's not like RTX2070 is much costlier to make than Navi. Vram is the same and even though RTX2070 has bigger die size It's built on a much cheaper manufacturing process.
> I don't see here anything to justify such a low price If the performance is comparable to RTX2070.


You are assuming Nvidia priced its GPUs correctly. That is a wrong assumption.


----------



## medi01 (May 7, 2019)

THANATOS said:


> Ryzen 7 1800X
> Price: US $499
> Release date: March 2, 2017
> 
> ...


Not sure if trolling or simply stupid.
*At the moment AMD dropped 8 core Ryzen, cheapest 8 core Intel was above 1k$.*


----------



## Super XP (May 7, 2019)

THANATOS said:


> Next time check the actual price.
> RTX 2070 $499
> RTX 2070FE $599
> The price you mentioned was for RTX2080 non FE.


I am quoting Canadian Pricing. lol, 
I'll quote usd from now on.


----------



## medi01 (May 7, 2019)

THANATOS said:


> non FE.


At this very moment, from NV site, non Fools Edition:


----------



## zo0lykas (May 7, 2019)

Look guys what I found


----------



## Ibotibo01 (May 7, 2019)

THANATOS said:


> The competitor for 470/570 was(is) 1060 3GB and not 1050Ti


I wanna say that When GTX 1050 Ti released in 2016, it's price was 140 Dollars. RX 470 costed *180 Dollars *and it only pass %30 GTX 1050 Ti. GTX 1050 Ti is only 75W. Well, how much power  RX 470 spends?
*R9 390X-390=RX 580 for 2 years. I think that Nvidia defeats AMD. GTX 980 is same with GTX 1060 6GB.*
AMD builts Ryzen for 4 years because of the fact that Jim Keller helps AMD and he left AMD. So what will we see in 2021?
In the same way, Raja koduri left AMD. I don't expect these GPUs are as good as RTX 2070 for 330 Dollars. Maybe it will same with GTX 1660 Ti(RX3080 non XT). 




*Why do you defend AMD? 
Everyone is right for Nvidia's pricing method*. I think that GTX 1650 should costs 130 Dollars.


----------



## THANATOS (May 7, 2019)

medi01 said:


> Because the argument went "AMD is a company" and it "never undercut by so much", not "AMD is company only when selling GPUs".


You were quoting Assimilator as shown in your link and he was only talking about GPUs, he even mentioned *competitor* and *not competitors *as he should have If he was talking about both CPUs(Intel) and GPUs(Nvidia).


----------



## AnarchoPrimitiv (May 7, 2019)

sergionography said:


> Interesting. I wonder how power efficiency will play out on this though. Hopefully AMD doesn't squeeze things way above the sweetspot efficiency curve.



Yeah, power efficiency is super important  because at an average cost of $0.12 - $0.15 per kWHr for electricity, an additional 100 watts consumed by a videocard could end up costing you an egregious $30/year extra...and we can't have that


----------



## Manu_PT (May 7, 2019)

AnarchoPrimitiv said:


> Yeah, power efficiency is super important  because at an average cost of $0.12 - $0.15 per kWHr for electricity, an additional 100 watts consumed by a videocard could end up costing you an egregious $30/year extra...and we can't have that



Nice prices over there on US! Try again on EU countries now and you might aswell save enough for a new GPU in 3 years. Also I love my hardware to be ultra silent while maintaining good temps and not overspending on watercooling.


----------



## Assimilator (May 7, 2019)

medi01 said:


> Intel CPUs, when Ryzen have come. Note the cheapest 8 core.
> 
> 
> 
> ...



Good job comparing Ryzen (mainstream) with -E series (HEDT)!


----------



## ensabrenoir (May 7, 2019)

.....the "I'm name my thing +1 higher than your thing" is getting lame to me.   Have we become that gullible?   Why is the whole "my new thing is just as good/ almost as good as your current thing" the norm?  Where did the leap frogging each other each generation go?  -- grumpy old man who hasn't had a cup of coffee.


----------



## THANATOS (May 7, 2019)

AnarchoPrimitiv said:


> Yeah, power efficiency is super important  because at an average cost of $0.12 - $0.15 per kWHr for electricity, an additional 100 watts consumed by a videocard could end up costing you an egregious $30/year extra...and we can't have that


You have 2 similarly performing gpus: gpu A and gpu B. GPU A is $50 cheaper so you will buy It because It has better performance/price ratio so you get better worth for your money, but on the other hand after 2 years you have to to pay $60 more on bills than If you had chosen gpu B. So now which is the better card?
If you are not the one paying the bills then It's still GPU A, but If you pay them then It should be GPU B.


----------



## spnidel (May 7, 2019)

NdMk2o1o said:


> So you're assuming the rumor is true then that also means you must assume the other part about the cost is true or is it just the bits that suit you? And if so then 1080/2070 performance for $330 is pretty darn good and better value /performance that anything nvidia has to offer, not to mention that navi 10 was always slated to be a midrange part from initial press statements going back as much as 2 years. So what is exactly is trash about a card that costs $330 that competes with a $500 one? Nothing to me.



according to the rumor, the top-end part doesn't beat the 2080, I don't care about the price.



NdMk2o1o said:


> So what is exactly is trash about a card that costs $330 that competes with a $500 one?


the efficiency. learn to read. if they need 2 8-pin power connectors with a god damn 7nm NODE ADVANTAGE to be on par with a 2070, then what is there to say about the efficiency of a potential higher-end card? nothing. it's going to suck even more ass unless they bump up the budget for their R&D department big time.



GinoLatino said:


> Anyway lets hope for the consumers sake that AMD has a decent product. And if someone wish AMD to fail just because he is an nVidia "follower", he is a complete R _ _ _ _ D!


good job calling a person that has R7 1700, a vega 64 with a fury x, a R9 290, and a HD6950 prior to it a "retard" for wanting better performance from AMD LOL


----------



## Midland Dog (May 7, 2019)

Vayra86 said:


> Inb4 Nvidia GTX 10000
> 
> You get a free Dragonball Z sticker with it.
> 
> ...


unwise to piss off a sleeping giant, cant wait to see them bring out RTX4___ series and skull rape RTG on 7nm EUV


----------



## BorgOvermind (May 7, 2019)

Brusfantomet said:


> Is the “performs competitively with NVIDIA's $499 GeForce RTX 2070” the same as Radeon VII comparison to the 2080? Because to me it looks like the VII needs water-cooling to actually get close to the 2080.
> One can hope that it’s the case. Having 93 % of the compute units of the Radon VII (if frequencies stays the same) it shuld place it on par with the 2070. But I am not holding my breath.


Depends on what you look at. In direct computing power the VII beats the 2080 by 30%.


----------



## spnidel (May 7, 2019)

BorgOvermind said:


> Depends on what you look at. In direct computing power the VII beats the 2080 by 30%.


come now, we all know what people mean when they say "performs competitively with [another graphics card]"


----------



## Kissamies (May 7, 2019)

Reminds me of Polaris in many ways. Well, in a good way.


----------



## THANATOS (May 7, 2019)

spnidel said:


> the efficiency. learn to read. if they need 2 8-pin power connectors with a god damn 7nm NODE ADVANTAGE to be on par with a 2070, then what is there to say about the efficiency of a potential higher-end card? nothing. it's going to suck even more ass unless they bump up the budget for their R&D department big time.


They don't really need 2 8-pin connectors(375W including PCI express slot) for Navi. Radeon VII has 300W TBP so this should be under 250W at worst. If they have a brain then they won't clock It too high so It will be at the level of Vega56 which is 210W. That wouldn't be so bad.


----------



## bug (May 7, 2019)

THANATOS said:


> They don't really need 2 8-pin connectors(375W including PCI express slot) for Navi. Radeon VII has 300W TBP so this should be under 250W at worst. If they have a brain then they won't clock It too high so It will be at the level of Vega56 which is 210W. That wouldn't be so bad.


But there are two connectors in the picture. I can't tell what type, but even if they're both 6 pin, it's still more than what the RTX 2070 reference design needs.


----------



## AltCapwn (May 7, 2019)

It's always the same hype; great performance for great price. Unfortunately, it always end up the same; sold at higher price, no availability, performance not that good. 

Well, in canadian price.


----------



## THANATOS (May 7, 2019)

Super XP said:


> I am quoting Canadian Pricing. lol,
> I'll quote usd from now on.


Please do, so there won't be any misunderstanding next time.


----------



## Space Lynx (May 7, 2019)

I thought I read somewhere we are getting some official AMD announcements today, May 7th?  Am I remember wrong? Maybe I was thinking June 7th and just got the dates wrong, eh...


----------



## THANATOS (May 7, 2019)

bug said:


> But there are two connectors in the picture. I can't tell what type, but even if they're both 6 pin, it's still more than what the RTX 2070 reference design needs.


Radeon VII also has 2 8pin connectors and It doesn't have 375W TBP. I think It will have higher TDP than RX2070.


----------



## bug (May 7, 2019)

lynx29 said:


> I thought I read somewhere we are getting some official AMD announcements today, May 7th?  Am I remember wrong? Maybe I was thinking June 7th and just got the dates wrong, eh...


That's May 27


----------



## Penev91 (May 7, 2019)

Manu_PT said:


> Nice prices over there on US! Try again on EU countries now and you might aswell save enough for a new GPU in 3 years. Also I love my hardware to be ultra silent while maintaining good temps and not overspending on watercooling.


I live in an EU country and electricity here is just under $0.13/kWh ?


----------



## THANATOS (May 7, 2019)

medi01 said:


> Not sure if trolling or simply stupid.
> *At the moment AMD dropped 8 core Ryzen, cheapest 8 core Intel was above 1k$.*


So what? I already said we were talking about GPUs not CPUs, can't you read?


----------



## cucker tarlson (May 7, 2019)

Challenge is a broad term.
Amd showed R7 challenging 2080 and turned out to lose by 10-15%.
Since it's gcn and allegedly has trouble clocking Imo this is gonna be 2060 performance at 330.
Recent amd releases are more like amd supporters eventually getting nvidia performance than amd really challenging their lineup.


----------



## THANATOS (May 7, 2019)

Super XP said:


> You are assuming Nvidia priced its GPUs correctly. That is a wrong assumption.


We can also compare It to AMDs current lineup. RTX2070 is ~30% faster than Vega 56.
Vega 56 is sold for $320 and at the time of release It was $399.
Navi 3080XT will be $330 with RTX2070 level of performance according to rumors.
Performance/price ratio would be 27% better and If we compare prices at release then by 58%.


----------



## dinmaster (May 7, 2019)

amd needs to step up in gpu's, enough with this mid range additions. what happened to beating nvidia at the highest card level? come out with a better card bringing to prices down from the top. i understand price is good and all but by the time it comes out nvidia will have 2nd gen ray tracing cards coming out. i personally don't mind waiting for amd to release something right after nvidia does that crushes them all levels.


----------



## nguyen (May 7, 2019)

Knowing AMD this 3080XT will most likely be as fast as an overclocked RTX 2060 (~ RTX 2070), while has no overclocking headroom available and consume 250W. AMD fans would still going on about how you can undervolt it to save 20-30W of power and lower fan noise and the additional 2GB of VRAM compare to rtx 2060.


----------



## Auer (May 7, 2019)

nguyen said:


> Knowing AMD this 3080XT will most likely be as fast as an overclocked RTX 2060 (~ RTX 2070), while has no overclocking headroom available and consume 250W. AMD fans would still going on about how you can undervolt it to save 20-30W of power and lower fan noise and the additional 2GB of VRAM compare to rtx 2060.



I hope not. Undervolting capability is not exactly a huge selling point....


----------



## oxidized (May 7, 2019)

Auer said:


> I hope not. Undervolting capability is not exactly a huge selling point....



For many it is, as it's going to decrease consumption (as if it doesn't affect performance).


----------



## Gasaraki (May 7, 2019)

sutyi said:


> RTX 2070's avg around 190-210W under gaming.  If NAVI can not close a 20-25% performance gap while being literally two nodes lower than VEGA and going up to 220-240W TDP per se... it won't be funny...



Radeon VII is 7nm...   still hot as balls, just saying.



GinoLatino said:


> Not necessarily, I believe the main reason is to help unsavvy customer to better identify their product with the other brand range, as they did with CPUs Ryzen 3 - 5 - 7... it makes it easier to compare prices too.



LOL, I love AMD fanboys. Always some excuse. AMD did this on purpose to piss off Intel and nVidia. R3, R5, R7 CPUs vs. i3, i5, i7s. X470, x399 chipsets vs Intel Z370 and X299 chipsets. Now they are doing the same thing to nVidia. RX 3060, RX 3070, RX 3080s vs. nVidia RTX2060, RTX2070, RTX2080.

Hate AMD for this. They can't be unique? They just have to confuse the end users for no reason?


----------



## Auer (May 7, 2019)

oxidized said:


> For many it is, as it's going to decrease consumption (as if it doesn't affect performance).



So what is a better selling point, Ability to undervolt or no need to undervolt?


----------



## oxidized (May 7, 2019)

Auer said:


> So what is a better selling point, Ability to undervolt or no need to undervolt?


Hey i agree with you, i'm just pointing out what some people (mostly AMD blind fans) think why it's a good selling point.


----------



## cucker tarlson (May 7, 2019)

oxidized said:


> For many it is, as it's going to decrease consumption (as if it doesn't affect performance).


not only did you (or they) get it backwards,but also forgot that any card can be undervolted.
as if amd had some special uv abilities.
a portion of radeon gpus crahes when using wattman's auto uv feature,this is a silicon lottery.


----------



## oxidized (May 7, 2019)

cucker tarlson said:


> not only did you (or they) get it backwards,but also forgot that any card can be undervolted.
> as if amd had some special uv abilities.
> a portion of radeon gpus crahes when using wattman's auto uv feature,this is a silicon lottery.



Read the post above yours please.


----------



## SIGSEGV (May 7, 2019)

i ain't happy with this leak. it truly shows there's a stagnancy in their r n d gpu department.


----------



## B-Real (May 7, 2019)

https://wccftech.com/amd-navi-radeon-rx-gpu-rumors-navi-20-2020-rx-navi-graphics-cards/

$200 for Vega 56 performance with 130W TDP.
$330 for RTX 2070 performance with 190W TDP.

Looks like (in mid and high tier) AMD gets very close to NV in terms of performance-power consumption ratio, while still having the price-performance advantage. Very nice.



SIGSEGV said:


> i ain't happy with this leak. it truly shows there's a stagnancy in their r n d gpu department.


WUT? It nearly reaches the performance-power consumption ratio of NV while costing 30-35% less.



spnidel said:


> isn't the rtx 2070 pretty much a 1080 performance-wise? if so, then god damn, AMD... node shrink, and you still can't beat a 1080 ti with something that isn't HBM memory with power consumption that isn't trash. sad.


Still, the RTX 2070 only consumes 10W less and may cost $170 more. And there will be 3090 and 3090XT. So what are you talking about?


----------



## spnidel (May 7, 2019)

B-Real said:


> Still, the RTX 2070 only consumes 10W less and may cost $170 more. And there will be 3090 and 3090XT. So what are you talking about?


consumes 10W less on a larger node.


B-Real said:


> And there will be 3090 and 3090XT.


well, if there's going to be a 3090 and 3090XT - hooray! hopefully this time with 300W of power they'll be able to reach 2080 ti levels of performance. preferably at a lower pricepoint too.


B-Real said:


> So what are you talking about?


go read my post again if you've forgotten already LOL


----------



## Auer (May 7, 2019)

B-Real said:


> https://wccftech.com/amd-navi-radeon-rx-gpu-rumors-navi-20-2020-rx-navi-graphics-cards/
> 
> $200 for Vega 56 performance with 130W TDP.
> $330 for RTX 2070 performance with 190W TDP.
> ...



Also, the RTX2070 will be a year old this fall.....


----------



## dirtyferret (May 7, 2019)

not only are the new AMD CPUs & GPUs offering better performance and price in every test against their intel/nvidia counterparts but they also do the following;

*regrow hair
*solve male impotence
*cure cancer
*help the environment
*are vegan safe to eat (ok this may be real in theory)
*play crysis


----------



## Auer (May 7, 2019)

dirtyferret said:


> not only are the new AMD CPUs & GPUs offering better performance and price in every test against their intel/nvidia counterparts but they also do the following;
> *play crysis



Don't push it man..


----------



## vega22 (May 7, 2019)

Manu_PT said:


> Nice prices over there on US! Try again on EU countries now and you might aswell save enough for a new GPU in 3 years. Also I love my hardware to be ultra silent while maintaining good temps and not overspending on watercooling.




Sucks to live in PT. Sorry. UK prices are closer to the us if not slightly better.

So you don't mind paying over the odds for a good card but won't save money on the card and upgrade it's cooling to match performance while still saving monies for your power bill?

Interesting.


----------



## efikkan (May 7, 2019)

Impressive, such a huge amount of wild expectations spawned from some guesses about performance and price from a random guy.

Setting pricing and bins with final specs is the last step of the qualification process, which usually happens weeks ahead of release. No one, including the engineering team at AMD, knows the final specs 6 or 12 months ahead. Anyone who claimed to know the specs and pricing of Navi 1x or Zen 2 last year, or claims to know Navi 2x pricing/specs now is lying, because you can't know a fact that doesn't exist yet. For products far into the future AMD operates with rough targets, not precise pricing, product naming, core config or clocks.

And one last thing about Navi, by the time it arrives, Nvidia will be soon preparing their next generation.



BorgOvermind said:


> Depends on what you look at. In direct computing power the VII beats the 2080 by 30%.


Performance matters, not specs. RTX 2080 have much fewer "cores", far less GFlop/s and is made on an "inferior" node, and yet it outperforms Radeon VII.


----------



## HD64G (May 7, 2019)

Auer said:


> Also, the RTX2070 will be a year old this fall.....


But still too expensive to make many customers buy it...


----------



## EarthDog (May 7, 2019)

Auer said:


> Undervolting capability is not exactly a huge selling point....


That depends on who you ask. When VII came out, that was all the rage.. 'hey, undervolt and overclock for best results'.


HD64G said:


> But still too expensive to make many customers buy it...


One might think that... but, oddly enough, its highest use Turing card on steam... It is behind RX 580 with but with a higher growth rate.


----------



## medi01 (May 7, 2019)

THANATOS said:


> So what?


*So "AMD never cut price like that" is BS. *
*The poster above found $330 vs $499 unbelievable, but AMD did $499 vs $1089.*



efikkan said:


> from a random guy.


AdoredTV is "a random guy", also don't miss more breaking news on BentOverBackwards network, at 6pm, in "Green and Greener".












Assimilator said:


> Good job comparing Ryzen (mainstream) with -E series (HEDT)!


I'm all ears about "non HEDT" 8 core CPU by Intel, or exactly what makes CPU HEDT (or not).
Why is it so f*cking hard to admit you said something that appeared to be wrong and just move the f*ck on, why do you need to go full pathetic?


----------



## f22a4bandit (May 7, 2019)

Going to treat this rumor with a mountain of salt. However, is the pricing really that unbelievable? It's not as if AMD will have dedicated hardware for raytracing on these things, nor the RnD costs of the RTX cards to make up. It's plausible at least.


----------



## medi01 (May 7, 2019)

THANATOS said:


> You were quoting @Assimilator as shown in your link and he was only talking about GPUs,


Oh, really? Only about GPUs? Even barring idocity of commercial company undercutting competitors in CPU but not GPU market, just how pathetic can the argument get:


----------



## bug (May 7, 2019)

f22a4bandit said:


> Going to treat this rumor with a mountain of salt. However, is the pricing really that unbelievable? It's not as if AMD will have dedicated hardware for raytracing on these things, nor the RnD costs of the RTX cards to make up. It's plausible at least.


It's not the pricing that should be taken with a grain of salt, but the alleged performance and power efficiency. Plus, this is all a rumor, nothing from AMD directly.


----------



## prtskg (May 7, 2019)

f22a4bandit said:


> Going to treat this rumor with a mountain of salt. However, is the pricing really that unbelievable? It's not as if AMD will have dedicated hardware for raytracing on these things, nor the RnD costs of the RTX cards to make up. It's plausible at least.



I think NAVI has RT cores as Sony has said their next console has. And their next console uses NAVI, which I think you already know.



bug said:


> It's not the pricing that should be taken with a grain of salt, but the alleged performance and power efficiency. Plus, this is all a rumor, nothing from AMD directly.



I think they'll achieve that performance thanks 7nm and factory overclocking. Power efficiency is something I'll believe when I see it.


----------



## THANATOS (May 7, 2019)

medi01 said:


> *So "AMD never cut price like that" is BS. *
> *The poster above found $330 vs $499 unbelievable, but AMD did $499 vs $1089.*


As I already mentioned It's about  GPUs not CPUs. So my reply 





> So what?


 was correct, because I was talking about GPUs.
Scroll lower to my next post for more details.


----------



## kastriot (May 7, 2019)

War.. war between  brands and their supporters never change


----------



## Dave65 (May 7, 2019)

Caring1 said:


> Probably run hotter and use a lot more power too.



Don't care, if it is as fast and cheaper the cost of power is slight.


----------



## THANATOS (May 7, 2019)

medi01 said:


> Oh, really? Only about GPUs? Even barring idocity of commercial company undercutting competitors in CPU but not GPU market, just how pathetic can the argument get:
> 
> View attachment 122494


As I mentioned already, you quoted a part of what Assimilator said and in the rest you didn't quote he only mentioned nvidia and GPUs.
Did Assimilator directly mention cpus or Intel somewhere in that post? No.
Did Assimilator quote and reply to erocker who mentioned both CPUs and GPUs? Yes.
You simply assumed he was talking about both of them. If you want to know If he really meant both or just GPUs  then ask him It's that simple!

BTW erocker was quoting and replying to my post and surprise I was talking only about Nvidia vs AMD and GPUs. So the whole debate was started by me and was about GPUs and not both of them.


----------



## EarthDog (May 7, 2019)

Dave65 said:


> Don't care, if it is as fast and cheaper the cost of power is slight.


Indeed, but heat mitigation and noise are also part of the equation.


----------



## lexluthermiester (May 7, 2019)

sergionography said:


> Interesting. I wonder how power efficiency will play out on this though.


I was wondering this also.


----------



## medi01 (May 7, 2019)

THANATOS said:


> ...you quoted a part of what Assimilator said...


No, I literally put entire screenshot with entire post.
Are you OK?



THANATOS said:


> Did he directly mention cpus or Intel somewhere in that post?


You mean, when saying "*AMD never undercut their competitor by such a significant amount*" he meant only some competitors in some of the markets, but just forgot ot put "some" into the sentence?
Could it be he also meant particular time period?


----------



## Legacy-ZA (May 7, 2019)

I don't think I have seen any mention of this yet; If the new AMD cards support Crossfire, AMD might have a big advantage over nVidia.

nVidia doesn't support nVlink/SLi on their RTX2060 / 2070 and 1600 series cards.


----------



## Casecutter (May 7, 2019)

AnarchoPrimitiv said:


> videocard could end up costing you an egregious $30/year extra...and we can't have that


And being so attentive you've already invested in a PSU that is at least GOLD+ right?



THANATOS said:


> So what? I already said we were talking about GPUs not CPUs, can't you read?


Jun 25th, 2008 the 4870 released at $299, and the GTX 280 released about a month earlier Jun 16th, 2008 at $650 and only offered 13% more performance, or on perf/$ that was half the 4870. 
https://www.techpowerup.com/reviews/Diamond/HD_4870/23.html

Having efficiency is great unless you have to pay someone else that "monthly savings" all up front.


----------



## THANATOS (May 7, 2019)

medi01 said:


> No, I literally put entire screenshot with entire post.
> Are you OK?


I wasn't talking about that. I was talking about this.


medi01 said:


> > They've never undercut their competitor by such a significant amount
> 
> 
> 8 core Zen CPUs.
> ...





medi01 said:


> You mean, when saying "*AMD never undercut their competitor by such a significant amount*" he meant only some competitors in some of the markets, but just forgot ot put "some" into the sentence?
> Could it be he also meant particular time period?


Just ask him, then you will know If he meant in this part of his post:





> They've never undercut their competitor by such a significant amount, and they have zero incentive to do so this time around, unless they want to go all-out on trying to regain marketshare.


 both GPUs and CPUs or not, because in his whole post he explicitly mentioned only GPUs and Nvidia and there was no mention of CPUs and Intel.


----------



## cucker tarlson (May 7, 2019)

f22a4bandit said:


> Going to treat this rumor with a mountain of salt. However, is the pricing really that unbelievable?


lol,nvidia themselves have a 2070 challenger at $350,and it comes with rt features too.


----------



## THANATOS (May 7, 2019)

Casecutter said:


> And being so attentive you've already invested in a PSU that is at least GOLD+ right?
> 
> 
> Jun 25th, 2008 the 4870 released at $299, and the GTX 280 released about a month earlier Jun 16th, 2008 at $650 and only offered 13% more performance, or on perf/$ that was half the 4870.
> ...


It was almost 11 years ago, but I accept It. Thanks for providing this. I still think It won't happen with Navi, but that's just my opinion and I could be wrong.


----------



## EarthDog (May 7, 2019)

Legacy-ZA said:


> I don't think I have seen any mention of this yet; If the new AMD cards support Crossfire, AMD might have a big advantage over nVidia.
> 
> nVidia doesn't support nVlink/SLi on their RTX2060 / 2070 and 1600 series cards.


MEH... multi-GPU has lost its luster and has been losing favor for what seems like years now. Scaling isn't always there, double the power, double the heat, double the price... unless you cannot reach the FPS you want, single card is the way to go.


----------



## bug (May 7, 2019)

kastriot said:


> War.. war between  brands and their supporters never change


This isn't war, this is just medi01 going bananas over CPUs in a thread about GPU rumors.


----------



## Super XP (May 7, 2019)

Ibotibo01 said:


> I wanna say that When GTX 1050 Ti released in 2016, it's price was 140 Dollars. RX 470 costed *180 Dollars *and it only pass %30 GTX 1050 Ti. GTX 1050 Ti is only 75W. Well, how much power  RX 470 spends?
> *R9 390X-390=RX 580 for 2 years. I think that Nvidia defeats AMD. GTX 980 is same with GTX 1060 6GB.*
> AMD builts Ryzen for 4 years because of the fact that Jim Keller helps AMD and he left AMD. So what will we see in 2021?
> In the same way, Raja koduri left AMD. I don't expect these GPUs are as good as RTX 2070 for 330 Dollars. Maybe it will same with GTX 1660 Ti(RX3080 non XT).
> ...


Jim Keller never stays with one company for very long. He's hired to help design CPU Micro-Architectures, then moves on.

What Jim Keller did for AMD is not only assist with the ZEN Design,  but also had a hand in the CPU Design after ZEN 4's completion in 2021 or so. Basically he's set AMD up for several years in advance.

He will do the same for Intel, then move onto other ventures or simply retire.

It took AMD about 5 years to design and launch ZEN. In 2015 AMD started development on a New GPU Design,  built from the ground up. That GPU should be available sometime in 2020, though depending on how well Navi performs as a last GCN based effort, AMD may as well push the new GPU Design to 2021 to further enhance the architecture. D

With Dr. Lisa Su leading AMD,  the success of ZEN I am confident Navi will impress just enough to satisfy the average gamer, all while they prepare the brand new gpu.



EarthDog said:


> MEH... multi-GPU has lost its luster and has been losing favor for what seems like years now. Scaling isn't always there, double the power, double the heat, double the price... unless you cannot reach the FPS you want, single card is the way to go.


As much as I would like to disagree, I cannot lol

Perhaps one day Games will take better multi Core advantage.


----------



## Casecutter (May 7, 2019)

EarthDog said:


> Indeed, but heat mitigation and noise are also part of the equation.


How's that helping the "de-contented" GTX 1660 Ti, even with low TDP many run hot and not better on dBA, they just cheapen-up what they give as a cooler to the point it look like something you use to expect from $120 budget construction. Oh, but "gussy it up" with a plastic backing-plate.
https://www.techpowerup.com/reviews/MSI/GeForce_GTX_1660_Ti_Ventus_XS/32.html



THANATOS said:


> It was almost 11 years ago, but I accept It. Thanks for providing this. I still think It won't happen with Navi, but that's just my opinion and I could be wrong.


These are just rumors but I think AMD want's to grab market share.  I think AMD will be doing all it takes to get RGT back competitive and working off Nvidia inflating their margins is one place.  They'll do this 7nm derivative of GCN for at best 18 months, and the it will move to the "Next Gen" Architecture.  I think their "Ryzen" will be early 2022.  Right now need to build on what is/was a rebirth after their lull in R&D and moving past the pruning dead wood from the Raja Koduri era.


----------



## cucker tarlson (May 7, 2019)

medi01 said:


> *So "AMD never cut price like that" is BS. *
> *The poster above found $330 vs $499 unbelievable, but AMD did $499 vs $1089.*
> 
> 
> ...


you're getting yourself so worked up before we have any tangible indication of performance and price.
let's wait and see.
do you really think rtg is in the position to undercut nvidia that much?then what's all the fuss you're making?nvidia themselves have a $350 competitor for 2070,it's called 2060.


----------



## EarthDog (May 7, 2019)

Casecutter said:


> How's that helping the "de-contented" GTX 1660 Ti, even with low TDP many run hot and not better on dBA, they just cheapen-up what they give as a cooler to the point it look like something you use to expect from $120 budget construction. Oh, but "gussy it up" with a plastic backing-plate.
> https://www.techpowerup.com/reviews/MSI/GeForce_GTX_1660_Ti_Ventus_XS/32.html


Well, heat and temperature are different things, mind you. THere are other factors involved outside of TDP. 

That said, higher TDP cards can be quieter than lower TDP card it just depends on other factors. But typically and in general, a lower TDP with the same cooler/fans and heatload from the same size source should run cooler and quieter.


----------



## dicktracy (May 7, 2019)

Adored already did a 180 to this bogus rumor. LOL Don’t set yourselves up for another major disappointment.


----------



## EarthDog (May 7, 2019)

dicktracy said:


> Adored already did a 180 to this bogus rumor. LOL Don’t set yourselves up for another major disappointment.


A link would be good.


----------



## ToxicTaZ (May 7, 2019)

Nvidia will lower the cost of the TU106 (2060 and 2070)

Nvidia will counter Navi 10/20 with the TU104 

Nvidia will release the RTX 2070Ti (TU104-300A) witch is (1080Ti/Radeon 7) performance for a lower price. Possibly with 8GB and 16GB options. 

I herd talk about RTX 2080+ (unlocked TU104-475A) @2GHz with 8GB & 16GB models. ((3072 Cuda Cores)) 

"RTX 2070Ti" will blow away RX 3080 XT

As the "RTX 2080 Plus" blows away RX 3090 XT next February 2020

This is most likely what's going to happen


----------



## Markosz (May 7, 2019)

I still heavily doubt this will be the series name...
NVIDIA's next gen would be named the same in that case, plus the 'XT' is wierd.


----------



## Auer (May 7, 2019)

Am I the only one that thinks that AMD producing a RTX2070 competitor for $330 a year after the RTX2070 launched is really not all that remarkable?
And without RTX and DLSS as well. I know some ppl dont care about that but there it is all the same.


----------



## notb (May 7, 2019)

Markosz said:


> I still heavily doubt this will be the series name...


If you read forum/reddit discussions about Ryzen chipset names, you'd notice this... well... impresses the hardcore followers.
Doubt not. AMD is capable of doing this. It's sad, but that's how they do business.


> NVIDIA's next gen would be named the same in that case, plus the 'XT' is wierd.


"XT" suffix was used by ATI and AMD as well. The latest "XT" card was China-only RX560 XT.


Casecutter said:


> How's that helping the "de-contented" GTX 1660 Ti, even with low TDP many run hot and not better on dBA, they just cheapen-up what they give as a cooler to the point it look like something you use to expect from $120 budget construction. Oh, but "gussy it up" with a plastic backing-plate.
> https://www.techpowerup.com/reviews/MSI/GeForce_GTX_1660_Ti_Ventus_XS/32.html


Temperature has little to do with heat emission, which is still very low. But this is a small, cheap-ish cooler. It's expected to provide less cooling than "flagships", which it does.

Also, the load temperature is still perfectly fine. Many expensive cards reach over 70*C (still GPU's comfort zone).
In other words: there is some potential to limit Ventus' RPM, leading to lower noise.
And there was hardly any sacrifice performance-wise. It's almost an MSI 1660Ti Gaming (within measurement error margin).


> moving past the pruning dead wood from the Raja Koduri era.


I remember perfectly well that Koduri was an AMD-fanboy hero not so long ago. Funny how quickly things change.
Lisa Su will quit at some point - likely for an AMD's competitor). I wonder what will happen to all those signed CPUs then...



Auer said:


> Am I the only one that thinks that AMD producing a RTX2070 competitor for $330 a year after the RTX2070 launched is really not all that remarkable?


As far as selling go, they can ask $10. Making a profit is another story.


> And without RTX and DLSS as well. I know some ppl dont care about that but there it is all the same.


I think everyone cares now. It's just that AMD fans are still reluctant to admit it (they mocked RTX just few months ago).
It only takes Lisa Su to announce an RT acceleration chip and they'll all praise the idea.


----------



## Nkd (May 7, 2019)

spnidel said:


> isn't the rtx 2070 pretty much a 1080 performance-wise? if so, then god damn, AMD... node shrink, and you still can't beat a 1080 ti with something that isn't HBM memory with power consumption that isn't trash. sad.



its same damn number of CUs or less, what else you do you expect? Miracles. If the price is wrong then complain.


----------



## Casecutter (May 7, 2019)

EarthDog said:


> Well, heat and temperature are different things, mind you.


 Correct in that two thing matter what heat ends up inside your case (or keeps you warm) and perhaps noise if you don't play with headset or someone else is in the room getting annoyed.  (But what does any of that matter... "I'm gaming here!")



Auer said:


> Am I the only one that thinks that AMD producing a RTX2070 competitor for $330 a year after the RTX2070 launched is really not all that remarkable?


Sure not that remarkable, although when you consider AMD/RGT hasn't drop near the engineering R&D for graphics, and has made it to TSCM and their 7nm process they are scrappy and showing they may still have competitive ability.



notb said:


> Koduri was an AMD-fanboy hero


To me he never was more than a "show-boat" in all things, master of none... self-made celebrity.



Nkd said:


> If the price is wrong then complain


Complain to whom?  Those floating rumor... this is not AMD/RGT spreading any of it...


----------



## Manoa (May 7, 2019)

I read this all  but I still don't know: is this a GCN or not ?


----------



## GreiverBlade (May 7, 2019)

interested ... at 330$ 


well ... if the RTX 2070 weren't 550$+ for me ... 

oh well, wait and see then


----------



## notb (May 7, 2019)

Manoa said:


> I read this all  but I still don't know: is this a GCN or not ?


GCN.


----------



## Rexolaboy (May 7, 2019)

Title of article suggests that AMD or a 3rd party has tested the rx 3080xt and it has benchmarked similarly to an rtx 2070. So far AMD hasn't said anything like this, and there are no numbers from the leak to suggest this. I think stuff like this is a little childish or even dangerous for the release of the product creating a bigger let down than needed. People should understand what to expect is still a GCN GPU and that should speak for itself.


----------



## Totally (May 7, 2019)

THANATOS said:


> Did I mention in that quotation that 480/580 was pitted against 1050? I don't think so.
> Is It really such an obvious example?



I didn't say you said that either. I stated a fact and implied you were wrong because you are.



THANATOS said:


> *470/570 was(is) 1060 3GB and not 1050Ti!*



I can't say you suck at basic math but you must have serious difficulty with word problems, answer is right but that is not the solution. To paraphrase, the statement is: similar performance at 2/3rds *of the cost**. What's 2/3rds of $299? So (2/3)*$NVIDIA = $AMD -> (2/3)*$299 = $199

Let's see: 1060 6gb, $299; rx 480 4gb $199; at 1080p they perform more or less the same and they're GPUs and not CPUs.

*hint, keywords there. "Of" not "more/less."


----------



## Casecutter (May 7, 2019)

Manoa said:


> I read this all  but I still don't know: is this a GCN or not ?


Short answer yes, yes it still is... That said, back in 2016 some of the last work to GCN was to unburden it from a lot of the computing and professional requirements that have all this time remained, while moving to GDDR6 will help in GCN being somewhat need(y) for bandwidth.  Sure don't expect a lot or OMG... more stripping-out unused bit's games didn't use, memory bump while lowering power, and 7nm a mix of increased clock balance against power improvements.

All that said a 56 CU part (aka Vega 56) needs to add 30% in performance to be nipping at the 2070.  Now if you say that none of that is aided in GDDR6, being Vega had that assistance covered in HBM.  They'll need 15% in chip tweaks, and 15% from 7nm, about what they pushed Vega 7 above that of basically existing Vega 64 with all it compute still there.  I'm seriously *not seeing this Navi as really besting a RTX 2070 @ 1440p *performance, but if close and say $350 it will be competitive and can't come soon enough.


----------



## Manoa (May 7, 2019)

this sounds like the bulldozer improves to excavator xD
I think it's not worth guys, it will give verry littel, yhe thare mybe some price change nvidia and mybe low cost this version but like you say, it don't give OMG
mybe people with old cards who wanne higher resolution or more speed in same lower resolution but not going to be for 3840x2160 :x
it best wait for something mutch more faster, like something that can give good speed 60+ in 3840x2160
I have 780 Ti and I running anything, even new games with 1920x1080 and I don't have any performance problems 
if you wanne upgrade, I think it best to do it for something significant and worth 
or wait....unless it will can faster than radeon 7 ? like 30% or 50% ?



Casecutter said:


> I'm seriously *not seeing this Navi as really besting a RTX 2070 @ 1440p *performance


wait I don't understand, radeon 7 is also 2070 no ? so whay they make new card that is same or less performance than old card ?! what the point ?


----------



## bug (May 7, 2019)

Rexolaboy said:


> Title of article suggests that AMD or a 3rd party has tested the rx 3080xt and it has benchmarked similarly to an rtx 2070. So far AMD hasn't said anything like this, and there are no numbers from the leak to suggest this. I think stuff like this is a little childish or even dangerous for the release of the product creating a bigger let down than needed. People should understand what to expect is still a GCN GPU and that should speak for itself.


You are right, but for the past few generations, raising false expectations through "unsanctioned" leaks is all that AMD could do in the GPU space. To the point some people pay now over a grand for a video card


----------



## Casecutter (May 7, 2019)

Manoa said:


> but not going to be for 3840x2160


Sure no one should consider these for "stellar top -self 4K " at some supposed 350'ish price, but it's better than having  pay a 45% higher price to see basically the similar immersive play.



bug said:


> through "unsanctioned" leaks is all that AMD could do in the GPU space


Blame the victim... much?  I never see this as AMD tamping or igniting expectations just folks soliciting "Click Bait" from someone will to start some speculation.


----------



## efikkan (May 7, 2019)

Casecutter said:


> Short answer yes, yes it still is... That said, back in 2016 some of the last work to GCN was to unburden it from a lot of the computing and professional requirements that have all this time remained, while moving to GDDR6 will help in GCN being somewhat need(y) for bandwidth.


Neither bandwidth nor computational power has been the problem for GCN.
RX 580 have a 256 GB/s memory bandwidth compared to the similarly performing GTX 1060 at 192 GB/s
or
Radeon VII at a massive 1 TB/s vs. RTX 2080's 448 GB/s (which is still overkill).
The problem for GCN have always been resource utilization, and improvements to resource management will be the deciding factor for Navi, if there is anything substantial at all.


----------



## Manoa (May 7, 2019)

efikkan said:


> The problem for GCN have always been resource utilization, and improvements to resource management will be the deciding factor for Navi, if there is anything substantial at all.



yhe that whay it the "async monster" xD


> it's better than having  pay a 45% higher price to see basically the similar immersive play.


+1, but it still a waste of card, make card that is same or worse than your own older card radeon 7, I realy don't understand the point in it, mybe bether not to make card at all lel


----------



## eidairaman1 (May 7, 2019)

Just Wait till Its out.


----------



## Caring1 (May 8, 2019)

eidairaman1 said:


> Just Wait till Its out.


But then we miss out on fun threads like this


----------



## Camm (May 8, 2019)

notb said:


> I think everyone cares now. It's just that AMD fans are still reluctant to admit it (they mocked RTX just few months ago).



I'll still mock the shit out of it, and I own a 2080 Ti. No other way to cut it, image fidelity tanks with them enabled, with the only game I even consider turning on these technologies for being Metro Exodus, with every other implementation not worth the hit to FPS and fidelity.

There needs to be a more efficient way of doing RT without chunking out a huge part of the die to do so. As for DLSS, most testing with it shows that you get better fidelity just lowering the res for the same FPS. Could it be better someday? Maybe. But not this gen.


----------



## cucker tarlson (May 8, 2019)

Camm said:


> I'll still mock the shit out of it, and I own a 2080 Ti. No other way to cut it, image fidelity tanks with them enabled, with the only game I even consider turning on these technologies for being Metro Exodus, with every other implementation not worth the hit to FPS and fidelity.
> 
> There needs to be a more efficient way of doing RT without chunking out a huge part of the die to do so. As for DLSS, most testing with it shows that you get better fidelity just lowering the res for the same FPS. Could it be better someday? Maybe. But not this gen.


yup,except no one that's reasonable could even imagine just having rt just like that,going from rasterization to rt in just one generation.
you took the first step, a $1200 one at that.


----------



## steen (May 8, 2019)

efikkan said:


> Neither bandwidth nor computational power has been the problem for GCN.
> RX 580 have a 256 GB/s memory bandwidth compared to the similarly performing GTX 1060 at 192 GB/s
> or
> Radeon VII at a massive 1 TB/s vs. RTX 2080's 448 GB/s (which is still overkill).
> ...



No continual incremental updates like Fermi->Turing will do that for you. It must be acknowledged NV has executed, esp given the relative competition vacuum across the entire product stack. Not to say GCN8/9 made no improvements to earlier GCN, but without new RTL the key shortcoming of GCN is tough to work around. IMO the front end & entire register/cache/pipeline are 3 gens behind NV. Performance of Vega is reasonable in that context. Until AMD sort their front end 4tris/clk to 6tris/clk deficit they will likely have to keep pushing their silicon harder & lose out in perf/watt. DSBR/primitive didn't pan out with Vega & TU now has more flexible mesh & VRS. We'll see what Raja Koduri's fixes for GCN amount to. Other than leveraging 7nm & supporting VRS, I am a bit sceptical.

I disagree about Vega20/TU104 bandwidth being overkill in the context of compute use. It is Mi50. In the case of ML, there are many cases where nn is bandwidth limited, not compute limited. More bandwidth for TU102/4 would yield closer to its theoretical max. Frame buffer size is also an issue where >6GB nn models make TU104/6 marginal. You can never have enough frame buffer or bandwidth.



Camm said:


> There needs to be a more efficient way of doing RT without chunking out a huge part of the die to do so.



The whole point is that the RTX/tensor silicon is only ~10% of the die space. It's the redesigned TU uarch that was beefed up to support the register/cache/pipeline demands of RTX. That's why TU is 10-20% faster than GP at the same clock, but die area has blown out.



> As for DLSS, most testing with it shows that you get better fidelity just lowering the res for the same FPS. Could it be better someday? Maybe. But not this gen.



I keep asking the Q, what do people think DLSS is? My not so humble view is that it's a misnomer on NV's part. They should have left it at MLAA.


----------



## sutyi (May 8, 2019)

Gasaraki said:


> Radeon VII is 7nm...   still hot as balls, just saying.



Still Vega mostly unchanged which was not really efficient in the first place, espcially for gaming type workloads. They used 7nm to clock it sky high, it is running out of the clock / power sweet spot yet again, so it can be a bridging product on the desktop roadmap till Navi takes it's place. I'm hoping that Navi improves the perf / watt somewhat, but it won't be anything revolutionary in the power department as it's still based on GCN and that has / had some pretty hefty short comings especially on Geometry SEs. I just hope it will be competitive price / performance wise with a tad lower power and that would be well enough, till Arcturus or whatever the new Super-SIMD  GPU design will be called.

New tech products excite me enough to make me interested and informed about them, but never really rode the hype train in the years and I would urge others to stay off it too.


----------



## Camm (May 8, 2019)

steen said:


> The whole point is that the RTX/tensor silicon is only ~10% of the die space. It's the redesigned TU uarch that was beefed up to support the register/cache/pipeline demands of RTX. That's why TU is 10-20% faster than GP at the same clock, but die area has blown out.
> 
> I keep asking the Q, what do people think DLSS is? My not so humble view is that it's a misnomer on NV's part. They should have left it at MLAA.



Whats this 10% of die space shen, just basic bucket math, you can see that tensor/rt takes up about half of an SM, with SM's taking up about half of the die, you have at least 20% of the die dedicated to making RT work (on the basis that RT is currently too slow to run without supersampling).

As for DLSS, I don't really care on the how, more that it should be providing better fidelity at the same FPS than just using a lower resolution in the first place. Which bluntly, it doesn't.


----------



## medi01 (May 8, 2019)

Super XP said:


> He will do the same for Intel, then move onto other ventures or simply retire.


Jim Keller is at Tesla now.



cucker tarlson said:


> what's all the fuss





cucker tarlson said:


> before we have any tangible indication of performance and price.


*Let's not pretend that you are an impartial bystander, shall we?*

it is about grenboi disability to accept "water is wet" kind of fact.
*AMD* never *undercut competitor* like that is an apparent bullshit, cheapest 8 core by Intel was $1089 when AMD 1800x hit with $499. End of story.
"Let's spin it" effort have come up, so far with:

1) "But $1089 is a HEDT chip" (yay)
2) "But he meant GPU" (actually 290(x) vs 780 wasn't that far at $549 vs slower chip at $650)
3) "But we are not talking about GPUs" (coming from "cartels are fine" guy)

Yay. Pathetic.



cucker tarlson said:


> do you really think rtg is in the position to undercut nvidia that much?


I believe AdoredTV does have an actual insider connection and as I see it, AdoredTV just dropped *BAD NEWS NOT GOOD NEWS*. Namely:
1) Not meeting target clocks
2) Power hungry <= the worst part

3) Losing to VII CU for CU

As for whether 7nm chip with GDDR mem with 2070-ish performance is possible at $330, uh, is it even a question?



cucker tarlson said:


> nvidia themselves have a $350 competitor


Could you guys at least hide your BH? I mean, what the fuck does "oh, but my great company has an answer to this, I don't need to hide and cry" have to do with it? Jesus.




dicktracy said:


> Adored already did a 180 to this bogus rumor.


It's the 180 we are discussing here, the "I don't read even the first page"/"I only read reddit titles" kid.
Video is linked on the very first page, with most relevant parts of it as screenshots.




Manoa said:


> is this a GCN or not ?


Do you even understand what "is GCN" means?



Camm said:


> There needs to be a more efficient way of doing RT without chunking out a huge part of the die to do so.


I recall someone estimated that 22% of die are dedicated to it, not that much.


----------



## Caring1 (May 8, 2019)

medi01 said:


> Do you even understand what "is GCN" means?


Great Card Now?


----------



## steen (May 8, 2019)

Camm said:


> Whats this 10% of die space shen, just basic bucket math, you can see that tensor/rt takes up about half of an SM, with SM's taking up about half of the die, you have at least 20% of the die dedicated to making RT work



You looking @ comparative die shots or marketing slides? 



> (on the basis that RT is currently too slow to run without supersampling).



Sorry, what? If RTX ran with SSAA it would be a slide show.



> As for DLSS, I don't really care on the how, more that it should be providing better fidelity at the same FPS than just using a lower resolution in the first place. Which bluntly, it doesn't.



I can appreciate that. DLSS x2 runs @ native res but performance suffers. I actually think that MLAA tech is interesting & has scope for future IQ/perf advancement, especially on the TAA front. Even MS "super resolution" via DirectML results in undersampled edges with gaps remaining undersampled edges with gaps. Best description of DLSS is a lossy nn image compressor with a reconstruction process side channel (the DLSS profile).


----------



## cucker tarlson (May 8, 2019)

medi01 said:


> Jim Keller is at Tesla now.
> 
> 
> 
> ...


I was just pointing  out that this is a rumor.whether adtv is your guru changes norhing.what's with your rabid attitude?did you piss your pants hearing adtv and it started to itch now?
You're only active on tpu to attack ppl whom you disagree with and bait.


----------



## Vayra86 (May 8, 2019)

Caring1 said:


> Great Card Now?



No, Great Card Next - its always coming!


----------



## Spencer LeBlanc (May 8, 2019)

I miss the days of the XT's and GTs etc. Bring back the good ole days.


----------



## bug (May 8, 2019)

Spencer LeBlanc said:


> I miss the days of the XT's and GTs etc. Bring back the good ole days.


I miss the days of Trio64 and Unreal demos


----------



## medi01 (May 8, 2019)

cucker tarlson said:


> adtv is your guru


Because stating "I believe he has insider links" makes him "my guru".





cucker tarlson said:


> <A bunch of childish insults>


Oh, get lost.


----------



## cucker tarlson (May 8, 2019)

medi01 said:


> Because stating "I believe he has insider links" makes him "my guru".
> 
> 
> 
> ...


Sir.come on
Let's not pretend


----------



## jabbadap (May 8, 2019)

medi01 said:


> Jim Keller is at Tesla now.
> 
> *Let's not pretend that you are an impartial bystander, shall we?*
> 
> ...



Jim Keller moved to Intel from Tesla. Not that it really have anything to do with the GPUs not to mention Navi or RTX...


----------



## Rexolaboy (May 8, 2019)

bug said:


> You are right, but for the past few generations, raising false expectations through "unsanctioned" leaks is all that AMD could do in the GPU space. To the point some people pay now over a grand for a video card



You honestly think AMD would leak this information to the public? Its not even impressive information, I'm sure AMD would "leak" something a little more titillating. Clocking issues, IPC decrease, and silly naming scheme sound like rumors and hearsay spread by non technical staff. Just like the MSI rep that was telling someone that his b350 board wouldn't work with Ryzen 3rd gen.


----------



## bug (May 8, 2019)

Rexolaboy said:


> You honestly think AMD would leak this information to the public? Its not even impressive information, I'm sure AMD would "leak" something a little more titillating. Clocking issues, IPC decrease, and silly naming scheme sound like rumors and hearsay spread by non technical staff. Just like the MSI rep that was telling someone that his b350 board wouldn't work with Ryzen 3rd gen.


I don't care much about what AMD would or wouldn't leak. But at the same time I can't help noticing they never act on these leaks, therefore they must be ok with the generated word of mouth.

And just look at how the world was taken by surprise by Turing or Zen. That proves when companies want to keep something from the public, that's what they'll do.


----------



## Auer (May 8, 2019)

bug said:


> I don't care much about what AMD would or wouldn't leak. *But at the same time I can't help noticing they never act on these leaks, therefore they must be ok with the generated word of mouth.*
> 
> And just look at how the world was taken by surprise by Turing or Zen. That proves when companies want to keep something from the public, that's what they'll do.



Well it's free publicity.


----------



## Rexolaboy (May 8, 2019)

Zens leaks were pretty accurate, they were based on engineering samples. We already know what Navi should perform like considering it's based on GCN, but it looks worse lol


----------



## efikkan (May 8, 2019)

steen said:


> No continual incremental updates like Fermi->Turing will do that for you. It must be acknowledged NV has executed, esp given the relative competition vacuum across the entire product stack. Not to say GCN8/9 made no improvements to earlier GCN, but without new RTL the key shortcoming of GCN is tough to work around. IMO the front end & entire register/cache/pipeline are 3 gens behind NV. Performance of Vega is reasonable in that context. Until AMD sort their front end 4tris/clk to 6tris/clk deficit they will likely have to keep pushing their silicon harder & lose out in perf/watt.


The problem is not ROP performance, it's management of resources.
GCN have changed very little over the years, while Kepler -> Maxwell -> Pascal -> Turing have continued to advance and achieve more performance per core and GFlop, to the point where they have about twice the performance per watt and 30-50% more performance per GFlop.



steen said:


> You can never have enough frame buffer or bandwidth.


More is usually better, except when it comes at a great cost.
16 GB of 1 TB/s HBM2 is just pointless for gaming purposes. AMD could have used 8 or even 12 GB, and priced it lower.



Rexolaboy said:


> You honestly think AMD would leak this information to the public? Its not even impressive information, I'm sure AMD would "leak" something a little more titillating.


AMD does certainly leak information when they see a reason to. But no, these specs are not leaked by AMD, as they don't leak them until they are finalized. This thread is just another "victim" of someones speculation in a Youtube channel…


----------



## Casecutter (May 8, 2019)

efikkan said:


> GCN have always been resource utilization


Yea, that's an over simplification on my part or wrong way of saying it....  As correct its' not "band-width" more how that memory speed/throughput is utilized in GCN, and correct that wasn't corrected in the architecture with GDDR5, even with Vega/HBM not sure what changes where brought-in to free that up.

And true the "computational power" of GCN is not its' problem, however never truly got employed in gaming engines (DX11).  Today I'm not sure its' extent even for DX12, is aiding gaming enough to warrant a huge dependence in the architecture.  I understood that they tasked engineering (2015-16) to trim that back in a way to save power, and this "Navi" is the first chip to have that.



bug said:


> I can't help noticing they never act on these leaks, therefore they must be ok with the generated word of mouth.


So they are suppose to come out to confirm/deny (argue, attest, authenticate, bear out, certify, corroborate, substantiate, support, validate, verify, vindicate) every Tom, Dick and Harry story!   That's not how any smart-individual or company does it.  Once you start... your giving away "something" every time you open your mouth, and where does it stop.  And yes any generation of discussion from nothing but rumor is still talk/discussions keeping you or company relevant.  
Kardashian's built an Empire on just that kind of crap ?


----------



## efikkan (May 8, 2019)

Casecutter said:


> And true the "computational power" of GCN is not its' problem, however never truly got employed in gaming engines (DX11).  Today I'm not sure its' extent even for DX12, is aiding gaming enough to warrant a huge dependence in the architecture.  I understood that they tasked engineering (2015-16) to trim that back in a way to save power, and this "Navi" is the first chip to have that.


Game engines don't implement low-level GPU scheduling, dependency analysis and low-level resource management, not even the driver can do this, it's managed on chip. While you can tune some aspects of a game engine and see how it impacts performance, you can't solve GCN's underlying problem in software.


----------



## Casecutter (May 8, 2019)

efikkan said:


> you can't solve GCN's underlying problem in software


I don't see that I said it anything that is fixed by the software/driver?  It was game engine developers that never saw/knew the value or tools to constructed in a way to make use of such "on chip" resources.  And much like Bulldozer core implementation, something that was a fault of their going toward a direction nobody was looking to go.


----------



## Super XP (May 8, 2019)

ToxicTaZ said:


> Nvidia will lower the cost of the TU106 (2060 and 2070)
> 
> Nvidia will counter Navi 10/20 with the TU104
> 
> ...


If NAVI (That is based on old 2011 GCN Design) ends up anywhere near the RTX 2070 for a $300 to $350 cost, would be an Nvidia embarrassment. 

Can AMD give us one more crack at GCN before trashing it? We'll soon find out.


----------



## Auer (May 8, 2019)

Super XP said:


> If NAVI (That is based on old 2011 GCN Design) ends up anywhere near the RTX 2070 for a $300 to $350 cost, would be an Nvidia embarrassment.
> 
> Can AMD give us one more crack at GCN before trashing it? We'll soon find out.



nV's level of embarrassment becomes less every day that goes by with no sign of competition. RTX2070 has been out since October. With RT and DLSS.

If anything if AMD doesn't have a clear rival out by the end of summer for $350 the embarrassment will be all theirs.

The RTX2070 is current production, not new production. Shouldn't AMD at this stage release something better than a RTX2070 for the same $$$?


----------



## ToxicTaZ (May 8, 2019)

Nvidia will lower there cost of TU106 (2060/2070)

Nvidia will use their TU104 to fight against AMD Navi 10/20 GPUs. 

Nvidia is releasing RTX 2070Ti (TU104 300A) same performance as (1080Ti/Radeon 7) at a lower cost. To deal with AMD Navi 10

Nvidia also has a RTX 2080U model coming. (Fully unlocked TU104 model with full 3072 Cuda Cores @2GHz) to deal with AMD Navi 20

Both RTX 2070Ti and RTX 2080U both come with optional 16GB Plus models. 

Like always AMD has nothing to go against Nvidia TU102 (RTX Titan/RTX 2080Ti)


----------



## Casecutter (May 8, 2019)

Auer said:


> for the same $$$


Let me fix that, IF... AMD/RGT releases a 7nm part before Nvidia, even if close or nips at RTX 2070, at a price that's 30% less you'll see that as embarrassing?

Though all the while they do it with a pittance of the R&D/financials, while staff restructuring, and basically new relation with a foundry.  All while using an architecture that taped out in 2010 and ultimately only saw slight revisions until this (might be) the first major overhaul.  I'm not seeing it... unless you mean embarrassing for Nvidia?


----------



## Auer (May 8, 2019)

Casecutter said:


> Let me fix that, IF... AMD/RGT releases a 7nm part before Nvidia, even if close or nips at RTX 2070, at a price that's 30% less you'll see that as embarrassing?
> 
> Though all the while they do it with a pittance of the R&D/financials, while staff restructuring, and basically new relation with a foundry.  All while using an architecture that taped out in 2010 and ultimately only saw slight revisions until this (might be) the first major overhaul.  I'm not seeing it... unless you mean embarrassing for Nvidia?



No one cares about AMD's staff restructuring etc. Except market analysts and investors. Gamers don't give a damn about that. Corporate Heroics don't impact in game FPS.

7nm means nothing unless it's cheaper and faster. Equal won't be good enough. AMD's market share for GPU's is abysmal atm and they need a lot bigger splash than R7 was.

Meanwhile I doubt nV is just doing nothing. AMD is never going to compete by releasing a matching product 6 months later for 30% less. Actually it feels like AMD doesn't even really care that much atm. And that's a shame.


----------



## Casecutter (May 8, 2019)

Auer said:


> Actually it feels like AMD doesn't even really care that much atm


On that we agree AMD /RGT really has not cared to via Nvidia, especially of the enthusiast market.  They kept up appearances, but like Intel is now "sitting up straight" you can never be caught slouching in the seat of postulating King.


----------



## vega22 (May 8, 2019)

THANATOS said:


> BTW erocker was quoting and replying to my post and surprise I was talking only about Nvidia vs AMD and GPUs. So the whole debate was started by me and was about GPUs and not both of them.



290x beating the first Tiran at half the price?


----------



## efikkan (May 8, 2019)

Casecutter said:


> I don't see that I said it anything that is fixed by the software/driver?  It was game engine developers that never saw/knew the value or tools to constructed in a way to make use of such "on chip" resources.  And much like Bulldozer core implementation, something that was a fault of their going toward a direction nobody was looking to go.


They couldn't even if they wanted to.
The APIs we use (Direct3D, OpenGL and Vulkan) are GPU architecture agnostic.
When it comes to "optimizing" game engines there are very little developers can do, and they certainly can't control the internal GPU scheduling even if they wanted to, and optimization is largely limited to tweaking buffer sizes, resource sizes and generic operations to see what performs better, not any true low-level GPU-specific optimization like most people think.



Super XP said:


> If NAVI (That is based on old 2011 GCN Design) ends up anywhere near the RTX 2070 for a $300 to $350 cost, would be an Nvidia embarrassment.


How would anyone be embarrassed by Navi coming close to Nvidia?
AMD is the one who should be embarrassed if they can't make a "better" node and a "newer" design with probably more cores and GFlop beat last year's contender from Nvidia, which I don't expect them to do…


----------



## Manoa (May 9, 2019)

medi01 said:


> Do you even understand what "is GCN" means?


funny you would say that, implying that you do. hey fellas check this out *medi01 is an AMD engineer how cool is that ?*


----------



## Midland Dog (May 9, 2019)

new game guess the tdp
1: Pleasant surprise, 150w
2: Dissapointing for 7nm, 225w
3: Good Ole GCN, 300w
4: Meme status, 400w+



Manoa said:


> funny you would say that, implying that you do. hey fellas check this out *medi01 is an AMD engineer how cool is that ?*


Graphics Core Next, and as far as i can tell the CU layout hasnt changed since the HD 7000 series, inclusive of vega's "NCU"


----------



## jabbadap (May 9, 2019)

Midland Dog said:


> new game guess the tdp
> 1: Pleasant surprise, 150w
> 2: Dissapointing for 7nm, 225w
> 3: Good Ole GCN, 300w
> 4: Meme status, 400w+



Well if it has Virtual link connector cards tdp might be lower than pcie power connectors on the card suggest. I would say one thing though: if it is 300W tdp it will beat Radeon VII.


----------



## Auer (May 9, 2019)

Midland Dog said:


> new game guess the tdp
> *1: Pleasant surprise, 150w*
> 2: Dissapointing for 7nm, 225w
> 3: Good Ole GCN, 300w
> ...



Hoping for #1

Would be nice for AMD to finally shake the "Hot, Loud and Slow" image. True or false, public perceptions matter.


----------



## rvalencia (May 9, 2019)

efikkan said:


> The problem is not ROP performance, it's management of resources.
> GCN have changed very little over the years, while Kepler -> Maxwell -> Pascal -> Turing have continued to advance and achieve more performance per core and GFlop, to the point where they have about twice the performance per watt and 30-50% more performance per GFlop.
> 
> 
> ...


Geforce RTX 2080 Ti has 88 ROPS with 6 GPC which each contains geometry-raster engine.

Vega II has 64 ROPS with 4 Shader Engines which each contains geometry-raster engine.

In terms of GPU basics, NVIDIA has geometry and raster superiority.

NVIDIA has memory compression superiority over AMD.

For GPU role, TFLOPS is nothing without geometry-raster engines and ROPS read/write units. Reminder for AMD, GPUs are not DSPs.


----------



## BorgOvermind (May 9, 2019)

VulkanBros said:


> Reminds me....
> View attachment 122424
> And is still running - just changed the thermal paste and the thermal paddings


I still have one too.
I was the most long-lasting card from a performance perspective, remaining in the top best cards for many years.


----------



## rvalencia (May 9, 2019)

Super XP said:


> If NAVI (That is based on old 2011 GCN Design) ends up anywhere near the RTX 2070 for a $300 to $350 cost, would be an Nvidia embarrassment.
> 
> Can AMD give us one more crack at GCN before trashing it? We'll soon find out.










Vega 56 at 1710 Mhz rival or beat RTX 2070 and Vega 64 at 1590Mhz


----------



## medi01 (May 9, 2019)

jabbadap said:


> im Keller moved to Intel from Tesla.



Oh, wow.



Midland Dog said:


> 1: Pleasant surprise, 150w
> 2: Dissapointing for 7nm, 225w


#2 is most likely, given how AMD easily pushes power consumption by a third, for single digit gains.




Midland Dog said:


> Good Ole GCN


It's both an instruction set that is 7 years old (CUDA is 11 years old, x86 is 40 years old) as well as microarch.
The former evolves, the latter can be completely different or mostly the same, only AMD knows.



cucker tarlson said:


> Let's not pretend


Projections like that from "cartels are fine" camp are particularly appalling.


----------



## Assimilator (May 9, 2019)

rvalencia said:


> Vega 56 at 1710 Mhz rival or beat RTX 2070 and Vega 64 at 1590Mhz



In some games.
... compared to stock 2070
... consuming DOUBLE the power of overclocked RTX 2070
... using a procedure that is, to quote the guy in the video, "not recommended" because he has no idea what exposing the card to a couple hundred extra watts will do to it over time.

*slow clap*

I really wonder why RTG isn't hiring you guys for its engineering department.


----------



## medi01 (May 9, 2019)

rvalencia said:


> Vega 56 at 1710 Mhz rival or beat RTX 2070 and Vega 64 at 1590Mhz



Very impressive min frame rates, V56 starts at 275 Euro at mindfactory (e.g. MSI custom cooler one), but ouch at power consumption.


----------



## Totally (May 9, 2019)

vega22 said:


> 290x beating the first Tiran at half the price?



He's gone Jim.



Midland Dog said:


> new game guess the tdp
> 1: Pleasant surprise, 150w
> 2: Dissapointing for 7nm, 225w
> 3: Good Ole GCN, 300w
> ...



1. Not happening, even NV cards don't touch this.
2. Hopefully they land under here, no disappointing at all because of diminishing returns on power savings when dropping nodes. If there is any it will probably be eaten up by increased transistor count and GDDR6.
3. Really hope not


----------



## Casecutter (May 9, 2019)

Auer said:


> Hoping for #1



If you start with a 56CU part (Vega 56) the TDP was 210W, though TDP is “not the end all be all”. If we look at power usage under “Gaming”, we know that a 1070 was better by 57%, 1080 27%, while a 2070 is 17% better than a Vega 56.

Now sure the 2070 is higher performance, although if we postulate this Navi might nip at the heels of a 2070, it comes down to performance / watts. And yes we should see power savings moving to 7nm, but we know Navi at its' heart is GCN architecture.  AMD will be going more for clocks, but I don't think it will be at “dam the efficiency savings” they went with on Vega 7. 

So someone can find the real numbers (l looked around), but if the mix is 15% higher clocks while 15% saving in power (12mn vs. 7nm) what’s that look like?  Boost clock on the Vega 56+15% = 1700Mhz (that's around what Gamer Nexus was needing).  Then 15% lower power would make it 195W in “normal gaming” same as the 2070 Founders Edition. So if we say the TDP on Vega 56 could by that drop 15%, that equates to 178 TDP.  That's right inline with the 175 TDP claimed by the 2070 FE.  I think a range of 170-180W is plausible.

So working from that there’s still chip architecture resource utilization, and improvements to resource management in Navi, could that bring that little extra shine...  Consider a determined team of AMD engineers have been tasked over the last 3 years to "wring-out" GNC and implement GDDR6… Is 7-10% gain from architecture that much of a stretch goal?

I found this about the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).








						TSMC Kicks Off Volume Production of 7nm Chips
					






					www.anandtech.com
				






rvalencia said:


> For GPU role


For a role as a "Gaming" GPU...


----------



## Vario (May 9, 2019)

Once again I really dislike the AMD brand numbering system where they take the competitor's chipset / model series and one up it.  It just creates confusion for the customer.


----------



## rvalencia (May 10, 2019)

Assimilator said:


> In some games.
> ... compared to stock 2070
> ... consuming DOUBLE the power of overclocked RTX 2070
> ... using a procedure that is, to quote the guy in the video, "not recommended" because he has no idea what exposing the card to a couple hundred extra watts will do to it over time.
> ...


... Vega 56 has older 14 nm process tech not 7 nm
... no under voltage










Real time power consumption comparison between stock VII vs Vega 64 LC at +1700Mhz overclock and under voltage.

My point with Vega 56 at 1710 Mhz OC rivaling or beating RTX 2070 is gaming performance estimate  NOT power consumption i.e. cut-down VII into VII 56 and attach GDDR6 memory chips for estimated power consumption for RX 3080 XT. NAVI seems have GCN raster performance behavior. 




medi01 said:


> Very impressive min frame rates, V56 starts at 275 Euro at mindfactory (e.g. MSI custom cooler one), but ouch at power consumption.
> 
> View attachment 122606


AMD should release cut down VII into 56 CU and 256bit GDDR6 memory chips and offer competition to RTX 2070. NAVI has memory compression improvements.


----------



## Midland Dog (May 10, 2019)

rvalencia said:


> Vega 56 at 1710 Mhz rival or beat RTX 2070 and Vega 64 at 1590Mhz


but at over 300 watts, and with reg edits


----------



## medi01 (May 10, 2019)

Vario said:


> Vega 56 at 1710 Mhz OC rivaling or beating RTX 2070 is gaming performance


It's not really rivaling, but outright winning, check framerate stability, even where 2070 has bigger avg, it's fluctuating like crazy.


----------



## rvalencia (May 10, 2019)

Midland Dog said:


> but at over 300 watts, and with reg edits


Power consumption is not the point i.e. it's Vega with 56 CU  at 1710 Mhz reaching frame rate performance levels like the speculated RX 3080 with 56 CU relative to RTX 2070.

Are you arguing RX 3080 is built on the old 14 nm process tech?

1st gen 7nm process tech helps reduce the power consumption, hence RX 3080 is like cut-down Vega II 56 with 256 bit GDDR6 memory.


----------



## Assimilator (May 10, 2019)

medi01 said:


> It's not really rivaling, but outright winning, check framerate stability, even where 2070 has bigger avg, it's fluctuating like crazy.





Midland Dog said:


> but at over 300 watts, and with reg edits


----------



## John Naylor (May 10, 2019)

vega22 said:


> 290x beating the first Tiran at half the price?



This was the 1st instance of AMD very aggressively clocking cards in the box.   While the 290x was faster then the 780 out of the box, the teeny OC headroom left it unable to compete with the 780 ....  with both cards overclocked... it was all 780 ... even Linus figured that out.

See 8:40 mark









It gets worse under water...4:30 mark









Aside from these prerelease announcements never living up to the hype ever sinc AMDs 2xx series, there's one thing here that gives me great pause with this announcements.... the name.  Now if you want to distinguish your product from the competition because you have a better one, as is taught in Marketing 101 is "distinguish your product".   The "RX 3080 XT" ... the copied the RX, they went from 2 to 3 and from 70 to 80 and threw in an XT for "extra" I guess.  We saw the same thing with MoBos in mimicking the Intel MoBo naming conventions switching Z to an X.  When you mimic the competition, it says "I wanna make mine sound like theirs so they will see RX 3 to their RX 2 ad 80 is bigger than 70 and infer that its "like theirs but newer, bigger, badder, faster".   That was nVidias whole goal with the partnering idea ... "we will loosen up restrictions on our cards if you agree that we will lock down the naming so this type of thing won't cut into our sales".  Regardless of what the new card line actually does, I wish they'd stake out their own naming conventions.

I do hope that AMD can actually deliver on this kind performance .... But if they gonna push the value claim, let's do apples and apples for a change.  Right now the 2060 is faster for 100 watts less ... 100 watts at 30 hours a week costs me $44.20 a year.   If the new RX 3080 XT is 100 watts more .... from a cost PoV ...

+100 watts would add +$20 to PSU Cost (Focus Gold Plus)
+100 watts would warrant an extra $15 case fan
+$44.20 a year is $176.80 ... $211.18 total ... I'd rather the pay the extra $170 the 1070.

Now my cost for electricity is way higher than most folks in USA , comparable to many Eurpean countraies and a lot cheaper than many of those.   I pay 24 cents per kwh versus average US peep pays $0.11 ...for those folks the cost would be $81.03  over 4 years.

The reality is that most folks won't consider electric cost and if that's the case, the "value' argument is no longer apples and apples.    Many live in apartments and it's in the rent, some living at parents house ... but if ya gonna make the "well it's not as fast but it has best value claim",  it isn't valid w/o including all associated costs.  Those would be mine, others may not mind the extra heat and load / extra inefficiency of on PSU; but whatever they are in each instance, all impacts should be considered.

Now with "apples and apples' having been considered, I would much welcome a card that was comparable in performance, comparable in power usage and comparable in sound and heat generated.... but in each instance only interested in comparisons with both cards were at max overclock.  I hope against hope that AMD can deliver one but I'm weary of pre-release fanfare that consistenty fails to deliver.   I hop that this time they can manage to out out something that fullfills the promise, but weary of followinmg pre-release news for 6 months only to be disappointed.


----------



## Casecutter (May 10, 2019)

I couldn't follow all that...


John Naylor said:


> Right now the 2060 is faster for 100 watts less


But using a PSU calculator I delved into what you discussed.  Working from a 8700K not OC, and normal system 2x8Gb, SSD/HDD, combo drive, WLAN card, a nice AIO Cooler H100i, some 120mm fans while using Gold+ PSU.  8/hr Gaming; .24¢ (cents US) per kWh.  Yea, that's like silly high, here in So. Cal. we're 14¢
The 2060 (500W recommended) would use $361; step-up to a 2080 = $404 (550W).  That's $43/yr or $3.50 USD per month.
Same system Vega 56 (550W) 56 = $400; a Vega 64 = $467.  Even that is $63 a year is the V64 is $420 and 2080 is $700 it take 4 years to absorb the $63 before you pay back for a 2080. 
That's all on 40/wk.  If you're at that level it's really a Job!








						Power Supply Calculator - PSU Calculator | OuterVision
					

Power Supply Calculator - Select computer parts and our online PSU calculator will calculate the required power supply wattage and amperage for your PC.




					outervision.com
				






John Naylor said:


> I'm weary of pre-release fanfare that consistenty fails to deliver. I hop that this time they can manage to out out something that fullfills the promise, but weary of followinmg pre-release news for 6 months only to be disappointed.


Wow, you understand this isn't AMD/RGT, just purely a mountain of salt.  Why do you listen to folk who project the "utmost" and you're disappointed?  There's a malfunction somewhere and it's not just this rumor crap.


----------



## Midland Dog (May 11, 2019)

rvalencia said:


> Power consumption is not the point i.e. it's Vega with 56 CU  at 1710 Mhz reaching frame rate performance levels like the speculated RX 3080 with 56 CU relative to RTX 2070.
> 
> Are you arguing RX 3080 is built on the old 14 nm process tech?
> 
> 1st gen 7nm process tech helps reduce the power consumption, hence RX 3080 is like cut-down Vega II 56 with 256 bit GDDR6 memory.


GCN is always bandwidth starved regardless of being 7nm its gddr not hbm, amd will struggle to get it any better


----------



## EarthDog (May 11, 2019)

Midland Dog said:


> GCN is always bandwidth starved regardless of being 7nm its gddr not hbm, amd will struggle to get it any better


Hbm isnt a difference maker unless it's high res.. and even then gddr6 is plenty. HBM really hasnt played out well yet IMO.


----------



## Manoa (May 11, 2019)

the card is going to be "7 nm" DUV or EUV ?


----------



## HenrySomeone (May 11, 2019)

So, if past hyping is anything to go about (Crapeon 7 billed as 2080 level perf and only ending up 2070OC like ) and considering that 2060 and 2070 are closer together than 2070 and 280, this new turdeon will be below 2060 while consuming over 200W, lmao!


----------



## Midland Dog (May 11, 2019)

EarthDog said:


> Hbm isnt a difference maker unless it's high res.. and even then gddr6 is plenty. HBM really hasnt played out well yet IMO.


im here to tell you it aint, gtx 960 vs r9 380, gtx 1060 vs rx 480, gtx 1080 vs vega 64, gtx 2080 vs vega 7. proof is there, GCN doesnt scale with teraflops, and hits a bandwidth wall quickly

the biggest bottleneck in any and every gpu is bandwidth


----------



## Manoa (May 11, 2019)

it the true *Henry*, AMD many time make tarded decisions :x
and if it's DUV it's will be like you and many others say
but if it's EUV thare is chance it mybe good


----------



## EarthDog (May 11, 2019)

Midland Dog said:


> im here to tell you it aint, gtx 960 vs r9 380, gtx 1060 vs rx 480, gtx 1080 vs vega 64, gtx 2080 vs vega 7. proof is there, GCN doesnt scale with teraflops, and hits a bandwidth wall quickly
> 
> the biggest bottleneck in any and every gpu is bandwidth



And if you look at benchmarks, hbm enabled cards dont catch up until high res. It's really not considered a good thing when most game at 1080p or 2560x1440. Half the cards you listed arent more than 1080p and 2660x1440 cards anyway. Those cards only catch up at higher res. They would have been better served using gddr5.


----------



## RichF (May 13, 2019)

spnidel said:


> isn't the rtx 2070 pretty much a 1080 performance-wise? if so, then god damn, AMD... node shrink, and you still can't beat a 1080 ti with something that isn't HBM memory with power consumption that isn't trash. sad.


AMD seems to be interested in tiny dies these days to maximize profits.


Manoa said:


> this sounds like the bulldozer improves to excavator xD


Excavator didn't have enough cores nor cache nor clockspeed potential (due to low-grade 28nm process) to impress. It was designed to be cheap to produce. At the very least we're looking at a process improvement. The 28nm bulk Excavator used was actually inferior to GF 32nm SOI in terms of high performance.

AMD didn't develop its Bulldozer architecture the way it could have, had it chosen to go for high performance. We have no idea what a Keller-level talent could have done with it, let alone what more ordinary engineers could have done had AMD chosen to upgrade from Piledriver with a high-performance node (e.g. 22nm IBM or even 32nm GF) successor designed with things that were missing from Piledriver, like better microop caching, more capable individual cores, better AVX performance (e.g. fixing the regression from Bulldozer) and AVX-2 support, and L3 cache with decent performance. I have also heard anecdotally that Linux runs Piledriver much more efficiently than Windows when tuned for the architecture, so there may still be a Windows performance obstacle that could have been overcome.

People praised SMT and condemned CMT but we've seen enough examples recently of Intel not even enabling SMT in CPUs that offer good performance. I think it's therefore dubious to assume that SMT is needed for high performance, making the _SMT is vastly superior to CMT_ argument questionable. I wonder if it's possible/worthwhile to do the opposite of what AMD did and have two FPU units for every integer unit.

One of the worst things about Bulldozer is that we'll never know what the architecture could have been had it been developed more effectively. It should have never been released in its original state ("Bulldozer") and Piledriver wasn't enough of an improvement either. 8 core consumer CPUs were also premature considering the primitiveness of Windows and most software.


----------



## HwGeek (May 14, 2019)

Looks like that until NAVI comes out- the Mining craze will be back- Miners would love the new 7nm parts :-(.


----------



## rvalencia (May 14, 2019)

Midland Dog said:


> GCN is always bandwidth starved regardless of being 7nm its gddr not hbm, amd will struggle to get it any better


256bit GDDR6-14000  would yield about *448 GB/s* memory bandwidth. 

Vega 56 has * 410 GB/s* memory bandwidth. 

Vega 64 LC OC+UV at ~1750 Mhz yields similar results to VII despite it's 2X the memory bandwidth over Vega 64 LC's.  



__
		https://www.reddit.com/r/Amd/comments/9du2w4
NAVI has memory compression improvements.

Facts remains RTX 2080 Ti has 88 ROPS with six GPC blocks (with each  GPC has at least a raster engine) superiority over VII's 64 ROPS and four raster engines.

TFLOPS is nothing without raster engines and ROPS (graphics read/write units). Note why AMD is pushing for compute shader path i.e. using TMUs for read/write units


----------



## Midland Dog (May 15, 2019)

rvalencia said:


> 256bit GDDR6-14000  would yield about *448 GB/s* memory bandwidth.
> 
> Vega 56 has * 410 GB/s* memory bandwidth.
> 
> ...


ill stay skeptical until release, amd hype has always fallen short of the truth (since FIJI tried to Titan X but couldnt 980ti at least)


----------



## vega22 (May 15, 2019)

John Naylor said:


> This was the 1st instance of AMD very aggressively clocking cards in the box.   While the 290x was faster then the 780 out of the box, the teeny OC headroom left it unable to compete with the 780 ....  with both cards overclocked... it was all 780 ... even Linus figured that out.
> 
> See 8:40 mark
> 
> ...



You used lots of words to say very little there dude.

You compared cards which are not the same as those I mentioned and then went on to waffle about things which are less relevant to most.


----------



## Super XP (May 20, 2019)

RichF said:


> AMD seems to be interested in tiny dies these days to maximize profits.
> 
> Excavator didn't have enough cores nor cache nor clockspeed potential (due to low-grade 28nm process) to impress. It was designed to be cheap to produce. At the very least we're looking at a process improvement. The 28nm bulk Excavator used was actually inferior to GF 32nm SOI in terms of high performance.
> 
> ...


I agree, Bulldozer was a major issue, because AMD relied more on automation for the Core Design of this interesting CPU. In the past AMD CPU Architects were a lot more intimate with CPU designs, such as the Athlon & Athlon 64 for example. Several years before Bulldozer was designed & launched, there was some AMD internal struggles & changes in upper management, which ultimately allowed "A Bulldozer Type Decision" Of course, most of what I just said is from memory, but I remember reading multiple articles about this. I won't put the entire blame on Rory Read, as he became CEO when Bulldozer just launched. CEO Dirk Meyer was a Computer Engineer and was the decision maker with Bulldozer. And after Lisa Su was appointed CEO, again she's a Electrical Engineer, things turned for the better. Bulldozer was on Rory Read's watch and it failed, but it did not SINK the company. Lisa Su was quick to hire Jim Keller to start the ZEN project. And so on, bla bla bla all from memory lol

Piledriver was a much more efficient version of Bulldozer, which did significantly increase the overall performance. AMD had no choice but to do this, at least for the Desktop Gaming segment.

Bulldozer -Piledriver -Steamroller -Excavator -ZEN -ZEN+ & ZEN2......

EDITED. 
*I got my CEO's confused and made corrections.  *


----------



## steen (May 22, 2019)

efikkan said:


> The problem is not ROP performance, it's management of resources.
> GCN have changed very little over the years, while Kepler -> Maxwell -> Pascal -> Turing have continued to advance and achieve more performance per core and GFlop, to the point where they have about twice the performance per watt and 30-50% more performance per GFlop.



Sorry, I missed this earlier.

Where did you see me mentioning RBE/ROP performance? Fermi was performant, not simplistically due to GS yielding 50% > perf/clk, but due to the follow-on urach benefits of the polymorph engines allowing decoupling of the front end resulting in far greater extraction of parallelism. This gave better utilization, less bubbles/stalls in the pipeline. GF silicon implementation didn't match the expected RTL, but each iteration since has lead to improvements.



> More is usually better, except when it comes at a great cost.



Does that also extend to die area? 



> 16 GB of 1 TB/s HBM2 is just pointless for gaming purposes. AMD could have used 8 or even 12 GB, and priced it lower.



It's a repurposed Mi50, whattayagonnado? As a low volume gaming SKU, it's probably the bottom of the barrel 7nm working chips that might be marginal thermal/load. The cost to package as a lower frame buffer/bandwidth SKU might be marginal & the full spec can be exploited by marketing vs the competition.



rvalencia said:


> Facts remains RTX 2080 Ti has 88 ROPS with six GPC blocks (with each  GPC has at least a raster engine) superiority over VII's 64 ROPS and four raster engines.



There's a simple metric really, TU102=18b transistors outperforms Vega20=13b transistors as the silicon is deployed in a much better uarch, eg 3.3TFOPs FP64 for Vega is no benefit to gamers.



> TFLOPS is nothing without raster engines and ROPS (graphics read/write units). Note why AMD is pushing for compute shader path i.e. using TMUs for read/write units



The traditional GS/HS/DS geometry stages may well be deprecated in favor of more flexible & performant primitive/mesh shaders, but don't conflate GF->TU & GCN 1->9. It's not just the ROPs/TMUs in NV's favour, it's the decoupling of the front end and the ability to extract much more parallelism that allows higher utilization from lower peak FLOPs. We also need to consider better bandwidth utilization, data reuse (register/cache), etc.


----------



## medi01 (May 27, 2019)

Assimilator said:


> They've never undercut their competitor by such a significant amount













Assimilator said:


> unless they want to go all-out on trying to regain marketshare.


Ah, ok then.


----------



## Mephis (Jun 11, 2019)

Nope. Maybe btarunr wants to start thinking about not writing headlines that declare leaks as if they are factual. Just a thought.


----------



## HenrySomeone (Jun 20, 2019)

Hehe, well the prices will probably crash to those kinds of levels soon enough anyway, provided that they actually want to sell any, lol


----------



## Assimilator (Jun 21, 2019)

Mephis said:


> Nope. Maybe btarunr wants to start thinking about not writing headlines that declare leaks as if they are factual. Just a thought.



Yeah, good luck with that.


----------



## lexluthermiester (Jun 21, 2019)

Mephis said:


> Nope. Maybe btarunr wants to start thinking about not writing headlines that declare leaks as if they are factual. Just a thought.


All he did was report something interesting that was discovered.


----------

