# AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs



## btarunr (Oct 9, 2020)

AMD in its October 8 online launch event for the Ryzen 5000 "Zen 3" processors, provided a teaser of the company's next flagship graphics card slotted in the Radeon RX 6000 series. This particular SKU has been referred to by company CEO Lisa Su as "Big Navi," meaning it could be the top part from AMD's upcoming client GPU lineup. As part of the teaser, Su held up the reference design card, and provided three performance numbers of the card as tested on a machine powered by a Ryzen 9 5900X "Zen 3" processor. We compared these performance numbers, obtained at 4K UHD, with our own testing data for the games, to see how the card compares to other current-gen cards in its class. Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here. We obviously have a different CPU since the 5900X is unreleased, but use the highest presets in our testing. 

With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.



 

 




With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS. 

Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster. 

We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.

*View at TechPowerUp Main Site*


----------



## jesdals (Oct 9, 2020)

I could live with that if the price is belowe the 3080


----------



## TheLostSwede (Oct 9, 2020)

But which SKU did they show off? Everyone assumes it's the top tier card...


----------



## puma99dk| (Oct 9, 2020)

TheLostSwede said:


> But which SKU did they show off? Everyone assumes it's the top tier card...



That's possible and what most people believe.

We don't believe it to be like a RX 6900 but the XT or XTX (Last one AMD ref only) but if it's a regular RX 6900 it would be wow


----------



## R0H1T (Oct 9, 2020)

TheLostSwede said:


> But which SKU did they show off? Everyone assumes it's the top tier card...


Do you know something that we don't, any insider scoops?


----------



## kayjay010101 (Oct 9, 2020)

Yeah... as you say yourself, comparing between two different benchmark passes means nothing. You'd have to have the least amount of variables possible, including using the exact same benchmark pass. It seemed like they used the internal benchmark in their presentation, so comparing to your own custom route in the game isn't a valid form of comparison.

And this might not be the top SKU. A lot of other news publications have stated AMD stressed to them that they didn't tell which SKU this was, which to me seems like they're trying to imply it's not the top dog.


----------



## Anymal (Oct 9, 2020)

puma99dk| said:


> That's possible and what most people believe.
> 
> We don't believe it to be like a RX 6900 but the XT or XTX (Last one AMD ref only) but if it's a regular RX 6900 it would be wow


Maybe 6700xt ?


----------



## TheLostSwede (Oct 9, 2020)

puma99dk| said:


> That's possible and what most people believe.
> 
> We don't believe it to be like a RX 6900 but the XT or XTX (Last one AMD ref only) but if it's a regular RX 6900 it would be wow


Personally I think it would be a bad move by AMD to show it's full hand this early, as it potentially allows Nvidia to make adjustments. I guess it might be too late for that though, but who knows. Even so, all we can do for now is to speculate. That said, I doubt AMD will have a product that beats the RTX 3080, but I could be wrong.


----------



## PerfectWave (Oct 9, 2020)

"AMD has probably cherry-picked games that are most optimized to its GPUs "
i dont think the engine is optimized for AMD instead it is for Nvidia...


----------



## puma99dk| (Oct 9, 2020)

PerfectWave said:


> "AMD has probably cherry-picked games that are most optimized to its GPUs "
> i dont think the engine is optimized for AMD instead it is for Nvidia...



Every company cherry pick games and benchmarks that's nothing new...



TheLostSwede said:


> Personally I think it would be a bad move by and to show it's full hand this early, as it potentially allows Nvidia to make adjustments. I guess it might be too late though, but who knows.



Well AMD doesn't really compete with RTX 3090 and I don't believe they will because if the majority of people using a graphics card is still at 1920x1080 pixel they not make something that crushes this resolution instead og focusing on 4k@120fps+ and 8K@60hz? Because the majority of people in 2020 and properly 2021 doesn't game at 4K or 8K yet.



Anymal said:


> Maybe 6700xt ?



I haven't heard much about this card I only been focusing on the RX 6900 series because that's properly the card to replace my RX 5700 XT since the RTX 3080 20GB won't be in stock until 2021 sadly according to Nvidia.


----------



## Chrispy_ (Oct 9, 2020)

If that's the 80CU variant I'd be surprised since that implies lower IPC for RDNA2 than RDNA1, and AMD have been touting their IPC gains as a big deal for RDNA2.

I would imagine that's a downclocked 72CU variant with cheaper GDDR6, perhaps it's the 12GB variant that's effectively double the 5600XT.

I honestly don't care what card it is as long as AMD can move the price/performance curve forwards and beat whatever Nvidia are offering, and if they can do it at a lower power draw (not excactly the hardest challenge given Nvidia's shocking TDP this generation) then that's just the cherry on the cake.


----------



## Erazor6000 (Oct 9, 2020)

Finally, they made a better graphic card that my RTX 2080.

*Now *AMD have both excellent CPUs and (probably) GPUs when compared to Intel and nVidia. 
...NOT with Ryzen 3000, which were nothing special if you are into gaming only.
For me, RX 5700 XT was nothing to consider, especially because it was announced as late as Q3 2019, and it wasn't even better than RTX 2080 (non super).


----------



## puma99dk| (Oct 9, 2020)

Erazor6000 said:


> Finally, they made a better graphic card that my RTX 2080.
> 
> *Now *AMD have both excellent CPUs and (probably) GPUs when compared to Intel and nVidia.
> ...NOT with Ryzen 3000, which were nothing special if you are into gaming only.
> For me, RX 5700 XT was nothing to consider, especially because it was announced as late as Q3 2019, and it wasn't even better than RTX 2080 (non super).



You cannot compare the 2 cards because AMD made the RX 5700 series for 1080p.


----------



## john_ (Oct 9, 2020)

I guess the dilemma will be like this:

3080: Faster, Raytracing performance, DLSS, CUDA, (theory or fact you decide) better drivers

Big Navi: Lower price, 16GB of VRAM, FreeSync (if I am not mistaken NOT all FreeSync monitors are supported by Nvidia).

PS. Su called the card "Big Navi", so probably the top model. People gave that nick name to the big model, not some second mid range model.


----------



## docnorth (Oct 9, 2020)

puma99dk| said:


> That's possible and what most people believe.
> 
> We don't believe it to be like a RX 6900 but the XT or XTX (Last one AMD ref only) but if it's a regular RX 6900 it would be wow


It is rumored that Big Navi will at some point get a memory bandwidth upgrade, probably GDDR6X. Some reasons for the launch with GDDR6 might be:
1)Limited availability of GDDR6X
2)Cost
3)No time or resources to develop the memory controler. GDDR6X uses PAM4, it's a four state coding format instead of binary coding and needs a different memory controller.


----------



## demu (Oct 9, 2020)

Please check your RTX 3080 Borderland 3 results.
You are saying, that the RTX 3080 FE:s score (70.3 FPS) is tested using DX12.
_We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). _

However, in RTX 3080FE review you are stating that due to instability problems, you tested Borderland 3 with DX11:

_Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, *we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability.* We will switch to the new renderer once these issues are ironed out. _









						NVIDIA GeForce RTX 3080 Founders Edition Review - Must-Have for 4K Gamers
					

NVIDIA's new GeForce RTX 3080 "Ampere" Founders Edition is a truly impressive graphics card. It not only looks fantastic, performance is also better than even the RTX 2080 Ti. In our RTX 3080 Founders Edition review, we're also taking a close look at the new cooler, which runs quietly without...




					www.techpowerup.com


----------



## Erazor6000 (Oct 9, 2020)

puma99dk| said:


> You cannot compare the 2 cards because AMD made the RX 5700 series for 1080p.


I am aware of that.
My point is that we will have a choice between GeForce and Radeon *for high-end gaming* at 2K/4K in the years to come.


----------



## ratirt (Oct 9, 2020)

I think these games were not cherry picked in the means of showing the best performance. If that was the case I'm sure AMD would have picked different, more like Battlefield V. Also the assumption is that the tests were performed with the top model card but "big navi" not necessarily mean top tier but the new 6000 series.
If I were AMD, i wouldn't use top tier card for that showcase. NV did similar with Ampere. Although it might have been top AMD's card. I guess, in a month, we will all know what sort of performance the 6000 RDNA2 navi will bring.


----------



## JAB Creations (Oct 9, 2020)

ratirt said:


> I think these games were not cherry picked in the means of showing the best performance. If that was the case I'm sure AMD would have picked different, more like Battlefield V. Also the assumption is that the tests were performed with the top model card but "big navi" not necessarily mean top tier but the new 6000 series.
> If I were AMD, i wouldn't use top tier card for that showcase. NV did similar with Ampere. Although it might have been top AMD's card. I guess, in a month, we will all know what sort of performance the 6000 RDNA2 navi will bring.


Are you replying to me? That is a Facebook thing, leave confusing responses there please.

I'm saying TPU cherry picked. Gamers Nexus said that AMD was very honest in the benchmarks they picked. I'm saying whoever wrote this article clearly has a bias as they presented a sneak peak as the end-all-be-all.


----------



## DuxCro (Oct 9, 2020)

PerfectWave said:


> "AMD has probably cherry-picked games that are most optimized to its GPUs "
> 
> Borderlands 3 was used for that very early performance demonstration of RTX 3080 by Digital Foundry. Sponsored by Nvidia.  So the game is just cherries then. Since both companies used it for performance demonstration.


----------



## ratirt (Oct 9, 2020)

JAB Creations said:


> Are you replying to me? That is a Facebook thing, leave confusing responses there please.
> 
> I'm saying TPU cherry picked. Gamers Nexus said that AMD was very honest in the benchmarks they picked. I'm saying whoever wrote this article clearly has a bias as they presented a sneak peak as the end-all-be-all.


why you assume I replied to your post? Did I quote anything you've posted? It's not just you posting here and I replied in general by reading comments of other people.


----------



## Vya Domus (Oct 9, 2020)

john_ said:


> PS. Su called the card "Big Navi", so probably the top model. People gave that nick name to the big model, not some second mid range model.



People gave that name to the chip itself not to a card, as in the largest die. You really think they'll just have 1 product based on that chip ? I don't, that has never happened.



ratirt said:


> I think these games were not cherry picked in the means of showing the best performance.



Indeed, notably Borderlands 3 was one of the main games to be shown on RTX 3080. They didn't chose it by mistake, it's a direct "confrontation" of sorts.


----------



## Bytales (Oct 9, 2020)

Anymal said:


> Maybe 6700xt ?


Nah, the benched numbers are for the entry level navy 6300XT with 64bit 16Gb GDDR7x
Nvidia is faqed !


----------



## john_ (Oct 9, 2020)

Vya Domus said:


> *You really think they'll just have 1 product based on that chip ?*



Oh come on. Don't put words in my mouth that I never said. And especially don't put words in my mouth that in a way call me a stupid noob that bought his first PC yesterday. 
Damn, I am not even going to answer that BS question. I know that your average posts are much better than this.


----------



## cueman (Oct 9, 2020)

naah,i say different is about 20-25% for rtx 3080 .

amd use its new ryzen 5900 cpu,its help alot,also it help rtx 3080 nd evn more rtx 3090.


20% faster than rtx 2080 ti is max it can reach...i wonder what is big navi tdp thouse speed.. sure over 300W.

well, let see, but real world is that kind...


----------



## Bytales (Oct 9, 2020)

Bytales said:


> Nah, the benched numbers are for the entry level navy 6300XT with 64bit 16Gb GDDR7x
> Nvidia is faqed !



6500 series uses 128bit memory still 16gb, because hey, the memory is so cheap and the performance is so big, that it was worth it to give the entry level cards 16gb of video ram.

6700 uses 256 bit memory 32gb, 6800series uses 384bit memory 48 gb, and the 6900 series uses 512bit memory 64gb, with a special "Lisa Su edition" XTX version with 8192bit HBM3 memory, double the bandwith of the normal 6900 navi, and 128gb of video memory.

Nvidia can file for bankruptcy !


----------



## nguyen (Oct 9, 2020)

AMD CPU Slides: woohoo we are number one
AMD GPU slides: yeahhhh, here are our FPS, take it or leave it


----------



## Betty (Kung Pow) (Oct 9, 2020)

Did you do any percentage reduction as the AMD numbers were on the 5900x? Which might be some % up from the TPU test bench?


----------



## r.h.p (Oct 9, 2020)

Not sure to call it Red vs Blue and Green Noww.....


----------



## AnarchoPrimitiv (Oct 9, 2020)

I'm a fan of moore's law is dead (if you're going to be critical of him, provide evidence of why he's wrong and abstain from ad hominem personal attacks, otherwise it's completely irrelevant), and based on his sources and other things, he thinks the numbers Lisa Su shared might be the second to top card, as his sources, which have been proven correct again and again, have said that "AMD is going all out", and that the final clocks on the fastest sku haven't been decided yet


----------



## Mats (Oct 9, 2020)

JAB Creations said:


> Lame, no mention of TPU's test rig and then they claim AMD cherry picked?


Which part of *"a very coarse and unscientific way to compare AMD's numbers to ours"* don't you understand? Are you even able to click the link in the OP?



> Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here.


----------



## Dristun (Oct 9, 2020)

puma99dk| said:


> You cannot compare the 2 cards because AMD made the RX 5700 series for 1080p.


what? They didn't tout it as such - in _their own slides._


----------



## Turmania (Oct 9, 2020)

From looks of it, it will contest with rtx 3700 not against 3800. Basically a 2080 ti competitor. Obviously we know very little about it and power consumption etc... but Lisa said it is big navi, therefore it is safe to assume that it will be its leading card. Unless they come up with uber navi etc.


----------



## Mats (Oct 9, 2020)

Turmania said:


> Obviously we know very little about it and power consumption etc... but Lisa said it is big navi, therefore it is safe to assume that it will be its leading card.


I dunno, they focused on CPU's and mostly talked about the 5900X, the top model only got a few minutes at the end. They could have the picked graphics card to preview in a similar way.
I don't expect AMD's fastest card to run faster than a 3080, all I'm saying is we just don't know what they showed us.


----------



## fynxer (Oct 9, 2020)

*Lisa clearly said Big Navi optimization was done with Ryzen 5000 series* SO because Ryzen 5000 is faster it will yield more performance in games for the 3080 too so you cannot make a direct comparison like that.

You need to do benchmarks with 3080 and Ryzen 5000, it should add another 5-10% to 3080.

Same thing with 2080Ti, if you do 2080Ti with Ryzen 5000 you should get values closer to Big Navi, so Big Navi will be competing with 3070 or 3070Ti .

*FORGET THAT IT WILL COMPETE WITH 3080, IT WONT HAPPEN.*

Nvidia will also optimize their drivers and unlock another 10-15% performance, they done this many times in the past.

*I have for a long time suspected Nvidia by default locks away a certain % of performance in their drivers that they can access in case of emergency.* To many times in the past Nvidia just magicly found 5-15% performance by doing a so called "driver optimization" just when they needed it the most to compete with AMD/ATI.


----------



## Vya Domus (Oct 9, 2020)

john_ said:


> Oh come on. Don't put words in my mouth that I never said. And especially don't put words in my mouth that in a way call me a stupid noob that bought his first PC yesterday.
> Damn, I am not even going to answer that BS question. I know that your average posts are much better than this.



I didn't put any words in your moth, you gotta chill. You said this is "probably the top model" because she mentioned Big Navi and I pointed out that "Big Navi" was never thought of as a specific model but rather simply the silicon. You basically phrased in a way that implies Big Navi could only mean the top end card and I don't think so, there are going to be at least two cards on that "Big Navi" die. 

If you have any reason to believe "Big Navi" is a specific card/model go ahead and explain that.


----------



## kapone32 (Oct 9, 2020)

At the end of the day we in the world have no better understanding of what exactly Big Navi is at 9 AM this morning vs 24 hours ago. Everything that anyone says is pure conjecture. The only thing I know is that Oct 28 should be very interesting. November will be a good month for AMD, Intel and Nvidia.


----------



## Vya Domus (Oct 9, 2020)

I've also just noticed that for some reason they paired it with a 5900X not 5950X and also slower memory, you'd think they would use the fastest system. Hmm.


----------



## renz496 (Oct 9, 2020)

kapone32 said:


> At the end of the day we in the world have no better understanding of what exactly Big Navi is at 9 AM this morning vs 24 hours ago. Everything that anyone says is pure conjecture. The only thing I know is that Oct 28 should be very interesting. November will be a good month for AMD, Intel and Nvidia.



that day probably will give us more question than answer (along with more hype depending on how AMD wants to play it). we need to see the card being reviewed ASAP. hopefully we will see one before mid november. personally i hope this high end GPU battle to end already. give us the performance increase that we want to see in sub $250 already.


----------



## DemonicRyzen666 (Oct 9, 2020)

AnarchoPrimitiv said:


> I'm a fan of moore's law is dead (if you're going to be critical of him, provide evidence of why he's wrong and abstain from ad hominem personal attacks, otherwise it's completely irrelevant), and based on his sources and other things, he thinks the numbers Lisa Su shared might be the second to top card, as his sources, which have been proven correct again and again, have said that "AMD is going all out", and that the final clocks on the fastest sku haven't been decided yet



considering that he was talking about The Prosumer/work station card a short while ago. He was talking about Bignavi was competing with Nvidia series card work station cards, I don't really believe.
Because Those prosumer/work station card are suppose to be CDNA 1.0 and not RDNA 2.0.

CDNA 1.0 has 3 times the number CU's vs 5700 XT.


----------



## Vya Domus (Oct 9, 2020)

renz496 said:


> give us the performance increase that we want to see in sub $250 already.



Then you're going to be disappointed, I don't think any sub 300$ cards will be announced.


----------



## kapone32 (Oct 9, 2020)

renz496 said:


> that day probably will give us more question than answer (along with more hype depending on how AMD wants to play it). we need to see the card being reviewed ASAP. hopefully we will see one before mid november. personally i hope this high end GPU battle to end already. give us the performance increase that we want to see in sub $250 already.


If Big Navi pushes the 5700 into the $250 range your wish would have been fulfilled. The only thing I am looking at is is that chip as much faster in performance than the 5700XT vs Polaris vs Vega.


----------



## Turmania (Oct 9, 2020)

Will AMD ever make a 75w GPU?


----------



## kapone32 (Oct 9, 2020)

Turmania said:


> Will AMD ever make a 75w GPU?


There is no market for that you are better off buying a APU today or in the near future for low budget Gaming.


----------



## Turmania (Oct 9, 2020)

kapone32 said:


> There is no market for that you are better off buying a APU today or in the near future for low budget Gaming.



Gtx 1650 is selling like hot cakes though even with 150 USD price tag. There is a huge market and profit there.


----------



## kapone32 (Oct 9, 2020)

Turmania said:


> Gtx 1650 is selling like hot cakes though even with 150 USD price tag. There is a huge market and profit there.


That competes with the 5500XT.


----------



## delshay (Oct 9, 2020)

I'm just going to leave this link here. For the record I upvoted the top two user in the sub thread, read why.


__
		https://www.reddit.com/r/Amd/comments/j7jx6z


----------



## Turmania (Oct 9, 2020)

kapone32 said:


> That competes with the 5500XT.



5500XT competes with 1650 super.

there is no AMD solution for 1650 Vanilla. Which I find it odd, and perhaps they fix it this generation.


----------



## bug (Oct 9, 2020)

One thing people gloss over is AMD has just announced "the fastest gaming CPUs". I bet these results are obtained running said CPUs.


----------



## Viilutaja (Oct 9, 2020)

It is weird that only TPU gets +10FPS boost on 4K in Borderlands, but Nvidias own official slide on their webpage shows 3080FE card scoring exactly 61 fps in that game... hmmmm
Also Youtuber "joker" tested his 3080FE card and got 57.7fps at 4K, which is in line what Nvidia officialy got. And now TPU gets 70fps  Suspicious.

I made couple of modified slides from available benchmarks with RTX3080 and where that Navi 6000 series stacks up againts.
That card what AMD showed yesterday is dead on 1:1 level with RTX3080 FE model!

Everybody saying that RTX3080 will gain performance also with Ryzen 5000 series cpu.. no it wont. Because @4k where you are still very GPU limited. 
But if I am in giving mood- I would say that at best case scenario  RTX3080 may benefit max 1-2fps with AMD's new best of the best CPU: 5950X.


----------



## kapone32 (Oct 9, 2020)

Turmania said:


> 5500XT competes with 1650 super.
> 
> there is no AMD solution for 1650 Vanilla. Which I find it odd, and perhaps they fix it this generation.


I would say Polaris and you may be on to something. Polaris rather old but I am sure it has sold well for AMD. Navi did go through a rough patch but everything seems to be roses now (the memory clock jumps all over the place). A budget refresh that has the same GPU as the PS5 or Xbox1 would be nice for $169 US.


----------



## wheresmycar (Oct 9, 2020)

fynxer said:


> Nvidia will also optimize their drivers and unlock another 10-15% performance, they done this many times in the past.



Cheeky monkeys. Not only are these cards SUPER expensive they tap away the higher performance ceiling.


----------



## nguyen (Oct 9, 2020)

bug said:


> One thing people gloss over is AMD has just announced "the fastest gaming CPUs". I bet these results are obtained running said CPUs.



Yeah pretty funny that AMD used a 2080 Ti for benchmarking their CPU against 10900K, but somehow can't bench the 2080 Ti against their RX 6000. Pretty much a dead giveaway that AMD aren't so confident about their RX 6000 GPU. 
AMD is now trying to crank as much clocks outta these chip so that they can be justifiable purchase.


----------



## EarthDog (Oct 9, 2020)

If we average the average of the average, the numbers are real.


----------



## bug (Oct 9, 2020)

Viilutaja said:


> It is weird that only TPU gets +10FPS boost on 4K in Borderlands, but Nvidias own official slide on their webpage shows 3080FE card scoring exactly 61 fps in that game... hmmmm
> Also Youtuber "joker" tested his 3080FE card and got 57.7fps at 4K, which is in line what Nvidia officialy got. And now TPU gets 70fps  Suspicious.


Yup, very weird. It's almost as people are testing using different systems or something.



nguyen said:


> Yeah pretty funny that AMD used a 2080 Ti for benchmarking their CPU against 10900K, but somehow can't bench the 2080 Ti against their RX 6000. Pretty much a dead giveaway that AMD aren't so confident about their RX 6000 GPU.


Well, of course they can't bench against RX 6000, they're also under NDA.


----------



## nguyen (Oct 9, 2020)

bug said:


> Well, of course they can't bench against RX 6000, they're also under NDA.



Huh, they are the one who give out NDA , not under NDA.


----------



## Viilutaja (Oct 9, 2020)

They (AMD) are the one who knocks!


----------



## pal (Oct 9, 2020)

I belive there is only 1 BIg Navi.


----------



## Vya Domus (Oct 9, 2020)

nguyen said:


> Yeah pretty funny that AMD used a 2080 Ti for benchmarking their CPU against 10900K, but somehow can't bench the 2080 Ti against their RX 6000. Pretty much a dead giveaway that AMD aren't so confident about their RX 6000 GPU.



Pretty funny that they didn't want to benchmark GPUs that aren't even announced yet. Yeah, funny and strange ...



nguyen said:


> AMD is now trying to crank as much clocks outta these chip so that they can be justifiable purchase.



You mean like Ampere which basically has zero frequency headroom left ? Anyway, those aspects are decided early in the development process as they are closely linked with the manufacturing process, so they aren't cranking up anything.


----------



## kapone32 (Oct 9, 2020)

pal said:


> I belive there is only 1 BIg Navi.


That is counter intuitive.


----------



## SIGSEGV (Oct 9, 2020)

Simple, give me good price/perf ratio then I will gladly switch to RED for my 2nd workstation rig.  (2x 5700XT performance with the same MSRP of 5700XT or 600$ territory, then I will take 2 of your 6000 series or even your XTX version)
Otherwise, I will stay with green.


----------



## Daven (Oct 9, 2020)

nguyen said:


> Yeah pretty funny that AMD used a 2080 Ti for benchmarking their CPU against 10900K, but somehow can't bench the 2080 Ti against their RX 6000. Pretty much a dead giveaway that AMD aren't so confident about their RX 6000 GPU.
> AMD is now trying to crank as much clocks outta these chip so that they can be justifiable purchase.



The 2080Ti is probably the fastest commercially available GPU that AMD could get it's hands on (3080/3090 are too hard to get even for AMD). As to why they would use commercially available hardware, so that all us readers, reviewers and tech news analysts can make sense of the results based on our own experience with the hardware. According to you, it would be okay for AMD to take a future 1 year down the road GPU that is working in their labs and release public numbers with it. That makes no sense. With the 2080Ti we can all verify AMD's claims about how the CPU competition performs.


----------



## bug (Oct 9, 2020)

nguyen said:


> Huh, they are the one who give out NDA , not under NDA.


Hm, somebody never worked on unreleased products...


----------



## dicktracy (Oct 9, 2020)

EPYC fail.


----------



## john_ (Oct 9, 2020)

Vya Domus said:


> I didn't put any words in your moth, you gotta chill. You said this is "probably the top model" because she mentioned Big Navi and I pointed out that "Big Navi" was never thought of as a specific model but rather simply the silicon. You basically phrased in a way that implies Big Navi could only mean the top end card and I don't think so, there are going to be at least two cards on that "Big Navi" die.


Oh come on now. Give me a break. Yes you did. Because you want to create an argument based on which, you will come out and say that this is NOT the top model. Well, you can't say that, I can't say that, most people can't say which model it is.

But we can speculate.

And when people talk about "Big Navi" they talk about the TOP model. Not the silicon. You can have 15 models based on the biggest core, but that "Big Navi" nickname was always, in discussions between people or tech cites, the TOP MODEL. NOT one of the top models. But *THE* TOP MODEL. You don't start conversations about the second model of an unknown line of cards.

And let me ask you a question. Almost everyone are asking the same question. Will Big Navi be faster than 2080 Ti? And how much? Can it be close, equal or faster than 3080?
Tell me. Does those questions make any sense if they are about the SECOND in line top model?

You ignore logic and in my case, make me look like someone stupid, just so you can build an argument about something that you also don't know.



> If you have any reason to believe "Big Navi" is a specific card/model go ahead and explain that.



See what you are doing here? Again you put words in my mouth by distorting what I mean. because you should know and you probably know, that saying that "Big Navi" is a reference for the TOP model, does not have ANYTHING to do with how many models will be based on the biggest chip.


----------



## Blueberries (Oct 9, 2020)

Maybe we'll see a Crossfired Pro Duo 6000 series competing with green team in Port Royal

... or AMD did some engineering voodoo to offer a 4k 60 FPS card cheaper than NVIDIA


----------



## TheLostSwede (Oct 9, 2020)

nguyen said:


> Huh, they are the one who give out NDA , not under NDA.


Note the part highlighted in red...
Until a product is announced or launched, it tends to be confidential.


----------



## nguyen (Oct 9, 2020)

TheLostSwede said:


> Note the part highlighted in red...
> Until a product is announced or launched, it tends to be confidential.



Then why bother show anything at all. These FPS figures are useless by themselves.

"Hey let just bench the lightest scenes in these games, get some useless FPS numbers to hype our product." - AMD CEO to PR guys

Sure that worked out well for Polaris and Vega, just some bullshit numbers here and there until the actual release and then boom yeah our GPUs are so bad, we are selling them for almost nothing.


----------



## B-Real (Oct 9, 2020)

Gears 5 results from different sources (all using DX12): Guru 76 fps






Techspot: 72 fps







Eurogamer: 80 fps

Average of the 3 is 76 fps. AMD shows 73 fps.




cueman said:


> amd use its new ryzen 5900 cpu,its help alot,also it help rtx 3080 nd evn more rtx 3090.
> 
> well, let see, but real world is that kind...


What the heck are you talking about? In 4K, there is no difference between CPUs as there is a GPU bottleneck.
I would bet there is not much manipulation in the results. Why? Because they showed the CPU gaming results where there was one game in which Intel was faster.




bug said:


> One thing people gloss over is AMD has just announced "the fastest gaming CPUs". I bet these results are obtained running said CPUs.



Huge difference, isn't it?


----------



## efikkan (Oct 9, 2020)

PerfectWave said:


> "AMD has probably cherry-picked games that are most optimized to its GPUs "
> i dont think the engine is optimized for AMD instead it is for Nvidia...


It's neither.
You would need to have special API features to optimize for one of them.
Sponsored titles might have exclusive graphical features, but not vendor-specific performance optimizations.



docnorth said:


> It is rumored that Big Navi will at some point get a memory bandwidth upgrade, probably GDDR6X. Some reasons for the launch with GDDR6 might be:
> 1)Limited availability of GDDR6X
> 2)Cost
> 3)No time or resources to develop the memory controler. GDDR6X uses PAM4, it's a four state coding format instead of binary coding and needs a different memory controller.


The time required to implement it is measured in years. Regardless of the choice of memory controller, this was decided a long time ago.


----------



## medi01 (Oct 9, 2020)

jesdals said:


> I could live with that if the price is belowe the 3080


3080 is a virtual product, so don't hold your breath.

I love how performance between 2080Ti and 3080 is just "trades blows with 3070".


----------



## neatfeatguy (Oct 9, 2020)

Viilutaja said:


> It is weird that only TPU gets +10FPS boost on 4K in Borderlands, but Nvidias own official slide on their webpage shows 3080FE card scoring exactly 61 fps in that game... hmmmm
> Also Youtuber "joker" tested his 3080FE card and got 57.7fps at 4K, which is in line what Nvidia officialy got. And now TPU gets 70fps  Suspicious.
> 
> I made couple of modified slides from available benchmarks with RTX3080 and where that Navi 6000 series stacks up againts.
> ...



TPU has done a benchmark/performance test on Borderlands 3 when it came out - pre RTX 3080. They tested the game in DX12 with available cards at the time and 2080Ti was the top tier card then.

When the 3080 launched, Borderlands 3 was benched with it, but *only* in DX11. It says so above the charts on the 3080 review for the Borderlands 3 game, just go here and read for yourself or click the spoiler button below.


Spoiler



Borderlands 3 is brought to life with Unreal Engine 4, which takes advantage of DirectX 12 and Direct X 11 with some AMD-specific features, such as FidelityFX. In our testing, we used DirectX 11 because the DirectX 12 renderer has extremely long loading times and some instability. We will switch to the new renderer once these issues are ironed out.



AMD showed that their "Big Navi" card was running 61FPS in DX12 for Borderlands 3.
TPU has not ran the 3080 in DX12 for Borderlands 3. They only ran DX11 and that is the benchmark that shows 70.3 FPS.

I don't know where your confusion is other than you probably overlooked the fact that TPU's RTX 3080 benchmark for Borderlands 3 is only for DX11 and maybe you thought it was for DX12.


----------



## d0x360 (Oct 9, 2020)

I would imagine AMD wouldn't give out the numbers for their top end card.  They want to wow people in Nov and while coming close to the 3080 is awesome I have a feeling they were talking about a card that's 1 model down from the current high end one and that doesn't mean they couldn't one up it again later like nVidia does with super or ti's.


----------



## nguyen (Oct 9, 2020)

B-Real said:


> Gears 5 results from different sources (all using DX12): Guru 76 fps
> Techspot: 72 fps
> Eurogamer: 80 fps
> Average of the 3 is 76 fps. AMD shows 73 fps.
> ...



You are just wasting time, I can just look at the ground in those games to get 200FPS at 4K.

There are too many variables involved that you can't compare results from one review site to the other. Even results from the same review site can't be used if there are changes to the testing systems (new CPU, RAM, Motherboard, Drivers, etc...).

Those FPS figures are useless without any base for comparison, which AMD are witholding.


----------



## medi01 (Oct 9, 2020)

fynxer said:


> *Lisa clearly said Big Navi optimization was done with Ryzen 5000 series* SO because Ryzen 5000 is faster it will yield more performance in games for the 3080 too so you cannot make a direct comparison like that.



Which one of the tested games is CPU bottlenecked at 4k pretty please for CPU to even matter?
Of course AMD tested with own CPU as they now claim gaming CPU top spot.


----------



## TheLostSwede (Oct 9, 2020)

nguyen said:


> Then why bother show anything at all. These FPS figures are useless by themselves.
> 
> "Hey let just bench the lightest scenes in these games, get some useless FPS numbers to hype our product." - AMD CEO to PR guys
> 
> Sure that worked out well for Polaris and Vega, just some bullshit numbers here and there until the actual release and then boom yeah our GPUs are so bad, we are selling them for almost nothing.


Are you saying other companies have done it differently in the past?
This isn't something new as far as I'm concerned.
It's all about the hype train...


----------



## pcminirace (Oct 9, 2020)

[QUOTE = "jesdals, publicación: 4365578, miembro: 127487"]
Podría vivir con eso si el precio es inferior al 3080
[/CITAR]oh yes

But how much does it cost? Will it happen as with the ryzen 5000?

But how much does it cost? Will it happen as with the ryzen 5000?


----------



## Darmok N Jalad (Oct 9, 2020)

The way I took Lisa’s comment in the live stream was that it was nicknamed “Big Navi” not internally, but affectionately, because that’s what the internet at large was calling it with all the speculating. It’s a nuanced distinction, but I think what they demoed was the best they have. They did the same with the 5700XT. The only ace they may have left is clock speeds, and that is what they probably are tweaking right now since NVIDIA has already played their hand, well, at least initially anyway.


----------



## Franzen4Real (Oct 9, 2020)

neatfeatguy said:


> When the 3080 launched, Borderlands 3 was benched with it, but *only* in DX11. It says so above the charts on the 3080 review for the Borderlands 3 game, just go here and read for yourself or click the spoiler button below.
> 
> 
> Spoiler
> ...


And, couple that fact with this statement from OP _"It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes." _
We now have a bench that rendered a completely different scene and on a different API. I didn't realize that when first seeing the slides on the live stream. So now we are essentially comparing apples to baseballs. Sure, they are both round, but...


----------



## kapone32 (Oct 9, 2020)

nguyen said:


> Then why bother show anything at all. These FPS figures are useless by themselves.
> 
> "Hey let just bench the lightest scenes in these games, get some useless FPS numbers to hype our product." - AMD CEO to PR guys
> 
> Sure that worked out well for Polaris and Vega, just some bullshit numbers here and there until the actual release and then boom yeah our GPUs are so bad, we are selling them for almost nothing.


To stir excitement. This is the official notification to the public that these are coming and what you can expect.  You also so do not know what scenes they used as there is no video in existence of those benchmarks. I don't know why you are quoting architecture from before 7nm (Vega 7 excluded) when even the 5600XT competes with the Vega 64. Just because we have been pining for information on Big Navi means that it will be a fail.


----------



## Arpeegee (Oct 9, 2020)

People must either be in denial or lack good analytical skills if you think AMD would show their BEST card at a CPU event when they have a GPU event at the end of the month.

They've been mum on details all year with Navi and if I were to place a bet I would think the performance is closer to the RTX 3080 then people want to admit. People also need to keep in mind that even if it doesn't absolutely win over the 3080, if it's priced $100 cheaper than the "MSRP $699" then Nvidia has lost this battle. No regular person would pay $100-$200 more for only 5% to 10% better performance with a power hungry card (though the RTX 3090 sales kinda undermine my point ).


----------



## DuxCro (Oct 9, 2020)

I hope the availability of 6000 series will be a lot better than RTX 3000 series. Can't find them anywhere. And those launch prices are also imaginary. Can't find a store listing in EU for RTX 3080 below €800


----------



## Dante Uchiha (Oct 9, 2020)

Turmania said:


> 5500XT competes with 1650 super.
> 
> there is no AMD solution for 1650 Vanilla. Which I find it odd, and perhaps they fix it this generation.



They have: https://www.techpowerup.com/gpu-specs/radeon-rx-5300.c3584


----------



## Jism (Oct 9, 2020)

They did very well; we got competition.


----------



## TheoneandonlyMrK (Oct 9, 2020)

Honestly, this is a bit apples to mystery bananary, Tpu used an Intel platform.
Hopefully this was the cut down big Navi, still big Navi but not s big


----------



## Nkd (Oct 9, 2020)

I have seen numbers all over the place. TPU review is the only one getting 80+ in gears 5 at 4k. Redgaming tech showed video of his benchmark with 3080 he was a 75 at ultra with 10900k overclocked, and its similar at other reviews. Not sure what settings or map you are testing but AMD is just giving benchmark numbers here on built in benchmark and it seems to get with in spitting distance in both borderlands 3 and gears 5 when you run the same built in benchmarks on 3080. Call of duty is up for debates because I have seen the numbers in the 80s and 120s probably due to maps people are testing.



Arpeegee said:


> People must either be in denial or lack good analytical skills if you think AMD would show their BEST card at a CPU event when they have a GPU event at the end of the month.
> 
> They've been mum on details all year with Navi and if I were to place a bet I would think the performance is closer to the RTX 3080 then people want to admit. People also need to keep in mind that even if it doesn't absolutely win over the 3080, if it's priced $100 cheaper than the "MSRP $699" then Nvidia has lost this battle. No regular person would pay $100-$200 more for only 5% to 10% better performance with a power hungry card (though the RTX 3090 sales kinda undermine my point ).



Actually most reviews it matches 3080 in borderlands 3 and gears 5. It really boils down to maps and shit but these are straight form built in benchmark and 2080 it is in the 60s and 70s in both borderland 3 bad ass and gears 5 ultra benchmark.

Also my mind tells me this is not the biggest navi. I don't know if they will show the best on CPU event then have nothing special for GPU event. This might be the 72 CU part right under 3080 that is in between 3070 and 3080 (more closer to 3080 then 3070) and then one with 80CU another 10-15% or so on top.



kayjay010101 said:


> Yeah... as you say yourself, comparing between two different benchmark passes means nothing. You'd have to have the least amount of variables possible, including using the exact same benchmark pass. It seemed like they used the internal benchmark in their presentation, so comparing to your own custom route in the game isn't a valid form of comparison.
> 
> And this might not be the top SKU. A lot of other news publications have stated AMD stressed to them that they didn't tell which SKU this was, which to me seems like they're trying to imply it's not the top dog.



THIS! Internal benchmarks show it closer to 3080 on both bordlerlands 3 and gears 5. Call of duty is depending on seen and would have to wait for reviews.


----------



## Vayra86 (Oct 9, 2020)

TheLostSwede said:


> Personally I think it would be a bad move by AMD to show it's full hand this early, as it potentially allows Nvidia to make adjustments. I guess it might be too late for that though, but who knows. Even so, all we can do for now is to speculate. That said, I doubt AMD will have a product that beats the RTX 3080, but I could be wrong.



I don't think Nvidia gets the news from TPU headlines though, or r/AMD.

I think Nvidia already knows mostly what they'll be getting, and I reckon this is part of the reason 3070 is priced at 500.

Between 3070~3080 range perf... as expected, as predicted, and not a bad result at all. Price will make it count, as usual. Same shit different day...yep.


----------



## Space Lynx (Oct 9, 2020)

jesdals said:


> I could live with that if the price is belowe the 3080



and in stock.  lol  im sure AMD will undercut nvidia on price for gpu market they have to to make up for their navi driver debacle.

i probably will get big navi.  i might get a 3070, honestly, it all depends who lets me check out and ships first. lol  so many scalpers, we will be lucky to get anything this year, ps5, etc etc etc  its sad the industry has not figured out a better way to stop scalpers, and limit buyers to 1.


----------



## ador250 (Oct 9, 2020)

If it's around 5% slower than 3080 now, then it will eventually be 5% faster with mature driver in later. Plus, there is no way Lisa Su will show their top dog this early, it's literally a jebait. If it's a 64 or 72CU card then it's GG for Nvidia.


----------



## sergionography (Oct 9, 2020)

Franzen4Real said:


> And, couple that fact with this statement from OP _"It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes." _
> We now have a bench that rendered a completely different scene and on a different API. I didn't realize that when first seeing the slides on the live stream. So now we are essentially comparing apples to baseballs. Sure, they are both round, but...


I have checked online and noticed that Techpowerup has higher fps averages on some of the games compared to other websites, and I couldn't understand why that was the case until today. Most use the internal benchmark suites that come with the games and that's most likely what AMD did. If you take that into consideration the. Yes, this whole article is useless and misleading


----------



## Space Lynx (Oct 9, 2020)

AMD did trick nvidia last time, they duped them on a couple things if I recall correctly. On pricing.  so it is possible this isn't their top dog card. lol  we will see soon enough


----------



## TheLostSwede (Oct 9, 2020)

Vayra86 said:


> I don't think Nvidia gets the news from TPU headlines though, or r/AMD.
> 
> I think Nvidia already knows mostly what they'll be getting, and I reckon this is part of the reason 3070 is priced at 500.
> 
> Between 3070~3080 range perf... as expected, as predicted, and not a bad result at all. Price will make it count, as usual. Same shit different day...yep.


Are you saying Nvidia has an insider at AMD? As that is what it sounds like. Industrial espionage is illegal in the US afaik.


----------



## Vayra86 (Oct 9, 2020)

TheLostSwede said:


> Are you saying Nvidia has an insider at AMD? As that is what it sounds like. Industrial espionage is illegal in the US afaik.



I think they get enough information and have enough insight to make a pretty educated guess.


----------



## RedelZaVedno (Oct 9, 2020)

2080TI +21% is not bad at all, IF pricing is right. IF AMD makes 12GB and 16GB variants and price them $499 and $549, it would be blast of a launch.  On the other hand IF AMD decides to be greedy like with Zen3 announcement and price them $649 and $699, then it's gonna be DOA. Remember Radeon 7 debacle, competing with 1080TI 2 years later and charging $700? I'm afraid AMD will shoot itself in the foot once more and go with the suicidal greed option. I can still remember Lisa trying to sell us Polaris replacement (Navi 1) for 499 first and then for 449.


----------



## Shatun_Bear (Oct 9, 2020)

Videocardz guy (reliable), Moore'sLawisDead and some others are saying AMD have a card above the one demonstrated at the Ryzen 5000 reveal, this is likely ready for Q1 next year or later. October 28th is all about the one she was holding up and the others below it.


----------



## Space Lynx (Oct 9, 2020)

RedelZaVedno said:


> 2080TI +21% is not bad at all, IF pricing is right. IF AMD makes 12GB and 16GB variants and price them $499 and $549, it would be blast of a launch.  On the other hand IF AMD decides to be greedy like with Zen3 announcement and price them $649 and $699, then it's gonna be DOA. Remembering Radeon 7 debacle, I'm afraid AMD will shoot itself in the foot once more and go with the suicidal greed option. I can still remember Lisa trying to sell us Polaris replacement (Navi 1) for 499 first and then for 449. Greed is infectious as hell:
> View attachment 171323



I doubt it will be DOA, and Zen 3 won't be either, both will sell out instantly. Covid brought a lot of new people into this niche hobby.  Demand is going to outpace stock for a good 6 months still imo, especially if we have a bad flu/covid season.


----------



## RedelZaVedno (Oct 9, 2020)

lynx29 said:


> I doubt it will be DOA, and Zen 3 won't be either, both will sell out instantly. Covid brought a lot of new people into this niche hobby.  Demand is going to outpace stock for a good 6 months still imo, especially if we have a bad flu/covid season.


You might be right. It will all come down to supply chain. Let me refrain. If Nvidia can deliver enough $699 3080 to saturate the market and AMD wants to change the same amount or 50 bucks less, then Big Navi will be DOA. I've always been a fan of value and currently I see no value in new AMD products, with exception of 5900X. It's strange to see Zen3 making 10600K and 10700K valid "value" options. I really hope RDNA2 offer us awesome price to performance ratio progress, but I'm not holding my breath. Personally, I'm gonna keep 3600 and 1080TI until I see something worth upgrading to without breaking a bank.


----------



## Zach_01 (Oct 9, 2020)

RedelZaVedno said:


> You might be right. It will all come down to supply chain. Let me refrain. If Nvidia can deliver enough $699 3080 to saturate the market and AMD wants to change the same amount or 50 bucks less, then Big Navi will be DOA. I'm fan of value and currently I see no value in new AMD products, with exception of 5900X. It's strange to see Zen3 making 10600K and 10700K valid "value" options. I really hope RDNA2 offer us awesome price to performance ratio progress, but I'm not holding my breath. Personally, I'm gonna keep 3600 and 1080TI until I see something worth upgrading to without breaking the bank.


I would wait for reviews first to see were all the ZEN3 line falls into the swarm. Price to perf ratio against all other (ZEN2 and Intel).

I have the 3600 and gaming is the most "heavy" workload I use it for. I could look up for 5600X as it has ~20% more single thread performance per clock and another 400MHz (+11%) on top of that overall increase of IPC. Does +30% CPU performance worth the 300 price? Depends on the perspective and the outcome of reviews. But because ZEN4 is the last line that fits into AM4 and I want this X570 system for the next 5 years at least I'm looking towards 5800X/5900X. But not now. In a year after prices settle down, we can talk about it.


----------



## Space Lynx (Oct 9, 2020)

RedelZaVedno said:


> If Nvidia can deliver enough $699 3080 to saturate the market



Nvidia already announced they won't be able to keep up with supply until first few months of 2021. So yeah everything will be sold out for 6+ months imo.


----------



## GhostRyder (Oct 9, 2020)

If true this will be disappointing to me regardless of price.  While I will say if cheaper than the 3080 at this performance it can be a good value but since I failed to get a 3090/3080 I was thinking I might pick this up if it at least matched or bested the 3080.  But again we will have to see because we don't know their testing methodology.


----------



## Vayra86 (Oct 9, 2020)

Turmania said:


> From looks of it, it will contest with rtx 3700 not against 3800. Basically a 2080 ti competitor. Obviously we know very little about it and power consumption etc... but Lisa said it is big navi, therefore it is safe to assume that it will be its leading card. Unless they come up with uber navi etc.



Well, if they can *average *only a 10% performance deficit, and knowing the 3080 has zero OC headroom to speak of (even an undervolt is good on it), they're actually pretty competitive with a 3080. Now if they also provide 12-16GB to overtake the 10GB of the 3080... I think I'd know what card to pick, even at 650-720 ish for the AMD one. That's definitely competing with 3080, even if the net perf is a bit lower. It'll likely age a lot better and could be another 680 vs 7970 moment.

I'm hyper critical of what's on the table usually, but this looks pretty good. If the TDP is kept in check and it runs smoothly.... AMD has a great release on their hands. And let's pray for no driver or BIOS oopsies this time. Or at least a rapid and definitive hotfix scenario. That's a big thing, and RDNA1 didn't do it favors.

We also don't know how close the 3070 will get to 2080ti performance but I reckon they will match it. Not handily beat it - that only happens situationally. In that case... its certainly not competing with 3070, its on another tier entirely. This is definitely shaping up to be on the upper end of what I had expected from RDNA2.

Additionally, I think its realistic to expect this not to be a full chip, so there's that too.


----------



## Luminescent (Oct 9, 2020)

I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room .


----------



## efikkan (Oct 9, 2020)

ador250 said:


> If it's around 5% slower than 3080 now, then it will eventually be 5% faster with mature driver in later. Plus, there is no way Lisa Su will show their top dog this early, it's literally a jebait. If it's a 64 or 72CU card then it's GG for Nvidia.


The fine wine argument again, that's a nonsensical argument, and more than a little misleading.
Buy an inferior product today, which may be superior sometime later, but it never happens.
Nvidia's drivers improve performance just as much as AMD's over time.



lynx29 said:


> AMD did trick nvidia last time, they duped them on a couple things if I recall correctly. On pricing.  so it is possible this isn't their top dog card. lol  we will see soon enough


Even if they did, I don't believe misleading competitors or consumers is a good thing. We need healthy competition and fair play.



Vayra86 said:


> Well, if they can average only a 10% performance deficit, and knowing the 3080 has zero OC headroom to speak of (even an undervolt is good on it), they're actually pretty competitive with a 3080.


If the performance lands between RTX 3070 and RTX 3080 and the price does too, then it's fair. But if it's less performant at the same price, then people shouldn't buy it.
Even if "big Navi" happens to have more OC headroom, that shouldn't be a general sales argument, most gamers want stable computers for gaming, not overclocking for setting records.



Vayra86 said:


> Now if they also provide 12-16GB to overtake the 10GB of the 3080... I think I'd know what card to pick, even at 650-720 ish for the AMD one. That's definitely competing with 3080, even if the net perf is a bit lower. It'll likely age a lot better and could be another 680 vs 7970 moment.


Aah, the eternal "future proofing" argument; buy inferior stuff with superior "specs" that will win over time. 
Our best guidance is always real world performance, not anecdotes about what will perform better years from now.


----------



## Vayra86 (Oct 9, 2020)

efikkan said:


> The fine wine argument again, that's a nonsensical argument, and more than a little misleading.
> Buy an inferior product today, which may be superior sometime later, but it never happens.
> Nvidia's drivers improve performance just as much as AMD's over time.
> 
> ...



You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really. Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place. I didn't pull that comparison out for nothing. We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest, and I think it needs no discussion that the 2GB Kepler /refresh cards were obsolete faster than many would have liked. Faster than I liked personally at least - I bought a 770 and did replace with a 780ti only to get the 3GB VRAM as 2GB was falling short even at 1080p, for quite a few titles. Only one gen later the VRAM caps doubled and for good reason - the consoles had more too.

Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.

Ironically, the best handle on future proofing is gained by looking back. History repeats.


----------



## Luminescent (Oct 9, 2020)

You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
New games will widen the gap between old and new more and more.


----------



## Vayra86 (Oct 9, 2020)

Luminescent said:


> You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
> They can afford this tactic because they rule desktop market.
> New games will widen the gap between old and new more and more.



Wrong, both competitors keep adding support for new titles on all active GPU families.


----------



## efikkan (Oct 9, 2020)

Vayra86 said:


> You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really.


There are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).



Vayra86 said:


> Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place.


And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. _feels_ right.
As I've told you before, the fact remains that increasing VRAM usage nearly always requires both more bandwidth and computational performance in order to utilize it, which is why you usually will hit other bottlenecks long before VRAM.



Vayra86 said:


> We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest


Not really, that's combined system RAM and VRAM, so it's not a fair comparison.



Vayra86 said:


> Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.


No objection there. ~250W seems to be the spot where cooling becomes a hassle.



Luminescent said:


> You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
> They can afford this tactic because they rule desktop market.


Exactly how many years do you have to wait then?
The fact is Nvidia retains legacy support, AMD drops it.
AMD can't even get their current driver support right, how can they be better at legacy support?



Luminescent said:


> New games will widen the gap between old and new more and more.


Yeah right.
I'm sure those old Radeon 200/300 will finally be unleashed any day now.


----------



## Zach_01 (Oct 9, 2020)

Vayra86 said:


> Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.


Actually I'm ok at the 300~320W draw as I owned a R9 390X from 2015 to 2018. But I'm also case-less so that helps.


----------



## Luminescent (Oct 9, 2020)

Vayra86 said:


> Wrong, both competitors keep adding support for new titles on all active GPU families.


Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree *efikkan*, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.


----------



## TheoneandonlyMrK (Oct 9, 2020)

Luminescent said:


> Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
> Driver tricks will make RTX 3000 more and more appealing in new games.
> I agree *efikkan*, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
> They are here to piss off Nvidia a bit but not really compete.


Okay so interesting way to spend a Friday, I disagree with some of that.
No sorry all of it.

Opaquifying the B's for mortals, good work.


----------



## evernessince (Oct 9, 2020)

I'm not sure this is cherry picking.  Many of the games chosen favor Nvidia, not AMD.  The same could be said of AMD's recent Zen 3 announcement.  Far Cry Primal for example heavily favors Intel yet AMD choose to use it.

Of course I am going to wait for 3rd party benchmarks but it's nice to see a company that isn't trying to insult your intelligence with "2X the Performance!!!!!     *At 8K with RTX enabled vs a 2080"


----------



## Space Lynx (Oct 10, 2020)

I just checked on AMD's bicycles. They are sold out as well, lmao.  wow.  just wow.


----------



## Kaleid (Oct 10, 2020)

Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always


----------



## nguyen (Oct 10, 2020)

Kaleid said:


> Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
> I'll go for bang for buck as always



Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.


----------



## sepheronx (Oct 10, 2020)

nguyen said:


> Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.



Because the RX 5700 was a good bang for your buck GPU as well.


----------



## nguyen (Oct 10, 2020)

sepheronx said:


> Because the RX 5700 was a good bang for your buck GPU as well.



Saying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?

But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now ).

So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.


----------



## sepheronx (Oct 10, 2020)

nguyen said:


> Saying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?
> 
> But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now ).
> 
> So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.


Not even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad.  And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.

Better wait and see RDNA 2 results to compare better bang for your buck.


----------



## Blueberries (Oct 10, 2020)

Kaleid said:


> Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
> I'll go for bang for buck as always



So where do they cut cost? Performance inherently improves price/perf. They can't just sell cards at any price they want and earn a profit. There has to be some engineering improvement to undercut NVIDIA if the performance is similar and even more so if it's lower.

The cards are in NVIDIA'S hand until they release the 2060


----------



## GoldenX (Oct 10, 2020)

Could it be? A decent Radeon release after so long?
Nah, they will find a way to screw it up, and it better not be disabling ray tracing due to "reasons".


----------



## nguyen (Oct 10, 2020)

sepheronx said:


> Not even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad.  And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.
> 
> Better wait and see RDNA 2 results to compare better bang for your buck.



Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages  .

I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)


----------



## Kaleid (Oct 10, 2020)

nguyen said:


> Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.



I'm also...obviously aiming for a certain performance at a certain resolution.
Besides, I bought the 5700 used so i saved 100 USD doing so the 580 wouldn't have any chance when it comes to bang for the buck.
And since 3080 is inserted into the discussion, as it also provides lot of the money. Sure, but since I'm at 1440p it's performance would be utterly wasted with my monitor.

AMD raised the 5000 series CPU prices but I'm quite certain that very few will expect AMD to do the same with their new GPUs.

I'll change GPU next year, but not until I get to see MS direct storage in action.


----------



## puma99dk| (Oct 10, 2020)

Dristun said:


> what? They didn't tout it as such - in _their own slides._



That really became where the card shine after it was released tried it in both 1440p and 4K compared to my old GTX 1080 Ti.


----------



## cueman (Oct 10, 2020)

hmm,when you get off amd's cherry picked amd games,ryzen 5900 cpu help and also 16gb memory help,specially just 4K resolution its help,looks clear that rtx 3080 is 20-25% faster than big navi.

so, when choose 20gb rtx 3080 and using same cpu ryzen 5900 for test...no doubt.


but, sure,its seen so clear,..amd cant make any miracle its building...still old parts and usual updates.
i mean rrx 6900 xt is 2 x 5700xt with CF mode, so rx 6900 xt cant be more than that for performance...and amds tdp value info for that performance, 300W is joke.
i say we took over 350W.and i mean if it get even 80% rtx 3080 speed.

interesting is also RT speed..and that category amd is child situation...also,RT support btw, eat more juice... 25W?


we can imagine..erh, or say wait this review Techpowerup:


RTX 3080 FE 10GB/20GB + ryzen 5900 cpu  VS   RX 6900 XT 16GB + ryzen 5900 cpu

i can bet 25% win for rtx 3080...and i add at least 20% more different when rtx 3090/ryzen 5900 cpu put against rx 6900 xt.

i see that rtx 3000 series, its highers gpus need powerfull cpu for partner... now i say ryzen 5900 or so, later intel 10nm and of coz 7nm one.


hmm rtx 3070/TI against rx 6900 xt....well, sure is that rtx 3070 is cheaper, and might fight hard against rx 6900 xt.... loose 10% and be 200$ cheaper....?

let see...waiting test.. only few weeks!


also,is it nice to get battleline also intel Xe gpu, then we have 3 gpus for soap!


----------



## Chomiq (Oct 10, 2020)

lynx29 said:


> I just checked on AMD's bicycles. They are sold out as well, lmao.  wow.  just wow.


Well here's hoping AMD gpus aren't as shitty as those bikes.


----------



## 300BaudBob (Oct 10, 2020)

Luminescent said:


> I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
> If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room .


Stick the water cooler radiators out the window near heat loving plants; or use a themocouple to convert heat to run fans; or use it to boost cold water temps for your water heater...all sorts of fun things to do with that heat


----------



## INSTG8R (Oct 10, 2020)

While I have nothing to say about the numbers but I can say the BL3 presentation was using the built in benchmark I recognized it.


----------



## Nephilim666 (Oct 10, 2020)

cueman said:


> also,is it nice to get battleline also intel Xe gpu, then we have 3 gpus for soap!



Sums up your post well. Nonsense.


----------



## Vayra86 (Oct 10, 2020)

Luminescent said:


> Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
> Driver tricks will make RTX 3000 more and more appealing in new games.
> I agree *efikkan*, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
> They are here to piss off Nvidia a bit but not really compete.



No. Just:.. no, sorry. You can leave this drivel elsewhere.



efikkan said:


> There are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).
> 
> 
> And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. _feels_ right.
> ...



You can keep repeating the blanket statements but Im referring specifically to the bakance of this particular 3080. Its a typical Nvidia move to place it like this and after the 670/680/770 and numerous other examples Im pretty capable of predictions in this regard.

The gpu has shit balance, like it or not, and 700 isnt exactly a steal for such product.


----------



## medi01 (Oct 10, 2020)

GoldenX said:


> A decent Radeon release after so long?



If 5700 wasn't a decent release for you, neither will this one.


----------



## springs113 (Oct 10, 2020)

So much ignorance in this thread,  one thing that i loved though,  is all the discussion was civil as possible.  Now,  if anyone here understands marketing then you know this... that was not AMDs best foot forward.   I'd be worried if I'm Nvidia(more so if that's a 64CU card).  People love to talk about track record(especially at AMD haven't competed in the for years) ,  but fail to realize AMDs recent track record they "jebaited" several times recently, Zen 2, 5700xt, 5600xt, technically Zen3.

Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu.  It's clear to me and a select few that they've got something special on their hands and are being real coy about it.  Do you think the teaser at the end wasn't on purpose?  They've been doing it all along.   Look at the rumor mill when it came to Zen 3 vs RDNA2.  It was clear to me what was prioritized and where.   While Zen 3 is a big deal, RDNA2 is where the statement needed to made.

IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere.  These cards will be very efficient and priced well.   AMD also is doing what very few companies have done,  be honest.   These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU).  Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working.  The GPU event was obviously last for a reason,  they have the fastest CPUs in town and thus needed the GPUs to be tested with the best.   This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete.  RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.

A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?


----------



## nguyen (Oct 10, 2020)

springs113 said:


> Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu.  It's clear to me and a select few that they've got something special on their hands and are being real coy about it.  Do you think the teaser at the end wasn't on purpose?  They've been doing it all along.   Look at the rumor mill when it came to Zen 3 vs RDNA2.  It was clear to me what was prioritized and where.   While Zen 3 is a big deal, RDNA2 is where the statement needed to made.
> 
> IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere.  These cards will be very efficient and priced well.   AMD also is doing what very few companies have done,  be honest.   These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU).  Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working.  The GPU event was obviously last for a reason,  they have the fastest CPUs in town and thus needed the GPUs to be tested with the best.   This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete.  RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.
> 
> ...



Fury X - Flopped
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.

Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.

And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).

Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.


----------



## Dyatlov A (Oct 10, 2020)

Will it have good Vulkan driver for Red Dead Redemption 2 or will freeze like the previous generation?


----------



## Luminescent (Oct 10, 2020)

Vayra86 said:


> No. Just:.. no, sorry. You can leave this drivel elsewhere.


I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead,  lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch


----------



## Mussels (Oct 10, 2020)

If they get stock before my 3080 eventually arrives (late NOVEMBER! FOR A PREORDER!) i'll just damn well switch to team red again.


----------



## okbuddy (Oct 10, 2020)

6900xt about 2% faster than 3080, and $50 cheaper

3080 oc is exactly 200% 5700xt
6900xt is double 5700xt and plus


----------



## Vayra86 (Oct 10, 2020)

Luminescent said:


> I understand, some people can't comprehend they can do that, handicap cards from drivers, neah, impossible, corporations operating as a monopoly when they don't have any competition, this is nonsense.
> While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead,  lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch



Did it occur to you I never even asked you to provide a source for these old and tired claims? And that you never provided one either?

This is because I have already seen them all and none of them holds any water whatsoever. You can spare yourself the trouble. And again.. you can either learn a thing or two here or you can live in fantasy land, but again Id say, do it elsewhere. Ill torch any nonsense I see and so far you've been full of it.

Consider carefully what you might post next. This subject has been debunked a hundred times already  Im not going there again.


----------



## dinmaster (Oct 10, 2020)

did it occur to anyone that the numbers could be total false?? made up? or maybe its one of their lower cards and not the best one. She mentioned " this is the radeon 6000 series which we now affectionately called big navi thanks to many of you who nicknamed it for us" but is it really  big navi? maybe. we call it big navi with all the specs but we say that its big navi but what if its not.. anyway just some thoughts on something we will find out more about in 18 days..


----------



## Vayra86 (Oct 10, 2020)

nguyen said:


> Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages  .
> 
> I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)



Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange


----------



## nguyen (Oct 10, 2020)

Vayra86 said:


> Not so sure about that. Bang/buck or just price/perf is always a factor. Otherwise every 3080 buyer would have ordered a 3090 instead, right? The fact they do not but still spend 700-800 on a GPU is directly related to bang/buck. Or are you now gonna say 3080 is midrange



Oh don't you worry, the e-peen crowd already gobble up any 3090 stock they can find . Well I too would like a piece but all the vendors in my country are price gouging the hell outta 3080/3090, going as high as +400-500usd over MSRP for both, somehow that is legal in third world countries.
Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
There are many more rich kids out there than you might think


----------



## kingDR (Oct 10, 2020)

jesdals said:


> I could live with that if the price is belowe the 3080


Agree.


----------



## DuxCro (Oct 10, 2020)

lynx29 said:


> I just checked on AMD's bicycles. They are sold out as well, lmao.  wow.  just wow.


Probably because people think AMD bicycles have IPC (ImProved Cycling) over other bycicles.


----------



## Totally (Oct 10, 2020)

Luminescent said:


> I agree *efikkan*, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
> They are here to piss off Nvidia a bit but not really compete.



Could we lay off the "bad driver" or "power hungry" nonsense? Last I checked Nvidia cards have been ever so so power hungry and the 30 series launch and current state of those cards is something to think about before pointing out the very same flaws in the competitor.


----------



## GoldenX (Oct 10, 2020)

medi01 said:


> If 5700 wasn't a decent release for you, neither will this one.


Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.


----------



## springs113 (Oct 10, 2020)

nguyen said:


> Fury X - Flopped
> Vega - Flopped
> Radeon VII - Flopped
> AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.
> ...


You seem to be one of the main ones who can't comprehend.  Do you know memory cost as well...a gpu does not consist of just a "die".  According MLID who seems to know more than the both of us,  Nvidia is roughly making maybe $50-60 on each 3080.  Sounds like an expensive ass gpu to me.   It does not and will not cost AMD $50 below msrp to make their cards.   With winter approaching i guess one of the bigger justification for current Ampere is,  no space heater needed. 

FURY X did not flop... loud mouth aibs have word to Nvidia which allowed for Nvidia to drop its pricing to train on FuryX parade, so your argument there is a wash.

Funny how you forget 290x vs Titan fiasco... i won't mention who took that L.

Vega didn't flop it was just late... and judging by AMDs predicament at the time, Zen was the more important product(now we're about to reap the reward for Vegas sacrifice).  It's also funny that Vega is as good as its Nvidia counterpart now.  But if you want to call vega an L, I'll give you that. 

Radeon 7 was never meant for the gaming segment,  navi had a bug and was going to be delayed and they needed a "tide me over" card. And if Radeon 7 was trash then so was the 2080... at least the Radeon 7 could be used for compute.   
RDNA1 (5700xt) was a good bit faster that Nvidia had to do a refresh.  It did all this while being substantially smaller. 

Now to present time... If RDNA2s so bad why did Nvidia rush a launch? (No stock till 2021).  Why did they push out those space heaters?   Xx80 chip now back on the big die.  Hmm, i wonder,  it seems Nvidia knows more about AMD than you or I, and they clearly jumped the gun.   The 3090 is just 2 shaders shy of being the full die and it is only a mere 10% faster the 3080, all while being basically double/more the cost.   That's doom and gloom in my book.  If AMD wants the crown,  it's right there for the taking.   If i were them,  I'd look to take both the CPU & GPU crown in the same month.   Why did Nvidia delay the 3070 then give that bs excuse?   They wanted to one up AMD the day after,  if you think AMD want ready this go around,  you're delusional.   Too many irrational decisions by Nvidia this go around and it says one thing to me,  AMD has arrived.


----------



## Icon Charlie (Oct 10, 2020)

GoldenX said:


> Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.



Would not say worse than Turing as I have never found issue with my 5700.  As a matter of fact my 5700 turned out to be an pretty good video card. It is my 5700XT that turned out to be utter crap as far as heat issues.  The customer is not supposed to be given a crap card, to undervolt, due to its build.  This applies to both Nvidia and AMD and they are both been caught with marketing lies.

Overall IMHO if you got last generation's video card.  You do not need this generation's video card.  I'll even go further to say that if you got any video card from 2016 and on wards, you do not need a video card. Because the tech advances that you need to play your favorite games will not be much of an increase.

And contrary to what these tech talking heads want to say. The world does not revolved around 4K.  It is 1080p AND LESS.

So enjoy what you have and buy what you absolutely need to buy and save some money in the end.


----------



## Shatun_Bear (Oct 10, 2020)

Luminescent said:


> I think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
> If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room .



I know this is a semi-joke but that is the first combination of components in a PC that could genuinely suffice as a heater during the winter when gaming.

500W is the same as my small office electric eco heater. The heat that a 500W PC produces can't vanish into thin air of course it just gets soaked up by your cooler fins until it's spat out into your room.


----------



## evernessince (Oct 10, 2020)

RedelZaVedno said:


> 2080TI +21% is not bad at all, IF pricing is right. IF AMD makes 12GB and 16GB variants and price them $499 and $549, it would be blast of a launch.  On the other hand IF AMD decides to be greedy like with Zen3 announcement and price them $649 and $699, then it's gonna be DOA. Remember Radeon 7 debacle, competing with 1080TI 2 years later and charging $700? I'm afraid AMD will shoot itself in the foot once more and go with the suicidal greed option. I can still remember Lisa trying to sell us Polaris replacement (Navi 1) for 499 first and then for 449. Greed is infectious as hell:
> View attachment 171323



Are you talking about that time AMD jebaited Nvidia and then lowered their price?  That's not greed, it's AMD playing Nvidia for once.  Not to mention, it's a $50 price difference.  You want to talk about greed, let's talk about the $500 price increase from the 1080 Ti to the 2080 Ti, Nvidia's past staggered launches like the "flagship" 980 only to release into the 980 Ti, or the GeForce Partner Program.

I'm not for any price increases but it's hilarious the double standard being applied.  $50 makes AMD greedy fucks, $500 doesn't even register for Nvidia apparently though.


----------



## Blueberries (Oct 11, 2020)

TIL AMD fanboys actually believe the "jebait" happened


----------



## Nkd (Oct 11, 2020)

john_ said:


> I guess the dilemma will be like this:
> 
> 3080: Faster, Raytracing performance, DLSS, CUDA, (theory or fact you decide) better drivers
> 
> ...



There are multiple SKUs on the big navi chip. It's already been long rumored with model numbers in drivers that there are few SKUs for Navi 21 chip so this is Big Navi just not the biggest Navi. If they have bigger they will probably. It could be either way. I wouldn't be surprised though if this is one right below the top sku. 72CU.



nguyen said:


> Fury X - Flopped
> Vega - Flopped
> Radeon VII - Flopped
> AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.
> ...



AMD did not have a special event for Radeon VII. It was at the tail end of other event and no one expected it.  You sound very sure like you are Lisa Su lol.



Vayra86 said:


> I think they get enough information and have enough insight to make a pretty educated guess.



Yea they must have some guess thats why they made a power hungry card with no OC headroom. They probably had an idea what AMD is targeting but not exact. Videocardz, redgamingtech, not an apple fan, Moore's Law is dead basically all have confirmed there is a TOP card that no one absolutely no AIB has a wind of. AMD is keeping the top chip in house reference only until launch. So no one knows about the biggest chip that will be launched because AMD has kept it so close to chest.



nguyen said:


> Fury X - Flopped
> Vega - Flopped
> Radeon VII - Flopped
> AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.
> ...



Yield my friend. Let's say your numbers are 100% for how much it costs. That 9k makes more chips. Seriously your math is ignorant to everything thats true. AMD will pump more chips even they paid more per wafer. Look at Nvidia struggling to make chips. Not FE model will not be sold at Nvidia store lol. Rumor was true that Nvidia wont be making too many of those and will let AIBs sell the cards at higher price since its cost too much to make the FE model. Now all that is coming true.

GA 102 is not dirt cheap to produce when the chips are so damn big. Common now lol. You are assuming that there is 100% Yield and Nvidia is fitting more chips on the wafer then AMD. common now, no matter how you cook it AMD will make more chips on a smaller node and be cheaper to make overall.

oh by the way. Nvidia is not paying 3k per wafer. Its more like close to 6k. So yea they are making profit sure, but not so much with the FE model it seems since they are only selling it through best buy in the U.S now and rumors are coming true that they don't want to make too many of those.









						Why Did the NVIDIA RTX 3000 Arrive in 8nm with Samsung? | ITIGIC
					

Until the very last moment, until the last day before the presentation, there was debate about the fact that NVIDIA could have used the 7nm EUV manufacturing process to get ahead of AMD. As we saw, they finally had to opt for the 8 nm and this is a very important performance handicap compared to




					itigic.com


----------



## Th3pwn3r (Oct 11, 2020)

The best thing AMD can do is release their video cards and take away 3080 sales while Nvidia has no available stock. Going off of AMDs history I shouldn't have high hopes for Big Navi but I'm always hoping either AMD/Nvidia or Intel has something great coming out. That's how unbiased enthusiasts are, us guys who go way back, back when Al Gore invented the internet


----------



## evernessince (Oct 11, 2020)

nguyen said:


> Fury X - Flopped
> Vega - Flopped
> Radeon VII - Flopped
> AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.
> ...



None of those GPUs were really designed for gaming though and the Fury X was made because it's the only thing the graphics department could put together under rory read, who wanted to move AMD away from high end graphics all together.

Vega and Radeon VII are based on the Vega architecture which is clearly focused on professional and compute.  They have have a large part of the die dedicated to the HBCC, which does nothing for gaming.  The only upside to the VII cards was that they still sell for $700 because they do very well in professional workloads.

In essence "Big Navi" will be AMD's return to high end gaming graphics cards in a LONG time.



GoldenX said:


> Black screens of death, no new features, once again marketed features that had to be disabled... How was it decent? It managed to be worse than Turing's.



I suggest you read up on Navi before saying it has no new features: https://www.techpowerup.com/256660/...ecks-ryzen-3000-zen-2-radeon-rx-5000-navi-etc

Among the features is localized L1 cache sharing.  Sharders in a group can share content among the L1 caches.  This not only increases effective bandwidth and cache size (by removing the need for duplicated data) but reduces latency and cache misses.  I would not be surprised if RDNA2 enables sharing across shader groups.  Anything that spares the GPU from having to access the L2 cache, which is significantly slower to access than L1.

Given the boost in performance Navi has over prior AMD GPUs and that it brought AMD on par efficiency wise with Nvidia, I would certainly not call it a turing.  Just because AMD only released the mid range certainly doesn't make it bad.

Also, what is your source for marketed features that had to be disabled?  That was only for Vega as far as I'm aware as I've looked over all the marketing slides AMD have provided.


----------



## GoldenX (Oct 11, 2020)

I'm not totally sure if architecture changes can count as features. Sure that's the secret behind the performance uplift, but AMD still lacks variable rate shaders, mesh shaders and ray tracing.
Primitive shaders is also in hardware in RDNA1, same as Vega, was never enabled. Imagine developing a whole new stage in the graphics pipeline, _and never using it._


----------



## Deleted member 190774 (Oct 11, 2020)

Blueberries said:


> TIL AMD fanboys actually believe the "jebait" happened


Perhaps you're young, after all you did just use 'TIL' - which after Googling it, I've just learned means 'Today I learned'. Perhaps you're very insightful, or perhaps naive...

Either way, you'd have to be living under a rock to miss the mind-games that are going on in the industry right now. As such you can choose to believe that Lisa Su just chucked her hand in and said this is all we've got - at an event that was about CPU launches, or you can be open to the possibility that AMD is playing the game, and leaving something in the tank for the main GPU launch event in a few weeks.

If they had nothing to challenge Nvidia, I believe they would have teased a 'wait and see' line. Of course, there is also the possibility that AMD are softening the blow.

In my job, which involves being very analytical, I would fail dismally every time if I made rash decisions without considering all possibilities. Then again, I trust my gut instinct.

I'm not saying you're wrong in terms of the eventual outcome, but I am saying you're wrong to mock and be definitive in your opinion. Time will tell.


----------



## GoldenX (Oct 11, 2020)

Something I've learned after being disappointed in AMD's marketing for years is that the more quiet they are, the better the product tends to be.


----------



## INSTG8R (Oct 11, 2020)

GoldenX said:


> Something I've learned after being disappointed in AMD's marketing for years is that the more quiet they are, the better the product tends to be.


Well this one has definitely been kept under wraps and had the absolute minimum of hype in recent memory.


----------



## GoldenX (Oct 11, 2020)

That's my only hope for this release, after all the lame products we had to tolerate.


----------



## Blueberries (Oct 11, 2020)

beedoo said:


> Perhaps you're young, after all you did just use 'TIL' - which after Googling it, I've just learned means 'Today I learned'. Perhaps you're very insightful, or perhaps naive...
> 
> Either way, you'd have to be living under a rock to miss the mind-games that are going on in the industry right now. As such you can choose to believe that Lisa Su just chucked her hand in and said this is all we've got - at an event that was about CPU launches, or you can be open to the possibility that AMD is playing the game, and leaving something in the tank for the main GPU launch event in a few weeks.
> 
> ...



The "jebait" people are refering to is last year when AMD purportedly pretended that they planned to drop prices after the release of the Super cards and "tricked Nvidia" by launching the 5700 at a higher MSRP than they intended. 

But no, I'll call their bluff on this one too. They've got nothing impressive in the GPU space. They blew Intel out of the water in HEDT, but big Navi is a flop.


----------



## INSTG8R (Oct 11, 2020)

Blueberries said:


> The "jebait" people are refering to is last year when AMD purportedly pretended that they planned to drop prices after the release of the Super cards and "tricked Nvidia" by launching the 5700 at a higher MSRP than they intended.
> 
> But no, I'll call their bluff on this one too. They've got nothing impressive in the GPU space. They blew Intel out of the water in HEDT, but big Navi is a flop.


It’s a flop based on what? I’m curious what benchmarks and  specs you have to share to have come to this conclusion 3 weeks before anyone else really knows anything.


----------



## nguyen (Oct 11, 2020)

evernessince said:


> None of those GPUs were really designed for gaming though and the Fury X was made because it's the only thing the graphics department could put together under rory read, who wanted to move AMD away from high end graphics all together.
> 
> Vega and Radeon VII are based on the Vega architecture which is clearly focused on professional and compute.  They have have a large part of the die dedicated to the HBCC, which does nothing for gaming.  The only upside to the VII cards was that they still sell for $700 because they do very well in professional workloads.
> 
> In essence "Big Navi" will be AMD's return to high end gaming graphics cards in a LONG time.



Yeah Navi21 selling at 600usd  +-50usd can already be called high end gaming.
The elephant in the room here is AMD would like to make profit that could later turn into R&D budget, AMD can't do that if they go into a price war against Nvidia (whom already maintain godly profit margin). Therefore the only way AMD can do is slot their GPU into the gap between Nvidia price brackets, selling as many as they could while maintain as high profit margin as they can (5700/5700XT ring any bells ? ) .

Nvidia already knew 3080/3090 would have no direct competition; the 3070 is the one that Nvida is prepping to counter Big Navi. If AMD decide to price Navi21 at 500usd then Nvidia would have to lower the 3070 price accordingly, though I doubt AMD would do that.

Overall Nvidia is touting 3080 as their flagship gaming GPU, there is no shame for Navi21 being slightly slower. Remember that Nvidia only need TU104 to beat AMD flagship before.


----------



## PooPipeBoy (Oct 11, 2020)

INSTG8R said:


> It’s a flop based on what? I’m curious what benchmarks and  specs you have to share to have come to this conclusion 3 weeks before anyone else really knows anything.



Mis-informed opinion, nothing more. He's a fool for making such a sweeping judgement at this early point in time. Not worth the time of day.


----------



## INSTG8R (Oct 11, 2020)

PooPipeBoy said:


> Mis-informed opinion, nothing more. He's a fool for making such a sweeping judgement at this early point in time. Not worth the time of day.


I mean TPU shouldn’t even be using their own charts for comparison. Different API, Quality settings and not the benchmark like AMD used It’s just adds to more FUD and speculation.


----------



## Blueberries (Oct 11, 2020)

INSTG8R said:


> It’s a flop based on what? I’m curious what benchmarks and  specs you have to share to have come to this conclusion 3 weeks before anyone else really knows anything.



What do you have to say it isn't?

I don't have any benchmarks to say Zen3 will beat TGL either but you don't seem to have a problem with that speculation


----------



## INSTG8R (Oct 11, 2020)

Blueberries said:


> What do you have to say it isn't?
> 
> I don't have any benchmarks to say Zen3 will beat TGL either but you don't seem to have a problem with that speculation


So we‘ve gone from  your flagrant speculation to shifting focus to your projection at speculation of a completely different  product in a completely different hardware space with about as much info to go on as your Navi “flop“ Pick a lane....


----------



## R0H1T (Oct 11, 2020)

Based on the IPC numbers AMD gave it's a reasonable guesstimate, TGL will win some while zen3 should win most. CPU benchmarks are mostly predictable, with very few outliers, & given it's the *same x86* based *uarch* I doubt we'll see too many surprises. With GPU you can already see how Ampere does vs Turing at lower resolution, even a minor change can make huge difference in performance & at different resolutions there's no guarantee the same trend will apply!


----------



## INSTG8R (Oct 11, 2020)

R0H1T said:


> Based on the IPC numbers AMD gave it's a reasonable guesstimate, TGL will wone some while zen3 should win most. CPU benchmarks are mostly predictable, with very outliers, & given it's the *same x86* uarch I doubt we'll see to many surprises. With GPU you can already see how Ampere does vs Turing at lower resolution, even a minor change can make huge difference in performance & at different resolutions!


But we’ve moved the goal posts to unrealeased mobile parts now?...


----------



## R0H1T (Oct 11, 2020)

I'm assuming he's talking about absolute IPC based numbers, so ISO performance & it's only a fair comparison in that regard. If we want to talk apples to apples, mobile SoC, then it's a hard sell because zen3 mobile will have much less cache & probably monolithic die. Unless you meant he's changing the tack from GPU to CPU, in which case you're right.


----------



## INSTG8R (Oct 11, 2020)

R0H1T said:


> I'm assuming he's talking about absolute IPC based numbers, so ISO performance & it's only a fair comparison in that regard. If we want to talk apples to apples, mobile SoC, then it's a hard sell because zen3 mobile will have much less cache & probably monolithic die.


But he came in ranting with authority that Navi is a flop. I don’t give a flying fig about the mobile space or speculation on another unreleased product. I wanna hear why Navi is a flop not how Intel might finally pull themselves out of the hole they’ve dug for themselves in a hardware space completely unrelated to the topic at hand.


----------



## jamexman (Oct 11, 2020)

jesdals said:


> I could live with that if the price is belowe the 3080



Can you live with their trash drivers though?


----------



## HD64G (Oct 11, 2020)

As many have written, AMD hype was minimal to none this time. Navi10 was an unknown quantity until its official reveal as Big Navi is and the 1st one was better than expected. A repeat maybe?


----------



## ZoneDymo (Oct 11, 2020)

jamexman said:


> Can you live with their trash drivers though?



If you can live with Nvidia's trash drivers, you can live with AMD's trash drivers.


----------



## INSTG8R (Oct 11, 2020)

jamexman said:


> Can you live with their trash drivers though?


Cant say I’ve had any show stopping drivers on my 5700XT and I have a Freesync HDR monitor to boot so I have all the variables. Just Enhanced Sync being broken is all that comes to mind


----------



## jesdals (Oct 11, 2020)

jamexman said:


> Can you live with their trash drivers though?


Never used anything other than ATI/AMD since rage fury - so i do live fine with them but not more FanATIc than I did consider 3090 but due to lack of availability i did decide to waith for big Navi before makeing a choice. Using Radeon VII I do know about poor drivers - on other side theres been the same issues over the years with the green cards.


----------



## evernessince (Oct 11, 2020)

GoldenX said:


> I'm not totally sure if architecture changes can count as features. Sure that's the secret behind the performance uplift, but AMD still lacks variable rate shaders, mesh shaders and ray tracing.
> Primitive shaders is also in hardware in RDNA1, same as Vega, was never enabled. Imagine developing a whole new stage in the graphics pipeline, _and never using it._



RDNA2 already has all of those features confirmed.

RDNA2 also supports primitive sharders out of the box as well.

All I have to say is wait for the benchmarks.


----------



## INSTG8R (Oct 11, 2020)

evernessince said:


> RDNA2 already has all of those features confirmed.
> 
> RDNA2 also supports primitive sharders out of the box as well.
> 
> All I have to say is wait for the benchmarks.


Basically to be compliant with DX12 Ultimate or whatever it will be called they have to have most of that list.


----------



## GoldenX (Oct 11, 2020)

Makes me wonder was what the whole point of releasing RDNA1,something with feature parity with Vega, so already obsolete at launch.
Primitive shaders are dead by now. Both Microsoft and Khronos went for Mesh shaders. Two generations of products marketed to have something that never worked and never will.


----------



## INSTG8R (Oct 11, 2020)

GoldenX said:


> Makes me wonder was what the whole point of releasing RDNA1,something with feature parity with Vega, so already obsolete at launch.
> Primitive shaders are dead by now. Both Microsoft and Khronos went for Mesh shaders. Two generations of products marketed to have something that never worked and never will.


AMD is always chasing the latest features but sometimes they seem to lose focus.


----------



## Blueberries (Oct 11, 2020)

INSTG8R said:


> So we‘ve gone from  your flagrant speculation to shifting focus to your projection at speculation of a completely different  product in a completely different hardware space with about as much info to go on as your Navi “flop“ Pick a lane....



No, I'm speculating just like yourself but for some reason you think the burden of proof is on _*me* to disprove your theory that they're hiding a more powerful GPU beyond a curtain. You're the one throwing flagrant speculation and I'm being realistic._


----------



## cueman (Oct 11, 2020)

just my re-search ,calculation and sharp compare and etc etc.. info


well,i cant see that rx 6900 xt can reach even close rtx 3080 performance..i say rtx 3080 10/20gb is 20-25% faster.


amd use it 'teaser' video 5900xt 12-core cpu ,what ,if rtx 3080 get it for test,helps much just rtx 3080/3090 gpus, bcoz one of limited/bottleneckthouse gpus are cpu ,and 5900 xt 12-core helps.

also rx 6900xt have 16gb memory ,and its helps just there high resolution gaming, like amd 'teaser' video' test ,4K resolution.



still different is quite clear, even amd own cherry picked games...what is odd choice from them, bcoz almost no1 test site NOT use thouse...hmm. 

lol unpossible to compare. that amd want..or why?


come on amd.?.lol




summarum


rtx 3080 20Gb gpu with  amd 5900xt gpu or intel Rocket Lake 11th generation cpu,( coming btw Q1/2021, helps, i dare say, 5-10% more.

 so different is between rx 6900xt vs rtx 3080 10GB/20GB is between 20-25%


meaning practical example rx 6900 50fps VS rtx 3080 60 fps.. 

i feel i do favor for amd big navi that omen..lol


we,let see.





finaly, i cant belive amd 300W tdp for rx 6900 xt.. hmm well sure amd can say tdp is 300W ,but its unpossible for example gaming session....

bcoz even single rx 5700 xt, its fastest variant, sapphire nitro+ SE AIB gpu, official tdp is 285w and measured 'average gaming' (yes, Techpowerup test) also, 285W

btw, peak was 296W  ...wow.

hm,its btw,more than example rtx 2080 ti AIB versions,example MSI rtx 2080 ti Lightning eat watts, its have average gaming  tdp 279W....6w less than rx 5700 sapphire nitro+ SE AIB gpu...


but,btw again, rtx 2080 ti Lightning is 44% faster than rx 5700 sapphire nitro+ SE AIB gpu,this for 4K gaming...hmm 

(Techpowerup test) tx again...



so, i dont think that rx 6900 xt what practical is 2x5700xt for CF mode can do jod same values.....lol no way.


we must remembe,there is no different rdna2  and rdna1 for moust important issues

and i mean both use still 7nm core and both use still also gddr6 memory.

also RT support eat juice, 25w?


i say ..hmm. i dare say.. 360W average gaming, for rx 6900 xt, peak near 400w, this IF rx 6900xt get even close ,that 20% of rtx 3080 speed.

this bcoz there is several test for internet with rx 5000 gpus CF modes.

and example when testing CF mode rx 5600xt and rx5700, they measure tdp 464W, yes 464W....

gogole and you find.

if rx 6900 xt loose about 35% i belive 300w tdp.


so much power must have these days top gpus that it  need current alot.


so,we must remembe that rx 5700 xt is already 7nm gpu, so amd cant do much there power saving,

 but rtx 3080 is 8nm gpu and its little brothers rtx 2000 series is 12nm maded,different is big ,50% smaller die,its alot!


so when we know that rtx 3080 have amazing performance it tdp and average gaming measure watts eat, 303W, is excellent.


well ,we seen soon.



rtx 3000 series really earn all its glory.yes



also, when rtx 3070 and rtx 3060 ti coming (heard that 3070 Ti is cancelled..?)

 i bet thouse gpus oc'd very well,and my calcuting at least rtx 3070 AIB Lighning is hard case rx 6900 xt...maybe, but sure is,its alot cheaper. 200$ i guess


well ,let see that too soon.




p.s. rtx 3090 is it own category, but its make just 8K gaming,rtx 3080 10gb & 20GB for 4K gaming ,rtx 3060 ti and rtx 3070/super FHD and WHD gaming, rtx 3070 Super (16gb) also 4K gaming.


3rd gpu:

but, where is intel Xe gpu!?

 it is great to see 3 gpus battle line!



rs 6900 xt price:

i cant belive that amd cant sell rx 6900 xt under rtx 3080 price, little more, i say 699-799$



notice...

amd need cash,moore than nvidia.


its cant continue forever overdraft, bcoz its zero/2-3% profit price politics.





all great end of weekend!


----------



## INSTG8R (Oct 11, 2020)

cueman said:


> just my re-search ,calculation and sharp compare and etc etc.. info
> 
> 
> well,i cant see that rx 6900 xt can reach even close rtx 3080 performance..i say rtx 3080 10/20gb is 20-25% faster.
> ...


So much garbage in that post so I‘ll just correct the one piece of misinformation in that wall of misinformation. I have a Sapphire 5700XT Nitro +and unless you’re raising the power limit it’s never over 220W. There was a “special edition“ that is irrelevant since the boost clocks were adjusted making its minor OC nothing. Even IF was us more power it would still never be over 235W unless forced


----------



## Deleted member 190774 (Oct 11, 2020)

@cueman; remember, mate, Winners don't do drugs!


----------



## Vayra86 (Oct 11, 2020)

nguyen said:


> Oh don't you worry, the e-peen crowd already gobble up any 3090 stock they can find . Well I too would like a piece but all the vendors in my country are price gouging the hell outta 3080/3090, going as high as +400-500usd over MSRP for both, somehow that is legal in third world countries.
> Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
> There are many more rich kids out there than you might think



That is just a whole lot of cognitive dissonance on your part. 1% is as niche as it gets.


----------



## minami (Oct 11, 2020)

It is interesting that SONY adopted a primitive shader for PS5.


----------



## INSTG8R (Oct 11, 2020)

Blueberries said:


> No, I'm speculating just like yourself but for some reason you think the burden of proof is on _*me* to disprove your theory that they're hiding a more powerful GPU beyond a curtain. You're the one throwing flagrant speculation and I'm being realistic._


Where anywhere on this forums have I made any claims let alone the outrageous claims you’ve made? We’ve seen one short benchmark run and some infamous AMD graphs(yes( I think their graphs are always hilarious)So apparently just by that small amount information you can declare it a flop without out any hard proof while accusing me for some reason claiming it’s I guess the opposite of your flop but an all out victory? I don’t see anybody making those kinda of claims but you again while denying any burden of proof of your theory.


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> That is just a whole lot of cognitive dissonance on your part. 1% is as niche as it gets.



Again with the insults already ? do you know what size the player base of Steam is ?
Even the 5700 XT has not reached 1% yet, talking about cognitive dissonance ?


----------



## Razbojnik (Oct 11, 2020)

So basically 3070 only more expensive and with worst drivers...mkay.


----------



## Vayra86 (Oct 11, 2020)

nguyen said:


> Again with the insults already ? do you know what size the player base of Steam is ?
> Even the 5700 XT has not reached 1% yet, talking about cognitive dissonance ?



Insult? Observation. Are you saying 2080ti is doing 1% on Steam as well? Source?


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> Insult? Observation. Are you saying 2080ti is doing 1% on Steam as well? Source?



If you have half a brain then you would know where to find it
So 2080 Ti is so niche, yet still out sold 5700 XT, talk about cognitive dissonace


----------



## Vayra86 (Oct 11, 2020)

nguyen said:


> If you have half a brain then you would know where to find it
> So 2080 Ti is so niche, yet still out sold 5700 XT, talk about cognitive dissonace



again... Source?! Youre the one presenting cool facts here, dont run away now. Im not going to search for your realities...

Here is what I see. Perspective matters, as of October the share is 2% for just the 5700XT and 1.29% for thr 2080TI. So quite a gap still and its growing  61% *more* for 5700xt.



			UserBenchmark: AMD RX 5700-XT vs Nvidia RTX 2080-Ti
		


Then we get a decent high end card for some perspective...

2070S. 3.38%
that is nearly 3x as many cards as the 2080Ti


			UserBenchmark: AMD RX 5700-XT vs Nvidia RTX 2070S (Super)
		


So... come again?


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> again... Source?! Youre the one presenting cool facts here, dont run away now. Im not going to search for your realities



Can't read ? too poor for education ?
What part of "almost 1% of Steam Hardware Survey" is too hard to understand ?
Steam has a player base of 90 millions active users as of april 2019, 1% of that means 900 000 players own 2080 Ti. Yeah it's niche alright.


----------



## Vayra86 (Oct 11, 2020)

nguyen said:


> Can't read ? too poor for education ?
> What part of "almost 1% of Steam Hardware Survey" is too hard to understand ?
> Steam has a player base of 90 millions active users as of april 2019, 1% of that means 900 000 players own 2080 Ti. Yeah it's niche alright.



Dont dig yourself a deeper hole and read the above.

Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine. I think cognitive dissonance is the perfect term here, and again its NOT an insult, Im trying to get a point across that simple stats should have already shown you.

You also completely gloss over the fact that 5700XT is competing with numerous others in the performance and price bracket, even two past generations have cards that match it. The 2080ti does not.


----------



## INSTG8R (Oct 11, 2020)

Vayra86 said:


> Dont dig yourself a deeper hole and read the above.
> 
> Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine. I think cognitive dissonance is the perfect term here, and again its NOT an insult, Im trying to get a point across that simple stats should have already shown you.


I mean without even checking I know the 1060 is the most used GPU using the Hardware Survey as a metric. The conclusions you can draw from that can be interpreted differently I suppose but it tells me the average user runs an average card.


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> Dont dig yourself a deeper hole and read the above.
> 
> Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine.



Certainly less niche than all AMD cards but the RX570/580. Bravo for calling AMD fans out.
Being 25th most popular out of hundred of GPU is niche.
Yup.
Tell me who has cognitive dissonance again ?


----------



## Vayra86 (Oct 11, 2020)

INSTG8R said:


> I mean without even checking I know the 1060 is the most used GPU using the Hardware Survey as a metric. The conclusions you can draw from that can be interpreted differently I suppose but it tells me the average user runs an average card.



People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP.

Userbenchmark is more accurate as it reflects actual run benches per time frame.



nguyen said:


> Certainly less niche than all AMD cards but the RX570/580. Bravo for calling AMD fans out.
> Being 25th most popular out of hundred of GPU is niche.
> Yup.
> Tell me who has cognitive dissonance again ?



Are you that dense or unable to swallow your pride?! The facts are clear and this was never an AMD pissing contest, this was about how niche a top end GPU like 2080ti and now the 3090 were, remember?!

Wow, man. Just wow. Its very clearly you. Just stop here, it wont end well.


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP.
> 
> Userbenchmark is more accurate as it reflects actual run benches per time frame.
> 
> ...



Explain how Userbenchmarks is more trustyworthy than Steam ? because their users are verified ? Peope can't do duplicate benchmarks ?
Talking about digging graves 

Out of hundred of GPUs, 2080 Ti being the 25th most popular is being consider a niche. See how flawed your reasoning is ?

From https://dictionary.cambridge.org/dictionary/english/niche

Niche:
interesting to, aimed at, or affecting only a small number of people

Somehow you can't distinguish between a "small number" and a "small percentage" can you.
Glad I wasn't in any part of your education


----------



## INSTG8R (Oct 11, 2020)

Vayra86 said:


> People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP


While I agree that would be when Intel iGP was top dog! At least the 1060 makes for a sensible majority card. I have never once used userbench. I just run 3Dmark on driver changes for comparisons.


----------



## Totally (Oct 11, 2020)

Idc either way, don't plan to upgrade this cycle. Unless those numbers have merit and they manage to make it a sub it 300w card, sub 250w if I were to be greedy but unlikely since Nvidia has pushed the envelope by to setting up camp at 400W+, AMD is going to be like "400w? Don't mind if we do." latter which I feel is likely to be the case.



beedoo said:


> @cueman; remember, mate, Winners don't do drugs!



Unless it's tiger blood. Then for sure you know that you're "winning!".


----------



## Zach_01 (Oct 11, 2020)

cueman said:


> just my re-search ,calculation and sharp compare and etc etc.. info
> 
> 
> well,i cant see that rx 6900 xt can reach even close rtx 3080 performance..i say rtx 3080 10/20gb is 20-25% faster.
> ...


So many misinformation in that post, Idont know were to begin...
I have an MSI 5700XT GamingX AIB card and the card hits 2080MHz core boost (VRAM 1800MHz) and a total peak of 240W (GPU+VRAM). So its a little OCed from base specifications and still a 240W. Nominal 5700XTs are about 220~225W.
If you double that just as it is you got 440~450W. But that is NOT the case. RDNA2 is on enhanced 7nm node of TSMC with perf/watt improvements and 15~20% more density, and its not on the same 7nm that RDNA1/ZEN2/3 is.

So enhanced 7nm + some redesign that AMD done to the cores... we are facing a +50% performance per watt. It means that if you have a GPU with same performance as the 5700XT with all these enhancements it will draw 145~150W. Double that and you are at 290~300W region.

BUT... BUT... you must account that not all parts of the core are doubled exactly. The CUs yes (40 >> 80) but what about the memory controller (still 256bit?) and some other stuff that I will not refer to. So.. again... from that 290~300W you must cut another chunk of... lets say 20~30W. So now you are at 260~290W with doubled raw power of a 5700XT (220~225W) at the same max clock speed (2000~2050MHz).

Also... its rumored that top RDNA2 GPU is clocked to max boost at least 2200MHZ. Thats another increase from 2050MHz of a lets say +7~8% of power draw.
Again... 260~290W +7~8% = 280~310W.

You end-up with a 280~310W of a GPU card with more than 100%, more than double the raw power of 5700XT.
When gaming is that close enough to a 3080? We will see in 17 days...

-----------

For the price of this card...
AMD appears to have made some drastic choises to keep the cost in low levels. First the low memory bandwidth (256bit?) and the usage of GDDR6. Both these 2 are cheaper to make, and also reduce significantly the cost of the core because of the reduced complexity.
So... this top big navi card wont be as expensive as a 3080 is with the premium GDDR6X chips and the complex 320bit controller.

-----------

Before anyone make any assumptions about next AMD GPU out of thin air and does not know what is talking about... should think first.

And something else...

We dont know those gaming numbers AMD provided... with what CPU and with what card was. Remember that at 4K it will not matter much if it was a 5900X/5950X or a 10850K/10900K.


----------



## Vayra86 (Oct 11, 2020)

nguyen said:


> Explain how Userbenchmarks is more trustyworthy than Steam ? because their users are verified ? Peope can't do duplicate benchmarks ?
> Talking about digging graves
> 
> Out of hundred of GPUs, 2080 Ti being the 25th most popular is being consider a niche. See how flawed your reasoning is ?
> ...



A small percentage is always a small number because it relates to a larger whole. This isnt about education anymore is it, and it never was. But keep twisting....you are defending the point that a top SKU is not serving a niche. Good luck. Surely you wont look like a nutcase keeping that up.



INSTG8R said:


> While I agree that would be when Intel iGP was top dog! At least the 1060 makes for a sensible majority card. I have never once used userbench. I just run 3Dmark on driver changes for comparisons.



Userbenchmark is even giving the 2080ti more share than Steam so it only supports the 1% statement   Nguyen is apparently so blinded by whatever that he fails to see that even with that acknowledgment he is still reading statistic wrong entirely...

Oh well


----------



## INSTG8R (Oct 11, 2020)

Vayra86 said:


> A small percentage is always a small number because it relates to a larger whole. This isnt about education anymore is it, and it never was. But keep twisting....you are defending the point that a top SKU is not serving a niche. Good luck. Surely you wont look like a nutcase keeping that up.
> 
> 
> 
> ...


Yeah I only know of it in passing as a thing and bought 3Dmark years ago so a Time Spy run is all I need to see i anything is happening between software;BIOS/Driver changes


----------



## sepheronx (Oct 11, 2020)

I do not know why it's so hard for you guys to wait till reviews are out to share judgement.


----------



## ZoneDymo (Oct 11, 2020)

sepheronx said:


> I do not know why it's so hard for you guys to wait till reviews are out to share judgement.



Humanity never fails to disappoint


----------



## nguyen (Oct 11, 2020)

Vayra86 said:


> A small percentage is always a small number because it relates to a larger whole



"a billion dollars is a small amount of money because that is less than 1% of what Bezos made" 
this is what your statement implied.
Does Bezos himself think 1bil is a small amount of money ? definitely not, since he has sound mind after all.

oh well since I dont bully disabled kid I just forgive all your insults, gave me good laugh also.


----------



## Maye88 (Oct 11, 2020)

Can you correct your benchmarks for the 3080? Cause you used framerates for the 3080 for BL3 on ULTRA not Badass...

I also think AMD purposefully chose from among their worst games. Historically AMD has always underperformed on UE4 compared to Nvidia.


----------



## b1k3rdude (Oct 11, 2020)

Untill TPU gets their hands on an actual RX6000 card and runs it in the same system as a 2080ti/3080, this article is noithing more than pointless fluff.


----------



## INSTG8R (Oct 11, 2020)

The only things that can be compared is it was the same game and at 4K the rest is irrelevant as a comparable.


----------



## Jism (Oct 12, 2020)

sepheronx said:


> I do not know why it's so hard for you guys to wait till reviews are out to share judgement.



So many people wrote that the Big Navi woud'nt be nothing more then a 2x 5700 in relation of performance. Well they appearantly got more out of it at this point. And the "large" cache players a significant role along with it. I'm curious in the tech, only just a few more weeks, to see what rabbit AMD pulls out of the hat this time.


----------



## Nkd (Oct 12, 2020)

nguyen said:


> Yeah Navi21 selling at 600usd  +-50usd can already be called high end gaming.
> The elephant in the room here is AMD would like to make profit that could later turn into R&D budget, AMD can't do that if they go into a price war against Nvidia (whom already maintain godly profit margin). Therefore the only way AMD can do is slot their GPU into the gap between Nvidia price brackets, selling as many as they could while maintain as high profit margin as they can (5700/5700XT ring any bells ? ) .
> 
> Nvidia already knew 3080/3090 would have no direct competition; the 3070 is the one that Nvida is prepping to counter Big Navi. If AMD decide to price Navi21 at 500usd then Nvidia would have to lower the 3070 price accordingly, though I doubt AMD would do that.
> ...



If its 5% slower on average then 3080 that doesn't actually make it 3070 competitor though. So not sure what everyone is getting when they call it 3070 competitor. Its more like 3080 competitor.


----------



## Zach_01 (Oct 12, 2020)

Jism said:


> So many people wrote that the Big Navi woud'nt be nothing more then a 2x 5700 in relation of performance. Well they appearantly got more out of it at this point. And the "large" cache players a significant role along with it. I'm curious in the tech, only just a few more weeks, to see what rabbit AMD pulls out of the hat this time.


And the biggest mistake a lot of people made is that they thought that it was 2x5700XTs glued together on some kind of crossfire mode and and we know that cant even make a +70~80% performance. I'm expecting to be a real close 3080 competitor with more gap in price compared to performance gap, and with less power draw also. Wont beat it in all aspects but still.

Edit:
typo


----------



## INSTG8R (Oct 12, 2020)

Nkd said:


> If its 5% slower on average then 3080 that doesn't actually make it 3070 competitor though. So not sure what everyone is getting when they call it 3070 competitor. Its more like 3080 competitor.


And these numbers are being extrapolated from a completely inaccurate chart so there are no reliable comparisons yet...


----------



## hathoward (Oct 12, 2020)

Can't wait to see AMD GPU event. If they can dethrone the 3090 it'll be the first full team red rig I've built in over 20 years. (Yes I know that was before the Radeon acquisition...)


----------



## quadibloc (Oct 12, 2020)

I know there are a lot of people saying that the card they teased wasn't their absolute top-end card.
On the one hand, why not tease the absolute top end card in a situation like that, since doing so would give bigger numbers, which would be more impressive - and nothing else, really, but those three numbers, was revealed.
On the other hand, one would expect AMD to make an effort, if only for prestige purposes, to have something out there that would beat the 3090 or at least the 3080.
Nvidia's 3090 is "overpriced" at least in the sense that it is significantly behind the 3080 in price versus performance. So if you want the very best, you will have to pay a premium - and the 3090's performance will only yield dividends if one is gaming at a really high resolution.
So if nearly all the sales, and hence, nearly all the profits, are going to be made from sales of cards like the 3070, then AMD does only really need to compete in that arena.
And just because the new Nvidia cards are better bargains than Turing doesn't mean that AMD couldn't undercut them the same way they undercut Turing: skimp on the ray tracing, since frame rate, and not good looks, is what gamers tend to be concerned about.
Big, but budget priced, and better focused. Nvidia can have the ultra top-end to itself.
But that's only one possibility, and I think AMD will try to offer some ray-tracing capability that sort of looks comparable, and it will try as hard as it can to have something comparable even to Nvidia's top-tier offerings if it can. Maybe in the form of dual-GPU cards if it has to.


----------



## arbiter (Oct 12, 2020)

Even if card is say within 15% which i think AMD numbers claim. They couldn't sell it for around same price or even hell 100$ still doesn't make it the better buy without question. All gotta do is look at software suite nvidia is offering with their cards right now. The Green screen, DLSS, nvenc, RTX voice and that is only naming a few. Unless you don't need those things for people that stream for example all those features make nvidia card better buy. AMD needs more then just coming close to nvidia atm. Truthfully the within 15% of rtx3080 is suspect at best just cause AMD's history of things.


----------



## Vayra86 (Oct 12, 2020)

nguyen said:


> "a billion dollars is a small amount of money because that is less than 1% of what Bezos made"
> this is what your statement implied.
> Does Bezos himself think 1bil is a small amount of money ? definitely not, since he has sound mind after all.
> 
> oh well since I dont bully disabled kid I just forgive all your insults, gave me good laugh also.



So the financial situation of an individual, relative to all the money in the world, is the same as a market segment now?

You're totally not grasping at straws now, are you? You're also not bullying at all, following up with a clear insult as you just did 
All I gave you was a perspective on your view of statistic. You took that as an insult and ran with it, you still didn't let that go. And you still didn't get the slightest idea about what Steam stats do and don't tell you. Now I'm a 'disabled kid' Cool, man. You're the real adult here.


----------



## r9 (Oct 12, 2020)

Me personally I liked that comparison as I thought doing it myself, so this article saved me some work.
I did not expect to be hard proof but a ballpark and it should be taken as that.

The fact is if the test was showing that Big Navi was 20% faster than RTX 3080 it would have been taken as a rock solid proof that it is faster than RTX 3080 by the same people that bitch and moan in this thread right now.


----------



## INSTG8R (Oct 12, 2020)

r9 said:


> Me personally I liked that comparison as I thought doing it myself, so this article saved me some work.
> I did not expect to be hard proof but a ballpark and it should be taken as that.
> 
> The fact is if the test was showing that Big Navi was 20% faster than RTX 3080 it would have been taken as a rock solid proof that it is faster than RTX 3080 by the same people that bitch and moan in this thread right now.


That’s the problemi tho the s not even ball park close comparison its being used to  declare victory/defeat when there is zero comparison data available from the short test we saw


----------



## Zach_01 (Oct 12, 2020)

It was just a “tease”. Perhaps to show to people that 6000 is not only a 2080Ti/3070 competitor but something more.
If they wanted to be clear about the 3080 they would have done it already. I’m thinking they don’t rush things about it because they know that RTX30 series is on short supply right now.

I fully understand that many do not believe that AMD may have something worthy at hand because of the so called history. But who also believed 3-4 years ago that AMD would catch and get ahead the CPU game in such short time. Intel failures also helped but still, AMD alone has done some worthy job, increasing their CPU performance per watt more than 200% from prior to 2017.

Yes we should be prudent about next AMD announcement, but there is a lot of evidence that things are working well inside AMD. And they are focused on their goals.

By the claims that rumors state about the enhanced node (density) and enhanced/refined architecture = +50% performance/watt they could actually be very close to top challenging nVidia.

No I don’t believe that they can repeat the CPU ZEN3 story, but RDNA2 could be their ZEN2 time in the GPU world, as I’ve said a few times.


----------



## wolf (Oct 13, 2020)

Arpeegee said:


> People must either be in denial or lack good analytical skills if you think AMD would show their BEST card at a CPU event when they have a GPU event at the end of the month.
> 
> They've been mum on details all year with Navi and if I were to place a bet I would think the performance is closer to the RTX 3080 then people want to admit. People also need to keep in mind that even if it doesn't absolutely win over the 3080, if it's priced $100 cheaper than the "MSRP $699" then Nvidia has lost this battle. No regular person would pay $100-$200 more for only 5% to 10% better performance with a power hungry card (though the RTX 3090 sales kinda undermine my point ).


Whether it's the fastest Big NAVI card or not, who knows, opinion is pretty divided on that and for now you can believe what you want to believe based on your interpretation and perception.

As for if it goes punch for punch with a 3080 but sells for cheaper "then Nvidia has lost this battle. No regular person would pay $100-$200 more for only 5% to 10% better performance", that might be true if both offerings were from the same company, but even then people are willing to pay more than double for ~10-15% more with the 3090. Then don't forget that Nvidia has the market and mindshare, even at a price/performance disadvantage, they will sell a _buttload_ of cards. Beyond that we have no indication of relative RT performance and whether than can answer DLSS with anything better than a 70% render scale with contrast adaptive sharpening. If AMD steps it up and can match the performance (or even take the crown) and match the feature set, even then I'd wager Nvidia cards still flying off the shelves, these things take a long time to slowly change.

So many people here seem so *sure* of AMD and this product, and I've seen this cycle release after release. "Nvidia is worried about AMD this time!", "Nvidia will lose this battle, [insert architecture here] is so powerful/efficient this time!" etc etc... and I just find it so strange that release after release the cycle continues with the same flavour of statements. I guess if they do it enough times, they're bound to be right eventually. Don't get me wrong, I desperately want AMD to to contend with the top cards and even be top dog again, but I've learned to be cautiously optimistic rather than talking so surely of myself when really, I actually *know *next to nothing.


----------



## INSTG8R (Oct 13, 2020)

I hate the current attude that the battle is already over from a 5 second clip
i will only share my ”worst case”  
it will perform between 70 and 8 6and90n  the b3070    nd 80 will be more efficient actros the board  A competitive privce they filled a gap they exclusively


----------



## Totally (Oct 13, 2020)

INSTG8R said:


> I hate the current attude that the battle is already over from a 5 second clip
> i will only share my ”worst case”
> it will perform between 70 and 8 6and90n  the b3070    nd 80 will be more efficient actros the board  A competitive privce they filled a gap they exclusively



This is what I'm thinking too. Thinking back on past releases they probably have two SKUs. First card with two variants: First variant, similar to slightly better performance than the 3070 performance, better efficiency at load (never seen AMD have better Nvidia's efficiency at at idle or basic tasks), priced accordingly and a second slightly beefier version with higher clocks and/or more VRAM; Second card, Full die heavily oc'd out the box, worse efficiency due to oc and increased power limits, perf that is a few % lower than 3080, priced accordingly.


----------



## Vayra86 (Oct 13, 2020)

INSTG8R said:


> I hate the current attude that the battle is already over from a 5 second clip
> i will only share my ”worst case”
> it will perform between* 70 and 8 6and90n  the b3070    nd 80 will be more efficient actros the board  A competitive privce they filled a gap they exclusively*



I think something went wrong there, but I agree


----------



## INSTG8R (Oct 13, 2020)

Vayra86 said:


> I think something went wrong there, but I agree


i

Yeah  don’t know hat/ happened there...eloquent in bonsense... I already have issue typing on thite. predicttive spell spell checkm, formatting all esch other. it wouldn’t churn out something alooetue


----------



## Vayra86 (Oct 13, 2020)

INSTG8R said:


> i
> 
> Yeah  don’t know hat/ happened there...eloquent in bonsense... I already have issue typing on thite. predicttive spell spell checkm, formatting all esch other. it wouldn’t churn out something alooetue



I stundtanerd comelptyle!


----------



## kapone32 (Oct 13, 2020)

nguyen said:


> Can't read ? too poor for education ?
> What part of "almost 1% of Steam Hardware Survey" is too hard to understand ?
> Steam has a player base of 90 millions active users as of april 2019, 1% of that means 900 000 players own 2080 Ti. Yeah it's niche alright.


I have 1 laptop and 2 PCs with my Steam account. Which one does Steam report as mine? Steam is not a reliable source for GPU numbers. One of the best examples of a place to look would be a German retailer to see historical data on sales. You can find it for CPUs so the same should be true for GPUs.


----------



## EarthDog (Oct 13, 2020)

kapone32 said:


> I have 1 laptop and 2 PCs with my Steam account. Which one does Steam report as mine? Steam is not a reliable source for GPU numbers. One of the best examples of a place to look would be a German retailer to see historical data on sales. You can find it for CPUs so the same should be true for GPUs.


it should ask from the same pc it was installed on. Cookies. 

Link?


----------



## kapone32 (Oct 13, 2020)

EarthDog said:


> it should ask from the same pc it was installed on. Cookies.
> 
> Link?


Sorry I can't even copy the link from my work laptop. Another way to gauge GPU sales is Amazon's top lists. I think it only does the US right now but I still think it is better than Steam.






						Amazon Best Sellers: Best Computer Graphics Cards
					

Discover the best Computer Graphics Cards in Best Sellers.  Find the top 100 most popular items in Amazon Computers & Accessories Best Sellers.



					www.amazon.com


----------



## nguyen (Oct 13, 2020)

kapone32 said:


> I have 1 laptop and 2 PCs with my Steam account. Which one does Steam report as mine? Steam is not a reliable source for GPU numbers. One of the best examples of a place to look would be a German retailer to see historical data on sales. You can find it for CPUs so the same should be true for GPUs.



Steam choose the user base for the survey at random at the begining of each month. Steam will ask if you would like to participate in the survey.
Steam only asked for my participation like once a year.

Does it matter if which machine was registered ? each account only get 1 participation after all, less chance for duplicates.
Amazon or Mindfactory vs Steam ? How about you just go and ask your local store . Steam has over 100 million active accounts globally, its statistic should be closest to the truth out there.

Germany has about equal GPU allocations between Nvidia/AMD, so Mindfactory data is a good representation but only for first world country. Here in third world country, Nvidia outsold AMD by 9:1 because AMD distribution is very poor. 

So what would you prefer ? Regional data or global ?


----------



## EarthDog (Oct 13, 2020)

If you went by this, people's disbelief in demand looks even more asinine, lol.


----------



## nguyen (Oct 13, 2020)

I would say demand for 3080 is about the same for 1080 Ti during the mining era in 2018, everyone just want a piece of that sweet cake .


----------



## kapone32 (Oct 13, 2020)

nguyen said:


> Steam choose the user base for the survey at random at the begining of each month. Steam will ask if you would like to participate in the survey.
> Steam only asked for my participation like once a year.
> 
> Does it matter if which machine was registered ? each account only get 1 participation after all, less chance for duplicates.
> ...


I can appreciate what you are saying and we could find cases to support both arguments depending on parameters but you also have to look at User base per region. I have no doubt that Nvidia outsells AMD in terms of GPUs. I just believe that just based on the amount of redditt posts for Navi that they have done quite well and have a higher user base than you suggest.


----------



## INSTG8R (Oct 13, 2020)

so sandbagging....


----------



## nguyen (Oct 13, 2020)

kapone32 said:


> I can appreciate what you are saying and we could find cases to support both arguments depending on parameters but you also have to look at User base per region. I have no doubt that Nvidia outsells AMD in terms of GPUs. I just believe that just based on the amount of redditt posts for Navi that they have done quite well and have a higher user base than you suggest.



You mean the amount of Navi black screen reports on reddit .

Steam could be biased against non gaming specific GPU like Vega or Radeon VII, but Navi is not one of those.

Steam do not use aggregated data, so it doesn't care how many Navi are sold in total, just how many users have Navi card in their system at the begining of the month divided by the total number of users participated in the survey. Therefore each month's data is not complete by itself, it should be compared to previous months in order to reveal the trend in hardware ownership.

Anyways 1% in Steam users means 1 million people bought the 5700XT, that already is a significant number of sale. Reaching 1% in Steam should equate to going platinum  (1 million sale)


----------



## kapone32 (Oct 13, 2020)

nguyen said:


> You mean the amount of Navi black screen reports on reddit .
> 
> Steam could be biased against non gaming specific GPU like Vega or Radeon VII, but Navi is not one of those.
> 
> ...


Yes the number of people that helped users that had issues from being noobs or not used to AMD GPUs. As far as I know Vega is a Gaming GPU. It doesn't matter though you, are resigned (based on your experiences I hope) to prefer Nvidia. It doesn't matter to me though I love my 5700 that gives me 144+ FPs full time with Division 2. I am also confident that anyone who bought a Navi card (and found a way around those purported issues) would have that small smile we all get when we experience killer performance for a good price ($429.99 CAD). It doesn't matter what either of us think though as AMD's stock price alone means they have money to respond to anything Nvidia does in the short term. The sales will tell but AMD needs a budget card that is as fast as the Vega 64 for $200 US.


----------



## ratirt (Oct 13, 2020)

kapone32 said:


> Yes the number of people that helped users that had issues from being noobs or not used to AMD GPUs. As far as I know Vega is a Gaming GPU. It doesn't matter though you, are resigned (based on your experiences I hope) to prefer Nvidia. It doesn't matter to me though I love my 5700 that gives me 144+ FPs full time with Division 2. I am also confident that anyone who bought a Navi card (and found a way around those purported issues) would have that small smile we all get when we experience killer performance for a good price ($429.99 CAD). It doesn't matter what either of us think though as AMD's stock price alone means they have money to respond to anything Nvidia does in the short term. The sales will tell but AMD needs a budget card that is as fast as the Vega 64 for $200 US.


I never had an issue with my 5600 XT. Considering I bought it to replace my Vega 64 ( died due to PSU or so i think PSU changed as well so there), the 5600xt was the one I have chosen so that I could wait for the new stuff coming up in the market. Couldn't be more happier with the 5600XT. Absolutely no issues and the performance is great. Just like V64 but way less power consumption.
The entire Navi problem is blown out of proportion, unless you are a total rookie in the PC tech which means you will have even a mouse pad problems.



nguyen said:


> Anyways 1% in Steam users means 1 million people bought the 5700XT, that already is a significant number of sale.


I think it's more than 1% as of today.


----------



## kapone32 (Oct 13, 2020)

ratirt said:


> I never had an issue with my 5600 XT. Considering I bought it to replace my Vega 64 ( died due to PSU or so i think PSU changed as well so there), the 5600xt was the one I have chosen so that I could wait for the new stuff coming up in the market. Couldn't be more happier with the 5600XT. Absolutely no issues and the performance is great. Just like V64 but way less power consumption.
> The entire Navi problem is blown out of proportion, unless you are a total rookie in the PC tech which means you will have even a mouse pad problems.



I actually stumbled onto the 5700 (Customer build fell through) put I in my HTPC and wow colour me impressed. I love Squadrons up to 200 FPS no stuttering or tearing @ 1440P. The power draw is crazy I have the 5700 paired with a 600 W PSU with an OC 3600 & 32GB of DDR4.


----------



## ratirt (Oct 13, 2020)

kapone32 said:


> I actually stumbled onto the 5700 (Customer build fell through) put I in my HTPC and wow colour me impressed. I love Squadrons up to 200 FPS no stuttering or tearing @ 1440P. The power draw is crazy I have the 5700 paired with a 600 W PSU with an OC 3600 & 32GB of DDR4.


I bought the 750Watt PSU seasonic cause I'm definitely changing the card for a stronger one to get proper 4k gaming with my monitor. Wonder what the RDNA2 will bring. I been thinking about the 5700 though but I just wanted to get a card that would allow me to play some and i didn't want to spend a lot considering I had spent so damn much on two ThreadRippers. Anyway, no matter how you call it or see it. The 5000 series AMD GPUs are solid in performance and price. If you ask me.


----------



## kapone32 (Oct 13, 2020)

ratirt said:


> I bought the 750Watt PSU seasonic cause I'm definitely changing the card for a stronger one to get proper 4k gaming with my monitor. I been thinking about the 5700 though but I just wanted to get a card that would allow me to play some and not spend much considering I'd spent so damn much on two ThreadRippers. Anyway, no matter how you call it or see it. The 5000 series AMD GPUs are solid in performance and price. If you ask me.


Agreed totally on TR4 I was prepared to sell my X399 MB & CPU and get a TRX40 setup but the cost of the 3960x brought a tear to my eye. So I built myself a HTPC with an inexpensive X570. The 5700 is great but not a great 4K card. I am definitely interested to see if just adding the 5000 series CPUs improves performance. That single CCX is no joke. Getting back to the thread I expect the Big Navi to be in and around the performance gap that existed between Polaris and Vega so that should be the card you look for to get that 4K goodness you want.


----------



## ratirt (Oct 13, 2020)

kapone32 said:


> Agreed totally on TR4 I was prepared to sell my X399 MB & CPU and get a TRX40 setup but the cost of the 3960x brought a tear to my eye. So I built myself a HTPC with an inexpensive X570. The 5700 is great but not a great 4K card. I am definitely interested to see if just adding the 5000 series CPUs improves performance. That single CCX is no joke. Getting back to the thread I expect the Big Navi to be in and around the performance gap that existed between Polaris and Vega so that should be the card you look for to get that 4K goodness you want.


Off topic. I got two 3970x and it was quite a change. These are so damn fast in basically anything I throw at them. They crunch data freakishly fast. That's what I really needed. Went a bit overboard with the budget but not by much and it was worth it.
About the RDNA2 and the new NAVI. I really have some expectations and I hope these cards will end up around 3080 performance at least. If it turns out, these are even faster that's even better.
Waiting patiently to see if the RDNA2 is a game changer or not really. The benchmarks (showed by AMD) bring some optimism but it's better to wait for the final product and reviews. Besides, we don't know what card AMD teased anyway. Hopefully the RX 6000 series keynote will bring more information and shed some light on the matter.


----------



## evernessince (Oct 13, 2020)

INSTG8R said:


> AMD is always chasing the latest features but sometimes they seem to lose focus.


 
Both companies at some point were "chasing the latest features", this is nothing new.  Recent examples that come to mind are Adaptive sync, anti-lag, and CAS.  Nvidia came out with G-Sync, AMD released FreeSync.  AMD released anti-lag and CAS, Nvidia released ULLM and FreeStyle (which is litterally a copy of CAS)

This had been going on for decades in this industry and it's not something you can say only one party partakes in.


----------



## Zach_01 (Oct 13, 2020)

The difference is that nVidia most times trying to pass her own proprietary standards to shut everyone else out.


----------



## vctr (Oct 15, 2020)

Anymal said:


> Maybe 6700xt ?


6700XT is probably close to 40/50CUs, and I doubt they have shown 6700XT cause that would mean that the 6900XT(if it's 80CUs) would be almost double the performance of the 3080, and I don't think in 1 gen they can go from getting close to 2070S to beating 3080 by 2x.


----------



## Zach_01 (Oct 15, 2020)

We can drop the namings cause not a single person knows how many GPUs are to be, what would they call them (other than 6000series), and what the price will be. No leaks about.
We can only asume that its going to be more than one, probably 2 or 3.

Also one can roughly estimate the performance of the 80CU one based on leaks about the full die size, the improvements (density and perf/watt) of the 7nm enhanced node, the clocks and so on...


----------



## EarthDog (Oct 15, 2020)

Zach_01 said:


> The difference is that nVidia most times trying to pass her own proprietary standards to shut everyone else out.


Was the last time that happened with Physx a decade and change ago? It isn't RTRT...


----------



## medi01 (Oct 15, 2020)

nguyen said:


> Steam could be biased against non gaming specific GPU like Vega or Radeon VII, but Navi is not one of those.


Steam is positively biased towards internet cafes, where AMD has close to zero presence.

But whatever makes you feel better.


----------



## mtcn77 (Oct 15, 2020)

Zach_01 said:


> Also one can roughly estimate the performance of the 80CU one based on leaks about the full die size,


Hi, I wanted to drop by because AMD's very own 'Timothy Lottes' - who is a great engineer by the way - has said that gpus stopped scaling due to scheduling constraints.


----------



## nguyen (Oct 15, 2020)

medi01 said:


> Steam is positively biased towards internet cafes, where AMD has close to zero presence.
> 
> But whatever makes you feel better.



The internet cafe thing was fixed since May 2018.








						Steam Hardware Survey Fix - Overcounting of Asia Cybercafes :: Steam Discussions
					






					steamcommunity.com
				



Now steam is a complete neutral party, they have nothing to gain supporting either AMD or Nvidia.
You just listen to every nonsense that the Scott "jebaited" dude are feeding aren't you.


----------



## medi01 (Oct 15, 2020)

nguyen said:


> Now steam is a complete neutral party, they have nothing to gain supporting either AMD or Nvidia.


They have zero interest spending money on fixing shit either, they are not "survey company".



nguyen said:


> You just listen to every nonsense that the Scott "jebaited" dude are feeding aren't you.



No, I listen to random anons on the internet, who just happen to know thing by coincidence.

Steam wildly contrasts actual retailer sales (e.g. mindfactory) and most market share reports.


----------



## nguyen (Oct 15, 2020)

medi01 said:


> They have zero interest spending money on fixing shit either, they are not "survey company".
> No, I listen to random anons on the internet, who just happen to know thing by coincidence.
> Steam wildly contrasts actual retailer sales (e.g. mindfactory) and most market share reports.



Sure you can post Mindfactory sale too, let see if it support any of your point . FYI it doesn't.
Or better yet, AMD's own Q1 and Q2 earning where they barely made any money in GPU sales ?
Q1 2020
Q2 2020
Or Jon Peddie research ?

Anyways provide whatever source you want, as long as they are not from the "Jebaited" dude then they are good info.


----------



## medi01 (Oct 15, 2020)

Next gen Xbox (naming is so freaking confusing) with RDNA2 and mm, I think still Zen 2 @ 3.2Ghz consumes *160-165W *in Dirt 5 (that's a 56CU thingy at well under 2Ghz)













nguyen said:


> Anyways provide whatever source you want, as long as they are not from the "Jebaited" dude then they are good info.


Jebaited dude claimed exactly zero figures, just mentioned that Steam figures are skewed.


----------



## nguyen (Oct 15, 2020)

medi01 said:


> Jebaited dude claimed exactly zero figures, just mentioned that Steam figures are skewed.



It's called confict of interest, whatever he said are meant to improve AMD image.
He knew Steam already fixed the counting algorithm and he used old data trying to discredit Steam Survey.
What else is new ? water is wet ?


----------



## Bronan (Oct 16, 2020)

LOL if AMD would launch a much faster card than what we assume was the top dog card.
I am sure the second hand market gets flooded with very young and cheap nvidia cards
I actually see constant improvements with the current 5700XT with the latest 20.9.2 albeit that it sometimes crashes 
When i turned off v-sync and compared to my mates 2080TI system we where both stunned.
Running several games with insane high fps baffled me, my friend with his 2080TI was sittings next to me with his machine and i got between 60 up to 400 fps while his machine ran between 80 and 240 fps with the same settings
Even besides those numbers we noticed that the game looked much better on my machine at ultra, we really could not believe what we saw with our own eyes
He actually demanded to show me the inside of my machine, and checked all settings several times 
The average fps when playing Horizon Zero Dawn is 257,6 FPS however it shows some tearing when it goes beyond 300 fps 

Running the silly synthetic benchmarks shows ofcourse much lower numbers than the nvidia counterparts.



nguyen said:


> Sure you can post Mindfactory sale too, let see if it support any of your point . FYI it doesn't.
> Or better yet, AMD's own Q1 and Q2 earning where they barely made any money in GPU sales ?
> Q1 2020
> Q2 2020
> ...


Even though i agree that the net profit is not so big they are slowly gaining more ground in every aspect of the market 
I also saw so many actually want to people jump on the latest nvidia release without even thinking about what AMD is going to offer.
Many of my friends have already ordered and payed in advance for their new nvidia toy without ever knowing when they will get it.
I actually decided to wait and see what happens this far we asume that what AMD has showed or has been leaked that it will be the top card.
However thats pure speculation 
Especially now AMD already showed that they are back into the race and even surpassed Intel with their latest 5950X as a big surprise getting a boost clock of 4.9 Ghz at the end of the show.
That shows what have been leaked not allways is that you are sure as hell not should accept as being the reality, and that a pleasant surprise as this is possible at each moment.
I actually am sure to set aside some money to buy me into the AMD products as soon as there are very good motherboards with more pci-e lanes than the previous not so GOOD gaming motherboards.
Yes i know i am darn spoiled and demanding but hey i want to spend a small fortune only on something really good, and not on a pci-e sharing crap product every time i want to buy a new machine.


----------



## medi01 (Oct 16, 2020)

nguyen said:


> He knew Steam already fixed the counting algorithm and he used old data trying to discredit Steam Survey.
> What else is new ?


He mentioned Steam fixed some but not all issues reported by AMD.
He wasn't specific about what issues were not addressed.

Steam figures contrast with other sources, major retailer stats being one of them, I don't get why you don't let that dead horse just be.
Mindfactory had AMD (unit) market share between 29%-39% for this year, with 5700 series and 2070 series being the most popular models.


----------



## nguyen (Oct 16, 2020)

medi01 said:


> He mentioned Steam fixed some but not all issues reported by AMD.
> He wasn't specific about what issues were not addressed.
> 
> Steam figures contrast with other sources, major retailer stats being one of them, I don't get why you don't let that dead horse just be.
> Mindfactory had AMD (unit) market share between 29%-39% for this year, with 5700 series and 2070 series being the most popular models.



Let me tell you that you don't understand the fundamental difference between Steam data and Mindfactory data.
Steam data represent what GPU are being used by Steam users.
Mindfactory data are number of GPU being sold.
Steam is the absolute percentage while Mindfactory is the relative change, get it ?

Let say there are 100 millions dGPU circulating the market with Nvidia 80% and AMD 20%, an additional 1 million GPU being introduced into the market every month with Nvidia and AMD selling equal amount of cards would make that market share barely change (79.7% vs 20.3%), get it ?

So stop comparing apple to orange.

Also Mindfactory is a single retailer, that is like asking who would win the POTUS election by counting votes in a single city.


----------



## medi01 (Oct 16, 2020)

nguyen said:


> Steam is the absolute percentage while Mindfactory is the relative change, get it ?


Shocking story indeed, except Steam shows changes too.


----------



## Adam Krazispeed (Oct 18, 2020)

jesdals said:


> I could live with that if the price is belowe the 3080




sorry techpowerup but this is my bet...... big navi XT/XTX 6900 XTX/XT that is 2400Mhz game clock possibly 2.6/2.7ghz  boost (MAX) boost clocks.. 135-137% relative Performance......sorry i got more info i cant share yet.... hehe this is all i can do.. for now!!!


----------



## hat (Oct 18, 2020)

2.4GHz?


----------



## medi01 (Oct 18, 2020)

hat said:


> 2.4GHz?



Overhyping is a method to make launch of an excellent product a disappointment.
One of the FUD arsenal moves.

36CU RDNA2 chip (PS5) base is 1.8GHz and it boosts to only 2.23GHz, there is no way 80CU chip would have "base clock" of 2.4.

56CU RDNA chip (XSeX) clocks at 1.825Ghz (no boosting).

2.4+ GHz as base is just *FUD campaign*.


----------



## Zach_01 (Oct 18, 2020)

medi01 said:


> Overhyping is a method to make launch of an excellent product a disappointment.


I agree! There is no need for this...



medi01 said:


> 36CU RDNA2 chip (PS5) base is 1.8GHz and it boosts to only 2.23GHz, there is no way 80CU chip would have "base clock" of 2.4.
> 
> 56CU RDNA chip (XSeX) clocks at 1.825Ghz (no boosting).
> 
> 2.4+ GHz as base is just *FUD campaign*.


You actually cannot base the clock assumptions upon consoles. High-end dGPUs have a power budget higher than an entire console that includs a CPU and other stuff too.
I'm not saying that an 80CU Navi21 will boost to 2.7GHz. Maybe boosting to 2.3~2.4GHz is viable.


----------



## Adam Krazispeed (Oct 18, 2020)

ssed on Techpowerups specs and bench


----------



## INSTG8R (Oct 18, 2020)

Adam Krazispeed said:


> ssed on Techpowerups specs and benchView attachment 172230


Problem with that is all those cards were using completely different settings, API and benchmark scenarios. All they have in common is the same game and same resolution


----------



## Adam Krazispeed (Oct 18, 2020)

i modified this to where i belive the 6900xt/xtx will sit in the stack on release


----------



## INSTG8R (Oct 18, 2020)

Adam Krazispeed said:


> i modified this to where i belive the 6900xt/xtx will sit in the stack on release


I like your optimism. I’m total Red Team but I have to remain sceptical. The leaked numbers that are floating around look promising but until they  turn into benchable performance numbers they don’t mean much.


----------



## TheoneandonlyMrK (Oct 18, 2020)

nguyen said:


> Let me tell you that you don't understand the fundamental difference between Steam data and Mindfactory data.
> Steam data represent what GPU are being used by Steam users.
> Mindfactory data are number of GPU being sold.
> Steam is the absolute percentage while Mindfactory is the relative change, get it ?
> ...


Pot, kettle, black, you see steam stats as that golden single source your berating others for seeing In mindfactory.

Neither can be the last word clearly.


----------



## medi01 (Oct 19, 2020)

Zach_01 said:


> I agree! There is no need for this...
> 
> 
> You actually cannot base the clock assumptions upon consoles. High-end dGPUs have a power budget higher than an entire console that includs a CPU and other stuff too.
> I'm not saying that an 80CU Navi21 will boost to 2.7GHz. Maybe boosting to 2.3~2.4GHz is viable.



If 80CU could have 2.4Ghz base, then 56CU will definitely go beyond 1.8Ghz.


----------

