# AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+



## Raevenlord (Oct 18, 2020)

AMD's RDNA2-based cards are just around the corner, with the company's full debut of the secrecy-shrouded cards being set for October 28th. Rumors of high clocks on AMD's new architecture - which were nothing more than unsubstantiated rumors up to now - have seemingly been confirmed, with Patrick Schur posting on Twitter some specifications for upcoming RNDA2-based Navi 21 XT. Navi 21 XT falls under the big Navi chip, but likely isn't the top performer from AMD - the company is allegedly working on a Navi 21 XTX solution, which ought to be exclusive to their reference designs, with higher clocks and possibly more CUs.

The specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news. The 2.4 GHz (game clock) speeds are being associated with AIB cards; AMD's own reference designs should be running at a more conservative 2.3 GHz. A memory pool of 16 GB GDDR6 has also been confirmed. AMD's assault on the NVIDIA 30-series lineup should embody three models carved from the Navi 21 chip - the higher performance, AMD-exclusive XTX, XT, and the lower performance Navi 21 XL. All of these are expected to ship with the same 256 bit bus and 16 GB GDDR6 memory, whilst taking advantage of AMD's (rumored, for now) Infinity Cache to make up for the lower memory speeds and bus. Hold on to your hats; the hype train is going full speed ahead, luckily stopping in a smooth manner come October 28th.



 



*View at TechPowerUp Main Site*


----------



## INSTG8R (Oct 18, 2020)

Just want the 6700XT to replace my 5700XT.


----------



## Verpal (Oct 18, 2020)

Good, best case scenario XTX might be competitive against RTX 3090, if this ''infinity cache'' thing actually work well.

Considering Zen 3 pricing, I won't bet on AMD undercutting too much this time around.


----------



## cueman (Oct 18, 2020)

interesting.looks it cant beat rtx 3080 so amd drop good its tdp down.
thouse clocks heard high,very high.

under 260w and rtx 3080 speed? noway.not 4K speed


----------



## Flanker (Oct 18, 2020)

If RDNA2 can do what the HD4xxx series did at the time, I think that will be good news for consumers.


----------



## xman2007 (Oct 18, 2020)

Verpal said:


> Good, best case scenario XTX might be competitive against RTX 3090, if this ''infinity cache'' thing actually work well.
> 
> Considering Zen 3 pricing, I won't bet on AMD undercutting too much this time around.


I don't get this notion that for the same performance they should be 50-100 cheaper than their nvidia counterpart though its fairly common presumption


----------



## INSTG8R (Oct 18, 2020)

xman2007 said:


> I don't get this notion that for the same performance they should be 50-100 cheaper than their nvidia counterpart though its fairly common presumption


Well that’s usually what AMD does historically. But if they really have a 3080 contender who knows how pricing will go. It’s a pretty conventional card, no exotic HBM etc


----------



## john_ (Oct 18, 2020)

2.4GHz is a great frequency number for a GPU, but in the end performance is what matters.


----------



## ZoneDymo (Oct 18, 2020)

xman2007 said:


> I don't get this notion that for the same performance they should be 50-100 cheaper than their nvidia counterpart though its fairly common presumption



Because...that is a reason for consumers to buy their card over the competitor? aka make them some money?

Like if Nvidia and AMD both had a card that performed identical and had the same power consumption etc, then personally atm I would probably go for Nvidia purely for that well done Nvenc which AMD has no answer for as of yet.

But if the AMD card is a good chunk of change cheaper, then I dont care about Nvenc and get AMD


----------



## BMfan80 (Oct 18, 2020)

Flanker said:


> If RDNA2 can do what the HD4xxx series did at the time, I think that will be good news for consumers.



That would be awesome,I had a 4770 and when overclocked it was a beast of a card.
I have a screenshot of it back in the day with a furmark score that is higher than a GTX470

I'm waiting to see what the new cards can do,I would like to replace my 1080ti with one.


----------



## PrEzi (Oct 18, 2020)

ZoneDymo said:


> I would probably go for Nvidia purely for that well done Nvenc which AMD has no answer for as of yet.


Wait what ?  What about the encoding engine with the on par features that was there for ages? VCE ?
I use A's video converter (free) for simple recodings if the quality doesn't matter (ans speed is preferred.... on the other hand with a 3960X it doesn't really make that much difference...).
If I want to achieve a high quality then no HW encoder is able to deliver it. Only high quality 2-pass encoding on a CPU.


----------



## EarthDog (Oct 18, 2020)

Nice... im imagining a 3080 competitor (within a few % give or take) and cheaper. Sounds like a winner!


----------



## Max(IT) (Oct 18, 2020)

hopefully they didn't go the "9800XT way". 
We need a reliable product by AMD, after the "5700XT fiasco", not just something faster. Even if it is slightly slower than the Ampere competitor, but with 16 GB of VRAM and a lower price, I would consider it.
IF (and a big IF) it is reliable.
The 5700XT wasnt.


----------



## TheoneandonlyMrK (Oct 18, 2020)

Looking good let's hope it benches well.


----------



## Vya Domus (Oct 18, 2020)

High clocks means really high performance from the ROPs, something almost all large AMD GPUs seem to have lacked over the years. You can't really implement "wide ROPs" as the process they perform isn't easily parallelizable, so you can only increase their clock speed really.


----------



## Finners (Oct 18, 2020)

who is Patrick Schur? quick google just brings up a twitter account with 1000 followers


----------



## Fabio (Oct 18, 2020)

BMfan80 said:


> That would be awesome,I had a 4770 and when overclocked it was a beast of a card.
> I have a screenshot of it back in the day with a furmark score that is higher than a GTX470
> 
> I'm waiting to see what the new cards can do,I would like to replace my 1080ti with one.


I need to replace my 1080 too. A card like that would be awesome



Max(IT) said:


> hopefully they didn't go the "9800XT way".
> We need a reliable product by AMD, after the "5700XT fiasco", not just something faster. Even if it is slightly slower than the Ampere competitor, but with 16 GB of VRAM and a lower price, I would consider it.
> IF (and a big IF) it is reliable.
> The 5700XT wasnt.


why you say 5700xt was a fiasco?


----------



## INSTG8R (Oct 18, 2020)

Fabio said:


> why you say 5700xt was a fiasco?


Well it did have a rough launch driver wise so first month or so wasn’t great but fiasco is a bit of an exaggeration.


----------



## PTyop (Oct 18, 2020)

BMfan80 said:


> That would be awesome,I had a 4770 and when overclocked it was a beast of a card.
> I have a screenshot of it back in the day with a furmark score that is higher than a GTX470
> 
> I'm waiting to see what the new cards can do,I would like to replace my 1080ti with one.



That would be an hell of an overclock as the GTX 470 was 2 to 3 times more powerful than the HD 4770 !


----------



## bug (Oct 18, 2020)

xman2007 said:


> I don't get this notion that for the same performance they should be 50-100 cheaper than their nvidia counterpart though its fairly common presumption



It's a psychological thing: if you offer something on-par with the competition, only later, you're at a disadvantage in the buyer's mind. So you "have to" offset that somehow, which is usually price.
Not very logical, but we aren't Vulcans either


----------



## BMfan80 (Oct 18, 2020)

PTyop said:


> That would be an hell of an overclock as the GTX 470 was 2 to 3 times more powerful than the HD 4770 !



I found an old screenshot and I compared my then 7950.


----------



## fynxer (Oct 18, 2020)

Yea, this guy must been high GPU DUST for sure and he thinks unicorns exists too.

Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that. Not a chance in hell this is true.


----------



## dj-electric (Oct 18, 2020)

Its end of 2020, been who-counts amount of years with overhyping GPUs, yet people still bite....


... people still bite...
"Seemingly confirmed" is truly just the icing on this cake.


----------



## Totally (Oct 18, 2020)

INSTG8R said:


> Well that’s usually what AMD does historically. But if they really have a 3080 contender who knows how pricing will go. It’s a pretty conventional card, no exotic HBM etc



Where in the dark ages bro, price wars are a thing of the past, I put those days behind me when AMD made the announcement saying that they were no longer the value brand when they released Zen. Since Nvidia is currently dictating prices, whatever ceiling Nvidia sets AMD is going to do their best to match.


----------



## INSTG8R (Oct 18, 2020)

dj-electric said:


> Its end of 2020, been who-counts amount of years with overhyping GPUs, yet people still bite....
> 
> 
> ... people still bite...
> "Seemingly confirmed" is truly just the icing on this cake.


I would say this has been the least hyped GPU release in years. No leaks no “Raja hype”(poor Intel) we got a little tease at the CPU launch that’s about it.



Totally said:


> Where in the dark ages bro, price wars are a thing of the past, I put those days behind me when AMD made the announcement saying that they were no longer the value brand when they released Zen. Since Nvidia is currently dictating prices, whatever ceiling Nvidia sets AMD is going to do their best to match.


Definitely want to see proper competition for sure.


----------



## Totally (Oct 18, 2020)

Max(IT) said:


> hopefully they didn't go the "9800XT way".
> We need a reliable product by AMD, after the "5700XT fiasco", not just something faster. Even if it is slightly slower than the Ampere competitor, but with 16 GB of VRAM and a lower price, I would consider it.
> IF (and a big IF) it is reliable.
> The 5700XT wasnt.



Fiasco? can you help my memory?

Did the cards crash to desktop frequently when gaming or pushed hard?
Were the cards sold out day 1, hour 1, minute 1 because of extremely limited supply or non-existent supply in someplaces?
Very power hungry?


----------



## bug (Oct 18, 2020)

Totally said:


> Fiasco? can you help my memory?


AMD's trademark: driver problems, to this day not solved.
And yes, it is power hungry. 5700 has better perf/W than its direct competitor, 5700XT, pushed a little harder, doesn't.
Personally, I wouldn't call 5700XT a fisaco (if you didn't care about RTRT, it is basically a 2070 for $100 less), but it certainly seemed to have more problems than usual. One problem actually, but very annoying knowing AMD couldn't solve it.


----------



## HD64G (Oct 18, 2020)

So, we have specs confirmed for Navi21XT (many doubted of those before and some of us were almost sure about those): 80CUs, 16GB VRAM, close to 2.3GHz boost clocks. Performance competitive to 3080 for the reference stock 6900XT. Price, driver stability-performance and volume for sale on launch week unknown. If AMD wants to become very competitive with RDNA2, it is in their hands now as nVidia has shown their cards (20GB 3080 won't become singnificantly faster than 10GB 3080 with the 3090 being only 10% faster).


----------



## Totally (Oct 18, 2020)

bug said:


> AMD's trademark: driver problems, to this day not solved.
> And yes, it is power hungry. 5700 has better perf/W than its direct competitor, 5700XT, pushed a little harder, doesn't.
> Personally, I wouldn't call 5700XT a fisaco (if you didn't care about RTRT, it is basically a 2070 for $100 less), but it certainly seemed to have more problems than usual. One problem actually, but very annoying knowing AMD couldn't solve it.



I was pointing out the 30 series is a worse launch and even that isn't being described as a fiasco. Even you say it yourself the 5700xt launch couldn't be described as a fiasco.


----------



## INSTG8R (Oct 18, 2020)

Totally said:


> I was pointing out the 30 series is a worse launch and even that isn't being described as a fiasco. Even you say it yourself the 5700xt launch couldn't be described as a fiasco.


Best part I’m sure many may not have even noticed was when “The Fixes” driver was released the. Boost clocks were raised so basically everyone got decent OC for their troubles.


----------



## EarthDog (Oct 18, 2020)

INSTG8R said:


> Best part I’m sure many may not have even noticed was when “The Fixes” driver was released the. Boost clocks were raised so basically everyone got decent OC for their troubles.


If a tree falls in the forest...? 

Kidding! But yeah, 5700xt was a bit unique. Worse performance per watt than its little brothers trying to punch up. Driver issues on launch...then adrenalin 2020 was quite a rough release as well. Black screen issues... its a mine field, but not something that should sway an enthusiast (others perhaps).


----------



## INSTG8R (Oct 18, 2020)

EarthDog said:


> If a tree falls in the forest...?
> 
> Kidding! But yeah, 5700xt was a bit unique. Worse performance per watt than its little brothers trying to punch up. Driver issues on launch...then adrenalin 2020 was quite a rough release as well. Black screen issues... its a mine field, but not something that should sway an enthusiast (others perhaps).


Well you know I literally punish myself with drivers, even still I‘ve still had a great experience overall. No denying the Adrenaline release did not go well.


----------



## LFaWolf (Oct 18, 2020)

Who is this Patrick Schur? A quick search shows this Patrick Schur joined twitter in August 2020 and is a software engineer, maybe someone that just graduated from college. Is this even credible at all?


----------



## EarthDog (Oct 18, 2020)

INSTG8R said:


> Well you know I literally punish myself with drivers, even still I‘ve still had a great experience overall. No denying the Adrenaline release did not go well.


you = enthusiast. So like I said.....

And you're a beta tester for Pete's sake. Lol. 

There isnt any denying that the waters are a bit more rough on the amd side as of late. Again, it shouldn't away most users, but if you're looking for a better chance of set it and forget it, the nod goes to the other side, if only by a small margin.


----------



## ZoneDymo (Oct 18, 2020)

PrEzi said:


> Wait what ?  What about the encoding engine with the on pair features that was there for ages? VCE ?
> I use A's video converter (free) for simple recodings if the quality doesn't matter (ans speed is preferred.... on the other hand with a 3960X it doesn't really make that much difference...).
> If I want to achieve a high quality then no HW encoder is able to deliver it. Only high quality 2-pass encoding on a CPU.



Im talking using something for livestreaming with for example OBS, Epoxvox has done multiple comparisons where VCE just does not touch Nvenc atm, not even close.



fynxer said:


> Yea, this guy must been high GPU DUST for sure and he thinks unicorns exists too.
> 
> Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that. Not a chance in hell this is true.



2.4 gameclock he says.


----------



## AusWolf (Oct 18, 2020)

The XTX chip with more CUs is AMD-exclusive? Does this mean that us, mere mortals (=Europeans) won't get fully unlocked chips, just like we can't have nvidia FE cards either? If that's the case, I'm not happy.


----------



## TheoneandonlyMrK (Oct 18, 2020)

EarthDog said:


> you = enthusiast. So like I said.....
> 
> And you're a beta tester for Pete's sake. Lol.
> 
> There isnt any denying that the waters are a bit more rough on the amd side as of late. Again, it shouldn't away most users, but if you're looking for a better chance of set it and forget it, the nod goes to the other side, if only by a small margin.


Most users didn't experience any issues with AMD driver's , it's hyped and we're over it for now.

The forums would have been flooded had the majority had issues instead of some.


----------



## INSTG8R (Oct 18, 2020)

theoneandonlymrk said:


> Most users didn't experience any issues with AMD driver's , it's hyped and we're over it for now.
> 
> The forums would have been flooded had the majority had issues instead of some.


Well the recently added Bug Report Tool makes it easier than ever to report any bugs that are encountered and sent right to AMD to get looked at. Even I use it testing drivers.


----------



## Chomiq (Oct 18, 2020)

Confirmed by nobody on Twitter, wow much reliable info.


----------



## okbuddy (Oct 18, 2020)

must be 10% faster than 3080


----------



## Punkenjoy (Oct 18, 2020)

On CPU side, AMD have made their proof with Zen 1 and Zen+ at first people were reluctant to jump in. Zen 1 had also few hickup at launch with memory support and stuff. But now, they have the brand recognition on CPU side so they can be the most expensive solution if they have more performance.


On the GPU side, they aren't there yet. If they price match or are higher priced than Nvidia, people will get Nvidia card anyway. Just to see how many 1030/1050/1650 that got sold when there was a way more performing card on AMD side for the same price. 


I think this is the generation that AMD have the potential, like Zen 2, to get some good recognition and brand name. If they do, they will probably be able to price match or charge more for Navi 3 cards


----------



## DeathtoGnomes (Oct 18, 2020)

bug said:


> AMD's trademark: driver problems, to this day not solved.
> And yes, it is power hungry. 5700 has better perf/W than its direct competitor, 5700XT, pushed a little harder, doesn't.
> Personally, I wouldn't call 5700XT a fisaco (if you didn't care about RTRT, it is basically a 2070 for $100 less), but it certainly seemed to have more problems than usual. One problem actually, but very annoying knowing AMD couldn't solve it.


I have yet to see any Day 1 drivers, from either camp, be perfect. Both camps have early driver problems. TeamGreen has a done a better job at updating drivers than AMD has, but far from perfect you expect from AMD.


----------



## Vayra86 (Oct 18, 2020)

If they can push 2.4 Ghz that is definitely going to change my perception of this release. 3080 equivalent?


----------



## Anymal (Oct 18, 2020)

cueman said:


> interesting.looks it cant beat rtx 3080 so amd drop good its tdp down.
> thouse clocks heard high,very high.
> 
> under 260w and rtx 3080 speed? noway.not 4K speed


TGP is under 260w, that is GPU only, you have to add memory modules and all other components on PCB to get TBP, 3080 has 320w TBP.



Vayra86 said:


> If they can push 2.4 Ghz that is definitely going to change my perception of this release. 3080 equivalent?


61 fps, 3080 65fps in Borderlands 3 4k ultra


----------



## Icon Charlie (Oct 18, 2020)

bug said:


> AMD's trademark: driver problems, to this day not solved.
> And yes, it is power hungry. 5700 has better perf/W than its direct competitor, 5700XT, pushed a little harder, doesn't.
> Personally, I wouldn't call 5700XT a fisaco (if you didn't care about RTRT, it is basically a 2070 for $100 less), but it certainly seemed to have more problems than usual. One problem actually, but very annoying knowing AMD couldn't solve it.



As posted previously.  The RX 5700 is a very good card and prefer it over the 5700XT that I have due to temp/wattage issues.   

If AMD's New card comes in reported around the 250 Watt range.  I WILL BUY ONE.   This will easily fit into my current rig without adding any increased stress to it, because I build my computer for upgradeability in the future.  

Most people do not build their rig for future upgrade potential  and therefore skimp on certain areas, such as the PSU. 

This is great if their wattage vs performance is true.


----------



## Vya Domus (Oct 18, 2020)

I don't think it'll be just 250W, caches consume a lot of power, in fact they are usually the biggest consumers of power in a chip. And if the rumors are true, a lot of this chip will just be memory.


----------



## DeathtoGnomes (Oct 18, 2020)

Anymal said:


> *TGP* is under 260w, that is GPU only, you have to add memory modules and all other components on PCB to get *TBP*, 3080 has 320w *TBP.*


I have to say, not everyone knows every acronym. TBP is The Pirate's Bay or The Burning Process?


----------



## Zach_01 (Oct 18, 2020)

Anymal said:


> TGP is under 260w, that is GPU only, you have to add memory modules and all other components on PCB to get TBP, 3080 has 320w TBP.
> 
> 
> 61 fps, 3080 65fps in Borderlands 3 4k ultra


Not exactly that...

TGP stands for "Total_Graphics_Power" and that includes GPU + VRAM + PCB components to power those.
TBP stands for "Total_Board_Power" and that includes TGP + Cooling solution + Lights(if any).

And the FPS numbers we saw on ZEN3 presentation was a tease from AMD to show that they have something better than a 2080Ti competitor and stop that false hype.
They are not stupid to show everything they have.


----------



## INSTG8R (Oct 18, 2020)

Anymal said:


> 61 fps, 3080 65fps in Borderlands 3 4k ultra


Navi was on DX12 and Bad Ass so even higher


----------



## EarthDog (Oct 18, 2020)

theoneandonlymrk said:


> Most users didn't experience any issues with AMD driver's , it's hyped and we're over it for now.
> 
> The forums would have been flooded had the majority had issues instead of some.


I get it. That doesn't mean it still wasn't a significant issue. If 98% of people were fine before and 97% were during the peak.. that's 50% more issues. It was enough of a problem that amd had to publically address it.  There were threads all over here, there, reddit, and everywhere. More than the usual background static for sure... for _several_ months. The blackscreen issue was "known" (listed) on their release notes for months. I dont call that hype. I would call it a non issue at this point.


----------



## Searing (Oct 18, 2020)

INSTG8R said:


> Well it did have a rough launch driver wise so first month or so wasn’t great but fiasco is a bit of an exaggeration.



No the 5700 XT had game crashes (random black screen then turn off) for 6+ months. Then it took them 13 months to finally solve my Netflix and Youtube crashes from hardware accelerated decode. That is not acceptable for any company. I was 0/5 for AMD for 5 different customers (had to return or sell all of them) until I bought a 5600 XT from MSI a month ago.


----------



## INSTG8R (Oct 18, 2020)

Searing said:


> No the 5700 XT had game crashes (random black screen then turn off) for 6+ months. Then it took them 13 months to finally solve my Netflix and Youtube crashes from hardware accelerated decode. That is not acceptable for any company.


Hardware acceleration has had issues on both sides with certain apps on and off.  Definitely not just an AMD issue.


----------



## evernessince (Oct 18, 2020)

Totally said:


> I was pointing out the 30 series is a worse launch and even that isn't being described as a fiasco. Even you say it yourself the 5700xt launch couldn't be described as a fiasco.



It's funny, that AMD fiasco.  Not a single reviewer could replicate the issues.  Not saying it didn't happen to some degree but it seems to it was more a bandwagon than a fact finding investigation.  I can't tell you how many times I've read "turns out it wasn't my 5700 XT that was the issue" on reddit.  It comes as no surprise that random anecdotal evidence on reddit does not qualify as fact.  I guess the more people spend, the more they feel it necessary to defend that purchase.  Makes sense giving turing was the worst generation price to performance increase wise in Nvidia history.

The 30xx series launch has certainly been a fiasco and we can all verify the facts of that.


----------



## TumbleGeorge (Oct 18, 2020)

INSTG8R said:


> Hardware acceleration has had issues on both sides with certain apps on and off.  Definitely not just an AMD issue.


Exactly, all processes corresponds via OS. Microsoft also has fingerprint into quality and efficiency of hardware acceleration. Maybe I'm wrong?


----------



## Searing (Oct 18, 2020)

I build computers for a living. I built 5 different RX 5700 XT computers, all with completely different parts, and all of them had some kind of severe crashing issue with the RX 5700 XT. I don't need to rely on youtubers for info, I'm more of an expert than they are. Over 100 video cards have passed through my hands in the last year. I know people want AMD to be healthy, but their driver team really messed up in the last year. Every computer without exception fixed by switching to nVidia or an OLD AMD card. The black screen bug was caused by many many different things, and they were playing whack a mole one at a time all year long.

I'll be first in line for the latest AMD card, but believe me I'm only giving them one or two chances.


----------



## squallheart (Oct 18, 2020)

PrEzi said:


> Wait what ?  What about the encoding engine with the on pair features that was there for ages? VCE ?
> I use A's video converter (free) for simple recodings if the quality doesn't matter (ans speed is preferred.... on the other hand with a 3960X it doesn't really make that much difference...).
> If I want to achieve a high quality then no HW encoder is able to deliver it. Only high quality 2-pass encoding on a CPU.


You do realize it doesn’t perform as well as NVENC both in speed and quality right? This fanboism is hilarious to me.


----------



## TheoneandonlyMrK (Oct 18, 2020)

EarthDog said:


> I get it. That doesn't mean it still wasn't a significant issue. If 98% of people were fine before and 97% were during the peak.. that's 50% more issues. It was enough of a problem that amd had to publically address it.  There were threads all over here, there, reddit, and everywhere. More than the usual background static for sure... for _several_ months. The blackscreen issue was "known" (listed) on their release notes for months. I dont call that hype. I would call it a non issue at this point.


I didn't say there wasn't a significant issue, just that not everyone was effected, your playing it up and clearly you think I'm downplaying it, so it's likely between the two eh.
We both agree those issues are in the past and I'm sure we both agree they could return at any moment, for any brand , the quality control on software isn't what it could be in most companies IMHO.


----------



## deltaseven (Oct 18, 2020)

I'm very much curious about the CU count on the XTX variant as the XT should have 80CU and is supposed to be the one shown on October 8th during Zen 3 announcement.

If price is right and drivers are also fine at launch, the car lineup is to be success.

I've read some more info here and there and about the clock speed around the Navi21 variants.

I'll leave this here as additional to "tweeting rumors" of the day:


__ https://twitter.com/i/web/status/1317731737347248129


----------



## INSTG8R (Oct 18, 2020)

Thing is the black screen issue stil creeps randomly and even with being set up to collect crash dumps and the crash mitigations that are currently in place, getting data to send to AMD has proven difficult but without some kind of log/dump files to pinpoint the issue it still remains random and elusive. I’ve personally never had one but was trying to help someone who was and hoping he’d have a dump or a log to forward but alas nope...


----------



## evernessince (Oct 18, 2020)

Searing said:


> I build computers for a living. I built 5 different RX 5700 XT computers, all with completely different parts, and all of them had some kind of severe crashing issue with the RX 5700 XT. I don't need to rely on youtubers for info, I'm more of an expert than they are. Over 100 video cards have passed through my hands in the last year. I know people want AMD to be healthy, but their driver team really messed up in the last year. Every computer without exception fixed by switching to nVidia. The black screen bug was caused by many many different things, and they were playing whack a mole one at a time all year long.
> 
> I'll be first in line for the latest AMD card, but believe me I'm only giving them one or two chances.



Statistically speaking having 5 cards with problems is extremely unlikely.  Typical GPU RMA rate is 0.8%.  AMD's 5700 XT return rate was 1.7% at the height of the driver issues.

Just saying, if your return rate for AMD is 100% either you are one in a billion unlucky or there is something else going on.


----------



## Steevo (Oct 18, 2020)

bug said:


> AMD's trademark: driver problems, to this day not solved.
> And yes, it is power hungry. 5700 has better perf/W than its direct competitor, 5700XT, pushed a little harder, doesn't.
> Personally, I wouldn't call 5700XT a fisaco (if you didn't care about RTRT, it is basically a 2070 for $100 less), but it certainly seemed to have more problems than usual. One problem actually, but very annoying knowing AMD couldn't solve it.




Sorry, but wut.

I pre-ordered my 9600XT and had zero issues with it. The 9xxx series was pretty good, and came with HL2 where the ATI drivers worked (without washed out colors) unlike Nvidia, and better performance and their AF actually worked at the time. 






						Half Life 2 GPU Roundup Part 2 - Mainstream DX8/DX9 Battle
					






					www.anandtech.com


----------



## Emu (Oct 18, 2020)

Searing said:


> No the 5700 XT had game crashes (random black screen then turn off) for 6+ months. Then it took them 13 months to finally solve my Netflix and Youtube crashes from hardware accelerated decode. That is not acceptable for any company. I was 0/5 for AMD for 5 different customers (had to return or sell all of them) until I bought a 5600 XT from MSI a month ago.



Funnily enough, I have that same issue still with my RTX 2080 ti with crashing to desktop and/or the screen turning off.  It is random enough that I cannot just return my card though.


----------



## Jism (Oct 18, 2020)

evernessince said:


> It's funny, that AMD fiasco.  Not a single reviewer could replicate the issues.  Not saying it didn't happen to some degree but it seems to it was more a bandwagon than a fact finding investigation.  I can't tell you how many times I've read "turns out it wasn't my 5700 XT that was the issue" on reddit.  It comes as no surprise that random anecdotal evidence on reddit does not qualify as fact.  I guess the more people spend, the more they feel it necessary to defend that purchase.  Makes sense giving turing was the worst generation price to performance increase wise in Nvidia history.
> 
> The 30xx series launch has certainly been a fiasco and we can all verify the facts of that.



Yep. PCI-E 4.0 and certain overclocked memory profiles, caused issues. When people turned down memory settings, suddenly the card worked perfect. Not AMD's fault for voiding stability when overclocking.


----------



## Punkenjoy (Oct 18, 2020)

Jism said:


> Yep. PCI-E 4.0 and certain overclocked memory profiles, caused issues. When people turned down memory settings, suddenly the card worked perfect. Not AMD's fault for voiding stability when overclocking.


Indeed

Many people were strongly GPU limited and didn't realised that their CPU Overclock wasn't stable at all. By putting a much faster GPU, the cpu finally had enough work to clock high enough to cause stability issue. It do not mean that Windows say your cpu is clocking at x frequency that it's actually doing that. By example my R5 3600 don't even clock at base clock in game due to me being GPU limited (no wonder why i am commenting on GPU news... lol)

The truth is people that had similar experience on Nvidia side thought about their CPU where on AMD with their bad driver reputation, they automatically got the blame. 

But taht do not means some had real driver issue. But these aren't just AMD, you can have rare issue on Nvidia side too. But on AMD side they will blame the GPU, on Nvidia, they will blame all the rest of the system. 

But in the end, NAVI was a new architecture and the driver had to mature. I suspect the change from Navi 10 to Navi 21 aren't large enough to have big drivers issues. That do not means there won't be, Even Nvidia had some with the 3080. 


But the good news is this market is getting very competitive. That is the best thing that can happen for us. More than just AMD destroying Nvidia or vice versa (oh god ... these click bait video on youtube...)


----------



## Searing (Oct 18, 2020)

evernessince said:


> Statistically speaking having 5 cards with problems is extremely unlikely.  Typical GPU RMA rate is 0.8%.  AMD's 5700 XT return rate was 1.7% at the height of the driver issues.
> 
> Just saying, if your return rate for AMD is 100% either you are one in a billion unlucky or there is something else going on.



Having 5 cards with problems from the drivers is absolutely likely. The cards are not broken. It isn't about returns. It is about not working properly. People keep stuff that crashes too much, but it wouldn't crash with an RX 580 or nVidia card. RMA or not means nothing. If I build a computer for someone and I get called back to fix it, then I know the issue. The card isn't returned. They'll give you a replacement that has the same problem as the hardware isn't the issue, the drivers are.


----------



## PrEzi (Oct 18, 2020)

squallheart said:


> You do realize it doesn’t perform as well as NVENC both in speed and quality right? This fanboism is hilarious to me.



And here you are completely missing my point (reading with understanding problems?). Where did I said that VCE is better? Where did I stated that I find this solution superior? 
I am not a streamer, I don't give a damn about OBS or whatsoever.
If I want a quality video material then by all means I don't use either NVENC or VCE, I use tuned 2-pass profiles and encoding on the Threadripper 3960X.

Calling someone a fanboy without understanding the point is hilarious to me.


----------



## Searing (Oct 18, 2020)

Emu said:


> Funnily enough, I have that same issue still with my RTX 2080 ti with crashing to desktop and/or the screen turning off.  It is random enough that I cannot just return my card though.



The AMD issue is very specific. The screen goes black, the audio continues, then it hard crashes. There were MANY different things causing that to happen. I'd love for AMD to give an explanation but they won't. Crashing to desktop is very different and not as severe. I hope you can find a fix. Try underclocking your card, crashes to desktop are a lot easier to get a handle on than hard faults.


----------



## TheoneandonlyMrK (Oct 18, 2020)

Searing said:


> It isn't about returns. It is about not working properly. People keep stuff that doesn't work and that crashes, but it wouldn't crash with an RX 580 or nVidia card.


I had a different experience , I have a gtx460X2 ,A Gtx560 too that were always a nightmare and just dead respectively, they're in the draw of doom I kept them my first Polaris and this Vega will be there before long but all the AMD card's have on my life, continuously crunched, folded or for a bit mined on AMD driver's both stock , heavily over clocked on air and water solidly, I wouldn't knock your experiences though I have not used an rDNA card, there were architectural differences that require massive driver change's so I wouldn't be surprised if they had issues, evidently they did, excuse the pre emptive post.


----------



## Searing (Oct 18, 2020)

INSTG8R said:


> Best part I’m sure many may not have even noticed was when “The Fixes” driver was released the. Boost clocks were raised so basically everyone got decent OC for their troubles.



I'll take a fix after one week over a fix after 13 months any day though 



theoneandonlymrk said:


> I had a different experience , I have a gtx460X2 ,A Gtx560 too that were always a nightmare and just dead respectively, they're in the draw of doom I kept them my first Polaris and this Vega will be there before long but all the AMD card's have on my life, continuously crunched, folded or for a bit mined on AMD driver's both stock , heavily over clocked on air and water solidly, I wouldn't knock your experiences though I have not used an rDNA card, there were architectural differences that require massive driver change's so I wouldn't be surprised if they had issues, evidently they did, excuse the pre emptive post.



I've had tons and tons of AMD cards with no problems. The 5700 XT is different. In fact I used to consider nVidia's drivers to be worse historically (the Vista problems). I also had all nVidia cards crashing Hearthstone with certain high FPS monitors, and GTA5 so I used to actually replace nVidia cards with AMD cards to solve those problems (it took Blizzard a few years, but the Hearthstone launch crash has been fixed now).

I'm just worried the partisanship has really led AMD fans and even AMD itself to not take their driver issues seriously. (Hopefully they've reorganized their team and hired more people). They can't afford the 5700 XT fiasco to happen again with their next launch. I also found AMD's support to be dismissive. Like I'll show the "gpu accelerated bug" caused my GPU to go to 100 percent usage and drop frame rates in youtube to a few fps, and file a report and they'll say "we can't replicate that issue" and that's it. Like one guy tries it, discounts it, and gives up. Never fixes it. People won't be AMD customers for long with that kind of quality.


----------



## evernessince (Oct 19, 2020)

Searing said:


> Having 5 cards with problems from the drivers is absolutely likely. The cards are not broken. It isn't about returns. It is about not working properly. People keep stuff that crashes too much, but it wouldn't crash with an RX 580 or nVidia card. RMA or not means nothing. If I build a computer for someone and I get called back to fix it, then I know the issue. The card isn't returned. They'll give you a replacement that has the same problem as the hardware isn't the issue, the drivers are.



No because polls from hardware unboxed 3 months back suggest that only 7% of card owners were having severe issues like what you were describing (and lets be honest here, a number of those users are likely Nvidia owners checking the box that makes AMD look worst).  You are suggesting that 100% of 5700 / 5700 XTs had issues and that simply isn't true.  I should not have to say that though, if 100% had issues, drivers or otherwise, AMD would have been decimated.  Why you make the assumption that 5/5 cards having issues is "likely" .  No, in no universe is it.  None of the data supports that and we haven't ever seen anywhere near 100%.

My prior statement stands:  Either you are extraordinary unlucky or there is something else going on.



Searing said:


> I'm just worried the partisanship has really led AMD fans and even AMD itself to not take their driver issues seriously. (Hopefully they've reorganized their team and hired more people). They can't afford the 5700 XT fiasco to happen again with their next launch. I also found AMD's support to be dismissive. Like I'll show the "gpu accelerated bug" caused my GPU to go to 100 percent usage and drop frame rates in youtube to a few fps, and file a report and they'll say "we can't replicate that issue" and that's it. Like one guy tries it, discounts it, and gives up. Never fixes it. People won't be AMD customers for long with that kind of quality.



No one's saying AMD shouldn't improve it's drivers.  Navi clearly had / has issues.

The problem is when people come into threads making claims like 100% of AMD cards have issues.  You can't complain about partisanship when you yourself added to it.


----------



## dinmaster (Oct 19, 2020)

the issues thing would have been front page news like the nvidia shortage of 3000 series cards if everyone was having the problems... move on


----------



## EarthDog (Oct 19, 2020)

theoneandonlymrk said:


> I didn't say there wasn't a significant issue, just that not everyone was effected, your playing it up and clearly you think I'm downplaying it, so it's likely between the two eh.
> We both agree those issues are in the past and I'm sure we both agree they could return at any moment, for any brand , the quality control on software isn't what it could be in most companies IMHO.


lol, I'm not playing it up. I'm a realist. The issue wasn't hyped. I'll leave it at that.


----------



## renz496 (Oct 19, 2020)

Flanker said:


> If RDNA2 can do what the HD4xxx series did at the time, I think that will be good news for consumers.



good news to consumer but it can be more damaging to AMD.


----------



## TheoneandonlyMrK (Oct 19, 2020)

renz496 said:


> good news to consumer but it can be more damaging to AMD.


How can a good card be bad for AMD as Xfx have shown, there's a market for all their old cards, the new, well see.

Unlike some I think there's room for more profitable GPU vendors not less.

Back in the dawn of 3d the pliable market was small so many a maker went under or we're bought, now two is far from enough, or balance.


----------



## INSTG8R (Oct 19, 2020)

theoneandonlymrk said:


> How can a good card be bad for AMD as Xfx have shown, there's a market for all their old cards, the new, well see.


Yeah talking about all these “Navi issues” And XFX really didn’t help with their release...thankfully the came out with a redesign that it now one of the better regarded cards


----------



## TheoneandonlyMrK (Oct 19, 2020)

INSTG8R said:


> Yeah talking about all these “Navi issues” And XFX really didn’t help with their release...thankfully the came out with a redesign that it now one of the better regarded cards


Definitely for mining apparently!.rumoured!.


----------



## INSTG8R (Oct 19, 2020)

theoneandonlymrk said:


> Definitely for mining apparently!.rumoured!.


Nah I mean the terrible first Thicc they released that as basically useless at cooling...


----------



## TheoneandonlyMrK (Oct 19, 2020)

INSTG8R said:


> Nah I mean the terrible first Thicc they released that as basically useless at cooling...


Thing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.


----------



## INSTG8R (Oct 19, 2020)

theoneandonlymrk said:


> Thing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.


Yeah but have to give them credit the replacement turned out to be an excellent card.


----------



## renz496 (Oct 19, 2020)

theoneandonlymrk said:


> How can a good card be bad for AMD as Xfx have shown, there's a market for all their old cards, the new, well see.
> 
> Unlike some I think there's room for more profitable GPU vendors not less.
> 
> Back in the dawn of 3d the pliable market was small so many a maker went under or we're bought, now two is far from enough, or balance.



the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.


----------



## INSTG8R (Oct 19, 2020)

renz496 said:


> the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.


Well this is definitely their chance with Nvidia's supply issue and inflated prices. If big Navi is truly competitive and available they have a chance to really take some market share back.


----------



## TheoneandonlyMrK (Oct 19, 2020)

renz496 said:


> the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.


Except with chips as big as Nvidia's on the latest node's, with the very latest memory, all in tight supply, that's not going to happen ,the price is going, gone up,.


----------



## Mussels (Oct 19, 2020)

Hell if its really 250W, might be worth it over my 3080 pre order...


----------



## Cheeseball (Oct 19, 2020)

Still going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):

$599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.

For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.


----------



## Mussels (Oct 19, 2020)

fynxer said:


> Yea, this guy must been high GPU DUST for sure and he thinks unicorns exists too.
> 
> Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that. Not a chance in hell this is true.



GHz, not MHz... and he says game frequency not base clock which we don't know much about yet.


----------



## Mysteoa (Oct 19, 2020)

renz496 said:


> the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.



Nvidia makes more money because they have bigger market share in consumer and professional/server GPU markets whereas AMD still have a small CPU and GPU market. They do make a decent amount of money from consoles, but they also have a huge dept that they are paying off. Lest not forget that Nvidia still has more employees than AMD. This thing takes time, you can't compare a company that doing good all the time to one that is recovering and expect it to be more profitable than the first one.


----------



## Bruno_O (Oct 19, 2020)

Cheeseball said:


> Still going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):
> 
> $599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.
> 
> For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.



Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).


----------



## rtwjunkie (Oct 19, 2020)

EarthDog said:


> Nice... im imagining a 3080 competitor (within a few % give or take) and cheaper. Sounds like a winner!


Depending on how true the rumor is, it certainly sounds competitive.



fynxer said:


> Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that.


Read it again. It says “can reach”, not base clocks.


----------



## Cheeseball (Oct 19, 2020)

Bruno_O said:


> Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
> The AMD fine wine attacked again (new uArch so of course big gains over time).



Still 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.

Also I own a 5700 XT and it does match up.


----------



## INSTG8R (Oct 19, 2020)

Cheeseball said:


> Still 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.
> 
> Also I own a 5700 XT and it does match up.


For me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon


----------



## Camm (Oct 19, 2020)

I would not have liked to have been an earlier adopter of a 3080\3090, with what scuttlebutt of AMD's performance is looking like.

Can't really make a Super/Ti variant, so Nvidia's only option will be price cuts (depending on where AMD price of course, but I'm expecting AMD to want to recoup market share, so with a product with a likely lower BOM cost, now's the time).

Even then, with how expensive many of those coolers look to make, that'll eat in seriously to Nvidia's margins.


----------



## nguyen (Oct 19, 2020)

Bruno_O said:


> Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
> The AMD fine wine attacked again (new uArch so of course big gains over time).



It was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday


----------



## Camm (Oct 19, 2020)

nguyen said:


> Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
> Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).



PCIE runs regardless of game.

DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.


----------



## Totally (Oct 19, 2020)

INSTG8R said:


> For me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon



Same but I'll probably skip over this gen


----------



## INSTG8R (Oct 19, 2020)

Totally said:


> Same but I'll probably skip over this gen


Well I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.



nguyen said:


> It was benchmarked with 3950X test rig with PCIe 4.0
> 5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X
> 
> Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
> ...


PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark


----------



## nguyen (Oct 19, 2020)

Camm said:


> PCIE runs regardless of game.
> 
> DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.



If so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0

Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.



INSTG8R said:


> Well I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.
> PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark



Not really, you can check HUB PCIe 3.0 vs 4.0 benchmarks, even the 5700XT can gain 5% with PCIe 4.0

Every optimization is a performance trick, do you really care when it give the same Image Quality ?
If no one explained how DLSS work, you would just consider it an optimization like anything else.


----------



## Camm (Oct 19, 2020)

nguyen said:


> Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS)



I got what you were saying, I said that DLSS unlike PCIE isn't available across all titles. Considering game choice to benchmark is also subjective, should HWU remove DLSS titles from their benchmarking suite?

Besides, HWU at the end of every benchmark I've seen of them from late have included DLSS numbers separately in the same review so people can work out for themselves if they would prefer the performance at an upscaled resolution. DLSS is very good, but by its nature, isn't lossless.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> If so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0
> 
> Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.


Again find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here  One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.

Edit: fell free t check this to a;y kind of advantage PCU 4 offers...








						NVIDIA GeForce RTX 3080 PCI-Express Scaling
					

NVIDIA Ampere finally brings PCI-Express 4.0 support to the high-end graphics market. The new interface promises twice the bandwidth of PCI-Express 3.0. We've setup an AMD Ryzen 3900XT system to test how various PCIe generations and lane widths affect gaming performance.




					www.techpowerup.com


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> Again find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here  One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.



So you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.
Work smarter, not harder


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> So you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
> It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.


Uh again totally off track  with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p  upscaling to 4K   How you can even consider that an equal bench result is truly laughable ...


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> Uh again totally off track  with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p  upscaling to 4K   How you can even consider that an equal benc( result is truly laughable ...



Well if 720p upscaled to 4K have equal IQ as native 4K, I say go for it.
Every hardware editorial seems to agree that if DLSS is available, use it. They didn't say anything about trading visual quality for performance (as least with DLSS in Quality mode).

The only thing with DLSS is that it's unfair to AMD, who has no answer yet. But it's a competition out there, not some fun race between friends.

Let think of it this way, while AMD continue to optimize rasterization performance for their GPU, Nvidia effort is to incorporate DLSS into new games. But somehow benching DLSS is too unfair for AMD, it makes no fricking sense.

Also equal settings and equal conditions were never actually equal, some games just have optimizations for specific brand, just like Far Cry 5 feature rapid-packed math.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> Well if 720p upscaled to 4K have equal IQ as 4K, I say go for it.
> Every hardware editorial seems to agree that if DLSS is available, use it. They didn't say anything about trading visual quality for performance (as least with DLSS in Quality mode).
> 
> The only thing with DLSS is that it's unfair to AMD, who has no answer yet. But it's a competition out there, not some fun race between friends.
> ...


Okay again fir the last time ...That’s not how benchmarks work! you compare every card under equal conditions or else it’s not a valid benchmark PERIOD. If AMD has a similar feature it ALSO could not be used...no ones denying some games are optim better for one brand but the cards are run at DEFAULT settings and results are based on their raw performance at those settings a# equal as possible  I’m glad you really like DLSS but it’s a “feature“ nobody gets to run features in benchmarks. all cards are run stock/default always.  Why do you think W1zzard has. single test bench where the only difference is card he“s testing and the drivers installed for it.? 5o every card is tested under EQUAL conditions.


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> Okay again fir the last time ...That’s not how benchmarks work! you compare every card under equal conditions or else it’s not a valid benchmark PERIOD. If AMD has a similar feature it ALSO could not be used...no ones denying some games are optim better for one brand but the cards are run at DEFAULT settings and results are based on their raw performance at those settings a# equal as possible  I’m glad you really like DLSS but it’s a “feature“ nobody gets to run features in benchmarks. all cards are run stock/default always.  Why do you think W1zzard has. single test bench where the only difference is card he“s testing and the drivers installed for it.? 5o every card is tested under EQUAL conditions.



I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
Are those equal testing conditions ?


----------



## rtwjunkie (Oct 19, 2020)

nguyen said:


> I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
> Are those equal testing conditions ?


It will be tested under against every other card at the same standard The test is card to card, not card 1 under this standard and card 2 under this standard plus some hardware the other can’t have.

where you WILL see 4.0 tested with bells and whistles is in the AAA game performance  reviews that W1zz does occasionally. In thise he DOES test what a game can do under the soecial abitlities or feathres that different cards have, because it is not a card to card comparison.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
> Are those equal testing conditions ?


And were bac’k to Appless and Pumpkins it does NOT effect the results n any way. Did  you read the 3080 PCI scaling tests? i saw 1FPS difference across all resolutions that is literally margin of error iW1zzards current test bench is 4.0 but again ir will have absolutely zero difference  you’re trying to make a metric that doesn’t exist to justify your favourite new pet feature should somehow make for valid benchmarks. you know where PCI4 would be an unfair advantage? Testing JNVME drives...something that can actually util the extra bandwidth GFX still don’t fully utilize 3.0. It was also pointed out that ithe HUB reviews f DLSS was available the results were listed but would NEVER be included in the benchmark result s


----------



## Camm (Oct 19, 2020)

nguyen said:


> I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
> Are those equal testing conditions ?



I'm not sure why you have such a hard on for this, testing shows negligible to no difference, and all Nvidia GPU's are now at PCIE 4.0.

Furthermore, DLSS & PCIE are not comparable. One is a lossy upscaling technology only available on a few games, the other is a ubiquitous connection standard.

But if you really want to split hairs, everyone testing on an Intel system tested at PCIE3.


----------



## dicktracy (Oct 19, 2020)

2.4Ghz but slower than the 3080 especially when RT is turned on. Insta-Passss!


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> And were bac’k to Appless and Pumpkins it does NOT effect the results n any way. Did  you read the 3080 PCI scaling tests? i saw 1FPS difference across all resolutions that is literally margin of error iW1zzards current test bench is 4.0 but again ir will have absolutely zero difference  you’re trying to make a metric that doesn’t exist to justify your favourite new pet feature should somehow make for valid benchmarks. you know where PCI4 would be an unfair advantage? Testing JNVME drives...something that can actually util the extra bandwidth GFX still don’t fully utilize 3.0. It was also pointed out that ithe HUB reviews f DLSS was available the resu were listed but would NEVER be included in the benchmark result s



holy jeez, since when were I talking about TPU benchmarks.
I was responding to a post about HUB testing, not TPU.
In the HUB testing, Steve also said that PCIe 4.0 contribute to a few % net gain for 5700XT, which 2070S and 2060S were left out. 









This is the vid I was responding to, not some PCIe scaling benchmark at TPU.

Take a hint will you.


----------



## Camm (Oct 19, 2020)

dicktracy said:


> 2.4Ghz but slower than the 3080 especially when RT is turned on. Insta-Passss!



Sauce? Since you have links to prerelease benchmarks across the product stack.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> holy jeez, since when were I talking about TPU benchmarks.


I’m just informing you that TPU is now benchmarking on PCI 4.0 as well so you better clutch your pearls at all the benches done here too then. You literally can’t let go that PCI 4.0 is offering no advantage. All the AmPete cards are 4.0 I’ll be sure to cry foul when the have an unfair advantage over your 2080ti...


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> I’m just informing you that TPU is now benchmarking on PCI 4.0 as well so you better clutch your pearls at all the benches done here too then. You literally can’t let go that PCI 4.0 is offering no advantage. All the AmPete cards are 4.0 I’ll be sure to cry foul when the have an unfair advantage over your 2080ti...



So you are comparing apple to pumpkin and accusing me of doing so. TPU benchmark suit are different to HUB, different game different testing config.
If TPU testing show no difference then it must be the same for every other testing condition ?
Pretty small minded aren't you, and you called yourself enthusiast ?

Well since you have X570 and 5700XT, might as well test them yourself against HUB






Yeah some time next month once I get 5950X + 3090 I might do some PCIe scaling benchmark for you


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> So you are comparing apple to pumpkin and accusing me of doing so. TPU benchmark suit are different to HUB, different game different testing config.
> If TPU testing show no difference then it must be the same for every other testing condition ?
> Pretty small minded aren't you, and you called yourself enthusiast ?
> 
> ...


The problem is YOU cannot let go of this literally insignificant  deviation like 4.0 is cheating 3.0 cards somehow.  Still not seeing any advantage her!e? W1z ha# always done PCI scaling tests and he’s tested this as well  and I  trust his methods and he literally useful the most recent flagship 4.0 card  I just picked one bench at 4K but you can pic any title or resolution he tested and the all look like this...I don’t use HIUB for hardware reviews  or benchmarks so no I don’t know their methods but I do  know W1zzards there is no more than 2FPS difference in every resolution and game he tested with a 4.0 3080 if any card could possibly use the extra bandwidth it would be that one. Your“re chasing a metric that ha# zero impact on benchmark scoring


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> The problem is YOU cannot let go of this literally insignificant  deviation like 4.0 is cheating 3.0 cards somehow.  Still not seeing any advantage her!e? W1z ha# always done PCI scaling tests and he’s tested this as well  and I  trust his methods and he literally useful the most recent flagship 4.0 card  I just picked one bench at 4K but you can pic any title or resolution he tested and the all look like this...I don’t use HIUB for hardware reviews  or benchmarks so no I don’t know their methods but I do  know W1zzards there is no more than 2FPS difference in every resolution and game he tested with a 4.0 3080 if any card could possibly use the extra bandwidth it would be that one. Your“re chasing a metric that ha# zero impact on benchmark scoring
> View attachment 172318



I litterally talked about how flawed HUB testing was, yet here you are defending TPU ?
When did I even talked about TPU

OK sure mister "zero" impact, "equal testing condition" , whatever you said


----------



## Camm (Oct 19, 2020)

Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.

Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB 2 slot, it doesn't make any sense.

And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: Isn't a lossy upscaling solution that isn't rendering at the benchmarked resolution.


----------



## nguyen (Oct 19, 2020)

Camm said:


> Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.
> 
> Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB slot, it doesn't make any sense.
> 
> And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: A lossy upscaling solution that isn't rendering at the benchmarked resolution.



C: should you disable DLSS where it is available to you ?

It's like DX12 vs DX11, you use whatever API give your better FPS and frametimes, forcing DX12 benchmark onto Nvidia cards where it is detrimental is quite unfair no ? same as disabling DLSS.

I believe the point of benchmarking is that it must resemble real world usage ?


----------



## Camm (Oct 19, 2020)

nguyen said:


> C: should you disable DLSS where it is available to you ?
> 
> It's like DX12 vs DX11, you use whatever API give your better FPS and frametimes, forcing DX12 benchmark onto Nvidia cards where it is detrimental is quite unfair no ?











						How ATI's drivers 'optimize' Quake III
					

MOST OF YOU ARE probably familiar by now with the controversy surrounding the current drivers for ATI’s Radeon 8500 card. It’s become quite clear, thanks to this article at the...




					techreport.com
				




A lossy image isn't the same as a lossless image, and suggesting as such reminds me of the Quake 3 driver shens from ATi & Nvidia back in the day. Or should we benchmark AMD cards running at the same internal resolution of DLSS & apply sharpening with native render target and call it comparable?


----------



## nguyen (Oct 19, 2020)

Camm said:


> How ATI's drivers 'optimize' Quake III
> 
> 
> MOST OF YOU ARE probably familiar by now with the controversy surrounding the current drivers for ATI’s Radeon 8500 card. It’s become quite clear, thanks to this article at the...
> ...












DLSS vs Fidelity FX


----------



## phanbuey (Oct 19, 2020)

Camm said:


> Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.
> 
> Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB 2 slot, it doesn't make any sense.
> 
> And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: Isn't a lossy upscaling solution that isn't rendering at the benchmarked resolution.



Except drives are limited by the bandwidth offered by USB 2.0, where as the 5700xt doesn't come anywhere near tapping out the bandwidth of a PCIE 2.0 slot, nevermind 3 or 4.  It's more like testing a USB 1 drive in a 2 or 3 slot.

Also PCI-E devices are specc'ed to be backwards compatible, or are supposed to be.  So if the 5700XT can't properly operate at 3.0 even though it has more than enough bandwidth, then there is something wrong with the design/implementation of that standard on the card.


----------



## Camm (Oct 19, 2020)

nguyen said:


> DLSS vs Fidelity FX



So now one lossy solution isn't allowed to be benchmarked, but another one is? At least be consistent in your bullshit.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> I litterally talked about how flawed HUB testing was, yet here you are defending TPU ?
> When did I even talked about TPU
> 
> OK sure mister "zero" impact, "equal testing condition" , whatever you said


How many more charts showing zero difference do you need to see before you drop it as advantage so you can keep typing to justify DLSS a is not?


----------



## nguyen (Oct 19, 2020)

Camm said:


> So now one lossy solution isn't allowed to be benchmarked, but another one is? At least be consistent in your bullshit.



Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX


----------



## Camm (Oct 19, 2020)

nguyen said:


> Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX



Bit hard to say its equal image quality since DLSS's processing of the image uses graphical techniques that aren't exposed by the game engine, making comparison difficult. Secondly, not all frames have equal graphical fidelity, as when new data is introduced to the scene, the AI algorithm has to 'race' to gain info from multiple frames to reconstruct the image.

Is DLSS good? Yup, will we probably play all games like this? Yup (if Nvidia bothers to make a solution that works across all games, which is another reason why it shouldn't be used in benchmarking scenarios). But it has fatal flaws that preclude it from being compared apples to apples, and benchmarking data with DLSS should be shown, but called out separately so the reader can make a value call on the limited support and performance subset.....

Which both HWU & TPU do. So again, whats your problem?


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX


Again neither of those things would be used in any benchmark except comparing them to each other and you h literally found the only example where it’s ev possible because it’s the only game that has both. Still gives zero weight to your argument For DLSS.But hey I can actually argue for Fidelity FX being the better of the 2 in that game I own I finished it and use$ Fidelity F.X but that comparison shows DLSS with some obvious visual  issues Fidelity FX diid‘t  and overall looked better.I know my performance in the game was fantastic and looked totally amazing .  Sti , can’t use it in benchmarks. .


----------



## nguyen (Oct 19, 2020)

Camm said:


> Bit hard to say its equal image quality since DLSS's processing of the image uses graphical techniques that aren't exposed by the game engine, making comparison difficult. Secondly, not all frames have equal graphical fidelity, as when new data is introduced to the scene, the AI algorithm has to 'race' to gain info from multiple frames to reconstruct the image.
> 
> Is DLSS good? Yup, will we probably play all games like this? Yup (if Nvidia bothers to make a solution that works across all games, which is another reason why it shouldn't be used in benchmarking scenarios). But it has fatal flaws that preclude it from being compared apples to apples, and benchmarking data with DLSS should be shown, but called out separately so the reader can make a value call on the limited support and performance subset.....
> 
> Which both HWU & TPU do. So again, whats your problem?



DLSS is like a specific customization for a game, doesn't mean it should be excluded though.
I was responding to a recent HUB testing that did not include DLSS result, so there is that.
TPU did some DLSS 2.0 testing, where is that ?



INSTG8R said:


> Again neither of those things would be used in any benchmark except comparing them to each other and you h literally found the only example where it’s ev possible because it’s the only game that has both. Still gives zero weight to your argument For DLSS.But hey I can actually argue for Fidelity FX being the better of the 2 in that game I own I finished it and use$ Fidelity F.X but that comparison shows DLSS with some obvious visual  issues Fidelity FX diid‘t  and overall looked better.I know my performance in the game was fantastic and looked totally amazing .  Sti , can’t use it in benchmarks. .



Kinda funny your conclusion is entirely contrary to what the author was saying. But hey, I was saying HUB was pretty unfair too , reviewer vs normal user yeah ?
You can apply FidelityFX to any game, just create a custom resolution and use the GPU scaler to upscale it to fit the screen, 5 seconds custom job.


----------



## Searing (Oct 19, 2020)

nguyen said:


> DLSS is like a specific customization for a game, doesn't mean it should be excluded though.
> I was responding to a recent HUB testing that did not include DLSS result, so there is that.
> TPU did some DLSS 2.0 testing, where is that ?



DLSS is terrible and doesn't work in 99 percent of the games I play. Next.



evernessince said:


> No because polls from hardware unboxed 3 months back suggest that only 7% of card owners were having severe issues like what you were describing (and lets be honest here, a number of those users are likely Nvidia owners checking the box that makes AMD look worst).  You are suggesting that 100% of 5700 / 5700 XTs had issues and that simply isn't true.  I should not have to say that though, if 100% had issues, drivers or otherwise, AMD would have been decimated.  Why you make the assumption that 5/5 cards having issues is "likely" .  No, in no universe is it.  None of the data supports that and we haven't ever seen anywhere near 100%.
> 
> My prior statement stands:  Either you are extraordinary unlucky or there is something else going on.
> 
> ...



I didn't contribute to any partisanship. I bought 5 RX 5700 XT cards over 13 months for 5 different clients and all of them had problems. 5/5 had problems. I'm pointing out it was not rare at all. There was mass misery online with those people knowledgeable enough about the problem trying to bring it to AMD's attention. Many people using those cards didn't even know it if they didn't play the same games, or didn't watch Netflix on Windows for example.

Lived experiences are better than random people talking about something they haven't experienced.

AGAIN: RMA of a broken card, vs. unfixable crashing from driver issues... those two things are totally different.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> DLSS is like a specific customization for a game, doesn't mean it should be excluded though.
> I was responding to a recent HUB testing that did not include DLSS result, so there is that.
> TPU did some DLSS 2.0 testing, where is that ?
> 
> ...


Funny you try to cast off fidelity FX as so simple it’s little bit more than that but guess what it looks good to compare to your precious DLSS and looks better doing for your crude description of what it’s doing while your tech is down res then upscaling that fancy AI working hard to hide what’s actually going  to not look like crap on the fly. Apparently I just need to make a custom Res and mine looks just as  good and performs a#s well or better and apparently I can apply it to any game I want...Sounds to me like Fidelity FX should be added to more games being pretty easy to eo. See the weakness of DLSS is that its constantly having to keep the illusion up on the fly and you can where it struggles to keep up with fast changes, Fidelity FX shows no such visual “artifacts”


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> Funny you try to cast off fidelity FX as so simple it’s little bit more than that but guess what it looks good to compare to your prec DLSS and looks better doing for your crude description of what it’s doing while your tech is down res then upscaling that fancy AI working hard to hide what’s actually going  to not look like crap on the fly. Apparently I just need to make a custom Res and mine looks just as  good and performs a#s well or better and apparently I can apply it to any game I want...Sounds to me like Fidelity FX should be added to more games being pretty easy to eo. See the weakness of DLSS is that its constantly having to keep the illusion up on the fly and you can where it struggles to keep up with fast changes, Fidelity FX shows no such visual “artifacts”



I could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit , see the big rock ?
And this is with YT compression, IRL the difference is probably massive.

I would rather just lower the details setting than using FidelityFX tbh, it looks too blurry.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> I could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit , see the big rock ?
> And this is with YT compression, IRL the difference is probably massive.


Well you say its “simple” but good enough to compare to your fancy system that can equally go  shitty when it can’t keep up with fast changing scenes and teXfursx I”m not gonna dig through the videos to find the one wit’s literally shimmering lines in the air defenders saying it just part of the game. Fidelity FX can look more than half as good as. DLSS for just “custom res and GPU scaling” kinda make# your precious DLSS a lot of high tech trickery when AMD  can do the same thing with just “simple settings‘“ oh and you also said I could do it with any gang Score another point for AMDs simple 
solution.
Laave it to Nvidia  to bring a backhoe to a job hen all AMD need to bring is a shovel.  Might take a little longer but still gonna dig the same hole


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> Well you say its “simple” but good enough to compare to your fancy system that can equally go  shitty when it can’t keep up with fast changing scenes and teXfursx I”m not gonna dig through the videos to find the one wit’s literally shimmering lines in the air defenders saying it just part of the game. Fidelity FX can look more than half as good as. DLSS for just “custom res and GPU scaling” kinda make# your precious DLSS a lot of high tech trickery when AMD  can do the same thing with just “simple settings‘“ oh and you also said I could do it with any gang Score another point for AMDs simple solution.



Yeah sure a solution that can be discerned within 5 seconds in a blind test is "good enough".
Not really agree to that. 
That is like saying play with low settings, which I would rather use instead of FidelityFX if I didn't have DLSS and need more FPS.


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> Yeah sure a solution that can be discerned within 5 seconds in a blind test is "good enough".
> Not really agree to that.
> That is like saying play with low settings, which I would rather use instead of FidelityFX if I didn't have DLSS and need more FPS.


LOL the entire basis of DLSS is “low setting’“ upscaled and cleverly trying to cover it up...you have ay too mch of a hard on for basically play games rendered at 720p made to look like it’s 4K.... at Lear I played Death Stranding somewhere close t9 my/ native Res I didn’t in fact co use any AA and desp your sensit eye# it looked fantastic maxecd out and ran fantastic as well, I even got HDR. .


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> LOL the entire basis of DLSS is “low setting’“ upscaled and cleverly trying to cover it up...you have ay too mch of a hard on for basically play games rendered at 720p made to look like it’s 4K.... at Lear I played Death Stranding somewhere close t9 my/ native Res I didn’t in fact co use any AA and desp your sensit eye# it looked fantastic maxecd out and ran fantastic as well, I even got HDR. .



Idk why you are so against innovation.
DLSS is not an upscaling technique, it is an image resconstruction technique. Basically Nvidia train the AI network what the objects looks like in 16K, in game the Tensor cores will recognize those low res objects and restructure them to imitate 16K images.
It is "work smarter, not harder" motto.

if the end results are equal, why do you care so much how it is done ?
Do you care if a race car only has V6 engine but is just as fast as those with V8 or V12 ?


----------



## INSTG8R (Oct 19, 2020)

It’s lowering the resolution to a more playable level the upscaling it to your desired resolution and the AI is just working hard to keep it looking good. I think it’s quite clever but DLSS 1 made the techni pretty obvious to me it’s meant for that poor guy that wants game on his crappy laptop at a decent frame rate. And Frankly not sure why DLSS got added to Death Stranding I was running it maxed out at 1440 getting 120 -144 FPS most of the time it ran and looked amazing. My favourite game i’ve played this year. Maybe PCI 4.0 as giving me an advantage   
But yes I much prefer pure GPU grunt to get my FPS while I did have Fidelity FX on Im not sure how much difference it made=for performance.


----------



## Greenfingerless (Oct 19, 2020)

It will be a winner if a new power supply isnt needed.


----------



## nguyen (Oct 19, 2020)

INSTG8R said:


> It’s lowering the resolution to a more playable level the upscaling it to your desired resolution and the AI is just working hard to keep it looking good. I think it’s quite clever but DLSS 1 made the techni pretty obvious to me it’s meant for that poor guy that wants game on his crappy laptop at a decent frame rate. And Frankly not sure why DLSS got added to Death Stranding I was running it maxed out at 1440 getting 120 -144 FPS most of the time it ran and looked amazing. My favourite game i’ve played this year. Maybe PCI 4.0 as giving me an advantage
> But yes I much prefer pure GPU grunt to get my FPS while I did have Fidelity FX on Im not sure how much difference it made=for performance.



Yes sure so my crappy laptop with 2070 Super Max-Q can enjoy higher IQ and FPS than your desktop with 5700XT, sounds good to me .
Anyways we can talk againt after CP2077 release


----------



## INSTG8R (Oct 19, 2020)

Bought it almost a year and a half ago now. I have a brand new 1TB empty NVME just waiting for CP2077.


----------



## nguyen (Oct 19, 2020)

same, bought it over GOG too so more money for CDPR


----------



## INSTG8R (Oct 19, 2020)

nguyen said:


> same, bought it over GOG too so more money for CDPR


Yeah 8 even got a 40% Displate discount so I grabbed a cool Star Wars poster with if. Could have taken a better pic...


----------



## jigar2speed (Oct 19, 2020)

Flanker said:


> If RDNA2 can do what the HD4xxx series did at the time, I think that will be good news for consumers.



HD4*** was a upper cut, Nvidia never saw coming and it created a series of re-release of same SOC.


----------



## renz496 (Oct 19, 2020)

INSTG8R said:


> Well this is definitely their chance with Nvidia's supply issue and inflated prices. If big Navi is truly competitive and available they have a chance to really take some market share back.



they probably will gain some if they don't have supply issue themselves. but this problem is only temporary. most likely one quarter at most. when things back to normal nvidia probably will start gaining their market share back.


----------



## INSTG8R (Oct 19, 2020)

renz496 said:


> they probably will gain some if they don't have supply issue themselves. but this problem is only temporary. most likely one quarter at most. when things back to normal nvidia probably will start gaining their market share back.


Exactly they have one shot at this they better be ready with a great product and supplies to back it up while Nvidia fumbLes to make up  literally thousands of orders


----------



## Max(IT) (Oct 19, 2020)

Fabio said:


> why you say 5700xt was a fiasco?


a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
And the lack of RT that made it "old" since the beginning...



INSTG8R said:


> Well it did have a rough launch driver wise so first month or so wasn’t great but fiasco is a bit of an exaggeration.


launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.



Totally said:


> Fiasco? can you help my memory?
> 
> Did the cards crash to desktop frequently when gaming or pushed hard?
> Were the cards sold out day 1, hour 1, minute 1 because of extremely limited supply or non-existent supply in someplaces?
> Very power hungry?


I don't want to have an argue with a fanboy, which clearly you are, so I will cut it short.
I have NO brand loyalty at all.
I have a PC with a Ryzen 3900X and another PC with a 5700XT and the VGA is terrible. Performance and price were good, but drivers made me mad for a whole year. Even today, when the situation improved, sometimes I'm experiencing black screens and freezes.

I am going to give AMD another chance with the new GPU, if it will be a good product, but they have to make it better than the crappy 5700XT...


----------



## INSTG8R (Oct 19, 2020)

Max(IT) said:


> a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
> And the lack of RT that made it "old" since the beginning...
> 
> 
> launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.


i would tend to disagree considering what I do on the side. Any specif issues you’d like raise? If I can  confirm or repro it I can get looked$ at. But I literally run every driver and then some and I can’t think of anything issue wise effecting stability. The only “big one” for me is Enhanced Sync is not worked rireliably for quite some time and little has been done to fix it. When combined with Freesync it was a perfect combo


----------



## Max(IT) (Oct 19, 2020)

INSTG8R said:


> i would tend to disagree considering what I do on the side. Any specif issues you’d like raise? If I can  confirm or repro it I can get looked$ at. But I literally run every driver and then some and I can’t think of anything issue wise effecting stability. The only “big one” for me is Enhanced Sync is not worked rireliably for quite some time and little has been done to fix it. When combined with Freesync it was a perfect combo


the web is literally FULL of people complaining about 5700XT freezes, and if you deny it I would automatically put you on the "fanboy list". In my long experience I learned that AMD fanboys are the worst on the web, by far. They are in complete denial. No point in arguing with them.

To be crystal clear, I am a big AMD supporter since the beginning (since AMD K6 200), and I love a lot of ATI cards I bought in the past (9700Pro and 9800Pro being my all time favorites), but that doesn't make me linked to a brand no-matter-what. My main PC is Ryzen based and I'm building another one for my little son Ryzen based too.
I bought a 5700XT because it had a good price (it was ~160€ less than the 2070 Super at the time) and a promise for good performance. But it gave me a lot of issues and I also returned one (hoping it was defective) and restored the PC several times in order to solve the issues. With no success in doing that.
Now I'm waiting for this new generation to get rid of this crappy card and buy something else.

If your experience was better, I'm happy for you. But for sure I'm not alone on the web regarding 5700XT issues...


----------



## DonKnotts (Oct 19, 2020)

ZoneDymo said:


> Like if Nvidia and AMD both had a card that performed identical and had the same power consumption etc, then personally atm I would probably go for Nvidia purely for that well done Nvenc which AMD has no answer for as of yet.


Same thing with CUDA. A lot of the things that I use my GPU for besides gaming, will definitely benefit more from an Nvidia card than an AMD card, so AMD needs to add value by lowering their price.


----------



## medi01 (Oct 19, 2020)

INSTG8R said:


> Well that’s usually what AMD does historically. But if they really have a 3080 contender who knows how pricing will go. It’s a pretty conventional card, no exotic HBM etc


There is absolutely no reason for AMD to price 3080 perf chip below $699 as 3080 is nothing but a placeholder with no availability for months to come (and I suspect until new arch comes from NV).



Max(IT) said:


> the web is literally FULL of people complaining


Chuckle.


----------



## Vya Domus (Oct 19, 2020)

nguyen said:


> I could already tell which one is FidelityFX 5 seconds into your vid without looking at the title bar, it looks like shit , see the big rock ?



Suuuuuuuure.



nguyen said:


> DLSS is not an upscaling technique, it is an image resconstruction technique.



You are wrong, it's an upscaling algorithm. Reconstructions happens when the image is missing information or portions of it are marked as being unusable for instance due to noise and then you fill in those gaps, that's not what DLSS does. DLSS takes a fully rendered image and upscales it. Checkerboarding is for instance a reconstruction algorithm because the initial image is missing half of the vertical columns of pixels.


----------



## medi01 (Oct 19, 2020)

renz496 said:


> AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.


Yeah, AMD that is used to 40% (and below) margins should be afraid of NV who is used to 60%+.
How could AMD possibly take on Huang, lol.



nguyen said:


> Also HUB refuse to include DLSS 2.0 results


How dare they not run game at a lower resolution but claim the upscaled one.

I've seen Digital Totally Not Shills Getting Exclusive Favourable Previews For Some Strange REason Foundry telling me that if I look at my monitor from another room, I would barely notice it is upscaled!


----------



## nguyen (Oct 19, 2020)

Vya Domus said:


> Suuuuuuuure.






need more circles ?
Some people just have better visual acuity you know


----------



## Vya Domus (Oct 19, 2020)

nguyen said:


> need more circles ?



Yes and a static image please. Are you seriously screenshoting frames from a compressed youtube video that aren't even in sync ?    

You're something else man.


----------



## medi01 (Oct 19, 2020)

nguyen said:


> DLSS is *not an upscaling* technique, it is an *image resconstruction *technique. Basically Nvidia train the AI network* what the objects looks like in 16K*, in game the T*ensor cores will recognize* those low res objects and restructure them to* imitate 16K images.*



Oh, you sweet summer child, is there anything about this tech that you got right...


----------



## Zach_01 (Oct 19, 2020)

medi01 said:


> Yeah, AMD that is used to 40% (and below) margins should be afraid of NV who is used to 60%+.
> How could AMD possibly take on Huang, lol.


The 3080 10GB does not have a 60% margin... Its far less, and that is why nVidia used it as a placeholder and marketing and never intended to sell this widely.
Wait, you will see the real cards shortly...


----------



## nguyen (Oct 19, 2020)

Vya Domus said:


> Yes and a static image please. Are you seriously screenshoting frames from a compressed youtube video that aren't even in sync ?
> 
> You're something else man.



Yes and that image is not even zoomed in, even with YT terrible compression, the difference is night and day.
Oh well you can live with the illusion that FidelityFX is equal to DLSS, kinda hard to argue with AMD fans.
AMD driver were flawless right ? right ?


----------



## lesovers (Oct 19, 2020)

Max(IT) said:


> the web is literally FULL of people complaining about 5700XT freezes, and if you deny it I would automatically put you on the "fanboy list". In my long experience I learned that AMD fanboys are the worst on the web, by far. They are in complete denial. No point in arguing with them.
> 
> To be crystal clear, I am a big AMD supporter since the beginning (since AMD K6 200), and I love a lot of ATI cards I bought in the past (9700Pro and 9800Pro being my all time favorites), but that doesn't make me linked to a brand no-matter-what. My main PC is Ryzen based and I'm building another one for my little son Ryzen based too.
> I bought a 5700XT because it had a good price (it was ~160€ less than the 2070 Super at the time) and a promise for good performance. But it gave me a lot of issues and I also returned one (hoping it was defective) and restored the PC several times in order to solve the issues. With no success in doing that.
> ...



Yes started by purchasing the AMD K6-3DNow 300Mhz CPU in 1998 and have had ATI/AMD graphic cards since the HD5850 in 2009. Just back to AMD CPUs after a long time with Intel and now have the 3900X CPU. Purchased a MSI 5700XT Gaming X graphic card in 2019 after a short time with a RX580 which replaced the R9 280X graphic card.


I must say the image quality and deep saturated colours on the AMD graphic cards are just great this is why I real like these cards. The only issue for me is after no driver issues at all since 2009 there is definitely is a driver issue with the 5700XT. The black screen reboot (multi-monitor?) issue can be seen when running the Heaven Benchmark (using DirectX11) where the worst is the latest driver (20.9.2) at about a 2 to 15 mins before a black screen reboot and the best driver is from February 2020 (20.2.2) about 1 to 2 hours before a black screen reboot. Using OpenGL the 5700XT is absolutely completely stable with the Heaven Benchmark (tested for over a week).


The only game I have had an issue with is Rust with driver 20.5.1 through to 20.9.2 where after about 8 to 12 hours of play in 1 in 4 days you may have a single black screen reboot. With 20.2.2 there are no issues at all of after playing and testing for months.  



So there is a driver issue with games however it may take days of paying the game with the worst driver to show any problems at all.


----------



## Vya Domus (Oct 19, 2020)

nguyen said:


> Yes and that image is not even zoomed in, even with YT terrible compression, the difference is night and day.
> Oh well you can live with the illusion that FidelityFX is equal to DLSS, kinda hard to argue with AMD fans.



No man, I just lack "visual acuity" let's leave it at that.

This is only for the _200-Giga-IQ-image-comparisons-of-compressed-screenshots-from-compressed-videos-not-in-sync_ ™ aficionados.

Anyway, I find it comforting that at least I am not the only one visually impaired : https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/



> But FidelityFX CAS preserves a slight bit more detail in the game's particle and rain systems, which ranges from a shoulder-shrug of, "yeah, AMD is a _little_ better" most of the time to a head-nod of, "okay, AMD wins this round" in rare moments. AMD's lead is most evident during cut scenes, when dramatic zooms on pained characters like Sam "Porter" Bridges are combined with dripping, watery effects. Mysterious, invisible hands leave prints on the sand with small puddles of black water in their wake, while mysterious entities appear with zany swarms of particles all over their frames.


----------



## kapone32 (Oct 19, 2020)

cueman said:


> interesting.looks it cant beat rtx 3080 so amd drop good its tdp down.
> thouse clocks heard high,very high.
> 
> under 260w and rtx 3080 speed? noway.not 4K speed


You cannot say that a Navi card running at 2.4 GHZ would not be fast the 5700 is already plenty fast.


----------



## Max(IT) (Oct 19, 2020)

lesovers said:


> Yes started by purchasing the AMD K6-3DNow 300Mhz CPU in 1998 and have had ATI/AMD graphic cards since the HD5850 in 2009. Just back to AMD CPUs after a long time with Intel and now have the 3900X CPU. Purchased a MSI 5700XT Gaming X graphic card in 2019 after a short time with a RX580 which replaced the R9 280X graphic card.
> 
> 
> I must say the image quality and deep saturated colours on the AMD graphic cards are just great this is why I real like these cards. The only issue for me is after no driver issues at all since 2009 there is definitely is a driver issue with the 5700XT. The black screen reboot (multi-monitor?) issue can be seen when running the Heaven Benchmark (using DirectX11) where the worst is the latest driver (20.9.2) at about a 2 to 15 mins before a black screen reboot and the best driver is from February 2020 (20.2.2) about 1 to 2 hours before a black screen reboot. Using OpenGL the 5700XT is absolutely completely stable with the Heaven Benchmark (tested for over a week).
> ...



Yes, I have nothing to complain about image quality (AMD/ATi usually are better on that).
I found the most stable drivers for me to be 20.4.2 (IIRC).
The black screen/freeze bug is very subtle: I can play the same game for weeks (literally) without any issue and then ... bang ! Three crashes in a row on the same day !
It is driving me crazy ... 

The situation was terrible in december/march, when the freezes were quite frequent, but now it is much better (it happens mostly in Warzone, but not very often).
I'm afraid to upgrade to the latest 20.9.2: I read about big performance improvements but also about some crashes.


----------



## TumbleGeorge (Oct 19, 2020)

I read many comments about hype. This is very normal condition in consumer kind society. A.k.a hype should not be considered a violation of the rules, especially in topics discussing more or less rumors. Maybe I'm wrong?


----------



## Max(IT) (Oct 19, 2020)

TumbleGeorge said:


> I read many comments about hype. This is very normal condition in consumer kind society. A.k.a hype should not be considered a violation of the rules, especially in topics discussing more or less rumors. Maybe I'm wrong?


Hype is part of the marketing strategy of any company.
Many rumors are leaked on purpose...


----------



## INSTG8R (Oct 19, 2020)

kapone32 said:


> You cannot say that a Navi card running at 2.4 GHZ would not be fast the 5700 is already plenty fast.


His speciality seems to be pulling wild crazy numbers out of the air as facts. He had some great “numbers n the 5700XT that are quite amusing


----------



## medi01 (Oct 19, 2020)

Zach_01 said:


> The 3080 10GB does not have a 60% margin... Its far less, and that is why nVidia used it as a placeholder and marketing and never intended to sell this widely.



Indeed, and that is why AMD should simply ignore 3080 and its price point when pricing own cards.


----------



## kapone32 (Oct 19, 2020)

muches





INSTG8R said:


> His speciality seems to be pulling wild crazy numbers out of the air as facts. He had some great “numbers n the 5700XT that are quite amusing


Haha Indeed, this is one of the problems that subsist with AMD GPU launches. Either people think they are not as fast and some will say that they are much faster. The argument for me is how good is the card vs my current card including  price period.


----------



## INSTG8R (Oct 19, 2020)

kapone32 said:


> muches
> Haha Indeed, this is one of the problems that subsist with AMD GPU launches. Either people think they are not as fast and some will say that they are much faster. The argument for me is how good is the card vs my current card including  price period.


Yeah I didn’t know my card could do the amazing  things he said it could do. Was quite ”educational”


----------



## Max(IT) (Oct 19, 2020)

kapone32 said:


> muches
> Haha Indeed, this is one of the problems that subsist with AMD GPU launches. Either people think they are not as fast and some will say that they are much faster. The argument for me is how good is the card vs my current card including  price period.


numbers don't lie. Just wait for the reviews....

The only problem with the reviews is they can't say anything about reliability.


----------



## nguyen (Oct 19, 2020)

Vya Domus said:


> No man, I just lack "visual acuity" let's leave it at that.
> 
> This is only for the _200-Giga-IQ-image-comparisons-of-compressed-screenshots-from-compressed-videos-not-in-sync_ ™ aficionados.
> 
> Anyway, I find it comforting that at least I am not the only one visually impaired : https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/



Cherry picked paragraph ain't ya, typical. Did you even read the entire article down to the end .
Comparing IQ in a pre-released state of the game.

I can do cherry picking paragraph too, this time it's the official release of Death Stranding








						Death Stranding - Native 4K vs FidelityFX Upscaling vs DLSS 2.0
					

Death Stranding supports both DLSS 2.0 and FidelityFX Upscaling, so we've decided to test these re-construction techniques.




					www.dsogaming.com
				






> *All in all, DLSS 2.0 is slightly better than both Native 4K and FidelityFX Upscaling*. Performance-wise, both FidelityFX and DLSS 2.0 perform similarly. FidelityFX Upscaling comes with a sharpening slider via which it can provide a sharper image than both Native 4K and DLSS 2.0. However, there is more aliasing with FidelityFX Upscaling than in both Native 4K and DLSS 2.0. On the other hand, DLSS 2.0 can eliminate more jaggies, but also introduces some visual artifacts.


----------



## kapone32 (Oct 19, 2020)

Max(IT) said:


> numbers don't lie. Just wait for the reviews....
> 
> The only problem with the reviews is they can't say anything about reliability.


Why would I wait for reviews if I know the card is faster than my 5700? Only the price will hold me back.


----------



## Vya Domus (Oct 19, 2020)

nguyen said:


> Cherry picked paragraph ain't ya, typical. Did you even read the entire article down to the end



Do you just read the text ? I posted the link for a reason, the images in question are include and the differences are explained. The FIdelityFX image does indeed retain more detail in the examples provided, that is if your _visual acuity _is high enough to notice it.



nguyen said:


> Comparing IQ in a pre-released state of the game, yeah only AMD fans can do that.



You are failing so hard it's not even funny anymore, are you trying to be this obtuse on purpose or what ? The images in question were added on 14 of July when the game was released. I'll tell you what fanboys really do, and that is posting a video and saying that it "looks like shit".


----------



## nguyen (Oct 19, 2020)

Vya Domus said:


> Do you just read the text ? I posted the link for a reason, the images in question are include and the differences are explained. The FIdelityFX image does indeed retain more detail in the examples provided, that is if your _visual acuity _is high enough to notice it.
> You are failing so hard it's not even funny anymore, are you trying to be this obtuse on purpose or what ? The images in question were added on 14 of July when the game was released. I'll tell you what fanboys really do, and that is posting a video and saying that it "looks like shit".



I didn't post that vid you know, only quote.
And you would be lying if you don't see that one is much blurrier than the other .
Oh well so Arstechnica vs DSOgaming vs Digital Foundry
DLSS vs FidelityFX 2 to 1


----------



## Vya Domus (Oct 19, 2020)

nguyen said:


> DLSS vs FidelityFX 2 to 1



What is this a football match ?


----------



## nguyen (Oct 19, 2020)

Vya Domus said:


> What is this a football match ?



Hey I wasn't the first one to quote reviewer you know , in fact I was against biased reviewer in the first place. 
But hey if you like to quote reviewer then go ahead.


----------



## olymind1 (Oct 19, 2020)

Searing said:


> I've had tons and tons of AMD cards with no problems.



Same here, i saw a Rage 128 with its nice colors, then later came the Radeons, and started with 7500, 9500@9700, 3850, then went to green side with a GTX460, and back to the reds with 6770, 6850, 7850, RX480 4 GB, RX570 8 GB, no major problem, except 1-2 buggy driver here and there, and then came the mining craze and after prices started to drop back to normal levels, but Nvidia increased their prices with the RTX lineup, and AMD followed them too, now i'm waiting for some cheaper RDNA2 without driver issues for around 200-300 €, which would be a worthy upgrade for my RX570, but as how currently things are, i'll probably wait half or a year to buy a graphics card and will be sticking to 1080p for a long time.


----------



## medi01 (Oct 19, 2020)

AMD seems to have had hardware issues with 5700XT series, that were "black screening" cards, akin to 3080s.
The difference is that the former was sold in much higher number.

That being said, I expect AMD to up it's QA for this kind of shit not to happen again.
NV is forced to go bananas OC with Ampere, AMD is not.


----------



## nguyen (Oct 19, 2020)

medi01 said:


> AMD seems to have had hardware issues with 5700XT series, that were "black screening" cards, akin to 3080s.
> The difference is that the former was sold in much higher number.
> 
> That being said, I expect AMD to up it's QA for this kind of shit not to happen again.
> NV is forced to go bananas OC with Ampere, AMD is not.



I bet you that AMD is currently clocking the shit outta Big Navi as we speak , member 5600XT ?


----------



## Max(IT) (Oct 19, 2020)

kapone32 said:


> Why would I wait for reviews if I know the card is faster than my 5700? Only the price will hold me back.


The new card will be faster than the card is going to replace. Hardly a surprise.
The point is how much faster it will be and how it will compare with the offering from Nvidia in the same class



nguyen said:


> I bet you that AMD is currently clocking the shit outta Big Navi as we speak , member 5600XT ?


As they did with 5700XT... and 5600XT


----------



## INSTG8R (Oct 19, 2020)

Max(IT) said:


> the web is literally FULL of people complaining about 5700XT freezes, and if you deny it I would automatically put you on the "fanboy list". In my long experience I learned that AMD fanboys are the worst on the web, by far. They are in complete denial. No point in arguing with them.
> 
> To be crystal clear, I am a big AMD supporter since the beginning (since AMD K6 200), and I love a lot of ATI cards I bought in the past (9700Pro and 9800Pro being my all time favorites), but that doesn't make me linked to a brand no-matter-what. My main PC is Ryzen based and I'm building another one for my little son Ryzen based too.
> I bought a 5700XT because it had a good price (it was ~160€ less than the 2070 Super at the time) and a promise for good performance. But it gave me a lot of issues and I also returned one (hoping it was defective) and restored the PC several times in order to solve the issues. With no success in doing that.
> ...


As I said give me a repro bug/issue and I can pass it along. “The internet“ is an easy cop out and you can’t believe everything  you read. I’ve been on Red since the 960lXT I ran Crossfire with “the dongle“ owned the beautiful but absolutely terrible 2900XT I am aware of the regurgitated issues but give me something current or outstanding that needs to be fixed I”m on a few other big tech sites and I haven’t seen any recent chatter jumping out at me regarding current issues being a beta tester bugs are literally what I’m looking for.
Edit: The 9800 Pro is what brought me here so long ago and I became the Pro to XT flash guru here and I’ve been here ever since.


----------



## Markosz (Oct 19, 2020)

Ah yes, a random twitter post from a random person == confirmed.


----------



## BoboOOZ (Oct 19, 2020)

Markosz said:


> Ah yes, a random twitter post from a random person == confirmed.


The info is confirmed by other, more established leakers (NAAF, RedGamingTech). Chu, chu, chu, chu!


----------



## cueman (Oct 19, 2020)

''infinity cache'

yea, looks all amd fans speek it..well it dont help, of coz, if so nvidia use it for sure,its so simple to do, so forget it.

well, big navi get higher clocks,ok, so what, performance it what the one..and clocks speed ever bfore doing it for top gpu,example rrtx 2080 ti

rtx 3080 is beast,more than many know,bfore you use it for gaming.... specially 4K.. that peformance need power!

big navi and 300W not enough for sure.

some1 wait rx 6700xt gpu, i wait rtx 3080 Super 20Gb gpu and rtx 3070, indeed, but also i guess,moust popular, rtx 3060ti

also...rtx 3080 ti coming... with 7nm TSMC version and with 20gb gddr6x mems.

note: alot specks and name or so amd leaks out...erhh..
 but how about little fps results, real one, even few from amd 'leaks'..why not show?
its will so much better commercial than 2400mhz clocks. amd cant do it?


----------



## Ahhzz (Oct 19, 2020)

Boils and Ghouls, please keep in mind that the forums are to be a place to express opinions, share ideas and knowledge, and interact with others of the tech community, *while maintaining respect for each other. *Name calling and other schoolyard behavior is unacceptable. Do not take personal digs at each other, do not call names, and if you just flat can't stand someone else's opinion, put them on your ignore list and move on. 
We can't have a forum without you guys and your knowledge and participation. However. We _*will not* _have a forums where people feel they are attacked for politely sharing their opinions. 
Feel free to review the forum guidelines if you can't remember  your behavioral rules for being in polite society, and remember the unwritten golden rule: Don't be a Dick.


----------



## BoboOOZ (Oct 19, 2020)

cueman said:


> note: alot specks and name or so amd leaks out...erhh..


I love speck, especially on my pizzas


----------



## r9 (Oct 19, 2020)

xman2007 said:


> I don't get this notion that for the same performance they should be 50-100 cheaper than their nvidia counterpart though its fairly common presumption



I'm a fanboy of paying less and that only possible with big navi being competitive but unfortunately I don't see it matching rtx 3080.
Big navi has 2x rx5700 = 80. But the thing is that the performance doesn't scale 100% especially if you keep the same mem bus that now has to feed double the CUs. Higher clocks and the cache should help with the performance scaling but it's very optimistic that they can achieve 2x 5700xt perf as that what you need to match rtx 3080. Plus all the biggest unknown how does AMD raytracing stacks up quality/per vs the nvidia one. I really hope AMD pulls a miracle.


----------



## BoboOOZ (Oct 19, 2020)

r9 said:


> Plus all the biggest unknown how does AMD raytracing stacks up quality/per vs the nvidia one. I really hope AMD pulls a miracle.


AMD does raytracing the way it will be done in all coming next-gen games, excepting a handful of games paid by Nvidia to do it RTX style. That's more than enough for me. It's raytracing that will actually work in most games.


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> AMD does raytracing the way it will be done in all coming next-gen games, excepting a handful of games paid by Nvidia to do it RTX style. That's more than enough for me. It's raytracing that will actually work in most games.


...you're inferring RT doesn't work with Nvida? I'm confused at what your point is  here.

Don't they both use MS DXR?


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> ...you're inferring RT doesn't work with Nvida? I'm confused at what your point is  here.


It's 2 different, implementations, most chances (99.99%) are that they require different inputs to work.

Developers need to put some things in the games in order for them to work with RTX, they need to put different things for RDNA2 RT to work. I have no idea how difficult is the translation from the AMD to the Nvidia format, but what we know is that most developers will do the effort for the consoles, whether they will add further effort for the Nvidia format remains to be seen.


----------



## INSTG8R (Oct 19, 2020)

BoboOOZ said:


> It's 2 different, implementations, most chances (99.99%) are that they require different inputs to work.
> 
> Developers need to put some things in the games in order for them to work with RTX, they need to put different things for RDNA2 RT to work. I have no idea how difficult is the translation from the AMD to the Nvidia format, but what we know is that most developers will do the effort for the consoles, whether they will add further effort for the Nvidia format remains to be seen.


Your complicating it RTX still using base DXR which is what AMD will use. Nvidia doesn’t have an exclusive RT implementation just a a hardware solution to make use of it


----------



## TheoneandonlyMrK (Oct 19, 2020)

BoboOOZ said:


> It's 2 different, implementations, most chances (99.99%) are that they require different inputs to work.
> 
> Developers need to put some things in the games in order for them to work with RTX, they need to put different things for RDNA2 RT to work. I have no idea how difficult is the translation from the AMD to the Nvidia format, but what we know is that most developers will do the effort for the consoles, whether they will add further effort for the Nvidia format remains to be seen.


It's true to say that each implementation is architecturally different but as Earthdog says they talk through an API, Dx12 and Dx12 ultimate using DxR.
So optimising a game's use of features will be necessary not changing the API.

Rtx is Nvidia's implementation of DxR.


----------



## BoboOOZ (Oct 19, 2020)

INSTG8R said:


> Your complicating it RTX still using base DXR which is what AMD will use. Nvidia doesn’t have an exclusive RYT implementation just a a hardware solution to make use of it


True, my bad.

Still, next-gen games will be developed, optimized and tested so that they look good with RDNA2 RT implementations.


----------



## EarthDog (Oct 19, 2020)

I'm curious, what in that statement made you two (@theoneandonlymrk, @INSTG8R) thank that post? What was correct about any of it?  (I'm genuinely asking. Judging by both of your responses it seems patently false?).

AMD and Nvidia have 'ways' to do things, but its all DXR based with hardware support. "The way it will be done in all next-gen console...." is the same way it will be done now...right?


----------



## VPII (Oct 19, 2020)

I think it all depends what AMD's answer will be for DLSS 2.1.  Look it would be great to have competition as we need it.


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> I'm curious, what in that statement made you two (@theoneandonlymrk, @INSTG8R) thank that post? What was correct about any of it?  (I'm genuinely asking. Judging by both of your responses it seems patently false?)


I think this: 


BoboOOZ said:


> Still, next-gen games will be developed, optimized and tested so that they look good with RDNA2 RT implementations.


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> I think this:


It is still based on DXR... and any improvements will take time. Look at how RT has come along, slow and steady. I can see more momentum since it's in consoles, but like the use of more than a few cores/threads we've been waiting for since 2010, I won't hold my breath that will make a significant difference soon/this generation.

I have to imagine this 1st gen implementation, like NV, will need some bugs worked out and isn't as efficient/effective as second gen hardware driving DXR like NV is running. I also would have imagined AMD to let loose some DXR performance metrics like they did with raster based, but, it either isn't great, or they are holding the info close to their vest... either are perfectly viable.

Only time will tell.


----------



## TheoneandonlyMrK (Oct 19, 2020)

EarthDog said:


> I'm curious, what in that statement made you two (@theoneandonlymrk, @INSTG8R) thank that post? What was correct about any of it?  (I'm genuinely asking. Judging by both of your responses it seems patently false?).
> 
> AMD and Nvidia have 'ways' to do things, but its all DXR based with hardware support. "The way it will be done in all next-gen console...." is the same way it will be done now...right?


It's still not clear , what is clear is that the Xbox and pc implementation will be similar.


----------



## BoboOOZ (Oct 19, 2020)

VPII said:


> I think it all depends what AMD's answer will be for DLSS 2.1.  Look it would be great to have competition as we need it.


I haven't heard any rumors about a DLSS concurrent coming from AMD. AMD users might have to stick with native resolutions this gen.


----------



## INSTG8R (Oct 19, 2020)

BoboOOZ said:


> True, my bad.
> 
> Still, next-gen games will be developed, optimized and tested so that they look good with RDNA2 RT implementations.


Which will pretty much have to be based on DXR(Tho Sony kinda throws a parity wrench in the that pretty sure MS won”t be handing over DX)


----------



## EarthDog (Oct 19, 2020)

theoneandonlymrk said:


> It's still not clear , what is clear is that the Xbox and pc implementation will be similar.


Ok, so, basically, don't buy into that unsupported statement. Gotcha.


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> I have to imagine this 1st gen implementation, like NV, will need some bugs worked out and isn't as efficient/effective as second gen hardware driving DXR like NV is running. I also would have imagined AMD to let loose some DXR performance metrics like they did with raster based, but, it either isn't great, or they are holding the info close to their vest... either are perfectly viable.
> 
> Only time will tell.


Frankly, if you look at the benchmark results with RT on and off between Turing and Ampere, the performance hit seems to be the same or worse for Ampere. The only relative performance increase seems to come for fully path-traced games, which are games with very light graphical requirements. For normal AAA games, the gains seem to come from raw performance increase.

But yeah, maybe we'll see something different in future.


----------



## INSTG8R (Oct 19, 2020)

Well I wonder if AMD ditching the compute cores may have been premature they could have repurposed them as RT cores NV has baked in. But that is me trying to find a use for them on gaming cards where they were just taking up die space


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> Frankly, if you look at the benchmark results with RT on and off between Turing and Ampere, the performance hit seems to be the same or worse for Ampere.


You may want to do the math on that... at least looking at TPUs results with Metro and Control, those results do not speak to your point (assuming my math was correct). I'd stop digging the hole you're in if I was you.


----------



## TheoneandonlyMrK (Oct 19, 2020)

EarthDog said:


> Ok, so, basically, don't buy into that unsupported statement. Gotcha.


They're are likely to be exclusively supported features on some games on some hardware but that's going to be a dev time or marketing decision IMHO.
It's also clear to me that there are going to be performance differences between the let's say four pliable implementations, Xbox ps5, Rtx DxR and rDNA DxR.

These subtle differences will show themselves with time, for example Rtx might shine on global illumination with rays but the ps5 does reflections marginally better, those are purely hypothetical examples not based on my opinions of reality, just to describe the possibility of different outcomes.


----------



## Zach_01 (Oct 19, 2020)

r9 said:


> I'm a fanboy of paying less and that only possible with big navi being competitive but unfortunately I don't see it matching rtx 3080.
> Big navi has 2x rx5700 = 80. But the thing is that the performance doesn't scale 100% especially if you keep the same mem bus that now has to feed double the CUs. Higher clocks and the cache should help with the performance scaling but it's very optimistic that they can achieve 2x 5700xt perf as that what you need to match rtx 3080. Plus all the biggest unknown how does AMD raytracing stacks up quality/per vs the nvidia one. I really hope AMD pulls a miracle.


This is were you’re wrong, thinking that the 80CU GPU is 2x5700XT on some kind of crossfire. Understand that RDNA2 has different architecture. An improved one... It’s not how you think it is. The architecture will have better IPC, better performance/watt and at least 10% higher boost clock. In raw numbers it’s performance is higher that 2x5700XT. In real life it will be around 2x5700XT and that is placing it matching 3080.



cueman said:


> ''infinity cache'
> 
> yea, looks all amd fans speek it..well it dont help, of coz, if so nvidia use it for sure,its so simple to do, so forget it.
> 
> ...


And you are wrong about your assumptions. Because when nVidia doesn’t use something, it doesn’t mean that it’s without value. nVidia is not the ruler of tech nor the God of GPUs. nVidia can’t use right now big caches because the architecture in use is compute based/oriented and not game oriented. It happens to do well on games and particularly on 4K. If you see the gains on 1080/1440p are smaller that 2080Ti.

AMD and nVidia are choosing different paths. Does not make one or the other better or worst. It will come to users to choose what is best for them and what they want in the end.


----------



## BoboOOZ (Oct 19, 2020)

INSTG8R said:


> Well I wonder if AMD ditching the compute cores may have been premature they could have repurposed them as RT cores NV has baked in. But that is me trying to find a use for them on gaming cards where they were just taking up die space


If what is required is just floating-point operations, running them on the shaders would be much more efficient. Sounds to me like Nvidia took the reversed approach, they had some compute cores laying around, let's find them a job to do, but the problem is that in many situations they're still a waste of die space.


----------



## Zach_01 (Oct 19, 2020)

EarthDog said:


> You may want to do the math on that... at least looking at TPUs results with Metro and Control, those results do not speak to your point (assuming my math was correct). I'd stop digging the hole you're in if I was you.


I will agree. Ampere RTRT performance hit is less than Turing.

But to be honest. It’s way less that the x2 that Jensen advertise.


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> You may want to do the math on that... at least looking at TPUs results with Metro and Control, those results do not speak to your point (assuming my math was correct). I'd stop digging the hole you're in if I was you.


Meh, we're discussing, I discuss to learn things, being wrong is a natural part of learning. I(m an adult, I don't need to right all the time to feel good.
But I'll look again at those benchmarks and see if the numbers I remember are wrong.

Edit:

Just looked at this, the performance hit of RT+DLSS is the same on the 2080Ti and 3080 in 4k, as I remembered:


			https://tpucdn.com/review/evga-geforce-rtx-3080-ftw3-ultra/images/control-rtx-3840-2160.png


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> Meh, we're discussing, I discuss to learn things, being wrong is a natural part of learning. I(m an adult, I don't need to right all the time to feel good.
> But I'll look again at those benchmarks and see if the numbers I remember are wrong.


It is a discussion indeed. I was simply noting your record in this thread isn't stellar this morning. We all do it (are wrong sometimes)... but the goal is to get the correct information out so users can make their own decisions based on facts. Don't take it personally.


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> It is a discussion indeed. I was simply noting your record in this thread isn't stellar this morning. We all do it (are wrong sometimes)... but the goal is to get the correct information out so users can make their own decisions based on facts. Don't take it personally.


I don't, I know some need to be right mùore than others, we're all different, that's fine.

The fact is that most next-gen games will be developed, tested and optimized for RDNA2 GPU's, and it would be naive to imagine this will have no influence.


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> I don't, I know some need to be right mùore than others, we're all different, that's fine.
> 
> The fact is that most next-gen games will be developed, tested and optimized for RDNA2 GPU's, and it would be naive to imagine this will have no influence.


I'll ignore the passive aggressive barb there....

I used the 3080 reference review.... and just RT since you never mentioned DLSS until now. DLSS is different than RT, bud. Tough to hit a moving target. 


BoboOOZ said:


> Frankly, if you *look at the benchmark results with RT on and off between Turing and Ampere, the performance hit seems to be the same or worse for Ampere. *The only relative performance increase seems to come for fully path-traced games, which are games with very light graphical requirements. For normal AAA games, the gains seem to come from raw performance increase.




Anyway, I digress. The point(s) has(have) been made and users can do what they please with the information out there.


----------



## BoboOOZ (Oct 19, 2020)

EarthDog said:


> I'll ignore the passive aggressive barb there....


It wasn't at all directed especially at you, It's most important for people who upgrade in the next months, I'm one of them.


EarthDog said:


> I used the 3080 reference review.... and just RT since you never mentioned DLSS until now. DLSS is different than RT, bud. Tough to hit a moving target.


I used this review because it has the latest drivers, as you had poiinted out that my info is stale. Also, reviewers put DLSS and RTX on because otherwise, framerates are still too low, so it's normal to look at them like that. Anyway, are you trying to imply that, in fact, DLSS performance has dropped, while RTX performance has increased with Ampere? Or do you have numbers that show a different performance hit in Control than what I pointed out? Please share, don't worry about statistics about how many times somebody was right/wrong this morning, we might just learn something new in the process.


----------



## Zach_01 (Oct 19, 2020)

I’m not going to spend time and space for the specifics, but from what I saw so far from reviews, Ampere architecture is improved over Turing on all aspects. But most of the improvements were marginal. It’s mostly the larger number of process units (FP process) that makes the 3080 to be at best 32% than 2080Ti.

What bothers me is that nVidia’s advertisement and presentation is making it look like a leap forward, when certainly is not.


----------



## EarthDog (Oct 19, 2020)

BoboOOZ said:


> It wasn't at all directed especially at you, It's most important for people who upgrade in the next months, I'm one of them.
> 
> I used this review because it has the latest drivers, as you had poiinted out that my info is stale. Also, reviewers put DLSS and RTX on because otherwise, framerates are still too low, so it's normal to look at them like that. Anyway, are you trying to imply that, in fact, DLSS performance has dropped, while RTX performance has increased with Ampere? Or do you have numbers that show a different performance hit in Control than what I pointed out? Please share, don't worry about statistics about how many times somebody was right/wrong this morning, we might just learn something new in the process.


I just answered your question as you asked it. Moving the goal posts from there was on you. A 3080 runs over 100 FPS in Metro and almost 60 FPS in Control without DLSS (1440)... just saying. If you go up to 4K, it is needed in some titles, but... who's talking about 4K and the 2% of users that game on it (according to Steam)? Not me.. I specifically mentioned 1440p (good balance between # of users and a GPU heavy res).

Im not implying nor did infer (or straight up say) anything about DLSS. That was you. Please feel free to look at the results in your 3080/3090 review of your choice at TPU...again, DLSS was not ever mentioned by me. Please, take the time to do that. I did the math and came up with a different answer.. others support that assertion. The onus isn't on me... the math is there in the review I mentioned. I don't see it any different in the latest review (so long as you stay in the lane of RT/1440/no DLSS) considering all 3080 reviews were on 9/16 or 9/17 here at TPU (according to google). Again, DLSS isn't RT so I'm not sure why that is even being brought up at this point.

PS - It holds true in 4K on your review.


----------



## Blueberries (Oct 19, 2020)

TPU... where news articles are based entirely on some random nobody with 100 followers (since-been-deleted) tweet with random specs and no evidence or explanation of where the numbers came from followed by 10 pages of people arguing about literally nothing


----------



## r9 (Oct 19, 2020)

Zach_01 said:


> This is were you’re wrong, thinking that the 80CU GPU is 2x5700XT on some kind of crossfire. Understand that RDNA2 has different architecture. An improved one... It’s not how you think it is. The architecture will have better IPC, better performance/watt and at least 10% higher boost clock. In raw numbers it’s performance is higher that 2x5700XT. In real life it will be around 2x5700XT and that is placing it matching 3080.
> 
> 
> And you are wrong about your assumptions. Because when nVidia doesn’t use something, it doesn’t mean that it’s without value. nVidia is not the ruler of tech nor the God of GPUs. nVidia can’t use right now big caches because the architecture in use is compute based/oriented and not game oriented. It happens to do well on games and particularly on 4K. If you see the gains on 1080/1440p are smaller that 2080Ti.
> ...


Let's hope I'm wrong.



BoboOOZ said:


> AMD does raytracing the way it will be done in all coming next-gen games, excepting a handful of games paid by Nvidia to do it RTX style. That's more than enough for me. It's raytracing that will actually work in most games.


It will be impossible to compare apples to apples with nvidia's implementation. I hope enabling on amd won't cut the fps in half like on nvidia gpus. Fun times ahead ... can't wait for the reviews to come out! Usually the reviews put an end to this type of discussions but this time I think it will be just the beginning. Lol


----------



## BoboOOZ (Oct 19, 2020)

r9 said:


> It will be impossible to compare apples to apples with nvidia's implementation. I hope enabling on amd won't cut the fps in half like on nvidia gpus. Fun times ahead ... can't wait for the reviews to come out! Usually the reviews put an end to this type of discussions but this time I think it will be just the beginning. Lol


 Discussions never end


----------



## r9 (Oct 19, 2020)

EarthDog said:


> ...you're inferring RT doesn't work with Nvida? I'm confused at what your point is  here.
> 
> Don't they both use MS DXR?


That's a good point, if RT is enabled via DX we might get apple to apple comparison after all.


----------



## MarcingMarshmallow (Oct 19, 2020)

cueman said:


> interesting.looks it cant beat rtx 3080 so amd drop good its tdp down.
> thouse clocks heard high,very high.
> 
> under 260w and rtx 3080 speed? noway.not 4K speed



Keep in mind that this is only gpu wattage, the whole card is estimated to be 300 watts, so the gap is much smaller.


----------



## mechtech (Oct 19, 2020)




----------



## medi01 (Oct 19, 2020)

MarcingMarshmallow said:


> Keep in mind that this is only gpu wattage, the whole card is estimated to be 300 watts, so the gap is much smaller.


JoinedOct 19, 2020
Messages1 (1.00/day) 

Thanks for registering to say that. Very helpful indeed.


----------



## Totally (Oct 19, 2020)

nguyen said:


> It was benchmarked with 3950X test rig with PCIe 4.0
> 5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X
> 
> Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
> ...



  You make it sound as if the game only runs at 4k60 on max settings and is unplayable with anything less than that. 








Max(IT) said:


> a lot of driver related issues (the situation is better now, but not completely solved) and far from stellar sales.
> And the lack of RT that made it "old" since the beginning...



As pointed out by many in this thread, a very small portion of users were actually affected and after the fact retracted their claims as the those users found out that it was something else causing the issue not the card or drivers, others just hopping on the bandwagon and exaggerating things. The latter group is YOU.



> launch drivers ? Drivers were terrible until last may, and even today they are far from being very stable.



I'm not dismissing that there are issues but  problems of others aren't mine.



> I don't want to have an argue with a fanboy, which clearly you are, so I will cut it short. I have NO brand loyalty at all.



I'm a fanboy for disagreeing and pointing out your gross exaggeration? If that's what it takes to be a fanboy then I'm a fanboy. Vote me.



> I have NO brand loyalty at all.



No one asked, no one cared, and how does that preclude you from being biased as hell? Sus, if you asked me.



> I have a PC with a Ryzen 3900X and another PC with a 5700XT and the VGA is terrible. Performance and price were good, but drivers made me mad for a whole year. Even today, when the situation improved, sometimes I'm experiencing black screens and freezes.
> 
> I am going to give AMD another chance with the new GPU, if it will be a good product, but they have to make it better than the crappy 5700XT...



Says he has a crappy experience with a trash product and sticks with the same brand and is accusing others of being a fanboy. I'm gonna vote fanboy.


----------



## Max(IT) (Oct 20, 2020)

Totally said:


> You make it sound as if the game only runs at 4k60 on max settings and is unplayable with anything less than that.
> 
> 
> 
> ...


A small portion of users ... lol ...

as I said : a waste of time.


----------



## DeeJay1001 (Oct 20, 2020)

BMfan80 said:


> That would be awesome,I had a 4770 and when overclocked it was a beast of a card.
> I have a screenshot of it back in the day with a furmark score that is higher than a GTX470
> 
> I'm waiting to see what the new cards can do,I would like to replace my 1080ti with one.



I also had a 4770, It was one of the first 4770s Newegg shipped out. Was my first real GPU and I've been hooked ever since. I put in over 900 hours of COD MW2 on that card. 

That thing still runs like a champ, my little brother plays minecraft on it nearly everyday. 

That little 4770 has been pumping out frames and generating joy for over 10 years.


----------



## r9 (Oct 20, 2020)

I don't how you guys pick which one to buy but performance summary form Wizz reviews been my bible. Cross-reference to the price/availability. And due to AMD/ati being the underdog through history they always undercut nvidia in price so I end up mostly with AMD/Ati cards.


----------



## Totally (Oct 20, 2020)

Max(IT) said:


> A small portion of users ... lol ...
> 
> as I said : a waste of time.



Sounds like you have actual concrete data on the numbers of cards sold vs affected, care to share.



r9 said:


> I don't how you guys pick which one to buy but performance summary form Wizz reviews been my bible. Cross-reference to the price/availability. And due to AMD/ati being the underdog through history they always undercut nvidia in price so I end up mostly with AMD/Ati cards.



same


----------



## TheoneandonlyMrK (Oct 20, 2020)

Totally said:


> Sounds like you have actual concrete data on the numbers of cards sold vs affected, care to share.
> 
> 
> 
> same


Seconded, Data please.


----------



## r9 (Oct 20, 2020)

theoneandonlymrk said:


> Seconded, Data please.


----------



## Aquinus (Oct 20, 2020)

Raevenlord said:


> The specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news.


I predict that "Big Navi" is going to be a 20-30% smaller die than GA102 if these power and performance figures are accurate. I guess we'll only have to wait a little more than a week to find out.


----------



## Zach_01 (Oct 20, 2020)

Aquinus said:


> I predict that "Big Navi" is going to be a 20-30% smaller die than GA102 if these power and performance figures are accurate. I guess we'll only have to wait a little more than a week to find out.


There is already a leaked picture of the big die and was estimated around 536mm². It is smaller than GA102 (628mm²). Still if all info is true, the actual RDNA2 GPU will draw more power than GA102.
The 2 GPU card rivals may have the same *T*otal*B*oard*P*ower (~320W) but lets not forget that 3080 uses GDDR6X that draws the hefty amount of ~70W vs the 20~30W that "simple" GDDR6 draws.


----------



## r9 (Oct 21, 2020)

Zach_01 said:


> There is already a leaked picture of the big die and was estimated around 536mm². It is smaller than GA102 (628mm²). Still if all info is true, the actual RDNA2 GPU will draw more power than GA102.
> The 2 GPU card rivals may have the same *T*otal*B*oard*P*ower (~320W) but lets not forget that 3080 uses GDDR6X that draws the hefty amount of ~70W vs the 20~30W that "simple" GDDR6 draws.



Just put something in perspective Ryzen 3800x 74mm2 7nm sells for $399, doesn't come with vram, pcb or vrm.
Big navi 536mm2 will sell for ~$699, will need pcb, vram and vrm. To the cost add the waaay lower yield due to the muuuch higher size.
In case people wonder why AMD putting so much more effort into the CPU business.


----------



## lesovers (Oct 22, 2020)

Max(IT) said:


> Yes, I have nothing to complain about image quality (AMD/ATi usually are better on that).
> I found the most stable drivers for me to be 20.4.2 (IIRC).
> The black screen/freeze bug is very subtle: I can play the same game for weeks (literally) without any issue and then ... bang ! Three crashes in a row on the same day !
> It is driving me crazy ...
> ...




Thanks the 20.2.2 driver still gives me the best results so far therefore this is my final driver before purchacing the new Navi 6800XT or 6900XT.
Hope the new hardware just works like the RX580 and has no driver bugs please!


----------



## DeeDee Ranged (Oct 27, 2020)

Aquinus said:


> I predict that "Big Navi" is going to be a 20-30% smaller die than GA102 if these power and performance figures are accurate. I guess we'll only have to wait a little more than a week to find out.



RX 6800XT has better performance than the RTX 3080FE. Have a look at Igor's Lab he's usually well informed .








						3DMark in Ultra-HD - Benchmarks of the RX 6800XT with and without Raytracing appeared | igor'sLAB
					

As always, you have to be careful with such benchmarks, even if the material I received yesterday seems quite plausible. Two sources, very different approaches or settings and yet in the end a certain…




					www.igorslab.de
				




cheers


----------



## TumbleGeorge (Dec 26, 2020)

Is AMD sacrifice quality of colors/contrast or something other in rdna2 to get a little more fps? And remove some options in settings (AMD fluid motion)?

Or it is just rumors?


----------



## jigar2speed (Jan 6, 2021)

TumbleGeorge said:


> Is AMD sacrifice quality of colors/contrast or something other in rdna2 to get a little more fps? And remove some options in settings (AMD fluid motion)?
> 
> Or it is just rumors?



No such evidence yet


----------



## mtcn77 (Jan 6, 2021)

TumbleGeorge said:


> Is AMD sacrifice quality of colors/contrast or something other in rdna2 to get a little more fps? And remove some options in settings (AMD fluid motion)?
> 
> Or it is just rumors?


Are you from land down under? Because it is the other way around...


----------

