# AMD Radeon RX 6800 XT



## W1zzard (Nov 18, 2020)

The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.

*Show full review*


----------



## claylomax (Nov 18, 2020)

Awesome; feels like we've been waiting ages for this!


----------



## spnidel (Nov 18, 2020)

that power consumption, and noise levels - wow!


----------



## Sandbo (Nov 18, 2020)

Not surprisingly, the ray-tracing performance is quite a bit behind. Makes it difficult to decide for me as I want a card for ray-traced Minecraft.
Hope they will catch up in the future.


----------



## z1n0x (Nov 18, 2020)

spnidel said:


> that power consumption, and noise levels - wow!


Daaamn!

I think AIB's are going to have alot of fun with these babies.


----------



## okbuddy (Nov 18, 2020)

how on earth 5% worse than 3080


----------



## birdie (Nov 18, 2020)

TLDR:

Truly stellar rasterization performance per watt thanks to the 7nm node.
Finally solved multi-monitor idle power consumption!!
Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).
No tensor cores (they are not just for DLSS but various handy AI features like background noise removal, background image removal and others).
Horrible x264 hardware encoder.
Almost no attempt at creating a decent competition. I remember AMD used to fiercly compete with NVIDIA and Intel. AMD 2020: profits and brand value (fueled by Ryzen 5000) first.
Overall: kudos to RTG marketing machine which never fails to overhype and underdeliver. In terms of being future-proof people should probably wait for RDNA 3.0 or buy ... NVIDIA. Even 3-5 years from now you'll be able to play the most demanding titles thanks to DLSS just by lowering resolution.


----------



## Rais (Nov 18, 2020)

What a destructive launch, welcome back AMD


----------



## dicktracy (Nov 18, 2020)

It's a 3080 in old games and a 3070 in new games (RT). No thanks.


----------



## MxPhenom 216 (Nov 18, 2020)

....and im not impressed. Exactly where I thought these cards would sit, and ray tracing performance is dog shit.

Performance per watt runs circles around Nvidia though.


----------



## birdie (Nov 18, 2020)

Rais said:


> What a destructive launch, welcome back AMD



What or who have they destructed? I'm truly curious. As far as I can see these cards are for hardcore AMD fans who will continue to say that RTRT and hardware AI acceleration are irrelevant. Yes, they are irrelevant on RTX 6000 hardware but very much viable for the RTX 3000 series which thanks to Tensor Cores allows to enable RTRT/DLSS and game at 4K and implement some cool AI features.


----------



## Sandbo (Nov 18, 2020)

okbuddy said:


> how on earth 5% worse than 3080


While being 9% cheaper, I consider that a win!
Certainly, they might just end up being the same (and ridiculously high) price.


----------



## Vya Domus (Nov 18, 2020)

Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.


----------



## Night (Nov 18, 2020)

On the Spec page, it's mentioned 6800 and 6800 XT have 23 billion transistors, but on AMD's spec sheet they both have 26.8 billions.


----------



## wolf (Nov 18, 2020)

well done AMD, I feel like this is the first products in a few gens that really deliver. truly competitive and right up to scratch. the next gens from both camps now is going to be *HOT*


----------



## puma99dk| (Nov 18, 2020)

Vya Domus said:


> Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.



Yeah this here is awesome seeing AMD raise again. I am waiting for Sapphire's Nitro card this time around should be released on the 25th of November


----------



## W1zzard (Nov 18, 2020)

Night said:


> On the Spec page, it's mentioned 6800 and 6800 XT have 23 billion transistors, but on AMD's spec sheet they both have 26.8 billions.


Fixed, thanks! GPU-Z has the wrong value, too


----------



## Shatun_Bear (Nov 18, 2020)

Impressive almost 10% gaming performance increase from overclocking, taking it above the 3080 and level with 3090... Nvidia in trouble indeed.


----------



## BakerMan1971 (Nov 18, 2020)

As my current 1080Ti is plenty for my 1440p gaming, the upgrade to consider would have been for Raytracing and other benefits, so still Nvidia for future me at the moment but those results are very impressive


----------



## MxPhenom 216 (Nov 18, 2020)

Shatun_Bear said:


> Impressive almost 10% gaming performance increase from overclocking, taking it above the 3080 and level with 3080... Nvidia in trouble indeed.



You can overclock a 3080 too...


----------



## Khonjel (Nov 18, 2020)

Shatun_Bear said:


> Impressive almost 10% gaming performance increase from overclocking, taking it above the 3080 and level with 3080... Nvidia in trouble indeed.


Yep. Wanted to post just about that. These things are gonna be overclocking monsters.



MxPhenom 216 said:


> You can overclock a 3080 too...


I checked TPU's FE review just now. 3.9% performace boost from OC.


----------



## Frick (Nov 18, 2020)

The problem will be price, at least in Sweden. No store would make a profit on MSRP (according to Sweclockers anyway), and it's unclear what prices we'll be looking at in future.


----------



## MxPhenom 216 (Nov 18, 2020)

Frick said:


> The problem will be price, at least in Sweden. No store would make a profit on MSRP (according to Sweclockers anyway), and it's unclear what prices we'll be looking at in future.



If you can ever get one retail that isnt $1000+ of Ebay.


----------



## z1n0x (Nov 18, 2020)

Good thing copium is legal worldwide, i see some are fixing themselves already.


----------



## spnidel (Nov 18, 2020)

dicktracy said:


> It's a 3080 in old games and a 3070 in new games (RT). No thanks.


you weren't going to buy it anyway, mr shill



z1n0x said:


> Good thing copium is legal worldwide, i see some are fixing themselves already.


legal, but at this rate nvidia users are going to buy up all the supply and end up scalping it


----------



## TheLostSwede (Nov 18, 2020)

Frick said:


> The problem will be price, at least in Sweden. No store would make a profit on MSRP (according to Sweclockers anyway), and it's unclear what prices we'll be looking at in future.


You always have to add the VAT (moms) in Sweden on top of whatever price you see. I think the review sites in Sweden forgets about this.
Also, somethings Sweclockers are a bit too full of themselves imho.

Looks like a $40-ish price premium here, even after VAT has been added.


----------



## SIGSEGV (Nov 18, 2020)

birdie said:


> TLDR:
> 
> Truly stellar rasterization performance per watt thanks to the 7nm node.
> Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).
> ...



lmao. I like your insecure comments.
well, as I predicted after I read the scientific article about infinity cache.
Hi Wizz, do you have a plan to bench this card alongside the Ryzen 5000 CPU series? 
It's already there.


----------



## Colddecked (Nov 18, 2020)

Sandbo said:


> Not surprisingly, the ray-tracing performance is quite a bit behind. Makes it difficult to decide for me as I want a card for ray-traced Minecraft.
> Hope they will catch up in the future.



If you need the best RT performance RIGHT NOW, then Nvidia is clearly the way to go.  That being said, I won't be surprised if after a year or two, RT performance of the 6800 and 6800xt is noticeably better than 2080ti.


----------



## moproblems99 (Nov 18, 2020)

So, did these actually go on sale yet or also sell out in seconds?


----------



## KarymidoN (Nov 18, 2020)

thanks again for the amazing review @W1zzard .
AMD delivered exactly what they promissed. a Worth competitor, in some games it beats the 3080, but looses in others, it consumes less power and costs less money, i'm not disappointed, i just hope AMD continues the trend and keeps improving to keep NVIDIA in check and making us the consumers the Winners in this Battle.


----------



## Halo3Addict (Nov 18, 2020)

SIGSEGV said:


> lmao. I like your insecure comments.
> well, as I predicted after I read the scientific article about infinity cache.
> Hi Wizz, do you have a plan to bench this card alongside the Ryzen 5000 CPU series?











						AMD Radeon Resizable BAR / Smart Access Memory Review - 22 Games Tested
					

AMD's Smart Access Memory feature is highly interesting. It promises a performance boost when the new Radeon RX 6800 Series cards are paired with an AMD Ryzen Zen 3 processor. We extensively test this in a whole article with 22 games at three resolutions, at up to 4K Ultra HD.




					www.techpowerup.com
				




Shame about RT performance, but expected since it's their first attempt. Not sure how they are going to handle their SAM marketing after Nvidia came out saying they're going to support a similar feature on BOTH Intel and AMD if AMD is willing to play ball. They must have known it could be implemented on a driver level...


----------



## fancucker (Nov 18, 2020)

So subpar raytracing performance and near raster parity? Congratulations on getting out of the gutter AMD but this is with a node advantage (SS 8nm sucks) and the absence of a proper DLSS alternative. 3080 looks like more value. A shortlived victory before a 7nm ampere refresh or hopper restores the status quo


----------



## tomc100 (Nov 18, 2020)

I am getting this just for the multi-monitor power consumption alone.  That is just beautiful.  Also, I hate Nvidia.  Been buying ATI then AMD gpus since the first 9700 pro.  Had to jump ship to the 970 then 1080ti then rtx 2080 super because AMD didn't have what I needed at the time.  Now I just have to wait for a water cooled version.


----------



## oxrufiioxo (Nov 18, 2020)

Both impressed and disappointed.


----------



## MxPhenom 216 (Nov 18, 2020)

KarymidoN said:


> thanks again for the amazing review @W1zzard .
> AMD delivered exactly what they promissed. a Worth competitor, in some games it beats the 3080, but looses in others, it consumes less power and costs less money, i'm not disappointed, i just hope AMD continues the trend and keeps improving to keep NVIDIA in check and making us the consumers the Winners in this Battle.



Yep, keeping Nvidias imaginary cards in check with their own imaginary cards...

This year sucks for consumers.


----------



## B-Real (Nov 18, 2020)

Wow, really. 
-4-6% behind 3080 without SAM
-costs $50 less
-+6GB VRAM
-significantly less power consumption and therefore just crushing the 3080 in terms of effectiveness
-slightly better temperatures than reference 3080
-much quieter than the 3080
-OCs MORE THAN TWICE BETTER than the 3080 (WTF)

And note that 4 new benchmarks, including Dirt 5, AC: Valhalla, WD: Legion and Godfall are not yet included here, and in 3 of those, AMD just crushes the 3080

The only difference is in RT performance. But to be honest, given the quality of RT in most games (and even the number of games supporting it) it doesn't make it a decisive factor given the earlier points.


----------



## R0H1T (Nov 18, 2020)

fancucker said:


> with a node advantage (SS 8nm sucks) and the absence of a proper *DLSS alternative*


Node (full node?) disadvantage huh, care to enlighten us if you have Ampere's perf or perf/w on 7nm 

While SS might be inferior, by how much no one really knows. Anyone claiming otherwise is just blowing hot air off the wrong end.

Yeah sure, people are dying to go DLSS when *4k native* is already an option on most games


----------



## Ja.KooLit (Nov 18, 2020)

Impressive AMD. Welcome back. I salute AMD for taking intel and nvidia head on.. And now, alot of people including me, to finally tell people, I have a full AMD system with a heads up and proud. Im pretty sure AIB will have fun OCing this chip. Im going full AMD. Im sure patches will come for the ray tracing and so called virtual super solution


----------



## techguymaxc (Nov 18, 2020)

I'm glad AMD has once again returned to competing at the high-end of performance in rasterization.  The ray-tracing performance is unusable, however, at all but the lowest resolutions and without a DLSS competitor, I won't be purchasing an RX 6000 card.  Guess it's back to waiting for NV to make 3000 series cards available in meaningful quantity.


----------



## horsemama1956 (Nov 18, 2020)

birdie said:


> What or who have they destructed? I'm truly curious. As far as I can see these cards are for hardcore AMD fans who will continue to say that RTRT is irrelevant. Yes, it is irrelevant on RTX 6000 hardware. It's very much viable for the RTX 3000 series which thanks to DLSS allows to enable RTRT and game at 4K.


I feel like this is why Nvidia rushed their RT with the 20 series so people would say this. 

Them being first to market allowed them to set a certain benchmarking floor. The problem is like all nVidia tech, support is limited and generally requires their involvement or investment to implement in games. 

The AMD implementation will be limited in comparison, BUT every multiplatform game developed can easily utilize it in some capacity, which I think is good enough for most people. Consoles were always going to dictate how RT would be used since very few devs make high end PC only games any more.

Personally the "low" setting in RTX optimized games look best to me. 

nVidia have mastered marketing limited use tech as a selling point knowing few games/software will actually take advantage of it. These replies confirm that.

This applies to DLSS as well. It's great tech but purposely limited to marketable games. There would be no reason to buy a 3080 if a 3060 was leveraging DLSS in every game.


----------



## Jinxed (Nov 18, 2020)

@W1zzard

So exactly why is Nvidia in trouble? I don't get the title. Nvidia has the faster product both in rasterization and raytracing (much faster in raytracing in fact). There is no AI support on the AMD card that would be similar to DLSS, with no alternative to TensorCores.  All that on a worse 8nm process (really just an improved 10nm). I'd actually look at that the other way - AMD, even with the best available process technology, much work spent on improving power efficiency, AMD was still unable to beat Nvidia. What happens when Nvidia moves to 5nm next gen? Where will AMD find any reserves for improvement? It's not like they can ditch GCN for the second time and get the same improvements (they've finally done it now, removing the last remains of that horrible architecture).

For longevity of the cards, the TensorCores and DLSS are actually a perfect solution. You can play games in native 4k easily on a RTX 3080. Games that come out 4 years from now, you might still be able to play confortably with DLSS on. I agree with you that memory is not as important simply because you run out of shading power before you do with memory. But with DLSS, you can actually extend that moral end-of-life of a card.

I also couldn't find any DLSS ON raytracing results, while there were in most previous Nvidia reviews. Why?

Otherwise thanks for a great review!


----------



## Dirt Chip (Nov 18, 2020)

In a world without DXR AMD feet is GRAND!
But DXR is where everybody seems to go this days... so nothing new. NV will still dominate and the whole 10 vs 16 GB RAM will be excelent food for endless, worthless form wars.

If you dont care for DXR (I for once dont care at all) go get that 6800XT. It is quite perfect.
If you do, NV then- no other choice.


----------



## R0H1T (Nov 18, 2020)

Jinxed said:


> For longevity of the cards, the TensorCores and DLSS are actually a perfect solution.


You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for *games* we saw on *zen3* chips was just a coincidence?


----------



## medi01 (Nov 18, 2020)

Combining RT on and off results in one chart is one weird decision.


----------



## Night (Nov 18, 2020)

Great to see AMD back in the competition game and on par with Nvidia's finest, but too bad there's no price competition anymore, 6800 XT and RTX3080 almost cost the same, while I'd personally go with AMD just for the sake of features/upgrades in this round (SAM, more VRAM). I'll think twice if I want to spend that much on a GPU this time around, and also the prices in Europe will be over the roof anyway. 4k 60 fps gaming is still quite expensive.


----------



## medi01 (Nov 18, 2020)

Sandbo said:


> ray-tracing performance


In all those 20 games, chuckle, how could one live without that...


----------



## pantherx12 (Nov 18, 2020)

Good performance, however it seems the infinity cache approach to accelerating memory performance can't keep up with 4k performance.

Based on the reviews I've seen it looses a lot of performance in most cases at 4k and looses ground to Nvidia.

I mean it could be a lack of pure grunt but this is the case even in games where it outperforms at 1440p.

I hope a board partner attempts a gddr6x version just to see if y theory is correct.

AMD need to get their upscaling tech out ASAP though.


----------



## birdie (Nov 18, 2020)

horsemama1956 said:


> I feel like this is why Nvidia rushed their RT with the 20 series so people would say this.
> 
> Them being first to market allowed them to set a certain benchmarking floor. The problem is like all nVidia tech, support is limited and generally requires their involvement or investment to implement in games.
> 
> ...



NVIDIA's RTRT implementation is *bog standard* and *standardized* as DX12/RTX and Vulkan RT extensions. Xbox Series X uses D3D12 RTX as well, so there's nothing NVIDIA'esque about RTRT. Not sure about Sony as they use their own graphics API but very similar RDNA 2.0 hardware.

DLSS on the other hand is 100% proprietary - that's true.


----------



## W1zzard (Nov 18, 2020)

medi01 said:


> Combining RT on and off results in one chart is one weird decision.


How else you would present the data?



pantherx12 said:


> gddr6x


G6X is NVIDIA exclusive from what I understand, also AMD's board partners cannot change memory technologies


----------



## Jinxed (Nov 18, 2020)

Vya Domus said:


> Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.


Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.


----------



## Frick (Nov 18, 2020)

TheLostSwede said:


> You always have to add the VAT (moms) in Sweden on top of whatever price you see. I think the review sites in Sweden forgets about this.
> Also, somethings Sweclockers are a bit too full of themselves imho.
> 
> Looks like a $40-ish price premium here, even after VAT has been added.



I meant recommended price from AMD, which usually have VAT. And if past releases is anything to go by it'll be a while before prices are fully set.


----------



## mb194dc (Nov 18, 2020)

Be interested to see what AIB 6800xt can do if push the clocks and power consumption up, some room to play with by the look of it with different bios limits and card design. From Igor review the 6800xt is only hitting 2200mhz max at 4K and can't push this design much higher. Was thinking we will get versions of the the 6800xt that will do 2400-2500.


----------



## Hossein Almet (Nov 18, 2020)

Apparently, all the AIB RX 6800 XT has been sold out at my local store.  They all are priced at A$1049, which is significantly cheaper the AIB RTX 3080.  The Strix OC 3080 is $1799, the Trio is $1469.  At this rate, the RX 6800 XT is a better buy.


----------



## RainingTacco (Nov 18, 2020)

Compared to nvidia: shit
Compared to AMD own shit: impressive gain

Overall for consumers: absolutely no change in pricing schemes, as price fixing continues[for the worse for 6800 which is complete crap for the price compared to rtx 3070].


----------



## spnidel (Nov 18, 2020)

W1zzard said:


> How else you would present the data?


longer bar with RT off, shorter bar within longer bar with RT on


----------



## ViperXTR (Nov 18, 2020)

Is this like AMD's Pascal moment?


----------



## W1zzard (Nov 18, 2020)

spnidel said:


> longer bar with RT off, shorter bar within longer bar with RT on


Not supported by our charting engine


----------



## B-Real (Nov 18, 2020)

Has anyone realized that the *6800XT OCs MORE THAN TWICE BETTER than the 3080*?  WTF, really. I can't even recall a single AMD card in a decade (maybe more?) that was able to OC better by a single 0,1% margin, not to speak of more than double numbers.


----------



## medi01 (Nov 18, 2020)

W1zzard said:


> How else you would present the data?


Split them up and have RT Off only and RT On only?

Or if not, have some color scheme that allows to easily grasp when RT is on and when not.


----------



## W1zzard (Nov 18, 2020)

medi01 said:


> Or if not, have some color scheme that allows to easily grasp when RT is on and when not.


I like that idea, let me try


----------



## shk021051 (Nov 18, 2020)

Good review!
great price/performance 
but lack of dlss technology and ray tracing performance make me go with team green


----------



## nguyen (Nov 18, 2020)

Looking forward to Asus TUF 3080 vs TUF 6800 XT battle, can't exactly buy the 3080 FE anywhere outside the US...


----------



## Cheeseball (Nov 18, 2020)

ViperXTR said:


> Is this like AMD's Pascal moment?



This is AMD's Pascal moment, but slightly more affordable compared to NVIDIA's offerings.

They finally fixed multi-monitor now! @W1zzard - Are the memory clocks low when running multi-monitor on both same refresh rate and varying refresh rates? This was a problem with the 5000-series and the older GCN tech (HD 7000 series).


----------



## B-Real (Nov 18, 2020)

fancucker said:


> So subpar raytracing performance and near raster parity? Congratulations on getting out of the gutter AMD but this is with a node advantage (SS 8nm sucks) and the absence of a proper DLSS alternative. 3080 looks like more value. A shortlived victory before a 7nm ampere refresh or hopper restores the status quo



So a technology realistically supported by a margin of games makes the 3080 better value for you?  Because that is the only thing you can cling on. Can you list the games that has a RT support that makes you say "Wow, that is gorgeous!"? So funny, really.


----------



## Turmania (Nov 18, 2020)

AMD leading in performance and power consumption ? Never thought that day was a possibilty. But it is now. Well done AMD. Competition is perfect for us consumers. So very happy with it.


----------



## Vya Domus (Nov 18, 2020)

RainingTacco said:


> Compared to nvidia: shit
> Compared to AMD own shit: impressive gain
> 
> Overall for consumers: absolutely no change in pricing schemes, as price fixing continues[for the worse for 6800 which is complete crap for the price compared to rtx 3070].



Nice joke, got any more ?


----------



## yeeeeman (Nov 18, 2020)

Vya Domus said:


> Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.


Thing is, nvidia cards are pushed to the max and with a decent undervolt you can quickly get 50W less with the same performance.


----------



## TheLostSwede (Nov 18, 2020)

Frick said:


> I meant recommended price from AMD, which usually have VAT. And if past releases is anything to go by it'll be a while before prices are fully set.


The MSRP doesn't really apply outside of the US though, because, reasons.


----------



## RainingTacco (Nov 18, 2020)

TBH, people who have either nvidia or amd stocks should be forbidden from commenting on these reviews. It just become an insincere shoutout of pseudoarguments.


----------



## R0H1T (Nov 18, 2020)

Imagine buying a 3080 at or above $1k or 3090 around $2k or above 

JHH really did a number on Nvidia fans this time around.


----------



## B-Real (Nov 18, 2020)

yeeeeman said:


> Thing is, nvidia cards are pushed to the max and with a decent undervolt you can quickly get 50W less with the same performance.


So undervolt becomes a thing now for people who laughed at AMD owners when they mentioned that with Vega, maybe 290-390, RX 480, etc? Interesting (I'm not saying you are part of that group, but you were the one who mentioned it).


----------



## Jinxed (Nov 18, 2020)

Turmania said:


> AMD leading in performance and power consumption ? Never thought that day was a possibilty. But it is now. Well done AMD. Competition is perfect for us consumers. So very happy with it.


Except that AMD is not leading in performance  Looking at the article 6800XT is 6% behind RTX 3080.


----------



## birdie (Nov 18, 2020)

B-Real said:


> So a technology realistically supported by a margin of games makes the 3080 better value for you?  Can you list the games that has a RT support that makes you say "Wow, that is gorgeous!"? So funny, really.



RTRT will be implemented in most triple-A titles from now on. AMD fans love to boast about how AMD GPUs are future-proof only this time around they are anything but.

It's highly unlikely "fine wine" will not allow AMD's RTRT performance to increase substantially given the current performance difference.


----------



## RainingTacco (Nov 18, 2020)

Who cares about wattage, as long as it can be cooled with acceptable noise? Anyone who buys 1k usd gpu doesnt care about 50W more on electricity bill.


----------



## horsemama1956 (Nov 18, 2020)

birdie said:


> NVIDIA's RTRT implementation is *bog standard* and *standardized* as DX12/RTX and Vulkan RT extensions. Xbox Series X uses D3D12 RTX as well, so there's nothing NVIDIA'esque about RTRT. Not sure about Sony as they use their own graphics API but very similar RDNA 2.0 hardware.
> 
> DLSS on the other hand is 100% proprietary - that's true.


Of course its standard, never said it wasn't. It's being implemented to nVidia standards right now on PC though. I personally just expect an RT on/off toggle in the future with "special" extra effects for nVidia.


----------



## Vipeax (Nov 18, 2020)

@W1zzard, would you happen to know why the average power consumption for the new AMD cards is much lower than the average other (also reputable) sites are reporting? Is this a Metro: Last Light-specific thing? Stellar review as always.


----------



## birdie (Nov 18, 2020)

horsemama1956 said:


> Of course its standard, never said it wasn't. It's being implemented to nVidia standards right now on PC though. I personally just expect an RT on/off toggle in the future with "special" extra effects for nVidia.



Even if it's called "RTX" in games in reality it's D3D12 DXR - there's no such thing as "nVidia standards right now on PC though". It's like saying that there are two different DirectX'es for AMD and NVIDIA.


----------



## mb194dc (Nov 18, 2020)

B-Real said:


> Has anyone realized that the *6800XT OCs MORE THAN TWICE BETTER than the 3080*?  WTF, really. I can't even recall a single AMD card in a decade (maybe more?) that was able to OC better by a single 0,1% margin, not to speak of more than double numbers.



5600xt will also clock about 25-30% over reference if you have the right bios .

Let us see what AIB can do with the 6800xt for a start, convinced we'll see AIB cards running at least 10% and maybe even 20% faster clocks than this reference version. How that will exactly translate in to real world performance, guess we have to wait a week till the cards are out 

In Igor review he also think 6800 vanilla will do 2.5ghz easy out of the box.


----------



## B-Real (Nov 18, 2020)

birdie said:


> RTRT will be implemented in most triple-A titles from now on. AMD fans love to boast about how AMD GPUs are future-proof only this time around they are anything but.
> 
> It's highly unlikely "fine wine" will not allow AMD's RTRT performance to increase substantially given the current performance difference.


"RTRT will be implemented in most triple-A titles from now on. "

Sorry, is this a fact? Can you quote it from somewhere/someone?



RainingTacco said:


> Who cares about wattage, as long as it can be cooled with acceptable noise? Anyone who buys 1k usd gpu doesnt care about 50W more on electricity bill.



Read back to Vega reviews and you will see how many green-eyed people cared about it back to that day.



mb194dc said:


> 5600xt will also clock about 25-30% over reference if you have the right bios .



Yeah, but that was a totally different story.


----------



## W1zzard (Nov 18, 2020)

Vipeax said:


> @W1zzard, would you happen to know why the average power consumption for the new AMD cards is much lower than the average other (also reputable) sites are reporting?


Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings


----------



## Kallan (Nov 18, 2020)

As a 5900X owner, I found this article to be lackluster as I wanted to see AMD 5000 series CPUs used on the testbed, not an old Intel CPU. I will have to go elsewhere for my benchmark results. But I am extremely pleased to see the AMD Radeon RX 6800 XT perform as well as it did and I expect it will only get better over time.


----------



## Vipeax (Nov 18, 2020)

W1zzard said:


> Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings


Thanks for the prompt reply. You already replied before I even modified my post haha.


----------



## RainingTacco (Nov 18, 2020)

Kallan said:


> As a 5900X owner, I found this article to be lack luster as I wanted to see AMD 5000 series CPUs used on the testbed, not an old Intel CPU. I will have to go else for my benchmark results.



Intels are still slightly better in gaming at the current moment, deal with it.


----------



## W1zzard (Nov 18, 2020)

Kallan said:


> As a 5900X owner, I found this article to be lack luster as I wanted to see AMD 5000 series CPUs used on the testbed, not an old Intel CPU. I will have to go else for my benchmark results.


I have this article in the works, already got 200 results for it, just 60 more test runs or so


----------



## ExcuseMeWtf (Nov 18, 2020)

Would easily take it over 3080 at same price, similar/slightly lower rasterization, don't care about RT (esp. if it looks like WD:Legion LMAO), and lower power consumption to more than make up for it for me. If there are no weird consumption spikes like on RTX 3080 it means you actually can save a little going for slightly less wattage on a PSU.


----------



## Searing (Nov 18, 2020)

birdie said:


> TLDR:
> 
> Truly stellar rasterization performance per watt thanks to the 7nm node.
> Finally solved multi-monitor idle power consumption!!
> ...



This kind of complete crap on a site with a review that says the opposite is why forums are a waste of time. Laughable.


----------



## B-Real (Nov 18, 2020)

RainingTacco said:


> Intels are still slightly better in gaming at the current moment, deal with it.


I suggest you stop lying. Lying is bad.






Or check ANY written/video reviews around.


----------



## spnidel (Nov 18, 2020)

W1zzard said:


> Not supported by our charting engine


then add support for it
surely displaying a second bar of a different color can't be that hard to write some code for


----------



## RainingTacco (Nov 18, 2020)

Havent you see that this is with better memory? Compare stock AMD vs stock Intel smoothbrain xD


----------



## Xuper (Nov 18, 2020)

yeeeeman said:


> Thing is, nvidia cards are pushed to the max and with a decent undervolt you can quickly get 50W less with the same performance.


Well , true But AMD can do. by looking at P/watt chart ,  different between RTX 3070 ( Best of Nvidia )  and RT 6800 (Best of AMD ) is a lot.69% vs 100%. so doing undervolt will not help them.


----------



## Vya Domus (Nov 18, 2020)

RainingTacco said:


> Who cares about wattage, as long as it can be cooled with acceptable noise? Anyone who buys 1k usd gpu doesnt care about 50W more on electricity bill.



6800XT actually has overclocking headroom. That's why 50W, more like a 100W really, matter.

Nice try though.


----------



## Jinxed (Nov 18, 2020)

B-Real said:


> I suggest you stop lying. Lying is bad.
> 
> 
> 
> ...


There is no difference in 4K gaming between those CPUs. Why do you post 1080p results? Are you going to use a 5900 ryzen for 1080p gaming?  For your sake I hope not


----------



## Sandbo (Nov 18, 2020)

B-Real said:


> Has anyone realized that the *6800XT OCs MORE THAN TWICE BETTER than the 3080*?  WTF, really. I can't even recall a single AMD card in a decade (maybe more?) that was able to OC better by a single 0,1% margin, not to speak of more than double numbers.


Not to argue with you, but if you water cool a Vega 64, it overclocks pretty well.
I got mine sustaining 1700 MHz, while stock cooler will average out 1400 - 1500 MHz only.
Certainly, I believe you meant without water cooling.


----------



## birdie (Nov 18, 2020)

Kallan said:


> As a 5900X owner, I found this article to be lack luster as I wanted to see AMD 5000 series CPUs used on the testbed, not an old Intel CPU. I will have to go else for my benchmark results.



Fanboyism at its finest. There's almost zero difference between 10900K and 5950X at resolutions above 1080p and 10900K is as fast or faster than 5950X at 1080p.


----------



## B-Real (Nov 18, 2020)

RainingTacco said:


> Havent you see that this is with better memory? Compare stock AMD vs stock Intel smoothbrain xD


Better memory? Its better memory WITH Intel and better memory WITH AMD.

And, as I said, check EVERY review available online. Written or in video format. Every review says AMD beats Intel in gaming. I understand you can't stand this situation, but this is the fact.




Jinxed said:


> There is no difference in 4K gaming between those CPUs. Why do you post 1080p results? Are you going to use a 5900 ryzen for 1080p gaming?  For your sake I hope not


Ok, so now, when AMD is finally able to beat Intel in gaming, 1080p results are irrelevant...  That is truly ingenious! You know, the only resolution a reviewer is able to distinguish CPUs from one another is FHD (or even less). That was the case with Ryzen 2000 series vs. Coffee Lake, Ryzen 3000 series vs. Comet Lake and this is the case with Ryzen 5000 series vs. Comet Lake. Flagship GPU on FHD brings out the only difference between CPUs: a flagship GPU even on 1440p or 4K OR a non flagship CPU on FHD is the same performance in every CPU (at least considering several generations).


----------



## HD64G (Nov 18, 2020)

So, Big Navi and Zen3 are both the best archs for gaming now (both for efficiency and performance up to 1440P), and bring even better results when combined. What a tech revolution from AMD. Kudos to them.


----------



## Jinxed (Nov 18, 2020)

B-Real said:


> Better memory? Its better memory WITH Intel and better memory WITH AMD.
> 
> And, as I said, check EVERY review available online. Written or in video format. Every review says AMD beats Intel in gaming. I understand you can't stand this situation, but this is the fact.


Then link actual 4K results where AMD beats top-of-the-line Intel CPUs in gaming. OR even the 9900K.


----------



## RainingTacco (Nov 18, 2020)

Wonder how AMD stocks will fare today. That's important question whether these gpus are considered boom or bust by the market.


----------



## Xuper (Nov 18, 2020)

Vya Domus said:


> 6800XT actually has overclocking headroom. That's why 50W, more like a 100W really, matter.
> 
> Nice try though.


on every forum , Nv people were focusing on P/Watt and screaming like missile.. right now their gun are DLSS/RT.


----------



## yeeeeman (Nov 18, 2020)

Xuper said:


> Well , true But AMD can do. by looking at P/watt chart ,  different between RTX 3070 ( Best of Nvidia )  and RT 6800 (Best of AMD ) is a lot.69% vs 100%. so doing undervolt will not help them.


I don't know about 3070, but 3080 can go from 320W to 220W just by undervolting. Performance will decrease by 5% tops. This means nvidia runs these chips well over their sweet spot.
But yeah, no doubt about it, amd has a clear advantage. Also, this efficiency allows them to overclock better and 10% better perf from a basic OC is very good. Quite sure it can reach 3090 performance via OC, which is awesome. Still, if you want RT in your games...I wouldn't buy a radeon card.


----------



## Prince Valiant (Nov 18, 2020)

Thanks for the review.



techguymaxc said:


> I'm glad AMD has once again returned to competing at the high-end of performance in rasterization.  The ray-tracing performance is unusable, however, at all but the lowest resolutions and without a DLSS competitor, I won't be purchasing an RX 6000 card.  Guess it's back to waiting for NV to make 3000 series cards available in meaningful quantity.


RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.



Jinxed said:


> Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.


I imagine AMD will make the same move as soon as the node is available. Do you have insider information that TSMC plans on barring AMD from using the node to make Nvidia look better ?


----------



## W1zzard (Nov 18, 2020)

medi01 said:


> Split them up and have RT Off only and RT On only?
> 
> Or if not, have some color scheme that allows to easily grasp when RT is on and when not.











						AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
					

The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.




					www.techpowerup.com
				



Reload. How's that?


----------



## Vya Domus (Nov 18, 2020)

Xuper said:


> right now their gun is DLSS/RT.



What's funny is that almost everyone lamented those back then, even Nvidia fanboys.



yeeeeman said:


> I don't know about 3070, but 3080 can go from 320W to 220W just by undervolting. Performance will decrease by 5% tops. This means nvidia runs these chips well over their sweet spot.



I am sure the 6800XT undervolted as well but I guess we'll pretend like that not the case, right ?


----------



## Pumper (Nov 18, 2020)

How can it be such a power hog when playing simple videos but eat 100W less when gaming?


----------



## Sandbo (Nov 18, 2020)

W1zzard said:


> AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
> 
> 
> The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.
> ...


Definitely preferred, might be good enough so graphs do not have to be split.


----------



## Prince Valiant (Nov 18, 2020)

W1zzard said:


> AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
> 
> 
> The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.
> ...


It's immediately noticeable.


----------



## RainingTacco (Nov 18, 2020)

Pumper said:


> How can it be such a power hog when playing simple videos but eat 100W less when gaming?



As someone said -horrible hardware encoder.


----------



## W1zzard (Nov 18, 2020)

Pumper said:


> How can it be such a power hog when playing simple videos but eat 100W less when gaming?


Memory runs at full speed during media playback, see the table on page 32

Previous architectures didn't do that, so I think it'll be fixed soon


----------



## hardcore_gamer (Nov 18, 2020)

This would have been an impressive card at $500.


----------



## Kallan (Nov 18, 2020)

RainingTacco said:


> Intels are still slightly better in gaming at the current moment, deal with it.



Sure you keep believing that! LOL


----------



## Frick (Nov 18, 2020)

TheLostSwede said:


> The MSRP doesn't really apply outside of the US though, because, reasons.



AMDs recommended price was specifically 6990 and 6249 respectively. That was the number they gave for PR purposes, which isn't just US MSRP + VAT.


----------



## spnidel (Nov 18, 2020)

hardcore_gamer said:


> This would have been an impressive card at $500.


yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years


----------



## Vya Domus (Nov 18, 2020)

Pumper said:


> How can it be such a power hog when playing simple videos but eat 100W less when gaming?



I can hardly call that a power hog but almost always there are issues with new cards and media power consumption which are usually fixed sooner or later.



RainingTacco said:


> As someone said -horrible hardware encoder.



Man, is Nvidia's damage control task force comprised of just you ? Rough times to be an Nvidia fan, huh ?


----------



## Jinxed (Nov 18, 2020)

Prince Valiant said:


> Thanks for the review.
> 
> 
> RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.
> ...


Oh hello???  Is anybody home? RTX 3000 series are on the 8nm Samsung node, which is just an improved 10nm. RX6000 from AMD are on the bestest and latest 7nm node from TSMC - true 7nm node. So right now, AMD has the node advantage, but still they trail behind Nvidia in rasterization and significantly so in raytracing. If both AMD and Nvidia move to 5nm TSMC, AMDs node advantage is gone. Do you now understand AMDs situation?


----------



## B-Real (Nov 18, 2020)

Jinxed said:


> Then link actual 4K results where AMD beats top-of-the-line Intel CPUs in gaming. OR even the 9900K.


Why should I link 4K results? As I mentioned, there is no real difference between CPUs that you can experince in real life above FHD or even on FHD with a non-flagship GPU. But you were the ones who were screaming about the same differences on FHD when Intel was in the lead.  I'm sorry you can't understand it.


----------



## Cheeseball (Nov 18, 2020)

Vya Domus said:


> I am sure the 6800XT undervolted as well but I guess we'll pretend like that not the case, right ?



I'd actually like to see this. I undervolted my 3080 to a locked .825 mV and only lost around 3 FPS in Cold War (Zombies mode, DLSS off) and PUBG, and my UPS reported a nice 85W reduction from 355W. It'd be nice to see the 6800 XT running below 180W at stock performance.



Pumper said:


> How can it be such a power hog when playing simple videos but eat 100W less when gaming?



Drivers.


----------



## moproblems99 (Nov 18, 2020)

Vaporware again. Sigh.



yeeeeman said:


> I don't know about 3070, but 3080 can go from 320W to 220W just by undervolting. Performance will decrease by 5% tops. This means nvidia runs these chips well over their sweet spot.



Said Every Vega owner.


----------



## anachron (Nov 18, 2020)

B-Real said:


> "RTRT will be implemented in most triple-A titles from now on. "
> 
> Sorry, is this a fact? Can you quote it from somewhere/someone?


I'm not pro nvidia or amd, i was really interested in the result of this new card generation, but i have to say that the RT performances are quite disappointing.
I enjoyed RT in Control, i enjoyed RT in Legion, i look forward for RT in Minecraft and maybe cyberpunk 2077 if i like the game. But even if the rasterization performance would be vastly superior to my 2070 super, i would feel bad to spend that much money on a 6800/6800XT to have worse performances when it come to RTX+DLSS even if it's only on a few games. In the case of Legion, i think my 2070 super without RT but with DLSS may even have better results than the 6800, although with a little bit of quality loss.

I understand that not everyone is into RT, the new AMD card are indeed very good in this case, but if you have any interest in RT, they are definitely not up for the task at the moment.


----------



## Jinxed (Nov 18, 2020)

B-Real said:


> Why should I link 4K results? As I mentioned, there is no real difference between CPUs that you can experince in real life above FHD or even on FHD with a non-flagship GPU. But you were the ones who were screaming about the same differences on FHD when Intel was in the lead.  I'm sorry you can't understand it.


Me? Where?


----------



## dayne878 (Nov 18, 2020)

I'm excited for Big Navi because of how it's going to affect Nvidia. I'm not going to rush out and buy an AMD card (my g-sync monitor won't support AMD cards with g-sync on and I value that far more than better performance). It's the same excitement when I see AMD's CPUs coming out, though I'm more likely to try AMD with my next build in a few years than I am to try AMD GPUs.

I'm still going to seek out a 3080 when stock is no longer an issue and it can be had at close to MSRP, but it's exciting to see competition in the GPU realm at the high end.


----------



## B-Real (Nov 18, 2020)

hardcore_gamer said:


> This would have been an impressive card at $500.


So you say that the 3080 isn't impressive too for $700, right?


----------



## RainingTacco (Nov 18, 2020)

RT shine with DLSS, without DLSS RT is just plain bad. AMD has no DLSS and bad RT, therefore nvidia RT+DLSS combo VASTLY outmatch AMD RT. Maybe next time -as amd fans love to say, next time we surely get them nvidia scrubs!


----------



## TheLostSwede (Nov 18, 2020)

Frick said:


> AMDs recommended price was specifically 6990 and 6249 respectively. That was the number they gave for PR purposes, which isn't just US MSRP + VAT.


Ah, ok, I had no idea they'd released local MSRP pricing, that's not very common. Yeah, that's not really going to cut it.


----------



## z1n0x (Nov 18, 2020)

RTRT perf on Metro is decent, but on Cotrol is crap hmm...
I'm curious what the RTRT perf will on new titles, one that are build not only with Nvidia in mind.
I feel like there is more to the RTRT story than what we're seeing here.


----------



## Vya Domus (Nov 18, 2020)

Jinxed said:


> Do you now understand AMDs situation?



Do you ?

AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.

Also, you do realize AMD is probably already working on a future 5nm design by now. Meanwhile I am led to believe Nvidia is figuring out how to bring Ampere to 7nm ? Do I have to explain how things aren't exactly going too well for them if that's the case ?

Seems like Nvidia has just done an Intel style upsy-daisy by screwing up their node situation.


----------



## B-Real (Nov 18, 2020)

So what we have learned from the 5000 series CPU and the 6000 series GPU launch is that

-FHD results in CPU benchmarks are not relevant any more (in fact they really weren't earlier too, but for some reason it was important for the blue team)
-NV cards can be UVd to get better power consumption
-NV fans hold on to the very last point they can (RT with only a few games available), when a rival generation supporting RT for the first time is able to match the last gen (first gen RT) flagship green card

Love it, really.


----------



## regs (Nov 18, 2020)

Could be software scheduler affecting Nvidia performance in DX12 and Vulkan in 1080p? Is if software or hardware being tested?


----------



## Fluffmeister (Nov 18, 2020)

Certainly an impressive effort from AMD, shame stock levels with these seem to be non-existent too.


----------



## Fleurious (Nov 18, 2020)

Actually impressed with what AMD has managed to pull off with these cards.  Performance is where I expected them to land.  I fully expect nvidia to respond with their Ti cards on a better node.


----------



## Colddecked (Nov 18, 2020)

Vya Domus said:


> Do you ?
> 
> AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.
> 
> ...



Intel style is a bit harsh, but ya not paying up for TSMC 7nm might have been a mistake... We'll see if Big Navi stock can improve, which has been predicted once AIB cards start rolling in..


----------



## anachron (Nov 18, 2020)

B-Real said:


> So what we have learned from the 5000 series CPU and the 6000 series GPU launch is that
> 
> -FHD results in CPU benchmarks are not relevant any more (in fact they really weren't earlier too, but for some reason it was important for the blue team)
> -NV cards can be UVd to get better power consumption
> ...


Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.

Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.


----------



## Gan77 (Nov 18, 2020)

You have an error in the "AMD description is using *Alpha & Omega Semiconductor DrMOS *components throughout the VRM".

*In fact TDA21472 - Infineon Technologies*


----------



## Colddecked (Nov 18, 2020)

anachron said:


> Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.



The difference is, 8gb will always be 8gb while RT performance will likely improve and AMD does have their DLSS like solution in the works which will also help.


----------



## horsemama1956 (Nov 18, 2020)

birdie said:


> Even if it's called "RTX" in games in reality it's D3D12 DXR - there's no such thing as "nVidia standards right now on PC though". It's like saying that there are two different DirectX'es for AMD and NVIDIA.


Good lord.


birdie said:


> Even if it's called "RTX" in games in reality it's D3D12 DXR - there's no such thing as "nVidia standards right now on PC though". It's like saying that there are two different DirectX'es for AMD and NVIDIA.


Slow down.

 You're too focused on a single word. It was called RTX in games because nVidia paid the developers money and helped to implement it so they could market their hardware for this purpose. Like they do with all of their tech, right? 

What my initial post was saying is that nVidia themselves rushed RT onto the 20 series so they could pretty much become synonymous with Ray Tracing(which they have) even though they themselves are the only ones implementing it, usually with incentives. Still with me?

As of right now, RTX is what people associate as a standard on PC. Not THE standard, but how nVidia thinks it should work.

A lower setting for cheaper hardware, a medium setting for higher end stuff and finally the ultra setting for their enthusiast market.

In the future, now that RDNA2 is here and new consoles have arrived, we will likely just have a simple setting to enable or disable BUT there will be an RTX setting in specific games(just like PhysX) where buying an nVidia card gets you "better" effects.

Do you understand? The scope of RT we will see on consoles is what PC is going to get since no one makes huge, graphics heavy PC games any more. 

RTX will still be a thing. Just a premium thing. I can totally see nVidia releasing GTX cards that do the simpler RT that consoles will target and selling RTX cards that handle a heavier load in nVidia sponsored games(again, like PhysX).


----------



## z1n0x (Nov 18, 2020)

Fleurious said:


> I fully expect nvidia to respond with their Ti cards on a better node.


That's not how that works. Maybe new stepping with abit better yields and perf/w but it wont be anything major.


----------



## Vya Domus (Nov 18, 2020)

anachron said:


> I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.



I tell you why, because some insist a 50% performance hit so much better than a 60% hit.

No, they're both crap. RT is still not ready for prime time.


----------



## anachron (Nov 18, 2020)

Colddecked said:


> The difference is, 8gb will always be 8gb while RT performance will likely improve and AMD does have their DLSS like solution in the works which will also help.


Indeed. I do not intend to change my graphic card right now, so if RT performance do improve and the AMD equivalent of DLSS do prove to be worth it when i do, i would most certainly be interested by an AMD graphic card.


----------



## kings (Nov 18, 2020)

Performance is as expected, in rasterization slightly below 3080 (on average), in RT it loses a lot, but in compensation it´s a little cheaper, has more VRAM and consumes significantly less power.

Overall, it´s a very solid product from AMD, as has not been seen in years. This was long overdue after several disappointments.

In another note, Nvidia definitely messed up going with Samsung. It is not surprising that Nvidia is already placing orders for TSMC 5nm.


----------



## N3M3515 (Nov 18, 2020)

MxPhenom 216 said:


> You can overclock a 3080 too...



6800XT OC: +9.1%
RTX 3080 OC: +3.9%

+5% for the radeon, now add a ryzen 5 and a x570 or b550 mobo...


----------



## Charcharo (Nov 18, 2020)

"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."

Wizzard, no offence, but this part is wrong. 
There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.

For people who mod, the actual PCMR users, not just the ones who think opening an ini file every 5 years makes them hardcore, this does matter too.

More VRAM is always good. Personally, 8GB is obsolete for me for 1440p and 4K. 10GB is not but its close since I can break it if I want to. But not obsolete. Still, just because you never used something does not mean that it isnt important.


----------



## W1zzard (Nov 18, 2020)

medi01 said:


> Split them up and have RT Off only and RT On only?
> 
> Or if not, have some color scheme that allows to easily grasp when RT is on and when not.





Sandbo said:


> Definitely preferred, might be good enough so graphs do not have to be split.





Prince Valiant said:


> It's immediately noticeable.


and RT charts updated once again. I like, thanks for the feedback


----------



## B-Real (Nov 18, 2020)

anachron said:


> Or you know, regarding the last part, some people just like RT for what it is. A lot of people in this thread have praised AMD for the achievement with this card even if they prefer an nvidia card because of RT. I can't understand why it always end in an AMD vs Nvidia war. I was disappointed with the nvidia choice to go with 8gb of ram on the 3070 just as much as i'm disapointed with the RT performance of the new AMD card.
> 
> Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.


There is no problem when someone says he is disappointed with the RT performance (in my opinion, matching a flagship first gen rival with the first gen non-flagship card is not disappointing or not in a way it is mentioned) but in other aspects it's a great card, and of course these people are not the ones to whom I addressed my comment.


----------



## steen (Nov 18, 2020)

Colddecked said:


> If you need the best RT performance RIGHT NOW, then Nvidia is clearly the way to go.  That being said, I won't be surprised if after a year or two, RT performance of the 6800 and 6800xt is noticeably better than 2080ti.


A lot will depend on which arch is better suited to inline RT with future titles. I'd hazard most titles are optimised for Nv ATM (obviously) where run-time optimisation will be trickier for AMD. I guess if AMD have a stable software stack they can build on performance from there.



W1zzard said:


> G6X is NVIDIA exclusive from what I understand, also AMD's board partners cannot change memory technologies


Exclusive insofar as implementing the tech in their memory controllers. I think AMD is unlikely to use GDDR6X even for their next gen.



pantherx12 said:


> Good performance, however it seems the infinity cache approach to accelerating memory performance can't keep up with 4k performance.


It's a bit early to tell & may still require driver tuning. It may be a case of frametime being more alu limited @ 4k giving Ampere the advantge.



> I hope a board partner attempts a gddr6x version just to see if y theory is correct.


Not possible.


----------



## okbuddy (Nov 18, 2020)

Sandbo said:


> While being 9% cheaper, I consider that a win!
> Certainly, they might just end up being the same (and ridiculously high) price.



wait, if AMD keeps fixing the drivers

will 6800xt finally be as good as 3080, like 0% margin


----------



## Jinxed (Nov 18, 2020)

Vya Domus said:


> Do you ?
> 
> AMD is doing away with a narrower bus and slower memory, they can always bump those up and gain a considerable chunk of performance just by doing that and nothing more. Also caches scale really well with newer nodes in terms of power, there is a very good chance AMD will trash Nvidia in performance/watt even harder when both get to the next node.
> 
> ...


Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
"4 texture OR ray ops/clock per CU"







			https://www.techpowerup.com/img/m7WjdguDI7kJLI8C.jpg
		


Now what do you think that means? I'll enlighten you. Remember there is such a thing as the bounding volume hierarchy tree (BVH). That is a big chunk of data, as it holds the bounding boxed for all objects in the scene to help optimize ray intersections. Unfortunately for AMD, as you see in the slide, their cores cannot perform raytracing operations at the same time as texturing operations, unlike in Nvidia's design. Even worse, they are using the same memory (as AMD repeatedly stated) as they use for texture data. If AMD GPUs did not have the Infinity Cache, they would be in huge trouble, as their per-core cache would keep being invalidated all the time, having to dump the BVH data and replacing them with texture data (and vice versa). And you can see the Big Navi paying the price for that in 4k and raytracing.


----------



## birdie (Nov 18, 2020)

Charcharo said:


> More VRAM is always good.



More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.

FTFY.

And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.


----------



## B-Real (Nov 18, 2020)

birdie said:


> More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.


And why is it our problem? NV chose this way, AMD chose another. Anyway, the 3080 probably won't have a problem with its 10 GB of VRAM, but the 3070 will.


----------



## regs (Nov 18, 2020)

Charcharo said:


> "Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."
> 
> Wizzard, no offence, but this part is wrong.
> There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.
> ...


10 GB does affect DOOM Eternal performance a bit, but in 8K. In 4K 8 GB is more than sufficient for upcoming years.








						Обзор видеокарт NVIDIA GeForce RTX 3080/3090 Founders Edition и тесты на 8К-экране
					

Обзоры видеокарт NVIDIA 30-й серии, которые мы опубликовали в прошлом месяце, отклонились от привычной схемы.




					3dnews.ru


----------



## Cheeseball (Nov 18, 2020)

B-Real said:


> -NV cards can be UVd to get better power consumption



To add clarity to this, they can be undervolted (I would assume .850mV as I'm at .825mV for a 1% loss) to just slightly below the peak gaming power usage of the 6800 XT (284W) and still retain stock performance. Samsung 8nm is definitely inferior to TSMC 7nm for sure, but not by much.

I'd like to see someone try undervolting (not overclocking) the 6800 XT. I would be seriously impressed if it can get below 240W.


----------



## Chomiq (Nov 18, 2020)

6800 is surprisingly in stock. 6800 XT - not so much, but price matches 3080 FE.


----------



## W1zzard (Nov 18, 2020)

Charcharo said:


> "Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."
> 
> Wizzard, no offence, but this part is wrong.
> There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.
> ...


Yeah no doubt, you can always make games run bad by changing settings or replacing textures. These cases are very very edge case, maybe 1000 gamers out of 100 million? (making up random numbers). 

More = good, but more = $, so more != good


----------



## Cheeseball (Nov 18, 2020)

birdie said:


> More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.



This is probably the reason why they stuck with a 320-bit bus and limited the RTX 3080 to just 10 GB. Having more exclusive/proprietary/expensive GDDR6X chips would've pushed up the price.


----------



## anachron (Nov 18, 2020)

birdie said:


> More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.
> 
> FTFY.
> 
> And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.


The thing is, benchmark are usually made with nothing else running on the test bench. I did run into the 8GB vram limit when using the high res pack in WD:L while W1zz did not, because i usually have a bunch of things running on my other monitor, and some (youtube video playing for example) do add to vram usage. So, while i could do without the high res pack, it does not seems very future friendly if i already potentially hit the limit.


----------



## Kallan (Nov 18, 2020)

birdie said:


> Fanboyism at its finest. There's almost zero difference between 10900K and 5950X at resolutions above 1080p and 10900K is as fast or faster than 5950X at 1080p.


That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);


----------



## Jinxed (Nov 18, 2020)

anachron said:


> Edit : for the record, i will quite probably recommend an AMD card for people in my entourage that are not interested in RT.


That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.


----------



## Chomiq (Nov 18, 2020)

Chomiq said:


> View attachment 176134
> 6800 is surprisingly in stock. 6800 XT - not so much, but price matches 3080 FE.


----------



## Jinxed (Nov 18, 2020)

Vya Domus said:


> No, they're both crap. RT is still not ready for prime time.


That's interesting  Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing?


----------



## TheinsanegamerN (Nov 18, 2020)

spnidel said:


> yeah, as much as I love competition, and as fast as the card might be, I'm not comfy with the price hike over $500 over the last couple of years


$500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur. 

Also, remember that the 8800 ultra was over $800 in 2007. 






						New Ultra High End Price Point With GeForce 8800 Ultra
					






					www.anandtech.com
				




These high prices are nothing new.


----------



## anachron (Nov 18, 2020)

Jinxed said:


> That would be a mistake. AMD cards do not have anything similar to DLSS. DLSS adds a significant boost to longevity of the cards, as similar to how you can play 4k games native today on a RTX 3080 (DLSS off), you will be able to play new games 4 years from now with DLSS on. You will not be able to do that on RX6000 series from AMD.


It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.


----------



## TheinsanegamerN (Nov 18, 2020)

Charcharo said:


> "Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 offers 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. GTX 1060 will not drive 4K, no matter if it has 3 GB or 6 GB. Game developers will certainly make use of the added memory capacity on the new consoles, we're talking 8 GB here, as a large portion of the console's total memory is used for the OS, game and game data. I think I'd definitely be interested in a RX 6700 Series with just 8 GB VRAM, at more affordable pricing. On the other hand, AMD's card is cheaper than NVIDIA's product, and has twice the VRAM at the same time, so really no reason to complain."
> 
> Wizzard, no offence, but this part is wrong.
> There is already a different between 8 GB and 10/11/12 GB in DOOM Eternal at 4K and Wolfenstein 2 with manually enabled maximum settings (above Mein Leben) at 1440p.
> ...



Wolfenstein the new order was one of the two games to expose the limitations ofthe 2GB framebuffer on the 680/770 cards way back when, the other being forza 4. But most other games ran perfectly fine, by the time the 2GB limit actually became a significant limit the 680 performance class was the range of 4GB 560s and 770 usage was nearly non existent anymore. 

Not saying RAM limits cant happen, but the doom-and-gloom over nvidia's 10GB bus is way overhyped. A handful of games with manual settings that obliterate VRAM usage =! 10GB not being enough for 99% of PC gamers, even on the high end.


----------



## RainingTacco (Nov 18, 2020)

anachron said:


> It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.



It will just matter if demanding games ie. most AAA titles get DLSS. For the rest[indie games or AA games] you don't need it as it will run at 4k native.


----------



## Sandbo (Nov 18, 2020)

TheinsanegamerN said:


> $500 was a long time ago, when we didnt need expensive DDR6/X buses, more expensive tracing, more complicated coolers to deal with hotspots from tiny nodes, and new nodes were regularly coming out. And of course you cant forget inflation, with two massive stimulus packages (in the US) in the last 10 years and tanking interest rates for most of that time inflation is goign to occur.
> 
> Also, remember that the 8800 ultra was over $800 in 2007.
> 
> ...


LOL
Was back then a proud 8800 GTS 320 MB owner, until a month later they dropped 8800 GT with hardware HD decoding.


----------



## Cheeseball (Nov 18, 2020)

Jinxed said:


> That's interesting  Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing?



While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) *while giving a noticeable uplift in graphics fidelity, then it would be acceptable. *But a drop of 50% or more? No.

Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).


----------



## Jinxed (Nov 18, 2020)

Kallan said:


> That might not be as true as you think as SAM will benefit AMD CPU owners, here is one example (starts at 16:57);
> 
> 
> 
> ...


You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.


----------



## MxPhenom 216 (Nov 18, 2020)

Prince Valiant said:


> Thanks for the review.
> 
> 
> RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.
> ...



So far TSMC is barring everyone from using 5nm except for Apple...


----------



## Space Lynx (Nov 18, 2020)

honestly I am glad I got the 6800 non-XT, I imagine with my super tuned ram at 3600 cas 14-14-14 which i already have stable on my 5600x, enable rage mode, medium OC, smart access memory enabled... I will be nipping at the heals of a 3080 myself even with non-xt. 

but mainly since i game at 1080p 165hz or 1440p 144hz, the 6800 with all that stuff mentioned above, maxes out the frame rate anyway... so yeah... I'm set and I saved $80 on top of that. would have liked a 6800 xt for sure, but I am just thankful I got what I got.

also love the title... "nvidia is in trouble" haha indeed, glorious times.


----------



## Jinxed (Nov 18, 2020)

anachron said:


> It will only be really true if all new games support DLSS (which is not the case actually, look at valhalla for example) and if the equivalent solution that AMD announced isn't up for the task. It's not really an easy choice as it depend on a lot of things that you can't really predict.


You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.


----------



## Turmania (Nov 18, 2020)

I'm genuinely very surprised at AMD, catching Nvidia in GPU segment. Always said that they are 2 generations behind in both performance and power consumption. Now they not only caught but passed Nvdia in both.


----------



## TheinsanegamerN (Nov 18, 2020)

rDNA2 is shaping up to be the next Evergreen series. Wouldnt surprise me to see AMD ripping a significant chunk of marketshare back from Nvidia. 

The AIB 6800xts are going to be awesome with larger power buses and limits. And given how slow the fans spin at stock there is plenty of temp room as well. 

Now I really want to see what the 6900XT is capable of, with the 6800XT OC tickling the 3090 in nvidia's golden examples.


----------



## Jinxed (Nov 18, 2020)

Cheeseball said:


> While RT is being implemented now, it is still not at an acceptable performance level in currently released games. If having RT enabled only gave like a 10% to 20% performance hit (like doing 8x AA in the previous years) *while giving a noticeable uplift in graphics fidelity, then it would be acceptable. *But a drop of 50% or more? No.
> 
> Also, you can safely assume that games on consoles will not be using RT at their max, but only implement subtle visual improvements (like light mirror effects and such).


I can play Control in 4K in full raytracing right now and get massive quality increase out of that. Now add Minecraft and Cyberpunk 2077 also with massive quality improvements gained from raytracing.


----------



## Tech Ninja (Nov 18, 2020)

Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.

how is Nvidia in trouble over $50 price difference?


----------



## anachron (Nov 18, 2020)

Jinxed said:


> You can't expect an AMD sponsored game to support DLSS. That is a very bad (and minority) example.


There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.


----------



## Space Lynx (Nov 18, 2020)

Tech Ninja said:


> Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.
> 
> how is Nvidia in trouble over $50 price difference?



nvidia card's don't OC for one thing, the new ones don't.  and AMD oc's very well.  surpassing 3080 really across the board even with both oc'd.

also competition is just great for the PC gaming industry... so just be happy and move on with life?


----------



## Jinxed (Nov 18, 2020)

anachron said:


> There will always be AAA game sponsored by AMD, my point was just that you can't count on DLSS for every demanding game that will be out in the future. So while i agree with you that it is a nice edge for nvidia to take into account, it's not a very reliable one.


AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.


----------



## TheinsanegamerN (Nov 18, 2020)

Tech Ninja said:


> Your own review shows 6800xt as 5% slower in raster vs 3080 and getting stomped out in DXR.
> 
> how is Nvidia in trouble over $50 price difference?


Nvidia has practically 0 OC headroom. AMD has decent headroom and is restricted by power limits, AIB products with higher limits and more memory OCing will expand that further. With OC factored in 6800xt totally closes the gap with Nvidia. 

Raytracing continues to only be a selling point to a tiny minority of users who love the game control, outside of that game raytracing is hilarious vaporware. A $50 difference means AMD getting more attention, that's enought o convince people with the cards being so close, and Nvidia doesnt have a lot of headroom to cut prices on a GA die they cant seem to make in any large number.


----------



## Jinxed (Nov 18, 2020)

lynx29 said:


> nvidia card's don't OC for one thing, the new ones don't.  and AMD oc's very well.  surpassing 3080 really across the board even with both oc'd.


The best samples they sent for reviews. Now try again with actuall off the self cards.


----------



## TheinsanegamerN (Nov 18, 2020)

Jinxed said:


> AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.


Both brands are, most non sponsored games are designed aroudn the consoles, which are running rDNA2.


----------



## Calmmo (Nov 18, 2020)

Yeah, no the prices on these here in the EU were crazy. over 800eu. Buying one at sub 700 eu I could justify the 2018 RT performance.
Tempted to buy if "properly" priced but that doesn't seem likely. So back to waiting for a 3080ti I guess.


----------



## Colddecked (Nov 18, 2020)

Jinxed said:


> The best samples they sent for reviews. Now try again with actuall off the self cards.



So only one company sends the best samples for review?  Come on buddy...


----------



## W1zzard (Nov 18, 2020)

Tech Ninja said:


> how is Nvidia in trouble over $50 price difference?


Because they no longer have a huge technological advantage? Not saying NVIDIA will die, but they will have to innovate. Something that almost nobody on this planet does better than NVIDIA. In the end we the consumers will benefit


----------



## Jinxed (Nov 18, 2020)

TheinsanegamerN said:


> Both brands are, most non sponsored games are designed aroudn the consoles, which are running rDNA2.


Yeah, that is a complete lie. The actual data are completely different.

Steam Catalog size 23000+








						The best Steam games 2022
					

New gems and bona fide classics among the best Steam games




					www.techradar.com
				




Xbox Catalog size is 1001








						List of Xbox games - Wikipedia
					






					en.wikipedia.org
				




Xbox One Catalog size is 2645





						List of Xbox One games (A–L) - Wikipedia
					






					en.wikipedia.org
				




Xbox One X Catalog size is 428





						List of Xbox One X enhanced games - Wikipedia
					






					en.wikipedia.org
				




Yeah, most games are actually made for PC. Stop repeating this long-ago debunked AMD fanboy crap already.


----------



## spnidel (Nov 18, 2020)

Jinxed said:


> That's interesting  Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing?


...because it's new tech...
you think tessellation performed well back in 2010 when the first couple of generations supporting it came out? fuck no, performance was dogshit.
same thing with RTRT.


----------



## W1zzard (Nov 18, 2020)

Jinxed said:


> Yeah, most games are actually made for PC.


Your "Most" argument is of course accurate, but nearly all major titles are designed for consoles first. Just play them, you'll immediately notice things in most of them that scream "uh didn't any beta tester on PC notice this?"


----------



## MrGRiMv25 (Nov 18, 2020)

Great to see AMD back on form, their CPU division is K.A.A.T.N, and their GPU group isn't much further behind. Also looks like the 10GB VRAM on the 3080 isn't much for an issue *for now* - their perf/w is great and it's pretty cool to see a GPU running at 2.4-2.5Ghz, proper insane on standard cooling, wonder what they'd be like on exotic cooling. 

With the 6800XT doing that well against the RTX3080/90 in a lot of games then the 6900XT will be a fun one to bench and OC

Well done AMD, you're finally back in the CPU and GPU game  .


----------



## MikeMurphy (Nov 18, 2020)

Consumes as much power as my Vega 56, while nearly tripling the performance.

Wow.  This is great stuff.


----------



## Jinxed (Nov 18, 2020)

W1zzard said:


> Your "Most" argument is of course accurate, but nearly all major titles are designed for consoles first. Just play them, you'll immediately notice things in most of them that scream "uh didn't any beta tester on PC notice this?"


I would disagree with that. What about the most played games of today? World of Warcraft? Minecraft? PUBG? Fortnite? Battlefield? Call of Duty? Soon Cyberpunk 2077 (which is designed for PC and then for consoles as well, not a backport)? Almost all of the most played games are designed for PC first.


----------



## W1zzard (Nov 18, 2020)

Jinxed said:


> I would disagree with that. What about the most played games of today? World of Warcraft? Minecraft? PUBG? Fortnite? Battlefield? Call of Duty? Soon Cyberpunk 2077 (which is designed for PC and then for consoles as well, not a backport)? Almost all of the most played games are designed for PC first.


Nice list, I would have mentioned those, too. Maybe CS:GO, too. COD is definitely console first.


----------



## Chomiq (Nov 18, 2020)

Hey @W1zzard have you looked at GN thermal results? Something seems off to me with them reporting 41 dB and "mediocre" thermal performance compared to your 31 dB with "best reference design". Hardware Unboxed has results similar to yours, about 1500 rpm at 75 C max.


----------



## Jinxed (Nov 18, 2020)

W1zzard said:


> Nice list, I would have mentioned those, too. Maybe CS:GO, too. COD is definitely console first.


I agree with you on some COD, depends which COD though. There are so many that I keep losing track .. I don't think the one with raytracing sold through battlenet first was a console backport. Don't remember which one that was. But it was promoted by Nvidia.


----------



## W1zzard (Nov 18, 2020)

Chomiq said:


> Hey @W1zzard have you looked at GN thermal results? Something seems off to me with them reporting 41 dB and "mediocre" thermal performance compared to your 31 dB with "best reference design".


My data suggests it's in line with other good triple slot design coolers, which seems reasonable? AMD has a large vapochamber over the others in this chart

Maybe something wrong with his sample?


----------



## windwhirl (Nov 18, 2020)

Well, damn. I never expected AMD to beat Nvidia in efficiency. Get close or even match, yes. But not the almost beatdown I see here. Very impressive.

Now, I just have to wait until FAH average PPDs are available and then I'll see what I should upgrade to next year.


----------



## steen (Nov 18, 2020)

Chomiq said:


> Hey @W1zzard have you looked at GN thermal results? Something seems off to me with them reporting 41 dB and "mediocre" thermal performance compared to your 31 dB with "best reference design". Hardware Unboxed has results similar to yours, about 1500 rpm at 75 C max.


Disassembled/reassembled HSF prior/post testing? GN should be competent with this though.


----------



## Chomiq (Nov 18, 2020)

steen said:


> Disassembled/reassembled HSF prior/post testing? GN should be competent with this though.


They usually do all their testing before disassembly. Steve has mentioned some mounting issues.


----------



## Colddecked (Nov 18, 2020)

W1zzard said:


> My data suggests it's in line with other good triple slot design coolers, which seems reasonable? AMD has a large vapochamber over the others in this chart
> 
> Maybe something wrong with his sample?



I think they got a bad sample, as he was also complaining about the overclocking ability.  Or maybe they just sent you the golden one?


----------



## Kallan (Nov 18, 2020)

Jinxed said:


> You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.


Well, that is your opinion, here is another review that shows that SAM does make a difference in some games;


----------



## RainingTacco (Nov 18, 2020)

It's a shame that these embargoes and shortages exist. That way reviewers could just buy a gpu and do tests on real one, and not some golden sample. And then return the card to amazon if necessary


----------



## xkm1948 (Nov 18, 2020)

Amazing improvement of old rasterization rendering.

Lags behind in hybrid ray tracing (DXR)

Good thermal / OK~ish pricing / Healthy OC headroom

No DLSS equivalent, mediocre encoder.


That is the ATi I know of. Decoupling compute design and gaming design has finally paid off. Welcome back to the race ATi
I am sure many folks who don't mind poor RT performance would buy these.


@lynx29 Did you get one today?


----------



## techguymaxc (Nov 18, 2020)

Prince Valiant said:


> RT is the antithesis of DLSS. You increase image quality only to decrease resolution, and IQ, for a performance boost.



Have you used DLSS?  I have owned a 2080 Ti since launch and having experienced it first hand, can say that I am very thankful for the option.  If you care to see for yourself, here is one of many videos on the subject:


----------



## Jinxed (Nov 18, 2020)

Kallan said:


> Well, that is your opinion, here is another review that shows that SAM does make a difference in some games;


The GamersNexus review shows only minor improvements from SAM, just like here on TPU. And in most cases, it is by far not enough to beat the RTX 3080. There are a few titles where this happens, funny enough the AMD sponsored games, but even Steve states that there's no chance you'd notice that in game. Besides, if Nvidia achieves the same results on both AMD and Intel CPUs, it's going to be detrimental to AMD if they only offer this on their own platform, as suddenly on Intel they become even slower than they were before.


----------



## Space Lynx (Nov 18, 2020)

Jinxed said:


> The GamersNexus review shows only minor improvements from SAM, just like here on TPU. And in most cases, it is by far not enough to beat the RTX 3080. There are a few titles where this happens, funny enough the AMD sponsored games, but even Steve states that there's no chance you'd notice that in game. Besides, if Nvidia achieves the same results on both AMD and Intel CPUs, it's going to be detrimental to AMD if they only offer this on their own platform, as suddenly on Intel they become even slower than they were before.




Let's wait for the* highly tuned ram reviews* zen 3 + 6800 xt + rage mode + medium OC (which rtx 3080 cant oc really at all) + smart access memory... its going to be glorious


----------



## RedelZaVedno (Nov 18, 2020)

4K performance in Microsoft Flight Simulator is extremely disappointing  I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.


----------



## Jinxed (Nov 18, 2020)

lynx29 said:


> Let's wait for the highly tuned ram reviews zen 3 + 6800 xt + rage mode + medium OC (which rtx 3080 cant oc really at all) + smart access memory... its going to be glorious


So you want users to invest time and money in all of that just to reach the performance level of what Nvidia can give you for mere $50 more, while offering much faster raytracing, features like DLSS and better video encoding, stable drivers?


----------



## RainingTacco (Nov 18, 2020)

RedelZaVedno said:


> 4K performance in Microsoft Flight Simulator is extremely disappointing  I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.
> View attachment 176144



Lighting and shading is on the top with modern games. Unless we will get voxel/vastly better physics in games with destructible environment down to smallest pieces. And then people will push to actualy shade and light all these elements and it will return to shading being on top[which is more resource intensive then simple xyz translation].


----------



## papajo_r (Nov 18, 2020)

I think you should rerun the entire benchmark with a different windows installation/drivers and maybe hardware because something is fishy in your tests..

E.g ACO the 6800 is FASTER than the 6800xt @1080p while at very low FPS in general while in other benchmarks of the same game for example the 6800xt was faster even compared to 3090 at this particular game...

and these measurements are very low and affect the "relative performance" average you give and many people will refere to in the near future and i find this missleading.


----------



## Foobario (Nov 18, 2020)

Jinxed said:


> @W1zzard
> 
> So exactly why is Nvidia in trouble? I don't get the title. Nvidia has the faster product both in rasterization and raytracing (much faster in raytracing in fact). There is no AI support on the AMD card that would be similar to DLSS, with no alternative to TensorCores.  All that on a worse 8nm process (really just an improved 10nm). I'd actually look at that the other way - AMD, even with the best available process technology, much work spent on improving power efficiency, AMD was still unable to beat Nvidia. What happens when Nvidia moves to 5nm next gen? Where will AMD find any reserves for improvement? It's not like they can ditch GCN for the second time and get the same improvements (they've finally done it now, removing the last remains of that horrible architecture).
> 
> ...


How many cards did AMD sell in the top end over the last four years?  I'll help you out.  The answer is zero.  So Nvidia had 100% market share.  Nvidia will be lucky to hold onto 50% share of the high end this time.  That could be troubling from Nvidia's point of view.

As for the performance disparity between the 6800xt and 3080, this review favors the 3080 far more than the other five reviews I've read.  Sure, if you are one of the five people in the world that play Anno 1800, Nvidia is the way to go, no doubt.  At least two of the other games in this review I've never heard of.  From the looks of it, the games are so obscure that AMD hasn't even bothered to create drivers for them.  Yet they were included in this review only to skew results in Nvidia's favor as it seems the author knows his audience here.  This message board seems to have more delusional Green kool aid drinkers than most.


----------



## steen (Nov 18, 2020)

RedelZaVedno said:


> 4K performance in Microsoft Flight Simulator is extremely disappointing  I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.


I'm of the view 4k perf hit (in general) is a function of cache hit rate. ~75% @1080p, ~68% @1440p, ~58% @2160p.

May indicate that N22 will have 96MB IC & N23 64MB.


----------



## Steevo (Nov 18, 2020)

AMD makes a card more energy efficient and as fast as a 3080. Nvidiots "Wahhh, its slower at RTRT, and and umm, its bad, cause Nvidia good" I notice how few are ready to discuss the average frametime and how AMD looks BETTER than Nvidia, that infinity cache smooths out all the peaks and still so much less power than the 7nm node is capable of handling.

Great review W1zz, sorry to see so many dog shit comments on your review complaining about an impressive piece of hardware. Can't wait to see how fast these can go with a bit of voltage modding or under water, if a 10% OC on air is there, with no voltage added, 1.2-1.3volts to the core and a 25% OC may be realistic.

I know whats going in my new machine, as soon as they are actually available, not gonna support the scalping of hardware.


----------



## RainingTacco (Nov 18, 2020)

Foobario said:


> As for the performance disparity between the 6800xt and 3080, this review favors the 3080 far more than the other five reviews I've read.  Sure, if you are one of the five people in the world that play Anno 1800, Nvidia is the way to go, no doubt.  At least two of the other games in this review I've never heard of.  From the looks of it, the games are so obscure that AMD hasn't even bothered to create drivers for them.  Yet they were included in this review only to skew results in Nvidia's favor as it seems the author knows his audience here.  This message board seems to have more delusional Green kool aid drinkers than most.



A review should have the biggest variety of games possible. DX11, DX12, AAA games, some AA games. You don't play only a very narrow selection of games on your GPU, unless of course you do that. But then again, why bother looking at average performance at all? Just look at the benchmark of the games you play/plan to play and buy GPU according to which one has more frames in these.


----------



## Jinxed (Nov 18, 2020)

Foobario said:


> How many cards did AMD sell in the top end over the last four years?  I'll help you out.  The answer is zero.  So Nvidia had 100% market share.  Nvidia will be lucky to hold onto 50% share of the high end this time.  That could be troubling from Nvidia's point of view.
> 
> As for the performance disparity between the 6800xt and 3080, this review favors the 3080 far more than the other five reviews I've read.  Sure, if you are one of the five people in the world that play Anno 1800, Nvidia is the way to go, no doubt.  At least two of the other games in this review I've never heard of.  From the looks of it, the games are so obscure that AMD hasn't even bothered to create drivers for them.  Yet they were included in this review only to skew results in Nvidia's favor as it seems the author knows his audience here.  This message board seems to have more delusional Green kool aid drinkers than most.


Yeah, with AMD it always seems to be an error on someone else's side, not AMD. AMD is perfect. I hear you.


----------



## Space Lynx (Nov 18, 2020)

Jinxed said:


> So you want users to invest time and money in all of that just to reach the performance level of what Nvidia can give you for mere $50 more, while offering much faster raytracing, features like DLSS and better video encoding, stable drivers?



what nvidia? ah you mean that paper launch... no idea. i got a 6800 so going to go enjoy gaming now.  cheers


----------



## xkm1948 (Nov 18, 2020)

On a different note, The RX6000 series means the end for a lot of AMD supporters'  "wait for XXXXX" meme. As far as DX12 Ultimate is concerned, these RX6000 might actually age pretty well like FineWine (As long as you keep that ray tracing off)

If ATI can keep this type of performance boosting up, RDNA3 might be them jumping back in front of Nvidia again. Well, assuming Nvidia functions like Intel though which is highly unlikely.


----------



## RainingTacco (Nov 18, 2020)

Steevo said:


> not gonna support the scalping of hardware.



AMD is scalping too, there are already shortages xD At least what i see in Poland.


----------



## Foobario (Nov 18, 2020)

Jinxed said:


> So you want users to invest time and money in all of that just to reach the performance level of what Nvidia can give you for mere $50 more, while offering much faster raytracing, features like DLSS and better video encoding, stable drivers?


Does your Nvidia based computer operate without a CPU or RAM?  How is buying a CPU and RAM a waste of money?  Many people do a complete system upgrade when getting a new GPU.  Probably even more so now that there has been a shift in what constitutes the best gaming CPU.


----------



## steen (Nov 18, 2020)

Steevo said:


> I know whats going in my new machine, as soon as they are actually available, not gonna support the scalping of hardware.


Next week for AIB, much greater supply. Should be reasonably obtanium by Dec, but given demand for >$500 gpus...


----------



## phill (Nov 18, 2020)

Great review there @W1zzard   

Now the question for me will be, when the heck can I actually buy one!?!   Such an achievement AMD, kudos and congrats on a fine product!!  Even the cooler is decent without the noise or the temps being at melting point..  Bloody marvelous!!  

@W1zzard will there be a review with the cards being tested on AMD platforms with the 5 series CPUs?  If I recall, that's meant to help with performance, so I guess this is kinda of, worst case?


----------



## Steevo (Nov 18, 2020)

RainingTacco said:


> AMD is scalping too, there are already shortages xD At least what i see in Poland.




AMD doesn't control retailers once they fill the retailers order. And unless AMD and Nvidia want to ruin relationships with retailers and AIB partners they are going to see their hardware to each at a set point and the retailers will mark up hardware as they see fit.


----------



## W1zzard (Nov 18, 2020)

phill said:


> @W1zzard will there be a review with the cards being tested on AMD platforms with the 5 series CPUs?  If I recall, that's meant to help with performance, so I guess this is kinda of, worst case?


Yup working on that article right now


----------



## Jinxed (Nov 18, 2020)

Foobario said:


> Does your Nvidia based computer operate without a CPU or RAM?  How is buying a CPU and RAM a waste of money?  Many people do a complete system upgrade when getting a new GPU.  Probably even more so now that there has been a shift in what constitutes the best gaming CPU.


What shift? Why without CPU or RAM? My 2 years old 9900K is paired with 64GB of RAM, standard speed for the platform. And I still get the same 4K framerate as I would with a brand new top of the line ryzen with expensive high speed memory and a lot of time investing into fine tuning it. That's according to W1zzard's own review here on TPU.

Core i9-9900K 100.7%, Ryzen 9 5950X 100%.













						AMD Ryzen 9 5950X Review
					

Ryzen 9 5950X is AMD's flagship 16-core, 32-thread monster. It offers outstanding application performance, your productivity tasks will complete faster than before. Thanks to the Zen 3 IPC advantage, it also excels in gaming, even winning against Intel's Core i9-10900K.




					www.techpowerup.com
				




So what exactly are you talking about?


----------



## NDown (Nov 18, 2020)

Damn, im actually tempted to let go of my VII (was planning to use it at least 2 years XD) if i ever get the chance to buy the 6800XT.

Hoping stocks wont be as bad as the 3080 but that hope has really thin chance of happening i guess.


----------



## RainingTacco (Nov 18, 2020)

Steevo said:


> AMD doesn't control retailers once they fill the retailers order. And unless AMD and Nvidia want to ruin relationships with retailers and AIB partners they are going to see their hardware to each at a set point and the retailers will mark up hardware as they see fit.



This will still end with people buying whole stock and reselling, if AMD won't provide enough supply for demand.


----------



## phill (Nov 18, 2020)

W1zzard said:


> Yup working on that article right now


You are amazing!!  

I really wish this scalping was not happening..  Those prices


----------



## Steevo (Nov 18, 2020)

RainingTacco said:


> This will still end with people buying whole stock and reselling, if AMD won't provide enough supply for demand.



True, but they are at the mercy of TSMC and their output, between CPU, GPU, APU for consoles they are utilizing every die. 

How long before we see 6700 and 6600 models, built for 1440 and 1080 gaming, that is where the money is made, I think AMD is doing this a FU to Nvidia, Im surprised there hasn't been more news about them, but I also am going to guess they are trying to help retailers clean out inventory of older architechture before releasing anything below the $350 price point.


----------



## xkm1948 (Nov 18, 2020)

NDown said:


> Damn, im actually tempted to let go of my VII (was planning to use it at least 2 years XD) if i ever get the chance to buy the 6800XT.
> 
> Hoping stocks wont be as bad as the 3080 but that hope has really thin chance of happening i guess.




R7 fetch some good money among miners. Use that to offset some of the 6800XT scalping price.


----------



## medi01 (Nov 18, 2020)

W1zzard said:


> AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
> 
> 
> The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.
> ...


Almost perfect, only highlighted card is a tad confusing, but that's OK.


----------



## MikeGR7 (Nov 18, 2020)

Confirming their inability to catch up with nVidia was a huge relief for me.

They clearly demonstrated with their CPU launch that they give crap about the end user in terms of value for money and all they wanted was a chance to pass ahead so they can overcharge their products.

Thank you AMD but we didn't need a new "Intel" in place of Intel and we didn't need a new "nVidia" to replace nVidia.

This company has clearly lost their path...


----------



## W1zzard (Nov 18, 2020)

medi01 said:


> only highlighted card is a tad confusing


please elaborate


----------



## xkm1948 (Nov 18, 2020)

phill said:


> You are amazing!!
> 
> I really wish this scalping was not happening..  Those prices




It is the new norm. 

I remember back in the day during ATI HD3870 days, the MSRP only lasted like a few days. Then on newegg every single 3870 went up in price big time


----------



## Space Lynx (Nov 18, 2020)

Jinxed said:


> So you want users to invest time and money in all of that just to reach the performance level of what Nvidia can give you for mere $50 more, while offering much faster raytracing, features like DLSS and better video encoding, stable drivers?



what do you mean it doesn't beat nvidia in anything? from ars technica review.


----------



## Steevo (Nov 18, 2020)

MikeGR7 said:


> Confirming their inability to catch up with nVidia was a huge relief for me.
> 
> They clearly demonstrated with their CPU launch that they give crap about the end user in terms of value for money and all they wanted was a chance to pass ahead so they can overcharge their products.
> 
> ...



Their inability to catch up? I wish I could see the world through your special eyes. At stock its about 2-3% slower than a card $50 more, and it uses 100W less, has 10% overclocking ability out of the box compares to 3% from team green, and with other tweaks it ends up as fast or slightly faster, plus much better frame times. But I guess you can't see that, cognitive dissonance is real .

No one is forcing anyone to buy it, and everything is worth what someone is willing to pay. Do you think AMD gets all the money from the sale of these cards? If you do perhaps try reading up on supply and demand economics.


----------



## xkm1948 (Nov 18, 2020)

MikeGR7 said:


> Confirming their inability to catch up with nVidia was a huge relief for me.
> 
> They clearly demonstrated with their CPU launch that they give crap about the end user in terms of value for money and all they wanted was a chance to pass ahead so they can overcharge their products.
> 
> ...




Companies exist to make $$$$$, not to become friends with folks.


----------



## Foobario (Nov 18, 2020)

Jinxed said:


> Yeah, that is a complete lie. The actual data are completely different.
> 
> Steam Catalog size 23000+
> 
> ...


He was most likely referring to games most people play.  Not indy crao


Jinxed said:


> What shift? Why without CPU or RAM? My 2 years old 9900K is paired with 64GB of RAM, standard speed for the platform. And I still get the same 4K framerate as I would with a brand new top of the line ryzen with expensive high speed memory and a lot of time investing into fine tuning it. That's according to W1zzard's own review here on TPU.
> 
> Core i9-9900K 100.7%, Ryzen 9 5950X 100%.
> 
> ...





Jinxed said:


> What shift? Why without CPU or RAM? My 2 years old 9900K is paired with 64GB of RAM, standard speed for the platform. And I still get the same 4K framerate as I would with a brand new top of the line ryzen with expensive high speed memory and a lot of time investing into fine tuning it. That's according to W1zzard's own review here on TPU.
> 
> Core i9-9900K 100.7%, Ryzen 9 5950X 100%.
> 
> ...


Can you imagine a world where there are thousands upon thousands of people that have CPUs that came out 2, 3, 4 or 5 years ago?  They might want to upgrade.  Some might want to optimize to the max.  Perhaps even buying perfectly tuned RAM with the new "in thing", a Ryzen 5000 series CPU.

Granted, the difference for you would be minimal, but I'm going to go out on a limb and guess that those that have a 9900K are a minority of the entire PC gaming space.


----------



## TheoneandonlyMrK (Nov 18, 2020)

birdie said:


> TLDR:
> 
> Just buy ... NVIDIA. Even 3-5 years from now you'll be able to play the most demanding titles thanks to DLSS just by lowering resolution.


There fixed that for you.


Moving on from infinity cache trashers that just got schooled by reality.


Great release ,I'm calling it a draw@Earthdog.


----------



## RedelZaVedno (Nov 18, 2020)

Foobario said:


> As for the performance disparity between the 6800xt and 3080, this review favors the 3080 far more than the other five reviews I've read. Sure, if you are one of the five people in the world that play Anno 1800, Nvidia is the way to go, no doubt. At least two of the other games in this review I've never heard of. From the looks of it, the games are so obscure that AMD hasn't even bothered to create drivers for them. Yet they were included in this review only to skew results in Nvidia's favor as it seems the author knows his audience here. This message board seems to have more delusional Green kool aid drinkers than most.


Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.


----------



## Jinxed (Nov 18, 2020)

lynx29 said:


> what do you mean it doesn't beat nvidia in anything? from ars technica review.
> View attachment 176147


So you post an AMD-sponsored game here, not even in 4K resolution, but in 1440p instead (LOL). Why don't you post this instead, from the same source?


----------



## Space Lynx (Nov 18, 2020)

RedelZaVedno said:


> Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
> - RNDA2 has better performance per watt
> - Offers more Vram
> - 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
> ...




its important to remember the types of people who buy these cards OC.  nvidia next gen cards dont really OC at all regardless of what you do to them.  and these get close to 10% gains on oc... so really my 6800 is already matching a 3080 when you add that in with my tuned ram/rage mode/smart access memory.  probably surpassing a 3080 actually at this point, not sure. don't care. but its a damn good deal for $579



Jinxed said:


> So you post an AMD-sponsored game here, not even in 4K resolution, but in 1440p instead (LOL). Why don't you post this instead, from the same source?




this game was heavily optrimized for next gen consoles as a launch title. its a sign of future AAA games as well, console comes first.    AMD will continue to show improved numbers in future next gen titles versus nvidia


----------



## Makaveli (Nov 18, 2020)

birdie said:


> TLDR:
> 
> Truly stellar rasterization performance per watt thanks to the 7nm node.
> Finally solved multi-monitor idle power consumption!!
> ...



please keep this joke of a post here on TPU and don't bring it over to the anandtech forum please.


----------



## RainingTacco (Nov 18, 2020)

lynx29 said:


> its important to remember the types of people who buy these cards OC



Lol for most of AMD history since polaris, vega and fury AMD users were UNDERVOLTING gpus and not OCing them xD. Because they ran like shit -noisy and hot.


----------



## xkm1948 (Nov 18, 2020)

Makaveli said:


> please keep this joke of a post here on TPU and don't bring it over to the anandtech forum please.




Tons of shit posting on ananatech. Don't shame opinions.


----------



## Vya Domus (Nov 18, 2020)

Jinxed said:


> Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
> "4 texture OR ray ops/clock per CU"
> 
> 
> ...



A simple way to disprove completely what you claim is to look at the SoCs used in consoles, the last level cache that you desperately try to prove AMD added to make RT viable is no where to be found on those, which means that it clearly wasn't it's purpose. Your speculation is plain and simple wrong.

You must be out of your mind to believe for a second that AMD just dedicated 1/5 of the die to improve RT performance which ended up being inferior to Nvidia's anyway. There are other flaws with their implementation that need to be addressed long before bandwidth became an issue.

Also because of the way BVH works some pointer chasing is required meaning caches don't help much.

No, that cache is an inevitable evolution of GPUs given that the ratio of DRAM bandwidth per thread has plummeted over the years and it's obvious that there will be a point in time after which no more performance can be extracted. It's there to aid all around performance and I bet Nvidia will be forced to implement something similar as well at some point. Anyway, the point is that it has nothing to do with RT.


----------



## Jinxed (Nov 18, 2020)

Foobario said:


> He was most likely referring to games most people play.  Not indy crao
> 
> 
> Can you imagine a world where there are thousands upon thousands of people that have CPUs that came out 2, 3, 4 or 5 years ago?  They might want to upgrade.  Some might want to optimize to the max.  Perhaps even buying perfectly tuned RAM with the new "in thing", a Ryzen 5000 series CPU.
> ...


That is an incorrect statement. The correct would be: You can get the same gaming performance from buying a standard Intel CPU today, with standard RAM, as you would from the top-of-the-line newest Ryzen CPUs with expensive highly clocked and tuned RAM. Simply because you are GPU limited. Even if you manage to scrape 1% of a difference somewhere, you won't notice the difference in game. You will however notice the difference in the amount of time and money investment.


----------



## Steevo (Nov 18, 2020)

Jinxed said:


> So you post an AMD-sponsored game here, not even in 4K resolution, but in 1440p instead (LOL). Why don't you post this instead, from the same source?




Some of us play games other than minecraft. 

If your specialized RTRT version of minecraft brickwork needs a green card to make the bricks ray traced, do it, buy why thread crap about the specific title with a specific add on in specific scenarios when 90%+ of others don't care? I am not saying don't comment, but why trash talk about such a specific scenario? How fast does your 3080 render this webpage? Fast enough to ignore other overwhelming evidence that AMD has made cards that put Nvidia to shame for power consumption and frame times? Fast enough to overlook that the competition is only going to help consumers?


----------



## mechtech (Nov 18, 2020)

If this on avg is double the performance of the 5700XT and the 5700XT is now last gen, I wonder if the 5700XT prices will drop to $300US or less??


----------



## Foobario (Nov 18, 2020)

lynx29 said:


> what do you mean it doesn't beat nvidia in anything? from ars technica review.
> View attachment 176147


Sure, if you pull out some obscure game like that you can make AMD look good./s


----------



## Jinxed (Nov 18, 2020)

lynx29 said:


> this game was heavily optrimized for next gen consoles as a launch title. its a sign of future AAA games as well, console comes first.    AMD will continue to show improved numbers in future next gen titles versus nvidia


The same argument was appearing ever since first AMD-powered consoles appeared, for what - close to a decade now? And it never happened.


----------



## Steevo (Nov 18, 2020)

Foobario said:


> Sure, if you pull out some obscure game like that you can make AMD look good./s




But that game is obscure, and no one wants 60FPS on Ultra settings, they want 20 FPS at 4K with ultraish settings!!!


----------



## bencrutz (Nov 18, 2020)

Jinxed said:


> Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
> "4 texture OR ray ops/clock per CU"
> 
> 
> ...



it's *4 Box or 1 Triangle Intersection per cycle*


----------



## Lindatje (Nov 18, 2020)

Gamers need this GPU, here the reviews.









						AMD Radeon RX 6800 XT Review
					

The Radeon RX 6800 XT is AMD's new high-end gaming graphics card targeting the GeForce RTX 3080. We've had the card in our labs for a while,...




					www.techspot.com
				




Better than the rtx 3080 with lower power consuption and is cheaper.
On 1080P its better than the rtx *3090
*


Good job AMD for *beating *NVIDIA. 

(Some NVIDIA fan`s are not happy, lol.)


----------



## Jinxed (Nov 18, 2020)

Steevo said:


> Some of us play games other than minecraft.
> 
> If your specialized RTRT version of minecraft brickwork needs a green card to make the bricks ray traced, do it, buy why thread crap about the specific title with a specific add on in specific scenarios when 90%+ of others don't care? I am not saying don't comment, but why trash talk about such a specific scenario? How fast does your 3080 render this webpage? Fast enough to ignore other overwhelming evidence that AMD has made cards that put Nvidia to shame for power consumption and frame times? Fast enough to overlook that the competition is only going to help consumers?


Then why do it for Assasin's Creed Valhalla? Do you think more people are playing that then Minecraft?


----------



## Space Lynx (Nov 18, 2020)

RainingTacco said:


> Lol for most of AMD history since polaris, vega and fury AMD users were UNDERVOLTING gpus and not OCing them xD. Because they ran like shit -noisy and hot.




times change I guess?  neat



Jinxed said:


> Then why do it for Assasin's Creed Valhalla? Do you think more people are playing that then Minecraft?




next gen console optimization is a real thing and it will continue to favor AMD in AAA ports in the future.


----------



## W1zzard (Nov 18, 2020)

papajo_r said:


> I think you should rerun the entire benchmark with a different windows installation/drivers and maybe hardware because something is fishy in your tests..
> 
> E.g ACO the 6800 is FASTER than the 6800xt @1080p while at very low FPS in general while in other benchmarks of the same game for example the 6800xt was faster even compared to 3090 at this particular game...
> 
> and these measurements are very low and affect the "relative performance" average you give and many people will refere to in the near future and i find this missleading.


They are equal. The differences you are seeing is random variance between test runs. AMD's driver is more CPU limited than the NVIDIA driver. Note how the FPS are identical (within random margin) at 1080p and 1440p


----------



## RainingTacco (Nov 18, 2020)

AMD always had higher API overhead for CPU, nothing new. It isn't a problem in DX12 titles though.


----------



## Jinxed (Nov 18, 2020)

Vya Domus said:


> A simple way to disprove completely what you claim is to look at the SoCs used in consoles, the last level cache that you desperately try to prove AMD added to make RT viable is no where to be found on those, which means that clearly wasn't it's purpose. Your speculation is plain and simple wrong.
> 
> You must be out of your mind to believe for a second that AMD just dedicated 1/5 of the die to improve RT performance which ended up being inferior to Nvidia's anyway. There are other flaws with their implementation that need to be addressed long before bandwidth became an issue.
> 
> ...


Oh really? Microsoft clearly stated that Xbox Series X has all the RDNA2 features, unlike PS5. The fact that they did not mention details like Infinity Cache and others does not mean they are not there. Some of the features AMD obviously intended to keep for their RX6000 presentation. And even if there was no Infinity Cache, how do you know it was not harmful to raytracing performance, since we do not have raytracing performance results for consoles? Given that there's pretty much no raytracing titles revealed for consoles, I'd assume it is going to perform terribly


----------



## Foobario (Nov 18, 2020)

Jinxed said:


> The same argument was appearing ever since first AMD-powered consoles appeared, for what - close to a decade now? And it never happened.


Do you think the fact that the prior gen consoles were using a three year old (now ten year old) AMD architecture designed for mobile apps might have held back the synergies?

Now PC is running Zen plus RDNA2 just like the consoles do.  We should see how things work out over the next couple of years.


----------



## Steevo (Nov 18, 2020)

Jinxed said:


> Then why do it for Assasin's Creed Valhalla? Do you think more people are playing that then Minecraft?




I think there are more people playing it than care to play a raytraced version of minecraft..... Thus my comment that 90%+ of users don't care about that specific scenario, instead they want to know how it handles 1080 or 4K popular games and of those a decent number of people with this hardware will care about how infinity cache handles texture upgrade packs for games like Skyrim, GTA, RDR2, and Valhalla. 

It is good ot know that if I wanted to have raytracing in my minecraft that a green card would be the way to go, but I don't think most will care, or if they are that tech savvy perhaps they will figure out a way to decrease the number of rays calculated in a brick world.


----------



## Jinxed (Nov 18, 2020)

bencrutz said:


> it's *4 Box or 1 Triangle Intersection per cycle*


How does that contradict my statement? it simply means the "ray ops" are either 4 box intersections or 1 triangle intersection. It does not change the fact that RDNA2 cannot to both texture and raytracing ops on the same CU in the same cycle.


----------



## Lindatje (Nov 18, 2020)

@Jinxed 
And you post a heavy NVIDIA sponsored title .....


----------



## Vya Domus (Nov 18, 2020)

Jinxed said:


> Oh really? Microsoft clearly stated that Xbox Series X has all the RDNA2 features, unlike PS5. The fact that they did not mention details like Infinity Cache and others does not mean they are not there. Some of the features AMD obviously intended to keep for their RX6000 presentation. And even if there was no Infinity Cache, how do you know it was not harmful to raytracing performance, since we do not have raytracing performance results for consoles? Given that there's pretty much no raytracing titles revealed for consoles, I'd assume it is going to perform terribly



That cache is not an RDNA2 "feature" it's just a cache. 

You should get into the habit of reading upon the available information before you come up with these bewildering speculations.

Here, die shot of the SoC inside the new Xbox, from those very same slides :





Where's that last level cache that Navi 21 has ? Just point to it.


----------



## TheLostSwede (Nov 18, 2020)

Jinxed said:


> Techspot is part of HardwareUnboxed (or the other way around). Of course AMD fanboys would try to post results from the biggest AMD fanboy site on the internet if all the other results do not match their expectations. I for one come to techpowerup to get some decent reviews, not to sites like techspot, where the reviewer gets blood in their eyes just from mentioning Nvidia.


You actually seem to mostly come here to argue with people from what I see.

Also, it's more than, not more then.


----------



## Jinxed (Nov 18, 2020)

Foobario said:


> Do you think the fact that the prior gen consoles were using a three year old (now ten year old) AMD architecture designed for mobile apps might have held back the synergies?
> 
> Now PC is running Zen plus RDNA2 just like the consoles do.  We should see how things work out over the next couple of years.


And before that it was GCN and Jaguar cores. Same story a few years forward. Nothing really changed.


----------



## z1n0x (Nov 18, 2020)

Some are abit too emotional, like they have something at stake.
I'm not even sure what are you arguing about anymore, fact of the matter is both gpu vendors will sell anything they made for months beause demand outstrips supply.


----------



## turbogear (Nov 18, 2020)

Thanks for the review TPU.
Interesting card that I would like to buy. 

Unfortunately,the supplies were almost non existing.
In Germany they were gone in 10minutes.
I placed order 2 minutes after launch at 15:02 o'clock and then called Mindfactory to check if I will get one.
They told me unfortunately not.
While I was paying with PayPal cards were gone. 

The guy on phone told me that they had very low quantities and he said he does not expect to get more very soon. So he said most probably deliver will be at least one month later.
He said it is similar situation to RTX 3080.
So I asked him to cancel my order and I will wait until some time in future when enough cards will be available. 

Checked on number of online store. It was gone in few minutes everywhere.


----------



## Jinxed (Nov 18, 2020)

Vya Domus said:


> That cache is not an RDNA2 "feature" it's just a cache.
> 
> You should get into the habit of reading upon the available information before you come up with these bewildering speculations.
> 
> ...


It's a SOC chip schema. It does not detail the features of the GPU in the picture. Rather the parts of the whole SOC. You can see the CPU cores, you can see the memory controllers, IO, interconnects. You are trying to interchange the fact that the picture does not show the GPU intrnal parts for your assumption that there are no actual GPU internal parts. Nice try, honestly, but no.


----------



## Mastakony (Nov 18, 2020)

If you only play the 10 games with RTX (10 good games if possible) OR ONLY PLAY 4K, you can grab a 3080 (if you also have DLSS LOL)
But seriously, it's a complete WIN for 95% of games and majority of resolutions (1080 1440)!!!!
Better perf
Less noise
Less power
Cheaper
AND A REAL OC available!!!!

CONGRATS AMD

PS : If you already have a ryzen 3000............., you know the deal


----------



## TheoneandonlyMrK (Nov 18, 2020)

Jinxed said:


> And before that it was GCN and Jaguar cores. Same story a few years forward. Nothing really changed.


Except what you are saying is rubbish ,the fact the old consoles had 8 weak cores kept my fx8350 in the game instead of below minimum specs.
And Polaris Still plays ANY game as does Tahiti at playable 1080p levels.
I had Polaris for age's.
And you can do work on the shader array while that ray accelerator does work, that can include work for ray's, since the CPU could have access to GPU memory it's possible it could also put work in to assist, God knows how/if that could work out.


----------



## Fouquin (Nov 18, 2020)

Jinxed said:


> Steam Catalog size 23000+



Oh no, the tens of thousands of anime/furry adult visual novels puked up in RPG maker aren't optimized for AMD! What ever will they do?! They've only got the AAA release games built for them, how terrible!


----------



## ratirt (Nov 18, 2020)

@W1zzard Awesome review  Thanks .

I find this so damn unbelievable. People look at the same graphs, same results and have totally different opinions about  the RX6000 series. How is that possible that you have awesome vs total crap. That's even more amazing the the rx6000 series cards themselves and these are really good. If you'd think, what AMD has done over one year from 5000 series GPUs release. One year and you have this.
I can't wait for the next year when RDNA3 will be released or teased or whatever. If AMD keeps up with that attitude we are in for a treat.
RT lacks a bit that's for sure. I think we need to give it more time to get there. NV has had 2 years to get it done. Lets see how much AMD will need to get where NV is now.

BTW. I've watched reviews on YT from different reviewers and the scores for the 6000 series vary quite a bit across all resolutions even with the same games tested. That's a bit weird.


----------



## turbogear (Nov 18, 2020)

Just for fun I was checking on eBay, there were a few people who were selling A4 size photos of 6800xt cards. 
There were enough idiots who actually fell for those and were betting on these without reading title fully which say only Photo. 

There were some bets standing between 500€ and 600€.
These guys will have lot of fun when they receive photo of 6800XT in their post. Just incredible what people do without reading.


----------



## Vya Domus (Nov 18, 2020)

Jinxed said:


> It's a SOC chip schema.



Stop denying reality, it's an actual die representation, not a schema.

last level caches are very obvious and stand out as they usually sit somewhere around or inside a cluster and are symmetrical and very large.

Like it's highlighted in this render :





If it was there you'd see from a mile, accept factual data and stop trolling. And Microsoft would have talked about it in their presentation, this idea that they were somehow hiding it is complete fiction and a ridiculous idea. They described it down to the smallest details about the RT implementation which was arguably much more important but they'd leave this out ? Yeah, right.


----------



## Charcharo (Nov 18, 2020)

regs said:


> 10 GB does affect DOOM Eternal performance a bit, but in 8K. In 4K 8 GB is more than sufficient for upcoming years.
> 
> 
> 
> ...












I literally have 8 GB cards and this game. Your source is wrong or incompetent. 8GB does indeed have issues in the game at 4K.



birdie said:


> More VRAM is always more expensive. Specially for NVIDIA which uses exclusive to it GDDR6X.
> 
> FTFY.
> 
> And these two games are outliers and I presume could have been fixed if they had been released recently. Lastly good luck finding any visual differences between Uber and Ultra textures on your 4K monitor.



DOOM Eternal was released this year and is the most optimized game ever made, or so it seems. Wolfenstein 2 was praised for its optimization too.

Nothing to fix. They just need more VRAM.


----------



## Jinxed (Nov 18, 2020)

TheLostSwede said:


> You actually seem to mostly come here to argue with people from what I see.
> 
> Also, it's more than, not more then.


Well if you call debunking arguments like "most games are made for consoles" with actual numbers as arguing, then there's really no way to react to that. And for the record I do beleive Big Navi is a decent achievement. But unlike some other people here, I do not lose sight of the fact that a big part of that comes simply from the process advantage AMD has and there are a lot of misses - lack of reasonable raytracing performance, lack of anything similar to DLSS, lack of any hardware that would support DLSS, meaning that current RX6000 series cards will not get anything producing similar quality as DLSS (maybe RDNA3 will) and so on ..


----------



## birdie (Nov 18, 2020)

Charcharo said:


> DOOM Eternal was released this year and is the most optimized game ever made, or so it seems. Wolfenstein 2 was praised for its optimization too.
> 
> Nothing to fix. They just need more VRAM.



Please educate yourself about game development a tiny bit. It's perfectly possible to make both games require a _lot_ less VRAM - it's just their developers went overboard when they had access to the 2080 Ti.


----------



## RainingTacco (Nov 18, 2020)

Anyone who is playing at 8k dont give a damn about some minutae price difference between gpus. He/she just buy the best of the best ie. RTX 3090 SLI right now.


----------



## MikeGR7 (Nov 18, 2020)

Steevo said:


> Their inability to catch up? I wish I could see the world through your special eyes. At stock its about 2-3% slower than a card $50 more, and it uses 100W less, has 10% overclocking ability out of the box compares to 3% from team green, and with other tweaks it ends up as fast or slightly faster, plus much better frame times. But I guess you can't see that, cognitive dissonance is real .
> 
> No one is forcing anyone to buy it, and everything is worth what someone is willing to pay. Do you think AMD gets all the money from the sale of these cards? If you do perhaps try reading up on supply and demand economics.



Well with all due respect, it is you who seem to have the "special" eyes because the numbers you mention are "modified" towards AMDs favor. 

It's mostly 5% difference and the power consumption difference is closer to 50 watts.

Also the "other tweaks" part is a lot better for the Green team nowadays... bios flashing on 3000 series is easy as pie and brings great performance jumps with it especially combined with watercooling.

Not to mention the software side of things with dlss, raytracing etc

If you think 5% is insignificant, go tell that to high end nvme drives lol


----------



## Charcharo (Nov 18, 2020)

W1zzard said:


> Yeah no doubt, you can always make games run bad by changing settings or replacing textures. These cases are very very edge case, maybe 1000 gamers out of 100 million? (making up random numbers).
> 
> More = good, but more = $, so more != good



The problem with being part of the actual PCMR Elite is that ... well most other people are quite a bit under that level. It is what it is. But we exist and are loud. You have tests with games no one plays, which is fair. It isnt your place to argue against the games at all, the logic is sound. This is a similar situation except I am 100% certain there are more modders than Shadow of the Tomb Raider gamers. A minority, but still bigger than that minority. 

I strongly hold - your comment there is wrong. And we even see the 1060 3GB lose at 1080p hard at times due to its VRAM buffer.


----------



## Colddecked (Nov 18, 2020)

techguymaxc said:


> Have you used DLSS?  I have owned a 2080 Ti since launch and having experienced it first hand, can say that I am very thankful for the option.  If you care to see for yourself, here is one of many videos on the subject:



I don't think anyone says anything bad about the actual implementation of DLSS, which is why it makes alot of sense for AMD/MS to develop their own DLSSish option.  Just that it takes some work by devs to implement, so its not a feature you can count on yet especially if you competitive multiplayer type games.


----------



## Charcharo (Nov 18, 2020)

birdie said:


> Please educate yourself about game development a tiny bit. It's perfectly possible to make both games require a _lot_ less VRAM - it's just their developers went overboard when they had access to the 2080 Ti.



Of course you can make something less VRAM. You just use inferior (smaller) textures. 

You can also make the geometry bottleneck smaller if you use lower quality models and lesser tessellation settings, you can lessen the impact on the L2 caches (for Ampere) if the game's very design easily compresses using earlier DCC parameters from Pascal.

But why lower the settings?  Isnt Ultra supposed to be decadent? If anything, we need games to have more demanding Ultra settings like we had in 2004 or 2007. A return to games being completely unplayable on Ultra settings like in the past would be so welcome to real tech enthusiasts. For the rest there is the "High" preset.


----------



## Colddecked (Nov 18, 2020)

RedelZaVedno said:


> 4K performance in Microsoft Flight Simulator is extremely disappointing  I was hoping RDNA2's higher frequency and 16gb or ram to make a difference, but game obviously prefers more shaders and faster memory. Well, 3080 for me after all I guess.
> View attachment 176144


WAIT FOR TI if you can!


----------



## Charcharo (Nov 18, 2020)

Colddecked said:


> WAIT FOR TI if you can!


The 3080 Ti will for sure be slightly slower than the 3090. Less bandwidth. So the dude, if he really cares for MSFS so much... well that is his choice.


----------



## birdie (Nov 18, 2020)

Charcharo said:


> Of course you can make something less VRAM. You just use inferior (smaller) textures.



And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:







I'm not sure they are comparable yet. Something is definitely missing.


----------



## MxPhenom 216 (Nov 18, 2020)

Lindatje said:


> Gamers need this GPU, here the reviews.
> 
> 
> 
> ...



Imagine buying any of these cards for 1080p....


----------



## Charcharo (Nov 18, 2020)

birdie said:


> And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.



id Tech 7 already does that. The only optimization it is missing (with regards to textures) is sampler feedback, or to be precises - a Vulkan equivalent to it.


----------



## Space Lynx (Nov 18, 2020)

MxPhenom 216 said:


> Imagine buying any of these cards for 1080p....




So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.  

That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc.  i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.


----------



## birdie (Nov 18, 2020)

MxPhenom 216 said:


> Imagine buying any of these cards for 1080p....



But but but ... 800fps in CSGO!


----------



## Makaveli (Nov 18, 2020)

R0H1T said:


> You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for *games* we saw on *zen3* chips was just a coincidence?



When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.


----------



## MxPhenom 216 (Nov 18, 2020)

birdie said:


> But but but ... 800fps in CSGO!



And a fat CPU bottleneck for all these cards at 1080p


----------



## Charcharo (Nov 18, 2020)

lynx29 said:


> So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.
> 
> That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc.  i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.



Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience  than a RTX 3080/6800XT at Ultra setting on an inferior display.


----------



## Makaveli (Nov 18, 2020)

Jinxed said:


> Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.



You mean the 5nm TSMC node that apple is using for the M1 processor. Guess who is going to buy up all the fab capacity for that. Both AMD and NV will be fighting for craps there. And why do you assume AMD is going to sit on 7nm  on the GPU side while NV goes to 5nm?

You do know there is an RDNA 3 already on the road map? and guess what nm its going to be using?


----------



## Colddecked (Nov 18, 2020)

RedelZaVedno said:


> Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
> - RNDA2 has better performance per watt
> - Offers more Vram
> - 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
> ...



Its sad but they just spent 30 billion acquiring a company, they quite literally can't afford to give too much value, especially considering supply.  Lets hope they'll throw something special at the mainstream segment when they reveal navi 22...


----------



## Space Lynx (Nov 18, 2020)

Charcharo said:


> Once you witness 4K 100-120 fps on an OLED HDR TV, you will never again want to go back to LCDs. Ever.
> This is a bigger upgrade than upgrading a CPU or GPU. A far bigger one. I would honestly say that (were it to have HDMI2.1) a 5700 XT with such a TV/screen and medium settings is a noticeably superior experience  than a RTX 3080/6800XT at Ultra setting on an inferior display.



I will never be that rich, so more power to you lol


----------



## RainingTacco (Nov 18, 2020)

lynx29 said:


> So what if some people do? If they have the intention of upgrading later on their monitor, most do, most can't buy everything at once.
> 
> That being said, some people do enjoy 240hz 1080p maxed out in AAA games, and even the rtx 3090 can't manage to do that yet for games like Valhalla, etc.  i don't care about that high of frames personally... 144-165 is the sweet spot... and 60 at 4k.



I find pushing AAA games with higher frames like 144hz futile, because you often get into frame pacing[stutter] issues on these mostly unoptimized, console port type of games. They are first and foremost optimized for 30 fps, then for 60 fps, and anything higher is just luxury. But you are right, some people enjoy higher frames on lower resolutions.


----------



## Jinxed (Nov 18, 2020)

birdie said:


> Speaking of the RT performance in Watch dogs: Legion for AMD cards:
> 
> 
> 
> ...


This is exactly why I asked for a review of the actual raytracing output on both Nvidia and AMD cards.


----------



## Jism (Nov 18, 2020)

The 6800xt avg gaming power consumption is around 218watts, vs 300watts for the 3080 and above. GG AMD!


----------



## RainingTacco (Nov 18, 2020)

Jinxed said:


> This is exactly why I asked for a review of the actual raytracing output on both Nvidia and AMD cards.



Ha, polish website purepc also noticed this.


----------



## Steevo (Nov 18, 2020)

MikeGR7 said:


> Well with all due respect, it is you who seem to have the "special" eyes because the numbers you mention are "modified" towards AMDs favor.
> 
> It's mostly 5% difference and the power consumption difference is closer to 50 watts.
> 
> ...



210 VS 303 is 93W difference. 



			https://tpucdn.com/review/amd-radeon-rx-6800-xt/images/power-gaming-average.png
		










						AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
					

The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.




					www.techpowerup.com
				




3080 is 6% faster, but the 6800XT can overclock 10% faster out of the box, VS 4% for the 3080. So a net equal.

Raytracing is 10 games currently, if those 10 games are a make or break deal go for the green, but to 90% of gamers we know raytracing will be like tesselation, it will take a few generations to implement and match up performance and quality and by that time these cards will be obsolete and the difference will be 15FPS at 4K in those new games VS 21FPS. 

DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?

And about overclocking and flashing BIOS, if 10% out of the box and a whopping 1v is an indication for AMD with the node they are on watercooling and 1.2-1.3V should give a 6800XT 25% more clock speed, meaning it will be 25% faster than the 3080 while still being cheaper. So still the better buy for 90% of gamers, games and those who want to play the silicon lottery and tweak. Samsungs node is crap and poor choice on on Nvidia for using them to save a few $$$


----------



## birdie (Nov 18, 2020)

Makaveli said:


> When it comes to longevity of cards history shows AMD has NV beat. There are numerous example of this in the past. NV drops driver support in previous gen cards much faster than AMD does.



This myth has been debunked by many reputable websites, including TPU. NVIDIA does *not* drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.

NVIDIA however *stops tweaking drivers* for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imagine optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.


----------



## Jinxed (Nov 18, 2020)

Steevo said:


> DLSS, I'm not sold on, it takes special profiles from nvidia, do they have support for every game? Does it matter if AMD does the same?


That is no longer the case (it was in DLSS 1.0). Since DLSS 2.0 there are no per-game profiles anymore. No game-specific training is required. The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.


----------



## RainingTacco (Nov 18, 2020)

birdie said:


> This myth has been debunked by many reputable websites, including TPU. NVIDIA does *not* drop driver support for previous gen cards [faster]. If anythings they support their cards a lot longer than AMD. For instance NVIDIA still fully supports Kepler generation cards which were released 8 years ago.
> 
> NVIDIA however *stops tweaking drivers* for previous generations cards because it's just not worth it from the financial standpoint - performance is not there anyways: any extracted performance gains would not bring your older cards to the level where their performance is enough to run new heavy games. Imaging optimizing drivers for the GTX 680. Why would you do that? The card is absolutely insufficient for modern games.



Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.


----------



## Charcharo (Nov 18, 2020)

RainingTacco said:


> Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.



This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).


----------



## Tomgang (Nov 18, 2020)

I'm not sure where I stand right now frankly. AMD's cards really has been a good thing happening for sure. Nvidia now has serious competition.

But here comes my concern, while I have taken the choice to move to Zen 3. I'm not so sure on big Navi yet. First of all I have only had nvidia since like ever. I have never owned a amd card in my entire life.

Big Navi offers more vram and a little bit cheaper. That is great. But will driver be the Radeon 5000 series all over again with bugs and problems, that still whas a problem long time after launch and the last thing I want is driver with issues and problems. So far amd will have to convince me away from nvidia, they will have to show that not only the hardware is good, but also there software side. Also it can clearly be seen ray tracing on big nav is in it's infantcy and not on pair with nvidia just yet.

So I guess my next gpu choise will be between 6900 XT and the rumored RTX 3080 TI. Also depending on performance and driver experience and optimization. Nvidia also have some other features like streaming and AI software. In short, I am not totally convinced to go big Navi yet. Drivers first of all have to be a good experience, bugs and errors will only piss me off. Then Ray tracing will have to mature as well. I might end up going nvidia again, so amd convince me to go radeon.


----------



## Vya Domus (Nov 18, 2020)

RainingTacco said:


> Adding to that, NVIDIA usually offer almost full performance upfront, and AMD just hone their drivers over longer time, thus making it appear as AMD performance improves as their gpu age.



The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.


----------



## jmeistr (Nov 18, 2020)

It's refreshing to see amd in the high end segment once again.


----------



## Makaveli (Nov 18, 2020)

Jinxed said:


> You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.



Trust their results more than your post.

You are in a RX 6000 review thread and pro Nvidia and don't intend on buying the hardware.

That applies to the other NV fan boys in this thread why are you here?

other than thread crapping?



Jinxed said:


> AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.



Both consoles are RDNA 2 which means you will see more games supporting it from the ground up.


----------



## RainingTacco (Nov 18, 2020)

Charcharo said:


> This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.
> 
> As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
> It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).



AMD driver team also has different philosophy -there are different teams working at different subsequent drivers, or even on different matters. Don't know how good communication is there and management, but i find it weird. One team can squash a bug in one driver revision, while the other did not and it resurfaces. Why use alternating team for driver releases? Why not just one team, with different departments -one go against these bugs, one against these.


----------



## illli (Nov 18, 2020)

Anyone find it ironic hypocrisy that a person shit posting all over AMD is calling other people fanboys, yet at the same time is some stalwart defender of everything intel and nvidia?


----------



## RainingTacco (Nov 18, 2020)

Vya Domus said:


> The dilemma of the Nvidia fanboy.
> 
> It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.



No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.


----------



## TheoneandonlyMrK (Nov 18, 2020)

RainingTacco said:


> No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.


Comedy double down, I wouldn't mention power draw, some never learn!?.


----------



## squallheart (Nov 18, 2020)

Vya Domus said:


> I tell you why, because some insist a 50% performance hit so much better than a 60% hit.
> 
> No, they're both crap. RT is still not ready for prime time.



I don't understand why people like you who misrepresent facts to prove a point.









						AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
					

The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.




					www.techpowerup.com
				




In the worst-case scenario, the 3080 drops -43%. Once you account for the improvements from DLSS, the penalty is nowhere as bad as you suggest.


----------



## Vya Domus (Nov 18, 2020)

squallheart said:


> I don't understand why people like you who misrepresent facts to prove a point.
> 
> 
> 
> ...



You are right sir, excuse me.

- 43%, now that sounds absolutely amazing compared to 50 or 60. Losing almost half the performance is pretty good, what can I say you have completely changed my mind I'm sold.


----------



## squallheart (Nov 18, 2020)

Jinxed said:


> That's interesting  Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing?



It's an ignorant take. Console launch titles are already implementation raytracing and somehow it is not "ready for prime time" ?

Most PC games will either be console ports or co-developed. I think raytracing performance is going to matter a lot in next gen games.

There's also this little title called Cyberpunk which will utilize raytracing..


----------



## RainingTacco (Nov 18, 2020)

theoneandonlymrk said:


> Comedy double down, I wouldn't mention power draw, some never learn!?.



Have you literally took my "raw power" comment as higher wattage? Are you smoothbrained? What i wanted to say that AMD GPUs had often better spec on paper like more ROPS, SM, higher core clock etc. but somehow were slower or on par with nvidia. Now you understand?


----------



## Casecutter (Nov 18, 2020)

I see this as RTG's "Zen" moment.  Is this XT "above" the Nvidia RTX 3080, ... no.  That said, this is much more than just "nipping" at the heels, this is in a stride for stride.    This is "competition"! and what we always looked for!  AMD/RTG can make a marked play on the number of sales and begin to rival Nividia.  At this point, RTG has found the momentum and only needs to focus hard and run their race.

Looking at "supposed issues" with Nvidia and their bigger GA102 Samsung 8mn and if those are inducing their issues with yield/supply.  We can't say for certain but it could be/continue it doesn't bode well for Nvidia.   While sure RTG has there own struggles, I don't see this initial release consideration as a long-term problem.  AMD/RTG juggling their demands at TSMC has its own challenges, but direct supplier/yields is probably not one of them.  AMD/RTG has probably got "Zen 3" CPU channel loaded, and for the last couple of weeks been loading "Navi 21" parts for reference boards and to AIB's.   I say RTG is set well to grab the after Christmas funds. 

Either side... either card if you can find one in your cart... lucky you!


----------



## RainingTacco (Nov 18, 2020)

I really hope that the samsung nvidia node isnt that crappy as people make it so, because otherwise if nvidia goes to lower tsmc process AMD will be again spanked. I hope that's not the case and with RDNA3 they will be on par with nvidia both performance and feature wise in all fields. People think im nvidia fanboy - i currently own both amd cpu and amd gpu. Im just jaded after 5700 got me headaches for months. I don't approach AMD with rose tinted glasses, and i had nvidia gpus before almost exclusively bar a HD series gpu. I know both sides of equation.


----------



## papajo_r (Nov 18, 2020)

W1zzard said:


> Fixed, thanks! GPU-Z has the wrong value, too



I seriously believe something gone wrong with the benchmarks here which dont seem to agree with the majority of major youtube channels that benched the GPU...

In some games even the 6800 takes the lead to the 6800 xt while the general difference seems to be 2 FPS between the two and on top of that the 3070 (a gpu that trades blows with the 2080 ti) comes on top of even the RX 6800 xt...

You should seriously check if something is wrong with your test bench hardware wise (eg dual channel is enabled? is the RAM more than 16GB when test the 6800 xt ? is the CPU a flagship one? ) or the drivers on windows and on the card itself...


----------



## hurakura (Nov 18, 2020)

So this card is good, yes?


----------



## regs (Nov 18, 2020)

Charcharo said:


> I literally have 8 GB cards and this game. Your source is wrong or incompetent. 8GB does indeed have issues in the game at 4K.


Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.


----------



## TheoneandonlyMrK (Nov 18, 2020)

RainingTacco said:


> No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.


.=




RainingTacco said:


> Have you literally took my "raw power" comment as higher wattage? Are you smoothbrained? What i wanted to say that AMD GPUs had often better spec on paper like more ROPS, SM, higher core clock etc. but somehow were slower or on par with nvidia. Now you understand?


No your so wrong it's not right, in the right application AMD showed their performance like Folding at home, mining, Doom.
Nvidia usually had higher sounding core counts, more rops and higher boost clocks, were you on another planet this last decade.


----------



## r9 (Nov 18, 2020)

RT-off performance is pretty good, with the higher oc headroom 8% vs 4% and the speed deficit being around 5% while 50$ cheaper is right where it should be. But RT performance is terrible and with no DLSS 2.0 equivalent it's a big disadvantage. However all those games have been optimized for Nvidia so we'll have to see how future titles and future game updates do. Personally I'm not convinced that 8gb is a big disadvantage as I haven't seen any proof as vram utilization doesn't prove anything talking fps proof. That said 6800 with 8gb vram for $480 or 6800xt for $550 would be great value/performance product. Unfortunately it will never happen.


----------



## RainingTacco (Nov 18, 2020)

theoneandonlymrk said:


> .=
> 
> 
> 
> ...



Remember GTX 1060 vs RX 480, where at launch GTX 1060 was winning, despite lower TFLOP performance, and then as GCN/polaris matured the performance improved and went on par with gtx 1060. That's what im talking about.


----------



## Charcharo (Nov 18, 2020)

regs said:


> Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.




*I literally have the game.* IDK where Wizzard is testing, but any and all late-game levels perform worse on 8GB cards.

I personally test on Urdak since its a heavy and awesome map (perhaps third best in the game).
My 2080 and 5700 XT both choke here. The otherwise inferior (at lower resolutions) 1080 Ti outperforms them. It loses if you use lower texture settings. Its therefor VRAM.

I'd love it if people actually owned the hardware and games before talking bull


----------



## Deleted member 24505 (Nov 18, 2020)

nice card. scalpers 1 enthusiast 0


----------



## Akkedie (Nov 18, 2020)

Jinxed said:


> The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.



This isn't quite accurate. Yes, you can enable it as you develop and try it out but if you want to ship a game with it then you need Nvidia's explicit blessing. That's why even though it's easy to implement you don't see it get widespread adoption. It's the same for every other RTX/Gameworks feature you see out there.


----------



## Chomiq (Nov 18, 2020)

birdie said:


> And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.
> 
> Speaking of the RT performance in Watch dogs: Legion for AMD cards:
> 
> ...











						Test kart graficznych AMD Radeon RX 6800 vs GeForce RTX 3070  | PurePC.pl
					

Test kart graficznych AMD Radeon RX 6800 vs GeForce RTX 3070  (strona 18) Test kart graficznych AMD Radeon RX 6800 vs NVIDIA GeForce RTX 3070. Która jest szybsza w grach komputerowych? Cena, specyfikacja i wydajność - wszystko w jednym miejscu.




					www.purepc.pl
				




Better comparison.


----------



## RainingTacco (Nov 18, 2020)

Yeah either something fishy is going in watch dogs legion, or the difference between AMD RT and nvidia RT is night and day. AMD probably decreased details to not completely hammer the frames. I wonder if these quality differences affect other RT games, have reviewers compared under a great scrutiny screenshots from both amd RT and nvidia RT implementations? Maybe they are differences like these in watch dog legion in other games too?


----------



## nangu (Nov 18, 2020)

z1n0x said:


> Some are abit too emotional, like they have something at stake.
> I'm not even sure what are you arguing about anymore, fact of the matter is both gpu vendors will sell anything they made for months beause demand outstrips supply.



Defending their scalpers price 3080 purchase may be?  

Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####


----------



## crubino (Nov 18, 2020)

Woww... nice! Great job! kudos to AMD! 

I hope after read this article, many people that pre-order RTX 3000 would change their mind, cancel their pre-order and turn to AMD card 
Since I'm on queue 41th now. I hope with that I will got my new RTX 3080 before this Christmas  ... LOL!!!



ps. _just kidding!_


----------



## RainingTacco (Nov 18, 2020)

nangu said:


> Defending their scalpers price 3080 purchase may be?
> 
> Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####



First RTX cards were early adopters fools gold. You basically bankrolled future development of these techs by overspending on quite mediocre techniques at the time  This is what marketing and naivety do to people.


----------



## Steevo (Nov 18, 2020)

birdie said:


> And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.
> 
> Speaking of the RT performance in Watch dogs: Legion for AMD cards:
> 
> ...


Not the loss of detail on the jacket or pants, but a garbage bag? Not the too shiny canisters? Also looks like they need to adjust the color gamut on the Radeon card.


----------



## Casecutter (Nov 18, 2020)

RainingTacco said:


> I really hope that the samsung nvidia node isnt that crappy as people make it so, because otherwise if nvidia goes to lower tsmc process AMD will be again spanked. I hope that's not the case and with RDNA3 they will be on par with nvidia both performance and feature wise in all fields. People think im nvidia fanboy - i currently own both amd cpu and amd gpu. Im just jaded after 5700 got me headaches for months. I don't approach AMD with rose tinted glasses, and i had nvidia gpus before almost exclusively bar a HD series gpu. I know both sides of equation.


I don't believe Samsung is crappy, but seeing this today it might emerge not as power efficient.  It appears there's not any huge improvement/difference between the 8nm or 7nm, other than I would reflect Nvidia sees a better cost per chip and primarily if they receive too many geldings.  The rumored RTX 3070 Ti (Super) might be filling out more bins.


----------



## TheoneandonlyMrK (Nov 18, 2020)

RainingTacco said:


> First RTX cards were early adopters fools gold. You basically bankrolled future development of these techs by overspending on quite mediocre techniques at the time  This is what marketing and naivety do to people.


You sure generation two (3080) is better?! Can't say I am.


----------



## RainingTacco (Nov 18, 2020)

theoneandonlymrk said:


> You sure generation two (3080) is better?! Can't say I am.



RTX 3000 series did a quite a big improvement in RT performance, despite using same amount of RT cores. Compare 2080 ti amount of cores and rtx 3080 -it's the same, and then compare performance. Then you have two time less tensor cores with same DLSS performance as in 2080 ti. That also means there's huge improvement.
The best comparison is RTX 3070 vs RTX 2080ti = 46 vs 68 RT cores and same performance in RT games. Both gpus have same floating performance/rasterization without RT, so it doesn't influence RT fps amount. That's a 47% improvement in RT cores performance.


----------



## Xuper (Nov 18, 2020)

amoung of pure hate toward 6800 is quite insane.Just let's Owners of NV card cool off.wow.this is crazy.


----------



## Steevo (Nov 18, 2020)

Xuper said:


> amoung of pure hate toward 6800 is quite insane.Just let's Owners of NV card cool off.wow.this is crazy.




Its like they are afraid the leather jacket isn't cool enough to save their hero and or epeen shrinkage...... But good news, they can use 100W more to keep it warm.


----------



## Durvelle27 (Nov 18, 2020)

I know now what my next GPU will be


----------



## Jinxed (Nov 18, 2020)

Akkedie said:


> This isn't quite accurate. Yes, you can enable it as you develop and try it out but if you want to ship a game with it then you need Nvidia's explicit blessing. That's why even though it's easy to implement you don't see it get widespread adoption. It's the same for every other RTX/Gameworks feature you see out there.


As in stick a GameWorks logo on it, as in registering in Nvidia Developer program? Yes. Why would that be an issue? I'd argue that most game devs are registered for both Nvidia and AMD developer support programs anyway.



Chomiq said:


> Test kart graficznych AMD Radeon RX 6800 vs GeForce RTX 3070  | PurePC.pl
> 
> 
> Test kart graficznych AMD Radeon RX 6800 vs GeForce RTX 3070  (strona 18) Test kart graficznych AMD Radeon RX 6800 vs NVIDIA GeForce RTX 3070. Która jest szybsza w grach komputerowych? Cena, specyfikacja i wydajność - wszystko w jednym miejscu.
> ...


That's not good for the Radeons, at least in Watch Dogs. Thanks for that link. Wow. But that is exactly why I was asking for the RT quality comparison a few days back in another thread.

@W1zzard I know you are quite busy at the moment, but is there a chance for at least a few screenshot comparisons similar to this polish site?



nangu said:


> Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####


Tbh I upgrade almost every gen as well, like you would. But on the other hand the people I sell or give the older cards are usually my friends and they tend to keep their GPUs for a much longer period of time. I bough a gtx750ti for one of them as a birthday gift and he still uses it every day. That's almost 7 years now. For people that do not upgrade every gen, but maybe every other gen or even slower, features like DLSS are a big deal, beleive me.


----------



## Chomiq (Nov 18, 2020)

Jinxed said:


> As in stick a GameWorks logo on it, as in registering in Nvidia Developer program? Yes. Why would that be an issue? I'd argue that most game devs are registered for both Nvidia and AMD developer support programs anyway.
> 
> 
> That's not good for the Radeons, at least in Watch Dogs. Thanks for that link. Wow. But that is exactly why I was asking for the RT quality comparison a few days back in another thread.
> ...


It is worth pointing out that these issues were mentioned by people from AMD to reviewers so it's not like AMD isn't aware of it or is trying to ignore it. People from purepc were like "ok, let's see what do they mean exactly" and here we are.
My initial assumption is that the game is using RT preset for ps5/xsx with much lower cutoff point for RT. Either this or some elements are defaulting to cube maps instead.


----------



## chispy (Nov 18, 2020)

3dmark firestrike world record for single graphics card has been broken on one of this amd rx 6800x at 2650/2000Mhz , Holy performance in benchmarks Batman lol !!!! beating rtx 3090 and rtx 3080.

https://hwbot.org/submission/4606724_lucky_n00b_3dmark___fire_strike_radeon_rx_6800_xt_47932_marks


----------



## Steevo (Nov 18, 2020)

chispy said:


> 3dmark firestrike world record for single graphics card has been broken on one of this amd rx 6800x at 2650/2000Mhz , Holy performance in benchmarks Batman lol !!!! beating rtx 3090 and rtx 3080.
> 
> https://hwbot.org/submission/4606724_lucky_n00b_3dmark___fire_strike_radeon_rx_6800_xt_47932_marks



I am really excited that AMD has gone back to giving us a chip that we can tweak and squeeze the performance out of with cooling and more volts. Also we may see the return of BIOS unlocks depending on how they are handling shader counts,


----------



## spnidel (Nov 18, 2020)

chispy said:


> 3dmark firestrike world record for single graphics card has been broken on one of this amd rx 6800x at 2650/2000Mhz , Holy performance in benchmarks Batman lol !!!! beating rtx 3090 and rtx 3080.
> 
> https://hwbot.org/submission/4606724_lucky_n00b_3dmark___fire_strike_radeon_rx_6800_xt_47932_marks


holy shit lol


----------



## milewski1015 (Nov 18, 2020)

I don't understand all the hate. As a few people mentioned, I don't think many people expected AMD to be able to churn out a card this competitive with Nvidia's offerings at the high-end. The fact that they're now able to offer a card in the same ballpark as Nvidia says leaps and bounds about their progress. You can nitpick all you want about DLSS and raytracing and this and that and the other thing, but the bottom line is that Nvidia isn't miles ahead anymore, which means competition is heating up, which is good for us consumers.

Did anybody actually expect AMD's first try at ray tracing performance to match Nvidia's after they had over a year to refine and improve it? Similar story in terms of raw rasterization performance: did anybody expect AMD to suddenly leapfrog Nvidia and take the performance crown after having been behind for years? Each team seemed to focus on different things. For Nvidia, that appears to have been overall performance as well as ray-tracing and DLSS. AMD seemed like they tried to make their best of their superior process node and maximize performance per watt, thermals, power draw, etc. The fact that AMD has gotten as close to Nvidia as they have is an applaudable achievement.

As with any other consumer product, which one you decide upon comes down to preference and that depends a lot on what you consider to be important to you. For those that don't give a shit about ray-tracing, Big Navi might look really appealing with its comparable performance, better thermals and lower noise at a slightly lower cost. For those that really want to crank the eye candy, Nvidia's refined RTX implementation and all the other fancy graphics tricks like DLSS and whatnot probably put Ampere ahead. Each comes with a tradeoff that can only be evaluated on a personal level, so spouting left and right that "ABC is better than XYZ" when that opinion is clearly influenced by personal preference (on what is important to have in a video card) and the desire to not feel like you've made (or will make) a bad purchasing decision is just adding fuel to the fire. Let people be happy they bought what they did or are excited about what they're excited about.

For a bunch of enthusiasts that are part of the "PC Master Race", everybody really does a good job of stooping down to the classic "Xbox vs. Playstation" console war levels. Just be happy that innovation will be driven by competition, be sad that availability is currently dogshit, be kind to each other, and be safe from COVID. 

PS: And I guess be jealous of @lynx29 for somehow being lucky enough to score both a new CPU and GPU this generation. Congrats dude!


----------



## Space Lynx (Nov 18, 2020)

milewski1015 said:


> I don't understand all the hate. As a few people mentioned, I don't think many people expected AMD to be able to churn out a card this competitive with Nvidia's offerings at the high-end. The fact that they're now able to offer a card in the same ballpark as Nvidia says leaps and bounds about their progress. You can nitpick all you want about DLSS and raytracing and this and that and the other thing, but the bottom line is that Nvidia isn't miles ahead anymore, which means competition is heating up, which is good for us consumers.
> 
> Did anybody actually expect AMD's first try at ray tracing performance to match Nvidia's after they had over a year to refine and improve it? Similar story in terms of raw rasterization performance: did anybody expect AMD to suddenly leapfrog Nvidia and take the performance crown after having been behind for years? Each team seemed to focus on different things. For Nvidia, that appears to have been overall performance as well as ray-tracing and DLSS. AMD seemed like they tried to make their best of their superior process node and maximize performance per watt, thermals, power draw, etc. The fact that AMD has gotten as close to Nvidia as they have is an applaudable achievement.
> 
> ...




thanks for the shout out, I still can't believe it myself.  I got my Master's in December, I knew it would be somewhat hard finding a job but I was confident, then covid hit... really been down in dumps lately, the jobs I was applying to just vanished. This win has really helped motivate me again. 

As Kevin says in The Office, "It's just nice to win one for a change"    

also the ryzen 5600x wasn't luck really, amazon/newegg had a lot of stock, it took a long time to sell out really from what i recall, i even made a topic here on tpu being like yo they are still in stock, they were eventually sold out like 10 mins after that topic was made still beats the 30 seconds sellout of nvidia gpu's though.

i think part of the fanboy wars is sometimes we enjoy being silly, but then someone takes it too far or too seriously because the internet is just a bad place for discourse due to lack of context (facial expressions, tone of voice, etc)


----------



## Makaveli (Nov 18, 2020)

Grab a coffee or tea and watch this very informative.


----------



## Xuper (Nov 18, 2020)

chispy said:


> 3dmark firestrike world record for single graphics card has been broken on one of this amd rx 6800x at 2650/2000Mhz , Holy performance in benchmarks Batman lol !!!! beating rtx 3090 and rtx 3080.
> 
> https://hwbot.org/submission/4606724_lucky_n00b_3dmark___fire_strike_radeon_rx_6800_xt_47932_marks


2650mhz ?! with Watercooler?

edit : ah my bad.LN2


----------



## Space Lynx (Nov 18, 2020)

Makaveli said:


> Grab a coffee or tea and watch this very informative.



yeah that video they mention the fabs being overloaded at TSMC.  cause of next gen consoles. I think xbox series x was already announced not to be in regular normal stock until april 2021... I expect same for PS5 and big navi card's...  so yeah. crazy storm of releases all at once.


----------



## Bruno_O (Nov 18, 2020)

birdie said:


> Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).



Have you read the results properly? https://tpucdn.com/review/amd-radeon-rx-6800-xt/images/metro-exodus-rt-2560-1440.png
It's around 10% faster than a 2080 TI on RT


----------



## dir_d (Nov 19, 2020)

Xuper said:


> 2650mhz ?! with Watercooler?
> 
> edit : ah my bad.LN2


I might be reading that wrong but the 6800 XT was on Air


----------



## Upgrayedd (Nov 19, 2020)

Were the review embargoes lifted on the day of release? Seems pretty shady of AMD


----------



## Steevo (Nov 19, 2020)

Upgrayedd said:


> Were the review embargoes lifted on the day of release? Seems pretty shady of AMD



How so, if they released a product that was available for preorder and promised something they couldn't deliver, and held reviewers to keep people from backing out maybe. But it's a great product, the only issue is available cards.


----------



## Space Lynx (Nov 19, 2020)

Upgrayedd said:


> Were the review embargoes lifted on the day of release? Seems pretty shady of AMD




all pc building companies do this as far as im aware, nvidia did as well.


----------



## Berfs1 (Nov 19, 2020)

On the power consumption, I do not believe the average power consumption of 210W for a 6800XT. Sorry, I don't believe that. Especially when under the same testing the 3080 does ~300W.


----------



## Space Lynx (Nov 19, 2020)

Berfs1 said:


> On the power consumption, I do not believe the average power consumption of 210W for a 6800XT. Sorry, I don't believe that. Especially when under the same testing the 3080 does ~300W.


 its really easy to figure out what the avg power draw of 6800 xt is, just go look at gamersnexus review, skip ahead to power draw section, do same for a few other youtubers, and you get an overall idea. of course it will vary some, but it might give you a better idea overall.


----------



## dir_d (Nov 19, 2020)

Berfs1 said:


> On the power consumption, I do not believe the average power consumption of 210W for a 6800XT. Sorry, I don't believe that. Especially when under the same testing the 3080 does ~300W.


why?


----------



## Berfs1 (Nov 19, 2020)

lynx29 said:


> its really easy to figure out what the avg power draw of 6800 xt is, just go look at gamersnexus review, skip ahead to power draw section, do same for a few other youtubers, and you get an overall idea. of course it will vary some, but it might give you a better idea overall.


So you are telling me that, a GPU on THE SAME 7NM PROCESS, DOUBLE THE CORES, MORE FREQUENCY, MORE VRAM, MORE THAN DOUBLE THE TRANSISTORS will take LESS power than the original GPU? (6800XT vs 5700XT, 210W vs 219W)

EDIT: As for the Gamers Nexus reviews, I saw a difference of about 20W, not 100W.


----------



## Space Lynx (Nov 19, 2020)

Berfs1 said:


> So you are telling me that, a GPU on THE SAME 7NM PROCESS, DOUBLE THE CORES, MORE FREQUENCY, MORE VRAM, MORE THAN DOUBLE THE TRANSISTORS will take LESS power than the original GPU? (6800XT vs 5700XT, 210W vs 219W)
> 
> EDIT: As for the Gamers Nexus reviews, I saw a difference of about 20W, not 100W.



I didn't tell you anything, I said go look at a wide variety of reviewers, and gather your average power draw from that. I'm rx 6800, so I already know my power draw will be quite nice, I never looked at XT.


----------



## Berfs1 (Nov 19, 2020)

lynx29 said:


> I didn't tell you anything, I said go look at a wide variety of reviewers, and gather your average power draw from that. I'm rx 6800, so I already know my power draw will be quite nice, I never looked at XT.


Yeah it's also really easy to find the power consumption of the card by looking at TPU's own charts, and that is precisely why I am questioning it, because it really seems off. The media playback power consumption on the 6800XT is much higher than RTX 3080. Perhaps it was a typo, and it was 310W, and not 210? I straight up do not believe that 210W number for a card on the same 7nm process as the 5700XT, with more than double the FP32 performance, but uh, less power. Sorry. I don't believe that. That is too hard to believe.



birdie said:


> Samsung 8nm vs TSMC 7nm - nah, not the same.











						AMD Radeon RX 5700 XT Specs
					

AMD Navi 10, 1905 MHz, 2560 Cores, 160 TMUs, 64 ROPs, 8192 MB GDDR6, 1750 MHz, 256 bit




					www.techpowerup.com
				











						AMD Radeon RX 6800 XT Specs
					

AMD Navi 21, 2250 MHz, 4608 Cores, 288 TMUs, 128 ROPs, 16384 MB GDDR6, 2000 MHz, 256 bit




					www.techpowerup.com
				




both 5700XT and 6800XT are on TSMC 7nm..?


----------



## birdie (Nov 19, 2020)

Bruno_O said:


> Have you read the results properly? https://tpucdn.com/review/amd-radeon-rx-6800-xt/images/metro-exodus-rt-2560-1440.png
> It's around 10% faster than a 2080 TI on RT



I should have mentioned I was referring to Control which implements a lot more RTRT features and where it's slower than borth RTX 3070 and 2080 Ti.



Berfs1 said:


> both 5700XT and 6800XT are on TSMC 7nm..?



I thought you were comparing NVIDIA to AMD. Sorry.


----------



## Totally (Nov 19, 2020)

birdie said:


> I should have mentioned I was referring to Control which implements a lot more RTRT features and where it's slower than borth RTX 3070 and 2080 Ti.



Reviewers are showing that Control is somewhat bugged ATM, and it's one game to just dismiss it like that is a bit biased.


----------



## F-man4 (Nov 19, 2020)

We can learn that 3070 < 6800XT = 3070Ti < 3080.
If more 2-slot 6800XTs release then it can replace 3080 for ITX builds.


----------



## saikamaldoss (Nov 19, 2020)

For $50 less, you will have to sacrifice on RT performance. Not sure why Nvidia should be worried. 3080TI with slightly more cores and 20GB mem will walk all over 6800XT


----------



## birdie (Nov 19, 2020)

Totally said:


> Reviewers are showing that Control is somewhat bugged ATM, and it's one game to just dismiss it like that is a bit biased.


----------



## Space Lynx (Nov 19, 2020)

ya if you want RT pretty much everyone agrees go nvidia.  but i don't so i am good to go


----------



## SLK (Nov 19, 2020)

Somehow, I get the feeling that AMD fanboys are Trump supporters.


----------



## nikoya (Nov 19, 2020)

I'm wondering if the liquid cooling (AIO or custom) is gonna bring something compared to the vapor chamber given that the Temps and noise are already great.


----------



## SLK (Nov 19, 2020)

lynx29 said:


> ya if you want RT pretty much everyone agrees go nvidia.  but i don't so i am good to go



Lol, $50 does not matter much at this price point. You might as well pay $50 more for a superior RT performance.


----------



## okbuddy (Nov 19, 2020)

4k with 256bit is a really bad idea


----------



## Space Lynx (Nov 19, 2020)

SLK said:


> Lol, $50 does not matter much at this price point. You might as well pay $50 more for a superior RT performance.




incorrect, you buy whatever is in stock at this point cause we are looking at shortages well into 2021, possible even summer 2021.


----------



## saikamaldoss (Nov 19, 2020)

lynx29 said:


> ya if you want RT pretty much everyone agrees go nvidia.  but i don't so i am good to go



If you have the money to buy a 6800XT for $650 and get a card that can’t do RT of a $699 3080... what’s the logic ? Just trying to understand your logic. Coz I am a AMD fan but even I am not able to convince my self to buy 6800XT after looking at the RT performance and your response may help me see what I am missing.. 

In Jan 3080 20 GB with more cores will come out and 6800XT will be really underpowered in 2 months.  Also I am concerned that 6800XT can only match 3070 when RT is enabled..  what’s your take away from it ?


----------



## Space Lynx (Nov 19, 2020)

saikamaldoss said:


> If you have the money to buy a 6800XT for $650 and get a card that can’t do RT of a $699 3080... what’s the logic ? Just trying to understand your logic. Coz I am a AMD fan but even I am not able to convince my self to buy 6800XT after looking at the RT performance and your response may help me see what I am missing..
> 
> In Jan 3080 20 GB with more cores will come out and 6800XT will be really underpowered in 2 months.  Also I am concerned that 6800XT can only match 3070 when RT is enabled..  what’s your take away from it ?




good luck buying one   ill be gaming maxed out next week. take care


----------



## SLK (Nov 19, 2020)

lynx29 said:


> incorrect, you buy whatever is in stock at this point cause we are looking at shortages well into 2021, possible even summer 2021.


You are right about the stock, but I still see no reason to buy 6800XT over 3080. 

For just $50 more, you will get :

1) Better RT
2) DLSS
3) More stable drivers.


----------



## Berfs1 (Nov 19, 2020)

SLK said:


> You are right about the stock, but I still see no reason to buy 6800XT over 3080.
> 
> For just $50 more, you will get :
> 
> ...


On top of that, a _much_ better encoder (Ampere NVENC vs AMD's VCE), and proper monitor overclocking support


----------



## saikamaldoss (Nov 19, 2020)

lynx29 said:


> good luck buying one   ill be gaming maxed out next week. take care



lol so no logic haan ? Lol ya ya I can always wait. I have already finished AC Valhalla and Legion... far cry6 in feb and only major title remaining is Cyberpunk and it’s a Nvidia sponsored game.. good luck playing at 4K with RT ultra setting with 6800XT...


----------



## Bruno_O (Nov 19, 2020)

if you ever remember, could you please test whether the hw bug present on the 5700 series where PCM sound through HDMI will drop every ~minute for 1 second is still present in the 6000 series?  this is a well know issue in the community, never address by AMD (prob due to a hw fault).


----------



## SLK (Nov 19, 2020)

Berfs1 said:


> On top of that, a _much_ better encoder (Ampere NVENC vs AMD's VCE), and proper monitor overclocking support



I didn't mention the encoder thing coz most are just pure gamers. Content creators, without a doubt, will pick Nvidia over AMD.


----------



## papajo_r (Nov 19, 2020)

Foobario said:


> Sure, if you pull out some obscure game like that you can make AMD look good./s


It is not just one obscure game... in some titles it even beats the rtx 3090!!

I seriously have doubts about the benchmarks presented in techpowerup something must have gone wrong in some instances even the 6800 non xt beats the 6800 xt..... either drivers or hardware issue something going very off compared to what other reputable youtubechannels have benched


----------



## Space Lynx (Nov 19, 2020)

papajo_r said:


> It is not just one obscure game... in some titles it even beats the rtx 3090!!
> 
> I seriously have doubts about the benchmarks presented in techpowerup something must have gone wrong in some instances even the 6800 non xt beats the 6800 xt..... either drivers or hardware issue something going very off compared to what other reputable youtubechannels have benched




the ars technica review is also a good one, shows it beating 3090 in 5 titles, meh im gonna have fun gaming, enough of this nonsense debating


----------



## SLK (Nov 19, 2020)

saikamaldoss said:


> lol so no logic haan ? Lol ya ya I can always wait. I have already finished AC Valhalla and Legion... far cry6 in feb and only major title remaining is Cyberpunk and it’s a Nvidia sponsored game.. good luck playing at 4K with RT ultra setting with 6800XT...



Cyberpunk 2077 looks awesome in trailers. I can't wait to see it in full ray tracing glory.


----------



## kiddagoat (Nov 19, 2020)

I am happy to see AMD coming back with their GPU division.  I am seriously considering the 6800XT, my first AMD card since my Fury X. 

I enjoy raytracing when I am playing single player games, but when I play online, I just want high, stable FPS.  I don't have time to look at all the shiny reflections, shadows, or other flare that comes along.  Honestly, playing Call of Duty Modern Warfare and now Cold War, I really can't tell the difference with raytracing on vs off.  I will say that DLSS has been a nice addition in the handful of games I own that support it.  Most don't though.  

I used to be one for jumping all in on Nvidia's new tech, 3D Vision, PhysX (I owned the BFG Tech standalone card before Nvidia converted PhysX into CUDA with their acquisition), G-Sync, SLI, Pixel Shader 3.0 vs 2.0 (Original FarCry), DirectX 10 on the 8 series, etc....  It is all just fluff and you used to pay through the nose for it.  There are always the few games that are used to demonstrate the new shiny features, but really you do not see it catch on until years later.  In the case of 3D Vision it sorta died on its own, the support was awful, SLI got the axe, G-Sync you don't need the proprietary module anymore to utilize it.  I have owned several G-Sync panels and now have ASUS TUF panels that support both G-Sync and Freesync, works wonderful and doesn't have the $300-$400 mark up.  Even those PhysX exclusive features... looking at you Alice, Batman Arkham Series, Ghost Recon Advanced Warfighter, Mafia, and Metro... are mostly fluff and didn't all that much to the immersion of the games.  At the time it was nice, but not for the premium you had to pay to get it.  All of these proprietary features go to the way side... and are dropped.  Raytracing is nice when I take the time to notice it in single-player games, but I learned a long time ago and with quite a bit of money that the new shiny just isn't worth it initially.  Kudos that both companies are picking it up and running with it, but it just isn't there yet.    

I have never had issues really with either company.  I have had Radeon's whose fans quit working and XFX have sent me just the cooler (My 4890 comes to mind and the R9 290X).   I can't speak to the HDMI issues on the newer cards, but I never had a problem using them with DisplayPort.  I have had Nvidia GPUs and Memory die on me.... MSI 9600GT, MSI and Zotac 570.... those wonder purple and green speckles scattered across the screen.  

I know I am preaching to the choir.... but it really is a shame that people are so polarized by brand names..... companies who don't care about you and just want to turn a profit.  Some of who don't even by enthusiast level hardware, just keyboard cowboys who have to thump their chest and wave the banner of their fanboyism..... it's really sad.  

Just look at what features are important and matter to you, do your research, and buy that.  Everyone's needs change over time.  I know mine have.  I used to be all AMD/ATi, then AMD/Nvidia, Intel/AMD, Intel/Nvidia, and now back to AMD/Nvidia currently.  For 4 years I used my laptop as a desktop with the graphics amplifier, and people told me I was nuts.  I liked the flexibility in gaming at home and then unhooking the graphics amplifier when I went to a LAN.  I still use my Alienware for LANs.   

I know it is hard to keep up with, but I really wish moderators would just clean up some of these threads, and not just here on TPU, but 3DGuru, YouTube, etc.... this is really asinine to see.

I don't mind objective based discussion and helping people understand things before making a purchase (I used to work at Microcenter), but the immediate flaming back and forth sucks.... I'd say almost half the comments on a given thread about GPUs or CPUs could be deleted.  

I feel old, but I have been into this hobby for over 20 years now.... maybe I am just being senile in my old age.


----------



## xkm1948 (Nov 19, 2020)

kiddagoat said:


> I am happy to see AMD coming back with their GPU division.  I am seriously considering the 6800XT, my first AMD card since my Fury X.
> 
> I enjoy raytracing when I am playing single player games, but when I play online, I just want high, stable FPS.  I don't have time to look at all the shiny reflections, shadows, or other flare that comes along.  Honestly, playing Call of Duty Modern Warfare and now Cold War, I really can't tell the difference with raytracing on vs off.  I will say that DLSS has been a nice addition in the handful of games I own that support it.  Most don't though.
> 
> ...



Oh boy another former FuryX owner. You would know better to stay off that red hyper train.


----------



## xorbe (Nov 19, 2020)

Thank you AMD for not following nVidia off the deep end into 350W TDPs.  I totally expected this Radeon to be 350W gaming.


----------



## Steevo (Nov 19, 2020)

Exactly, if the government says you should pay a tax on your tea, you should, and be grateful, for its the people in power we should all trust.

But back on topic, AMD has a product for less that does as well or better and this is the best you can muster? Political BS? Tribalism? No defensible position, just attempting to smear and troll? Give me a break.

Can you point me to a site to buy a 3080Ti? Or a 3080? (They are $1400 each on Amazon for 3080s and the Ti isn't available for months)


----------



## r9 (Nov 19, 2020)

SLK said:


> Somehow, I get the feeling that AMD fanboys are Trump supporters.


Attention everybody we have Antifa member in the house.


----------



## SIGSEGV (Nov 19, 2020)

This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...)  and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.


----------



## r9 (Nov 19, 2020)

Steevo said:


> Exactly, if the government says you should pay a tax on your tea, you should, and be grateful, for its the people in power we should all trust, after all Bernie I mean Biden is going to cure cancer.....
> 
> 
> 
> ...



Cure cancer right after he defunds police, take our rights to defend ourselves and reduce the age of consent.


----------



## Haserath (Nov 19, 2020)

Vya Domus said:


> Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.


Why do you think cache is power hungry? One of the slides shows a cache hit is like 1/6 the power use of going out to memory and it has lower latency and higher bandwidth.

Samsung's 8nm node is a decent bit worse than TSMC's 7 but it's not that bad. Still better than TSMC's 12nm. Ampere seems to just be worse for current rasterized games. Those FP units aren't utilized fully. And Nvidia is pushing the clocks to their max, maybe they caught wind of AMD's performance.


----------



## SLK (Nov 19, 2020)

SIGSEGV said:


> This thread confirmed has been bombarded by trolls.
> Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...)  and somehow ignoring their pride about performance/power consumption/heat.
> hello...
> 
> oh yes. winter is coming.



Performance (17 REVIEW SITES AVERAGE)





						Radeon RX 6800 /XT Launchreviews: Die Testresultate zur UltraHD/4K-Performance im Überblick | 3DCenter.org
					

Als Teaser zur kommenden Launch-Analyse sollen hiermit bereits die ersten ausgewerteten Testresultate zu Radeon RX 6800 & 6800 XT unter der UltraHD/4K-Auflösung offeriert werden. Danach kommt die größere Radeon RX 6800 XT leicht unterhalb der




					www.3dcenter.org
				




Power consumption/Heat? Go buy a GTX1650 for reduced power consumption and heat.


----------



## mechtech (Nov 19, 2020)

Doesn't look like Nvidia is in trouble

"NVIDIA (NASDAQ: NVDA) today reported record revenue for the third quarter ended October 25, 2020, of $4.73 billion, up 57 percent from $3.01 billion a year earlier, and up 22 percent from $3.87 billion in the previous quarter. "


----------



## Mastakony (Nov 19, 2020)

SLK said:


> You are right about the stock, but I still see no reason to buy 6800XT over 3080.
> 
> For just $50 more, you will get :
> 
> ...


for 10 games and if you lucky you play 3 or 4 of them......
What an argument...
If Nvidia fans ony have RTX and DLSS and drivers stability, that means they have....nothing.

The big question : Why NV fans take their precious time to come to talk on a .... AMD thread????

PRICELESS


----------



## Agentbb007 (Nov 19, 2020)

I purchased an asus tuf 3080 but I’m super happy to see how competitive amd is. Now I see why nvidia had to drop their prices so much this generation.  Competition is great and was sorely needed.


----------



## SIGSEGV (Nov 19, 2020)

SLK said:


> Performance (17 REVIEW SITES AVERAGE)
> 
> 
> 
> ...



sorry, I'm lost. I don't understand why you gave me that link.
care to elaborate?

 


> *Power consumption/Heat? Go buy a GTX1650 for reduced power consumption and heat*.



????


----------



## Khonjel (Nov 19, 2020)

Mastakony said:


> The big question : Why NV fans take their precious time to come to talk on a .... AMD thread????
> 
> PRICELESS


Idle mind is the devil's workshop.


Agentbb007 said:


> I purchased an asus tuf 3080 but I’m super happy to see how competitive amd is. Now I see why nvidia had to drop their prices so much this generation.  Competition is great and was sorely needed.


WHAT?!! Nvidia increased price during 2000 series and kept the same price for 3000 series. And before they increased the pricing, they obfuscated the market with 1000 series. How?


			
				Nvidia said:
			
		

> Well the msrp of our 1000 series card is X dollars. But that's just reference. We'll sell you our premium FE card that's actually X+50 dollars. If you want msrp card, check out our AIB partners, who will almost not release any card at msrp. Or actually sell at our FE pricing which is not msrp



Seriously Nvidia are masters at consumer perception manipulation. And objectively speaking, that's fucking impressive. Look at what people parrot now, "$500 3070 is bringing $1200 2080 Ti performance". Forgetting the fact that if the 2000-series prike hike didn't happen, 2080 Ti would've costed $800-900.


----------



## z1n0x (Nov 19, 2020)

This thread has gone to shit thanks to couple posters. Emotionally charged fanboying is one thing but, i'm seening outright shilling. You guys are way too aggressive and obvious.
People will be buying these gpus whether you like it or not, so chill off.


----------



## N3M3515 (Nov 19, 2020)

Khonjel said:


> Forgetting the fact that if the 2000-series prike hike didn't happen, 2080 Ti would've costed $800-900.



$750 MAX



Khonjel said:


> Seriously Nvidia are masters at consumer perception manipulation. And objectively speaking, that's fucking impressive. Look at what people parrot now, "$500 3070 is bringing $1200 2080 Ti performance".



That made me laugh so hard......people talking about $500 3070 like it's the deal of the century, when the gtx 1070 started at $380 and was equal to the $700 GTX 980Ti.


----------



## bogmali (Nov 19, 2020)

I don't know how politics somehow got inserted into the discussion and don't see how it has anything to do with it. If you can't keep your political bias out, please do not post otherwise you will be thread banned and assessed points.


----------



## W1zzard (Nov 19, 2020)

Berfs1 said:


> power consumption


Check the discussion in the forum comments of the non-xt review, you’ll understand



papajo_r said:


> something must have gone wrong in some instances even the 6800 non xt beats the 6800 xt..


I thought i replied to you already? What you are seeing is cpu Limited in very specific games due to driver overhead, which bunches up results against an invisible wall, and randomly one will be better than the other, they really are equal in that test, look at the numbers not at the placement of the bar


----------



## yukinin97 (Nov 19, 2020)

SLK said:


> Cyberpunk 2077 looks awesome in trailers. I can't wait to see it in full ray tracing glory.


Same here brother.
Cyberpunk 2077 is the only thing that has pushed me to upgrade this year. Sure, my trusty 1080 still does fine for most games I play at my resolution but I've been waiting almost 8-whole years for this game and you betcha that I wanna be able to play it at its highest visual fidelity when it comes out... if I can grab a 3070/80 by then which looks unlikely 

On the topic of the new RX cards, I gotta say It's much better than I was expecting from AMD due to well... past disappointments with their high-end GPUs relative to NVIDIA's offerings. This year what they've brought forth looks a lot better even though the Ray Tracing performance shown here and in other reviews leaves something to be desired but I feel that just like they have done with Zen over the years, the 2nd generation of their RT accelerators is when they'll really start bringing the heat to NVIDIA's overall performance lead in both rasterized and ray-traced gaming alike, just like they did to Intel with their CPUs. I'm just glad that we're seeing so much competition nowadays


----------



## papajo_r (Nov 19, 2020)

W1zzard said:


> Check the discussion in the forum comments of the non-xt review, you’ll understand
> 
> 
> I thought i replied to you already? What you are seeing is cpu Limited in very specific games due to driver overhead, which bunches up results against an invisible wall, and randomly one will be better than the other, they really are equal in that test, look at the numbers not at the placement of the bar


That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...

On top of that other reputable channels that benchmarked the gpu  have different results e.g in Assassin Creed Odyssey you depict in 1080p the 6800 being faster than the 6800 xt and both of them slower than even the RTX 3070 if memory serves me right yet e.g in hardware unboxed in that particular game the rx 6800 xt beats even the RTX 3090!!!

and in general as I said before I think that its clear that RTX 3070  trades blows with the 2080 ti and its not a biased opinion like most big channels showcase that (and I can tell that from personal experience)

and you have the rtx 3070 being faster than even the 6800 xt....  yea sure there is something wrong I am not sure what that something is (methodology, specific bugs with specific games, your setup or maybe you simply did some typos when writing down the results dunno but there is something off here)
'
If I had to guess I would say that due to whatever issue your metrics in some games are off which leads to the "relative performance" score being negatively affected for the 6800 xt


case and point all these reputable channels showcasing that the rx 6800 xt trades blows even with the RTX 3090 while you showcase it as slower compared to even the RTX 3070 while having *2 FPS* difference with the RX 6800 non xt

Nothing personal mr admin but something IS off...  maybe you got a defective sample?


----------



## Deleted member 190774 (Nov 19, 2020)

saikamaldoss said:


> In Jan 3080 20 GB with more cores will come out and 6800XT will be really underpowered in 2 months.  Also I am concerned that 6800XT can only match 3070 when RT is enabled..  what’s your take away from it ?


My take away from it is this; is it really a problem? Whether you buy anything from a 3060Ti to a 3090, or a 6800 to 6900XT, they're ALL pretty damn fast.

I won't be losing any sleep over $50 here, or 20% FPS difference there... Next year, then the year after there will be something faster again and I'll buy that.

If you think you're future-proofing at the moment by buying a 3080 - then you're probably mistaken as there are probably going to be some big leaps over the next few years and you can't be worrying about that. If you need an upgrade now, buy one. If you don't then do you just want to splash some cash - and if so, just buy whatever you want.


----------



## Agentbb007 (Nov 19, 2020)

papajo_r said:


> That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...


I don’t think you quite understand what CPU Limited means.  It means the CPU is holding the cards back so they’ll get to a certain FPS and can’t go any higher.  1080p resolution will be CPU bottlenecked with most modern GPUs so you need to look at 1440p and higher to see how the cards perform.
I thought of a good analog, it’s like lining up a 700 horsepower Ferrari against a 150 horsepower Honda but you limit both cars to 50 horsepower.  Both are going to perform about the same with some variation with the driver (pun intended). 
Hardware Unboxed doesn’t even test 1080p because all these modern GPUs are CPU bottlenecked at 1080p.  Look at 1440p+ if you want to see a true GPUs performance to compare.
And don’t even bother comparing different sites benchmarks there are so many variables that influence performance it’s pointless.


----------



## Chomiq (Nov 19, 2020)

SLK said:


> Lol, $50 does not matter much at this price point. You might as well pay $50 more for a superior RT performance.


Yeah sorry but that $50 changes to $200 anywhere outside of US.


----------



## nguyen (Nov 19, 2020)

Chomiq said:


> Yeah sorry but that $50 changes to $200 anywhere outside of US.



Can't actually buy a 6800XT right now so comparing regional prices is moot point.
I don't see any wrong buying either 6800XT or 3080 if they are available where you live, these 2 are so close and represent the best GPUs from Nvidia and AMD. 

AMD has done the impossible here, I was thinking that the 128MB Infinity Cache might have some downfall in the frametimes department it is rock solid there. 

Now the question is can AMD produce enough cards to satiate the starving gaming crowds . Seems like both AMD and Nvidia are both laughing all the way to the bank this season.


----------



## W1zzard (Nov 19, 2020)

papajo_r said:


> That makes no sense if it is CPU limited then again the 6800 xt should be faster or as fast as the 6800 non xt... same driver same "cpu limitation" or bottleneck same game same resolution just more cu units and higher frequency...


Yes, what you say is true. There is a little bit of randomness in all measurements. It's impossible to reproduce any measurement 100%. My AC test does have a relatively high variability, if I do 10 runs back to back, on the same card same system, there will be a few FPS difference every time. This is normal and just how things work. Of course I can roll the dice as often as I need to get the desired placement. Is that what you want?

I recommend you play a game and look at the FPS counter. Now move away, and move back to the same place. Do you see the exact same number?


----------



## medi01 (Nov 19, 2020)

W1zzard said:


> please elaborate


6800 bar is using different color and just taking color into account it's not possible to say if it is with or without RT.
But it's just one line that needs reading, so it's quite easy to perceive still.

Compare it to say this color scheme, where you see, hey, more saturated vs less saturated bars:


----------



## W1zzard (Nov 19, 2020)

medi01 said:


> 6800 bar is using different color and just taking color into account it's not possible to say if it is with or without RT.
> But it's just one line that needs reading, so it's quite easy to perceive still.


Oh I see. Our standard graph colors are green for primary tested, light blue for secondary tested, so I wanted to keep that highlighting scheme to guide the eye

I think this is worse, especially if you take it in the context of the other charts in the review


----------



## Khonjel (Nov 19, 2020)

Well looks like FidelityFX Super Resolution just like DLSS is game-by-game basis

__
		https://www.reddit.com/r/Amd/comments/jwji3n/_/gcqpqra

At least I hope that they invest all those Ryzen profits for game development.


----------



## Vayra86 (Nov 19, 2020)

birdie said:


> TLDR:
> 
> Truly stellar rasterization performance per watt thanks to the 7nm node.
> Finally solved multi-monitor idle power consumption!!
> ...



The 4 titles they want you to play, yes. You enjoy that Ray Traced, DLSSed Nvidia-chooses-for you what games you play-GPU, with a power consumption that is worse than anything AMD ever managed to produce.

I'll take one that's just going to give the best perf/dollar. They can stick that RT nonsense where the sun doesn't shine for now. There's still barely any content and what's there is completed in a few hours anyway. And even so - the card still runs it. wrt RT, little has changed since Turing, and the console ports will do RT just fine on either GPU regardless, because they do on the console as well.

I think AMD picked a fine balance in that regard. Low on RT perf, stellar on where the performance really matters. No tensor cores ... they only serve for vastly underused DLSS. Stop fooling yourself with things that really are irrelevant to the segment the product's for. The feature set difference is just not there anymore unless you really want to bitch about it. Its a bit sad, even. The hardware decoder, okay. But the rest? Please...

Nah... after a long time in the green zone, its switching time for me... I was never a fan of hot/power hungry parts... this hasn't changed.



Khonjel said:


> Well looks like FidelityFX Super Resolution just like DLSS is game-by-game basis
> 
> __
> https://www.reddit.com/r/Amd/comments/jwji3n/_/gcqpqra
> ...



Its the usual 'AMD have to have it too' movement and this alone signifies it'll die off eventually. Perhaps we should view it as an additional tool to make high profile games look better or, counteract their shitty optimization / RT perf hit.

Neither company is going to happily crunch down everything forever.



SLK said:


> You are right about the stock, but I still see no reason to buy 6800XT over 3080.
> 
> For just $50 more, you will get :
> 
> ...



And +30% power consumption. So its quite a bit more than 50 bucks, even if your power is cheap. You can safely double that over the lifetime of the GPU.

But let's face it, if you're a value king you just go for the 6800 or further down the stack. The 3070 is also much better value.


----------



## lexluthermiester (Nov 19, 2020)

HOLY CRAP! AMD, welcome back to the top-tier GPU performance party!
(yes, I know I'm late to the party, been busy..)


----------



## medi01 (Nov 19, 2020)

W1zzard said:


> Oh I see. Our standard graph colors are green for primary tested, light blue for secondary tested, so I wanted to keep that highlighting scheme to guide the eye
> 
> I think this is worse, especially if you take it in the context of the other charts in the review


Agree that that is worse,


----------



## Overclocker_2001 (Nov 19, 2020)

@W1zzard
graphics in frametime are a little bit confusing at best (sorry, i have to say that)
at least try to match color with brand, amd = orange/red, nvidia = green (or gray),
then it can be useful to put all graphics of a comparison  near each other varying only game titles, acually you have:
BF5 6800xt vs 3080
Borderland 3 6800xt vs 6800
Borderland 3 6800xt vs 3070
Borderland 3 6800xt vs 3080
Civ 6 6800xt vs 3080
DMC 5 6800xt vs 3080
Doom E 6800xt vs 3080
FC 5 6800xt vs 3080
G5 6800xt vs 6800
G5 6800xt vs 3080
G5 6800xt vs 2080S
G5 6800xt vs 2080Ti
ME 6800xt vs 3080
SB 6800xt vs 3080
TR 6800xt vs 3080
W3 6800xt vs 6800
W3 6800xt vs 3070
W3 6800xt vs 3080
W3 6800xt vs 2080Ti

Logic of this order? i think will be best 6800xt vs 3080 (all games) then 6800xt vs 2080Ti (all games?) and then 6800xt vs 6800 (all games?)
3070/2080S is better suited against 6800


----------



## medi01 (Nov 19, 2020)

SLK said:


> 1) Better RT *in a handful of nVidia sponsored games, but likely not in newer games like Dirt 5 that use DXR 1.1*
> 2) DLSS *a TAA derivative upscaling that blurs stuff, but will be countered by AMD anyhow*
> 3) More stable drivers. *a FUD, pro-green post would not feel complete without it*



FTFY


----------



## laszlo (Nov 19, 2020)

"Not great not terrible" 

so amd is back in business at high-end which is good for end consumer

nv is a little bit in corner ;they knew ahead launching their new cards that competition exist; this is why we saw those "low" prices which killed their previous pricing; i don't think they will cut the prices for the 3070&3080 as they are positioned well perf/price wise (not to mention these are quite expensive hardware-wise) ; only the 6900 will be an issue for them as seems it will be really close to 3090 considering current result...

overall good news and when cards will be plenty older ones will be cheaper... i'll wait for a cheap 5700xt or 2080 when they go down to 300-350 € ... i'm patient..


----------



## Khonjel (Nov 19, 2020)

medi01 said:


> FTFY


Dirt 5  is interesting to me. While I don't know if its DXR 1.1 implemention or relatively light ray tracing effects leads to better performance for 6000-series card, I would like reviewers to do in-depth test on it.

Now I don't usually get between pointless banter among emotionally charged fans but I gotta stop you there bud. DLSS (or least the 2.0 revision) is awesome bud. I don't give a rat's arse about ray tracing but DLSS is mind-blowing. And what converted me?
This video:


----------



## medi01 (Nov 19, 2020)

Khonjel said:


> I don't usually get between pointless banter among emotionally charged fans


You are so classy.



Khonjel said:


> mind-blowing


It is in the eye of the beholder, nothing to argue about.
It adds blur, it wipes fine details, it is mostly TAA, not NN (DLSS 1.0 was pure NN), but as all TAA derivatives, it has its uses.

Referencing it by upscaled resolution is spreading FUD.



lynx29 said:


> the ars technica review is also a good one, shows it beating 3090 in 5 titles, meh im gonna have fun gaming, enough of this nonsense debating



In computerbase review they had a separate section for newer games and it looks pretty sunny for AMD:


----------



## TheUn4seen (Nov 19, 2020)

Well, slower than the 3080 while similarly unobtainable, and with less features. I don't care about ray tracing, but I have a 3840x2160 120Hz screen to feed with frames and the 6800XT just doesn't cut it. Also, DLSS will probably future proof the 3080 much better as far as achieving playable framerates. I have to say this launch is heavily overhyped. It's a good GPU, which couldn't be said about AMD products for years, but not better than the competition. So, I won't cancel my 3080 order.


----------



## Khonjel (Nov 19, 2020)

medi01 said:


> You are so classy.
> 
> 
> It is in the eye of the beholder, nothing to argue about.
> ...


Well, I won't argue if DLSS is TAA or NN since I'm not knowledgeable about it but like 2kilksphilip said, I think it's awesome as a pure anti-aliasing option. After killing MSAA, modern AA has never been the same. DLSS as AA replacement changes that for me.


----------



## medi01 (Nov 19, 2020)

Khonjel said:


> I won't argue if DLSS is TAA or NN since I'm not knowledgeable about it


Let's not pretend it is a secret, shall we:
_
So for their second stab at AI upscaling, NVIDIA is taking a different tack. Instead of relying on individual, per-game neural networks, NVIDIA has built a single generic neural network that they are optimizing the hell out of. And to make up for the lack of information that comes from per-game networks, the company is making up for it by integrating real-time motion vector information from the game itself, a fundamental aspect of temporal anti-aliasing (TAA) and similar techniques. The net result is that DLSS 2.0 behaves a lot more like a temporal upscaling solution, which makes it dumber in some ways, but also smarter in others. _








						NVIDIA Intros DLSS 2.0: Ditches Per-Game Training, Adds Motion Vectors for Better Quality
					






					www.anandtech.com
				







Khonjel said:


> I think it's awesome as a pure anti-aliasing option.


And there is nothing wrong with that, especially if you judge it first hand, and not "some youtuber said so"


----------



## THU31 (Nov 19, 2020)

Amazing performance per watt, and overall performance itself. It is great to see AMD near the top.

But raytracing performance is pretty bad, and right now there is no "DLSS" to help with that. A good first step, though.

I hope to see some price wars with NVIDIA and Intel next year.


----------



## Pixrazor (Nov 19, 2020)

@W1zzard I have a question about ROPs.
In the article it is stated that there is 128 of them, yet GPU-Z only state half of it.
And I can't see anywhere AMD stating it having 128.
Thanks for your time.


----------



## W1zzard (Nov 19, 2020)

Pixrazor said:


> @W1zzard I have a question about ROPs.
> In the article it is stated that there is 128 of them, yet GPU-Z only state half of it.
> And I can't see anywhere AMD stating it having 128.
> Thanks for your time.


GPU-Z is wrong, this will be fixed in next version. 

Reason is that AMD's driver reports the wrong number to GPU-Z. AMD has doubled the pixel output per ROP unit with RDNA2, but the AMD driver doesn't take this doubling into account yet. They'll certainly fix it at some point, for now I added code to GPU-Z: "If Navi21 & (RopCount == 48 or RopCount == 64) , Then RopCount = RopCount * 2".


----------



## Chomiq (Nov 19, 2020)

nguyen said:


> Can't actually buy a 6800XT right now so comparing regional prices is moot point.
> I don't see any wrong buying either 6800XT or 3080 if they are available where you live, these 2 are so close and represent the best GPUs from Nvidia and AMD.
> 
> AMD has done the impossible here, I was thinking that the 128MB Infinity Cache might have some downfall in the frametimes department it is rock solid there.
> ...


It isn't if a $700 card is listed with this MSRP in the States (as out of stock at online retailers) and in EU becomes a $1000 MSRP card (also listed as out of stock). It's still a valid point and we'll be able to verify this once 6800's AIB designs will be listed at retailers next week.

The argument about $50 price difference is only valid in the US right now. FE vs Ref it's 1:1 in EU, with odds swayed toward AMD because they're pushing reference 6800's through regular retail channels instead of a broken website which only shows "Coming soon". At least with AMD you can place an order and get in the queue.


----------



## Upgrayedd (Nov 19, 2020)

lynx29 said:


> all pc building companies do this as far as im aware, nvidia did as well.


Its usually lifted a day or so before release.


Steevo said:


> How so, if they released a product that was available for preorder and promised something they couldn't deliver, and held reviewers to keep people from backing out maybe. But it's a great product, the only issue is available cards.


So people can make an informed buying decision but that's not how scams work. My microcenter has been flooded with the horde. RTX 3000, Ryzen 5000, PS5/XBSX and now RDNA2 shortages have cause the stores to hand out vouchers still every morning.


----------



## Pixrazor (Nov 19, 2020)

W1zzard said:


> GPU-Z is wrong, this will be fixed in next version.
> 
> Reason is that AMD's driver reports the wrong number to GPU-Z. AMD has doubled the pixel output per ROP unit with RDNA2, but the AMD driver doesn't take this doubling into account yet. They'll certainly fix it at some point, for now I added code to GPU-Z: "If Navi21 & (RopCount == 48 or RopCount == 64) , Then RopCount = RopCount * 2".


Ah ok. I see now. Thanks again boss.


----------



## Deleted member 190774 (Nov 19, 2020)

Irrespective of what the benchmarks are saying (about RT), given that both Nvidia and AMD will see benefits from future driver updates (I assume) or game advancements, I'd like to find some theoretical numbers for the respective hardware implementations. Forget TFlops etc.

This might turn out to be hard to determine and I don't have time to figure it out right now, but there are numbers out there that suggest Big Navi can do 1 ray/triangle intersection per CU per clock - so 80 CUs at 2200mhz would be 176,000. Is this good? How would this compare to Ampere - is it even comparable given their relative implementations?

It's got me curious...


----------



## laszlo (Nov 19, 2020)

beedoo said:


> Irrespective of what the benchmarks are saying (about RT), given that both Nvidia and AMD will see benefits from future driver updates (I assume) or game advancements, I'd like to find some theoretical numbers for the respective hardware implementations. Forget TFlops etc.
> 
> This might turn out to be hard to determine and I don't have time to figure it out right now, but there are numbers out there that suggest Big Navi can do 1 ray/triangle intersection per CU per clock - so 80 CUs at 2200mhz would be 176,000. Is this good? How would this compare to Ampere - is it even comparable given their relative implementations?
> 
> It's got me curious...




be careful ... curiosity killed the cat    

back to business is an interesting question but i'm not sure if it can be compared 1:1 due the different architecture...


----------



## Makaveli (Nov 19, 2020)

SIGSEGV said:


> This thread confirmed has been bombarded by trolls.
> Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...)  and somehow ignoring their pride about performance/power consumption/heat.
> hello...
> 
> oh yes. winter is coming.



yup the amount of nv shilling going on in this thread is ridiculous. It making the thread almost unreadable!

Why are they so angry competition is good for the market.


----------



## Spencer LeBlanc (Nov 19, 2020)

Why didn't you guys use the worlds fastest CPU for gaming?
That would also allow you to test SAM+Ragemod?


----------



## Vayra86 (Nov 19, 2020)

Vya Domus said:


> Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.



Nah, you misunderstand, Nvidia did it for all that production capacity that Samsung offers! And there is STILL a demand problem, go figure!

But yeah, this is leaps and bounds ahead. Apparently Nvidia's Ampere happened at AMD, or something. It pales in comparison - on a technical level that is, because the stack is competitive now in both camps. AMD has some pricing to fix if you ask me for the 6800. The 6800XT is in an okay place but still brings up the dilemma wrt ray tracing for some. A bit of a lower price point would have eliminated that outright - or at least more so.


----------



## hero1 (Nov 19, 2020)

Great work W1z

AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it. 

What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.


----------



## W1zzard (Nov 19, 2020)

Spencer LeBlanc said:


> Why didn't you guys use the worlds fastest CPU for gaming?
> That would also allow you to test SAM+Ragemod?


You saw the SAM review with 5900X ? There's a big link in the conclusion
Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too

Seriously


----------



## r9 (Nov 19, 2020)

W1zzard said:


> You saw the SAM review with 5900X ? There's a big link in the conclusion
> Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too
> 
> Seriously



Still your fault ... darn link is not big enough ... make it flash.


----------



## MxPhenom 216 (Nov 19, 2020)

Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.

That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.



Vayra86 said:


> Nah, you misunderstand, Nvidia did it for all that production capacity that Samsung offers! And there is STILL a demand problem, go figure!
> 
> But yeah, this is leaps and bounds ahead. Apparently Nvidia's Ampere happened at AMD, or something. It pales in comparison - on a technical level that is, because the stack is competitive now in both camps. AMD has some pricing to fix if you ask me for the 6800. The 6800XT is in an okay place but still brings up the dilemma wrt ray tracing for some. A bit of a lower price point would have eliminated that outright - or at least more so.



Yeah my dilemma depending on how things shake out starting early next year and if Nvidia releases a 3080Ti that isnt using Samsung anymore (rumor mill is it could be on TSMC 7nm), but I wanted my next card to be fairly good at both ray tracing and rasterization. Ampere is right in that ball park imo, but then its power consumption and heat is something else. Then we have the 6800XT that fits the rastorization, efficiency, and overclocking potential to be really fun to have on water cooling, but then be pretty booty for ray tracing and no DLSS equivalent as of yet...

Ampere = Adequate ray tracing performance for all the games id actually want to run ray tracing on (non competitive multiplayer titles), but power and heat, and very little tweaking headroom, and bad scaling for overclocking

Big Navi = Really good in everything but ray tracing as it stands right now....and no DLSS

@W1zzard What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?


----------



## W1zzard (Nov 19, 2020)

MxPhenom 216 said:


> What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?


Some cosmetic issues in the control panel, nothing worth mentioning. No stability issues, no crashes


----------



## dir_d (Nov 19, 2020)

What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.


----------



## W1zzard (Nov 19, 2020)

dir_d said:


> What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.


I don't think microbenchmarks are useful to the vast majority of readers, if you want to code some test and send it to me i'd be happy to run it.
Need the source for security, but happy to sign an NDA. This offer is open to anyone reading this.post.


----------



## HD64G (Nov 19, 2020)

And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.


----------



## W1zzard (Nov 19, 2020)

This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.


----------



## Vayra86 (Nov 19, 2020)

hero1 said:


> Great work W1z
> 
> AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it.
> 
> What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.



Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

With 8-10GB cards to carry them forward.



Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

Knowing the technical details and actual performance, heat and TDP now, its crystal clear AMD has a much stronger offering. Even if they don't top the 3080 in all situations - they have a much better piece of silicon on offer, and more VRAM to keep it going forever. These cards will be holding value, whereas a 3080 is just the 2080ti all over again - eclipsed within a single gen.

A few weeks back one could think 'hm, 3080 at 700, thats a great deal!'. Today, not so much. Its a new norm and the 3080 is really on the worst end of it. Especially if you consider that Nvidia is deploying a largely enabled GA102 for it. It means that without a shrink, Ampere is a dead end, and even on 7nm its worth no more than a refresh. This was the major redesign? Back to the drawing board then.

Its going to be interesting to see how important people think RT performance really is, because it truly is the one differentiator Nvidia can hold on to.


----------



## TechLurker (Nov 19, 2020)

r9 said:


> Still your fault ... darn link is not big enough ... make it flash.



Like the internet of the 90s with giant flashing "CLICK HERE!" links?



Vayra86 said:


> Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.
> 
> They bank on the fact that going forward, many more games will be getting that support.
> 
> Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.



It'll be interesting to see what happens moving forward; now that AMD has their foot in most gaming companies' doors simply due to optimizing for consoles, it's more than likely that we'll see AMD optimized features first and Nvidia optimizations following, just due to the nature of going for multi-platform releases and similar core hardware between consoles and AMD CPUs/GPUs. It's just a matter of AMD finalizing their version of DLSS and getting them to utilize it, and optimizations to RT.


----------



## anachron (Nov 19, 2020)

Vayra86 said:


> Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.
> 
> They bank on the fact that going forward, many more games will be getting that support.
> 
> ...


I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.


----------



## Berfs1 (Nov 20, 2020)

W1zzard said:


> Check the discussion in the forum comments of the non-xt review, you’ll understand


So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..


----------



## Bruno_O (Nov 20, 2020)

TheUn4seen said:


> Well, slower than the 3080 while similarly unobtainable, and with less features. I don't care about ray tracing, but I have a 3840x2160 120Hz screen to feed with frames and the 6800XT just doesn't cut it. Also, DLSS will probably future proof the 3080 much better as far as achieving playable framerates. I have to say this launch is heavily overhyped. It's a good GPU, which couldn't be said about AMD products for years, but not better than the competition. So, I won't cancel my 3080 order.


yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.



MxPhenom 216 said:


> Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.
> 
> That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.
> 
> ...



most new games,  based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried with the 6800 RT performance, it's good enough for some effects here and there - same as the consoles. Both Ampere and new 6000 series are too week for full blown RT effects anyway, and that's coming only in the PS6 era imho.



anachron said:


> I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.


8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.


----------



## Khonjel (Nov 20, 2020)

HD64G said:


> And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.


I don't remember any reviewer having driver issues on RX 5000 series release either. But few weeks later general public started posting online about various issues. Utimately time will tell.


----------



## JohnSuperXD (Nov 20, 2020)

I think RX5000's driver was good, to begin with and there is some function in the software stack that just doesn't work


----------



## gxv (Nov 20, 2020)

'With SAM enabled, we see the averages change "dramatically" (in the context of competition), with the RX 6800 XT now being 2% faster across all three resolutions. This helps the RX 6800 XT match the RTX 3080 at 1080p, while beating it by 1% at 1440p and being just 4% slower at 4K UHD—imagine these gains without even touching other features, such as Radeon Boost or Rage Mode! '

Now imagine techpowerup doing a serious review including rage + sam mode together


----------



## Vayra86 (Nov 20, 2020)

anachron said:


> I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.



You're absolutely right. I'm not dismissing them entirely though - its just that a choice for 'more RT performance and DLSS' is a choice for a variable, completely abstract advantage. You just don't know how it will develop going forward, and the baseline performance on both cards is decent enough for 'playable' - except perhaps at 4K, where Nvidia might be able to keep over 45-50 FPS more readily with RT on. That is the extent of the validity of those arguments, really. FWIW, apparently AMD Is also launching a DLSS-equivalent. It'll likely not be as useful, but more readily available to a broad range of games. But in much the same way, I wouldn't count on any of it.

In much the same vein we can't really predict how VRAM requirements will develop going forward, but there is a similar sort of risk of lacking performance there on Nvidia's end and it kind of extends to the 3070 too with its rather low 8GB. And then you're talking not just about RT perf, but everything that takes a hit. At the same time, there is one guarantee, none of these cards can really do more RT than a few fancy effects here or there. And still lose a lot of frames. And since the GPUs in the consoles won't be getting faster this gen, that's what you've got for the next three-five years going forward.

There is indeed something to choose now, and that's great.


----------



## W1zzard (Nov 20, 2020)

Berfs1 said:


> So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..


Ah you misunderstood. Look at the charts from the thread, note how 6800 XT runs closer to 300 W at higher resolution. That's why I switched all cards 2080 Ti and up to power testing at 1440p and not 1080p. Both reviews have been updated accordingly, check if the numbers listed in the review make more sense now.


----------



## anachron (Nov 20, 2020)

Bruno_O said:


> 8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
> RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.


I already posted about it, but here it is again.

I personally liked RT on Control and more recently on WD:L and i did find the difference noticeable when i had to deactivate it for some tests after some hour of playing. I also look forward to play minecraft RT with some friends, as the results are really amazing.

Now, i have a 2070 Super OC, all the game i play usually do above 60fps at 1440p with very few exception. The most important one being WD:L, that both hit the 8Go VRAM limit _when RT is on_ with the HD texture pack and the way i'm playing. So i'm aware about the low vram issue you mentioned.

The reason why i have been following this review is that i was hoping for a card with 16Go Vram that can justify to replace my actual GPU. But even with the better efficiency and rasterization performance, why would i buy a 6800XT to have worse performances in the games that would make me want to upgrade my actual GPU in the first place?

So now you can tell me to drop RT on Legion and enjoy my 85fps with my 6800XT (according to guru3d bench). But then i could also just deactivate it on my 2070Super while keeping DLSS which both solve the memory issue and give me 80FPS with a small drop in image quality for 0€ and even less power consumption than the 6800XT.
On the other hand, if i choose to buy the 3080, i will benefit from the FPS boost in every game like the 6800XT, it will solve my issue with Legion and allow me to play minecraft RT in comfortable condition. I _may _run into issue with the 10GB Vram at some point and it may cost me a bit more of electricity but it would still seems like a better deal for me right now.

@W1zzard : i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.


----------



## W1zzard (Nov 20, 2020)

anachron said:


> i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.


I've never tested that. Doubt it, decode happens on separate hardware in the GPU, and clocks are at max due to gaming already


----------



## Xuper (Nov 20, 2020)

I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.


----------



## W1zzard (Nov 20, 2020)

Xuper said:


> I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.


No, the reason is that memory clock is now running very low compared to previous AMD cards, check my reviews, the data is there

It is possible that they leverage the L3 cache to reduce the number of memory accesses, so that the low memory clock is ok, but why is this not a problem on NVIDIA who don't have L3 cache?


----------



## Xuper (Nov 20, 2020)

I noticed card doesn't expose memory temp , right?


----------



## R0H1T (Nov 20, 2020)

Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~








						AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble
					

The Radeon RX 6800 XT is a worthy opponent to the NVIDIA GeForce RTX 3080. It offers similar performance at better power efficiency and lower noise levels. In our Radeon RX 6800 XT review, we also take a closer look at raytracing and "Rage Mode", a new 1-click method for overclocking.




					www.techpowerup.com


----------



## W1zzard (Nov 20, 2020)

R0H1T said:


> Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~
> 
> 
> 
> ...


Yeah IQR values seem to be a bit lower in most games. Not sure if it's the L3 cache, but could be, because it reduces latency for many memory fetches


----------



## TheUn4seen (Nov 20, 2020)

Bruno_O said:


> yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.


Well, I have the 2080ti and a 1080ti in my wife's computer. I never saw GPU memory usage above 8GB and it doesn't often go above 5GB, aside from Horizon: Zero Dawn which seems to reserve the whole GPU memory pool immediately after running the game. I also don't have the "ultra" fetish, I care more about the framerate and it seems that the faster 3080's memory works better at high resolutions. Even if it' just 5%, slower is slower and there is no point in spending money on an inferior product when the difference in price is equivalent to a good meal in a restaurant.


----------



## Shatun_Bear (Nov 20, 2020)

SIGSEGV said:


> This thread confirmed has been bombarded by trolls.
> Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...)  and somehow ignoring their pride about performance/power consumption/heat.
> hello...
> 
> oh yes. winter is coming.



Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned 

Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.


----------



## mlambert890 (Nov 21, 2020)

Shatun_Bear said:


> Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned
> 
> Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.



Shills complaining about shills is ridiculous.  Performance in the most important new technology on par with the competitors *last gen* isnt impressive. Come on!  And once youre arguing "fps/watt" on *enthusiast* forums its a lost battle.  Nvidia is in no way "in trouble".  At all.  The 6800 is decent, and competitive, but objectively slower *where it matters*, is even *more* unbuyable, and is nearly as expensive. And getting the most out of it depends on proprietary tricks that require an unbuyable CPU and a 500 series chipset.  Only a true blind fan boy can call this some kind of huge win.

This gen is a mess on *both* sides. Too expensive. Still not significant enough gains. Paper launches.  "Team anything" = idiocy



Bruno_O said:


> most new games,  based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried



Thats not how this works at all.  There is no "AMD standard on consoles".  The XBox is using Direct X 12 DXR just like PCs.  The PS5 uses a version of Vulcan API they licensed.  ATI supports both by running RT on the cards compute units.  RTX is dedicated hardware which also supports DXR (obviously).  Why people think this is "all the same because AMD" is a really fundamental misundertanding of platform architecture, APIs and hardware integration.  PC will get *Xbox* ports, therefore DXR, therefore compatible with Nvidia with little to no effort.  And given Nvidia is far stronger, where there *are* delays, I will bet you money that when the "Nvidia optimized" version follows, it will be noticeably superior. Because the RT hardware is better. And the effects come via the API.  And tuning them isnt expensive.  And lots of RT enthusiasts *already own NVidia*.


----------



## MonteCristo (Nov 21, 2020)

Woe to You Oh Earth (NVidia) and Sea(Intel),

for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,

because he knows the time is short (Cyberpunk 2077 December Launch).

Let him who hath understanding (AMD Fanboys like me..) 

reckon the number of the beast,

for it is a human number..

its number is 6900XT!!!!!!!!!!


----------



## NoJuan999 (Nov 21, 2020)

MonteCristo said:


> Woe to You Oh Earth (NVidia) and Sea(Intel),
> 
> for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,
> 
> ...


LMAO !!!!


----------



## saikamaldoss (Nov 22, 2020)

beedoo said:


> My take away from it is this; is it really a problem? Whether you buy anything from a 3060Ti to a 3090, or a 6800 to 6900XT, they're ALL pretty damn fast.
> 
> I won't be losing any sleep over $50 here, or 20% FPS difference there... Next year, then the year after there will be something faster again and I'll buy that.
> 
> If you think you're future-proofing at the moment by buying a 3080 - then you're probably mistaken as there are probably going to be some big leaps over the next few years and you can't be worrying about that. If you need an upgrade now, buy one. If you don't then do you just want to splash some cash - and if so, just buy whatever you want.



I am losing my sleep over ray tracing and not 50$..if AMD had a good RT performance, it would have been easy to choose... or I have to wait till Dec 8th to see what 6900XT can do and what 3080TI can do in Jan. I have waited 1 year to replace my Vega64. 2 more months I can surly wait instead of buying the wrong card and regret again..


----------



## John Naylor (Nov 22, 2020)

1.   Regarding the conclusions,  I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.  

2.  AMD has achieved as close to parity here as we have seen in a long time ....  Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700.    Outta the box, that's a 3% increase at 1440p as compared to  a 3% increase in price.  That's pretty much a wash.

3. I don't quite understand the testing comparisons here, article could use clarification.   It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting.   Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple.   Both are shown in the  fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs .  "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. "   So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes.   Perhaps that could be addressed in an update.

4.  Comparing the OC performance, the 6800 XT "outta the box" did poorly when overclocked, but in "Max Power" mode The 6800 XT did better % wise than the reference 3080, the the 3080 FE still hit the highest OC in fps.  

5.  The shocker here is that the AMD card hit 78C w/ load and OC (and I assume *not* Max power) 2C better than the 3080 FE .... and in an even bigger surprise, AMD hit 31 dbA versus 35 dbA .... This is a significant win for AMD. However, again might not be a fair comparison since not sure what operational mode is represented in each graph.

6  Until we see some the  AIB cards reviewed, no way to really know which offers the best overclocked performance / power-sound-temps ratio

7.  At this point, if ya could purchase either at MSRP, sound arguments could be made for either card ....  I always look at price as a secondary factor becase any conclusion drawn based upon "value" is out the window when price adjustments are made.  Nvidia would have been foolish to price the card below $700  when they were the only card available.   But I wouldn't expect them to match AMDs MSRP until supply catches up w/ demand.  I don't expect to be in a position to make a logical decision recommendation until after the holidays.

Still most purchases will be decided by "brand" rather than "the numbers".   Features or advantages that one side has will be deemed "don't matter" by the other and visa versa.   To my eyes, I prefer to play games using motion blur reduction (ULMB) over adaptive sync and 17 / 23 games in the test suite can use ULMB at 120 Hz on nvidia cards . G-Sync monitors.   It also makes Ray Tracing easily obtainable at 60+ fps

In short, H U G E  Kudos to AMD here.   After not really having a horse in the race other than thr 5600 XT (and lower price segments)  last generation, they have achieved parity in the top "consumer gaming" segment, something they have not done since 2012.  Hopefully they can continue up and down the line.  Anxious to see the reviews on the AIB cards that our users will actually buy next year


----------



## W1zzard (Nov 22, 2020)

John Naylor said:


> 3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.


Only the purple bar in the OC gains chart was done at maximum power limit and manual OC, just like it says in the title
In that same chart Rage mode was not used at all, the green bar is the manual OC with power limit at stock (so purple bar with stock power limit)
The Rage mode results in the rest of the review were at everything stock, just Rage mode activated

Under the hood Rage mode is a profile that's stored in the BIOS. It has 4 vendor-defined values that rage mode overrides: power limit, temperature target, rpm target and acoustic limit. Note: no change in clocks or voltage


----------



## Super XP (Nov 22, 2020)

No, Nvidia is not in trouble, AMD is just where they should be, or should have been years ago. Thanks to every single ZEN generation being successful and future ZEN designs are all lined up and ready to go, AMD has given its Radeon Technology Group new life with the success of RDNA2 and with RDNA3 looking very good, with already talks on RDNA4 by speculators. Seems AMD is following its ZEN style of releases, complete design overhauls for each Generation.


----------



## ZoneDymo (Nov 22, 2020)

bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...

And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?


----------



## Steevo (Nov 22, 2020)

ZoneDymo said:


> bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
> See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...
> 
> And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?



Much like tesselation I'm sure we will see ray tracing become adjustable and future hardware implementations become faster/less overhead and more shared resources between geometry and ray tracing will make it almost penalty free as game engines, and hardware both work better.

But for a few years either implementation has a performance cost and aren't truly pure ray tracing.


----------



## lexluthermiester (Nov 23, 2020)

ZoneDymo said:


> bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?


This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.


----------



## Vayra86 (Nov 23, 2020)

lexluthermiester said:


> This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.



If you ask me, they should be improving on the software side, not the hardware side. They balanced out their raster and RT performance per shader quite right for this moment in time, if they can just scale it up in future generations along with shader count/die size, there will be sufficient RT perf on tap. I mean, how much are we prepared to lose over those stupid rays? There is a point of diminishing returns and its not like Nvidia is winning the efficiency crown with their larger reserved die space for RT/Tensor as it is. Part of that is node, but not all of it. Nvidia has a larger die, but is still less efficient (520 vs 620mm2 - 6800XT vs 3080)

In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations. Its one or the other, the ideal situation would be a new type of shader that could happily switch between operations. A step closer to a CPU...


----------



## lexluthermiester (Nov 23, 2020)

Vayra86 said:


> In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations.


IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.


----------



## Vayra86 (Nov 23, 2020)

lexluthermiester said:


> IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.



But that's just lighting and its not exactly expensive to do that with raster. So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.

This is why I'm advocating a slow approach over a fast one. We've seen it already with Turing. Big part of die reserved, high cost for those GPUs, barely a perf/dollar advancement from that gen, and barely a handful of RT titles to use it for.



John Naylor said:


> 1.   Regarding the conclusions,  I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.
> 
> 2.  AMD has achieved as close to parity here as we have seen in a long time ....  Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700.    Outta the box, that's a 3% increase at 1440p as compared to  a 3% increase in price.  That's pretty much a wash.
> 
> ...



Well spoken sir !


----------



## lexluthermiester (Nov 23, 2020)

Vayra86 said:


> So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.


No, what I'm saying is that as hardware performance improvements are made and optimizations in software are made, the hit to performance will become a non-issue to the point were RT lighting is very much preferred. Also there are varying degrees to which RT lighting can be applied currently, and to great effect. Improvements will only continue.


----------



## dragontamer5788 (Nov 23, 2020)

Hmmm, precomputing lighting clearly works and works well for static lights.

Where RT lights come in are:

1. Less work from the developer
2. Less precompiling
3. Dynamic and/or moving lights can become possible (including reflected lights off of moving surfaces).

Those are the things that were demonstrated with the extra-shiny Stormtrooper demo (







).

It certainly adds an element of realism. But at the same time, there's an element of over-realism. Because we've never seen dynamic lighting in video games before, such demos are overemphasizing them... kinda like how "Wizard of Oz" overemphasized the cartoony colors of color-TV back when color-TV became a thing.

We will need a few generations of video games before we see "realism". For now, we'll see overly shiny cars and overly-shiny helmets that don't really have a realistic atmosphere. Until the artists figure out to use raytracing correctly anyway... I really find a lot of the recent demos to be ridiculous.

--------

The best lighting is lighting that you *don't notice*.  Lighting that sets the mood, provides contrast, and draws the eye towards the important elements of the screen. Lighting doesn't necessarily have to be "realistic". Lightning just has to set the mood correctly.

See Pulp Fiction:






The light is *behind* Samuel Jackson's afro, a very unrealistic position when you consider what is going on in the room. But the lighting draws your attention to the scene (the guns, the faces, etc. etc.) while drawing your eye away from the background. That's cinematography right there: not necessarily being "realistic", but using lights to accomplish a goal... a way for the director to communicate with their audience.

A "realistic" light setup for that room would be dimly lit from only the window in the background. Its clear that the room doesn't have any lights in it, so why can we clearly see their faces? Well, that's cinematography done right. It doesn't worry about the details, its #1 goal is communication with the audience. Realism be damned.

--------

Video Game lighting gets better and better, and more realistic. But video game directors still don't know how to use lighting to communicate well. At some point, we have to recognize that its the video game art direction that's a problem as opposed to technology.


----------



## lexluthermiester (Nov 23, 2020)

dragontamer5788 said:


> We will need a few generations of video games before we see "realism".


Nah, we're already there. Control is a perfect example of RTRT done well.








Forgiving his broken english, he shows very clearly that RTRT is great on RDNA2.


----------



## dragontamer5788 (Nov 23, 2020)

I can't say I've played Control. But it seems like a good discussion point.





Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").





Why is there a giant shiny mirror in this room?

-----------

Technology demos are cool (and best demonstrated with something like Minecraft RT or Quake RT). But I feel like the next generation of video games needs to start thinking about the "interpretation" of the language of lighting effects. I still feel like this generation of video games errs on the side of 'Cool Graphics Demo' instead of actual cinematography.

With that being said: I'm looking at some screenshots of Control, and it seems like someone is thinking of cinematography.



			https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/control-ray-tracing-launch-article/control-ray-traced-geforce-rtx-screenshot-004.jpg
		


This screenshot is pretty good: the reflection on the computer monitor draws your eyes towards it. Its a good use of reflective technology, and sets the mood very well.


----------



## lexluthermiester (Nov 23, 2020)

dragontamer5788 said:


> Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").


Ah, but you're missing a slight point. Environmental lighting FX can be used as movement cues, IE enemies or object in the environment approaching and you see their approach before you actually see them. With RT lighting, this effect happens naturally, as it would IRL. But with non-RT lighting, creating that effect would be a serious task and create a great deal of system resource over-head as has been shown in a mutitude of past games that attempted it(to various levels of success).

And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.


----------



## dragontamer5788 (Nov 23, 2020)

lexluthermiester said:


> And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.



I think where I'm going is that super-clean and shiny floors are unusual in the real world (as well as in cinema). Even if we were to go to a high-end business building scene, such as the "Lobby Shootout" scene from the Matrix... the floors are marble: matte and non-shiny. (







)

Honestly, the only time a shiny floor happened in cinema to my memory is the "Little Princess" scene, where she's mopping a floor. You can see the footsteps of the children in the floor she just mopped: the shininess explicitly calling out how much the other children don't care about the main character anymore.





I'm sure there are other ones. But... yeah, even looking back at "The Matrix" lobby shootout: that was a matte floor. Shininess is actually pretty rare in cinema and the real world. Its overrepresented in modern RTRT games.

I do admit that "The Lobby Shootout" in the matrix had shiny columns (granite??) in the background. So shininess can be used to accent special scenes like that: but it shouldn't be used so willy-nilly as today's RTRT demos are doing.


----------



## lexluthermiester (Nov 23, 2020)

Once again, we're at an impasse. However, the point was that AMD's first go at RTRT is solid, if less efficient than NVidia's latest offerings. The landscape is rosy for RTRT going forward.


----------



## WhiteNoise (Nov 23, 2020)

@$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.


----------



## N3M3515 (Nov 24, 2020)

WhiteNoise said:


> @$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.



You were never going to buy AMD. You expect an equal performance gpu that consumes less power, has 6GB vram more and overclocks better for what? $100 less? wtf are you smoking dude. I'm waiting for the best $200 - $250 gpu from whatever company.
One problem i saw with low mid range last gen was AMD did not have any gpu at $220 - $240 while nvidia had 1660, 1660 super and 1660 ti, the next amd was 5600xt at $280.
It would be awesome if a gpu with 5700/2600 super performance is released at $220.


----------



## Bobica (Nov 24, 2020)

Can someone provide me or upload here a genuine bios from Sapphire RX 6800XT ?
IgorLab tested and apparently flash rx 6800 with rx 6800xt bios and increase performance 
check this link how IgorLab overclock this card to 2.55Ghz !!! using a script and flash 6800 with 6800xt bios









						AMD’s Radeon RX 6800 stable with continuous 2.55 GHz and RX 6800 XT overclocked up to 2.5 GHz - Thanks to MorePowerTool and board partner BIOS | igor'sLAB
					

Yeah, we did it again, and better this time. Especially since our forum member Gurdi probably won one of the main prizes in the silicone lottery, as his RX 6800 in reference design shows very…




					www.igorslab.de


----------



## WhiteNoise (Nov 25, 2020)

N3M3515 said:


> You were never going to buy AMD. You expect an equal performance gpu that consumes less power, has 6GB vram more and overclocks better for what? $100 less? wtf are you smoking dude. I'm waiting for the best $200 - $250 gpu from whatever company.
> One problem i saw with low mid range last gen was AMD did not have any gpu at $220 - $240 while nvidia had 1660, 1660 super and 1660 ti, the next amd was 5600xt at $280.
> It would be awesome if a gpu with 5700/2600 super performance is released at $220.



Don't tell me what i will or will not do. I wasn't expecting a better performing card, I was expecting a card that performs well, maybe close to the 3080 and does it at a more fair price. I have owned just as many ATI/AMD video cards as nvidia. I currently own and use two AMD cards. I am all for buying AMD but for the price they are asking for the 6800 XT I'd rather get the RTX 3080. Its the better card for that much money. $550 would have been a great price, even $599. Saving $100 compared to the RTX 3080 would have been enough to sway me. But a savings of $49.99? No thanks. 

In the past AMD has been able to provide good performance at a fair price. Their prices have swayed me more than a few times over the years.


----------



## N3M3515 (Nov 25, 2020)

WhiteNoise said:


> Don't tell me what i will or will not do. I wasn't expecting a better performing card, I was expecting a card that performs well, maybe close to the 3080 and does it at a more fair price. I have owned just as many ATI/AMD video cards as nvidia. I currently own and use two AMD cards. I am all for buying AMD but for the price they are asking for the 6800 XT I'd rather get the RTX 3080. Its the better card for that much money. $550 would have been a great price, even $599. Saving $100 compared to the RTX 3080 would have been enough to sway me. But a savings of $49.99? No thanks.
> 
> In the past AMD has been able to provide good performance at a fair price. Their prices have swayed me more than a few times over the years.




$550? $600? lol.....that's dreaming. - Equal performance - Lower power - Higher OC - +6GB VRAM - and you want to pay $100 - $150 less, i insist, wtf are you smoking? (I also have bought from both brands)


----------



## WhiteNoise (Nov 25, 2020)

N3M3515 said:


> $550? $600? lol.....that's dreaming. - Equal performance - Lower power - Higher OC - +6GB VRAM - and you want to pay $100 - $150 less, i insist, wtf are you smoking? (I also have bought from both brands)



I suppose it is dreaming considering the actual price. I was just hoping it would be cheaper than it is. Is it not OK any longer to express my opinion on the forum without being attacked? I run my games on a 4k monitor and it looks to me like they both perform well and on par with each other after reading reviews. I like the idea of Ray Tracing performance being better on the 3080.  I care nothing about power savings. I care nothing about overclocking a video card these days. The 16Gb of ram s very nice though!

Anyways, I wanted an RTX 3080 and could not get my hands on one, the RX 6800 released and I had hoped AMD would undercut nvidia enough to make me sway towards them. That is all. No need to be rude about it.


----------



## N3M3515 (Nov 25, 2020)

WhiteNoise said:


> I suppose it is dreaming considering the actual price. I was just hoping it would be cheaper than it is. Is it not OK any longer to express my opinion on the forum without being attacked? I run my games on a 4k monitor and it looks to me like they both perform well and on par with each other after reading reviews. I like the idea of Ray Tracing performance being better on the 3080.  I care nothing about power savings. I care nothing about overclocking a video card these days. The 16Gb of ram s very nice though!
> 
> Anyways, I wanted an RTX 3080 and could not get my hands on one, the the RX 6800 relased and I had hoped AMD would undercut nvidia enough to make me sway towards them. That is all. No need to be rude about it.



I'm sorry, i wasn't attacking you. What i meant is that AMD is just a bussiness much like nVIDIA is and when amd can't match nvidia performance wise they thend to price its products well below nvidia(or Intel). But that is not the case now.


----------



## Caring1 (Nov 28, 2020)

Yeston are going to release their 6800XT in December.
It's looks are one of those you either love or hate.


----------



## turbogear (Dec 5, 2020)

I own 6800XT reference card since some days now. 

I have been working on finding a nice under volt setting for it.
I have now the limits set at 2530MHz @1000mV with the Power Limit slider at 15%.
	

	
	
		
		

		
		
	


	



No overclock of memory.

I don't have the Hardware to measure GPU power consumption alone, but I did some approximations based on reading from my Corsair HX1200i power supply and subtracting consumption of other component while observing my system's consumption at idle and at GPU stress test.
I come up with ~294W for default 6800XT setting and with my above mentioned under volt I get ~320W which means around 26W more power consumption.


I am still on the refence cooler. 

I have ordered EK water block. Hopefully it will ship next week. 
	

	
	
		
		

		
		
	


	



This is the one I ordered:
EK Water block for RX 6800 and RX 6900 RDNA2 GPUs – EK Webshop (ekwb.com)


Here are the results from Time Spy. The SAM is enabled. My score at default 6800XT setting with SAM enabled is 17402 points.  
That is about 7% higher score. 
Next days I will test different games to check if these setting are also stable there as they are in Firestrike Extreme Stress test. 








Stability test with Firestrike Extreme:






Here are the settings that I am using:





The excel graph below show the frequency distribution over the entire Firestrike Extreme Stress test run:


----------



## TUFOM (Dec 5, 2020)

I'll just share my undervolt result for comparison:








						I scored 15 901 in Time Spy
					

AMD Ryzen 7 1800X, AMD Radeon RX 6800 XT x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## Deleted member 24505 (Dec 5, 2020)

Caring1 said:


> Yeston are going to release their 6800XT in December.
> It's looks are one of those you either love or hate.
> 
> View attachment 177275
> ...



disgusting unless you like slightly dodgy Anime cartoons


----------



## TUFOM (Dec 5, 2020)

tigger said:


> disgusting unless you like slightly dodgy Anime cartoons


I really dislike designs that block fan airflow but otherwise cool looking graphics card.


----------



## turbogear (Dec 5, 2020)

TUFOM said:


> I'll just share my undervolt result for comparison:
> 
> 
> 
> ...


Thanks for sharing.
I have not yet tried memory overclock. I will add that to my above setting and see how that results. 

I am also going to try MorePowerTool and play around with power limits.

I see that you also increased the low frequency.
Doesn't the GPU then always run at 2450MHz even in idle mode?


----------



## TUFOM (Dec 5, 2020)

turbogear said:


> Thanks for sharing.
> I have not yet tried memory overclock. I will add that to my above setting and see how that results.
> 
> I am also going to try MorePowerTool and play around power limits.
> ...


Nah its just target minimum, GPU tries to avoid dipping below under load it but it can dip below if limits are reached. Basically it doesn't do much. Also memory overclock is almost useless, infinity cache doing its job.


----------

