# AMD Radeon RX 6900 XT



## W1zzard (Dec 8, 2020)

AMD's Radeon RX 6900 XT offers convincing 4K gaming performance, yet stays below 300 W. Thanks to this impressive efficiency, the card is almost whisper-quiet, quieter than any RTX 3090 we've ever tested. In our review, we not only benchmark the RX 6900 XT on Intel, but also on Zen 3, with fast memory and Smart Access Memory enabled.

*Show full review*


----------



## pjl321 (Dec 8, 2020)

Wake me up in March when stock and prices have calmed down!


----------



## metalslaw (Dec 8, 2020)

Underwhelming.


----------



## SystemMechanic (Dec 8, 2020)

all sold out lols


----------



## xkm1948 (Dec 8, 2020)

For a top of the line product, i would expect top tier DXR performance along with all the accessories and features. Probably good for folks who only play rasterization based games. But i am not convinenced this is a good enough offering for the absolute best of the best flagship halo product


----------



## R0H1T (Dec 8, 2020)

Perhaps the first time in a long long time we've seen AMD beat Nvidia at any resolution, even with a slight handicap!


metalslaw said:


> Underwhelming.


Oh really?


----------



## Shatun_Bear (Dec 8, 2020)

More or less equal to 3090 @ 1440p and faster in 1080P for $500 less. Impressive.

For the 3090 you're paying an extra $500 for better RT and DLSS 2.0 (or at least until Fidelity FX is not released, once that out, this incentive is mostly gone).


----------



## metalslaw (Dec 8, 2020)

R0H1T said:


> Oh really?



5-7% faster than the 6800xt _for same system_, for $350 extra. Yes. It should be priced at $700-$750, then it's a good buy.


----------



## HD64G (Dec 8, 2020)

Irrelevant card for 99% of PC customers. Good review as usual @W1zzard  ! And good thing that SAM feature which is incoming for more CPU-GPU combos soon helps so much to lower CPU bottleneck for lower than 4K resolutions in games that it's needed (even 35-40% of FPS increase in some games at 1080P). Kudos to AMD for fetching the feature in windows and once again for fighting to the top tier of GPU and CPU market at once. Let's hope market and stock is improved soon to allow gamers buy new hw in proper prices.


----------



## R0H1T (Dec 8, 2020)

metalslaw said:


> 5-7% faster than the 6800xt _for same system_, for $350 extra. Yes. It should be priced at $700-$750, then it's a good buy.


Well that makes more sense but I guess the full (enabled) chip always carries a premium. The last 1% or 0.1% extra performance can cost anywhere between 10-100% more depending on what you're buying.


----------



## cueman (Dec 8, 2020)

so now its cleat and final,nvidia is fastest gpu,still.

even rx 6900 xt and with SAM on and picket mem cant beat nvidia 3090 Fonders edition model.

and as we seen rx 6800xt, oc'd AIB version not help.

now waiting rtx 3080 ti 20GB gpu

and Q1/2021 rtx 3000 series Super versions, with 20gb mem and TSMC 7nm tech...
and also nvidia own BAR mem options, same as amd SAM


----------



## R0H1T (Dec 8, 2020)

DP!


----------



## the54thvoid (Dec 8, 2020)

I'd call this parity for the first time in years. The eye-candy of RT is an additional feature that's a bonus but not graphically a deal-breaker. TBH, in RT games (Control), I spend time looking at how the environment 'works'. It doesn't add a huge amount to a game. In fact, you'd be hard pressed to tell if it was running unless you look for those effects. RT is very cool, but it's still in its infancy.

So, more efficient, and quieter for it. I'm 'sold' on it, if it wasn't for my Gsync monitor. Haters will still hate but considering where AMD were in the graphics arena, they've brought the goods as far as I'm concerned.


----------



## RedelZaVedno (Dec 8, 2020)

Really nice GPU, it makes 3090 pricing look kind of silly (but neither of them are worth price premiums over 6800XT/3080 imho). With that being said only *35* 6900XTs for one of the wealthiest countries in the world? What is AMD thinking? They could sell thousands of 6900ies in Switzerland. Maybe they actually don't wanna sell them at all? 6900 80CU GPUS rebranded as Instinct prosumers GPUs could be sold for triple the price to prosumer market. Maybe that's the catch, 6900XT being here just for marketing purposes to disturb 3090 victory lap and make them look greedy (well NVidia is greedy, we don't need additional proof, all corporation are fueled by shareholder's greed, what we need is decent availability and MSRP prices being honored by AIBs)?


----------



## R0H1T (Dec 8, 2020)

RedelZaVedno said:


> only *35* 6900XTs for the one of the* wealthiest countries in the world*?


What does that have to do with anything? They're still probably getting more stock per capita than most of Asia, now I know disposable income isn't so high over here but there's what some ~ *4 billion* people & at least 1000x times as many potential buyers?


----------



## tomc100 (Dec 8, 2020)

The first thing I check for now is power consumption on multi-monitor and it does not disappoint.  Also, it's almost as fast as the rtx 3090 and as drivers mature it'll probably match it as well.  Probably will have to wait until July 2021 to buy a water cooled version though.


----------



## Atnevon (Dec 8, 2020)

> one USB type-C with DisplayPort passthrough



@W1zzard Will this let me hook up the LG 5k screen that is geared to the MacOS group and let me get native 5K from this card?

Theres ways to get 5K but a straightforward plug-and-work in not one of them. Will this card change that?

EDIT: Tagged W1zzard as its been a while since I've posted on the regular. Forgot I can do that! I know i'm in a big minority for the knowledge but I do dream of having those juicy pixels for both my computers if it can be had.

EDIT 2: I found an article showing a mad-method of what I hope a simple Thunderbolt 3 cord from this 6900 XT will help me avoid https://www.unixtutorial.org/project-connect-lg-5k-display-to-pc/


----------



## TomTomTom (Dec 8, 2020)

parity, but the AMD card draws less power.


----------



## metalslaw (Dec 8, 2020)

R0H1T said:


> Well that makes more sense but I guess the full (enabled) chip always carries a premium. The last 1% or 0.1% extra performance can cost anywhere between 10-100% more depending on what you're buying.



Yah. The halo product price these days for video cards is ridiculous. A little extra is ok, but 50%+ extra dollars for a 5%-10% uplift in performance is pretty egregious.


----------



## phill (Dec 8, 2020)

Oh sign me up Scotty!!     I've gotta get one...  Maybe two for a few rigs and well, you know, science


----------



## Xuper (Dec 8, 2020)

First review of 5700XT (2019/07/07) : Link => 5700XT = 100% , 2060S = 108% , 2070 = 112% ( 1080p / 1440p )
Right now , this review 6900XT shows : 5700XT  = 65% , 2070 = 62%, 2060S = 60% ( 1080P)
After 13 months , 5700XT  = 104.8% , 2070 =103.3%,  2060S = 100%
so I expect after 12 months , 6900XT is going to surpass 3080/3080TI or super.


----------



## newtekie1 (Dec 8, 2020)

Same performance as a 3080 for $300 more, sounds awesome.


----------



## dicktracy (Dec 8, 2020)

Embarrassing RT performance, as expected. If you’re willing to throw $1000 on a GPU, wait for the 3080ti.


----------



## RedelZaVedno (Dec 8, 2020)

newtekie1 said:


> Same performance as a 3080 for $300 more, sounds awesome.


Not really... BUT yeah I agree, 3090/6900XT are intended for ppl with more money then sense


----------



## Fleurious (Dec 8, 2020)

Impressive performance at a disappointing price.


----------



## Divide Overflow (Dec 8, 2020)

Very impressive.  It's great to see competition at the high end of the GPU market once again!


----------



## newtekie1 (Dec 8, 2020)

RedelZaVedno said:


> Not really... BUT yeah I agree, 3090/6900XT are intended for ppl with more money then sense
> View attachment 178709
> 
> View attachment 178710
> ...



Yes, really.

Equal to 3080:






Equal to 3080 Again:





Equal to 3080 Again:





Don't post random graphs from an unnamed review in a review for the darn card.  We trust the results of TPU, not random graphs you made up.


----------



## yeeeeman (Dec 8, 2020)

Only 5% better than 6800xt?


----------



## Vya Domus (Dec 8, 2020)

newtekie1 said:


> Equal to 3080:



Which is about as "equal" to a 3090 using the same logic.  

Don't know what you're on about.


----------



## RedelZaVedno (Dec 8, 2020)

newtekie1 said:


> Yes, really.
> Don't post random graphs from an unnamed review in a review for the darn card.  We trust the results of TPU, not random graphs you made up.


These numbers are based on 18 games average from Steve's HardwareUnboxed video, one of the most trusted reviewers. Same conclusion was made by Gamers Nexus, 1080p and 1440p win for 6900XT, 4K and RT/DLSS win for 3090. Both said you should not buy either.

Here are full reviews:
Radeon RX 6900 XT Review, AMD's Fight For the Top - YouTube
AMD Radeon RX 6900 XT GPU Review & Benchmarks: Undervolting, Gaming, Power, Noise - YouTube


----------



## newtekie1 (Dec 8, 2020)

Vya Domus said:


> Which is about as "equal" to a 3090 using the same logic.
> 
> Don't know what you're on about.



A 5-8% is a difference, a 1-2% is not.  It's a pretty simple concept to grasp for most people.



RedelZaVedno said:


> These numbers are based 18 games from Steve's HardwareUnboxed video, one of the most trusted reviewers. Here's the full review. Same conclusion was made by Gamers Nexus, 1080p and 1440p win for 6900XT, 4K and RT/DLSS win for 3090.



Steve and GamersNexis are both hacks compared to TPU when it comes to performance testing.


----------



## Vya Domus (Dec 8, 2020)

newtekie1 said:


> A 5-8% is a difference, a 1-2% is not.



Says who, you ? How did you come up with that one.


----------



## R0H1T (Dec 8, 2020)

newtekie1 said:


> A 5-8% is a difference, a 1-2% is not.  It's a pretty simple concept to grasp for most people.


5-8% for 110~150% more price? Great logic there!


----------



## Fluffmeister (Dec 8, 2020)

I expected a bit more too frankly, I think the 3080 or the 6800 XT are better buys.


----------



## RedelZaVedno (Dec 8, 2020)

newtekie1 said:


> Steve and GamersNexis are both hacks compared to TPU when it comes to performance testing.


Whatever you say...


----------



## Chrispy_ (Dec 8, 2020)

Ignoring the fact that you can't buy one anyway, this is the most pointless card in the world. It's the card that the 6800XT should have been, and thanks to poor optimisation or drivers doesn't even match the 3080 that the 6800XT is supposed to rival.

IMO, the 6900XT should have been a larger chip - it doesn't have enough shaders to compete with the 3090 at sensible power or cooling limits. At $999 MSRP it's a joke, because it's only even a match for the $699 3080FE in traditional raster-based loads anyway. The minute you turn on DXR it's fighting for survival between the $399 and $499 cards from Nvidia.

Once all the dust settles from the scalping and stock problems, I'd expect to see the 6900XT sell at $699 since it's still down on DXR, DLSS, and NVENC compared to the 3080 to name just a few shortcomings. The 6800XT belongs at the $579 price point the 6800 launched at - it's quite clearly fighting the 3070 in many of the tests and falls short of it with any mention of DXR or DLSS, something that's definitely gaining traction in 2020 with both new consoles including it.

As for the vanilla 6800, it's a runt using massively-defective silicon. 25% of the card disabled? Jesus! Hopefully that model will vanish altogether when yields improve and when the 6800XT comes down to it's $579 price point. If you were after raytracing performance, just get a 2018 $349 RTX2060 6GB. That's honestly how bad it is. In fact, the $299 2060KO is about equal :\


----------



## xkm1948 (Dec 8, 2020)

Fluffmeister said:


> I expected a bit more too frankly, I think the 3080 or the 6800 XT are better buys.




not with the AIB / AIC price nowadays and lackluster RT and VR performance


I wonder how many of TPU's AMD fans would actually buy one now at least the rasterization performance is there.


----------



## IbaChiba (Dec 8, 2020)

you know I really hoped this increased competition would have led both brands to open up everything in terms of overclocking to get the advantage, disappointed honestly.


----------



## Vya Domus (Dec 8, 2020)

IbaChiba said:


> you know I really hoped this increased competition would have led both brands to open up everything in terms of overclocking to get the advantage, disappointed honestly.



What would have "increased competition", seeing AMD at the top of chart with 1% more performance or something like that ?

Nothing else cuts it ? Is that really what the average consumer is looking for ?


----------



## IbaChiba (Dec 8, 2020)

Vya Domus said:


> What would have "increased competition", seeing AMD at the top of chart with 1% more performance or something ?
> 
> Nothing else cuts it ?


i was just thinking of removing limits imposed by companies is all. is someone who spends $1000 dollars on a GPU the average consumer?


----------



## bug (Dec 8, 2020)

You know what? I don't care if this doesn't beat the 3090 across the board or if its RTRT performance kind of sucks. I hope this is enough to curb Nvidia's trend of putting an arbitrary price tag on cards offering minute performance improvements.
I understood why Nvidia did it and I didn't mind much, because I'm not buying into that price bracket anyway. But I'd love to see that practice put to an end.


----------



## Vya Domus (Dec 8, 2020)

IbaChiba said:


> is someone who spends $1000 dollars on a GPU the average consumer?



I don't know but I imagine he does not seriously expects that the 1000$ product handily outperforms the one which is 1500$. Is 8 % less performance for 50% less money not competitive ?


----------



## newtekie1 (Dec 8, 2020)

R0H1T said:


> 5-8% for 110~150% more price? Great logic there!


I didn't say the 3090 was a good buy either.  But, if I'm buying something that is actually better than the 3080, the only option is the 3090 right now.  And paying $1,000 for a 6900XT just to get identical performance to a 3080 for $300 cheaper is an even worse buy than the 3090.



bug said:


> I hope this is enough to curb Nvidia's trend of putting an arbitrary price tag on cards offering minute performance improvements.



And yet this is a perfect example of AMD putting an arbitrary price tag on a card offering minute performance improvements.


----------



## Xaled (Dec 8, 2020)

newtekie1 said:


> Same performance as a 3080 for $300 more, sounds awesome.


Slightly better performance for the same price. The 3080's 700$ price is just fake.


----------



## Vya Domus (Dec 8, 2020)

Xaled said:


> Slightly better performance for the same price. The 3080's 700$ price is just fake.



They're all fake, arguing that one is better than the other is pretty bewildering.


----------



## btk2k2 (Dec 8, 2020)

newtekie1 said:


> Yes, really.
> 
> Equal to 3080:
> 
> ...



There are 5 games where the AMD DX11 driver is causing a lower fps ceiling than the NV DX11 driver. AC:O, B3, D:OS2, FC5 and PC3. In all cases the fps decrease for the 6900XT and 3080/3090 from 1080p to 1440p and  is 0 but the 3080/3090 are between 10% to 30% faster in these titles. Doing a simple average of those 5 games alone @ 1080p the 6900XT gets 113fps and the 3080/3090 get 131 fps giving the 3080/3090 a 16% performance advantage. For the 3080 to be 1% faster on average in the summary despite this 16% advantage in 5 of the 23 games tested shows that for the other 18 games the 6900 is actually faster at 1080p like the HUB numbers show.

In all cases the 6900XT is getting 95+ fps average so it is not exactly performing badly and if any of those games are a persons primary game then it is an entirely valid reason to choose an NV card over an AMD one but it does skew the summary a bit and is not really representative of future titles considering Series X is DX12 only.

EDIT: Give it a few years and when these DX11 titles drop off of the review suite some will claim AMD fine wine.


----------



## Xaled (Dec 8, 2020)

Vya Domus said:


> They're all fake, arguing that one is better than the other is pretty bewildering.


Not that I am defending AMD, but Nvidia's real prices are much more "fake" and Nvidia has been systematically doing this since 2xxx series. While AMD claims that prices will settle in early 2021. On the other hand Nvidia's 2xxx series never settled and were never sold for the announced prices.


----------



## Blueberries (Dec 8, 2020)

When your 3090 killer offers 40% less Perf/$ than a 3080.

Ouch


----------



## StafuBi (Dec 8, 2020)

Can't wait to see 3 GHz overclock attempts on AIB cards


----------



## RainingTacco (Dec 8, 2020)

We are once again battling GPUs not according to their pure rasterization power[that will diminish over time because of die/price ratio, or become very expensive], but like in the old times- features. In the early days, directx version changed quite often and brought many features, people were buying GPUs that supported new tech. As it is currently -why buy a 1k usd gpu that doesn't have useful features? 
5nm GPUs will be very expensive, hence the importance of VRS, DLSS if RT become mainstream.


----------



## 15th Warlock (Dec 8, 2020)

Very impressive comeback for AMD, they’re firing on all cylinders no doubt, if you told me a year ago AMD would surpass Intel and compete directly with nvidia’s best, I would’ve asked you to give me a hit of whatever you were smoking, and yet, here we are.

I would like to request a follow up article once Nvidia enables PCIe BAR on their drivers, to see how it compares to SAM mode.

Thanks for the review.


----------



## FeelinFroggy (Dec 8, 2020)

Great review as always.  Its a good card, just over priced.  In the real world you wont see any difference between this card on a game than you would the 6800xt or the 3080.  Only on benchmarks would you see the handful of percentage points difference in scoring.  And dont even bring up 1080p as anyone using this card at 1080p is an idiot.  

But its nice to see AMD back at the big boy table again.  This should bring back enough competition for the refreshes in 2021 and we will see the prices drop (I'm talking about you Nvidia, the 3090 price is a joke).


----------



## Beertintedgoggles (Dec 8, 2020)

Blueberries said:


> When your 3090 killer offers 40% less Perf/$ than a 3080.
> 
> Ouch



Wow, what an incredibly stupid observation....  You do realize that the 3090 (even at MSRP price and not inflated) offers 68% less Perf/$ than the 3080.......?

Edit:  Assuming you're talking about the 4K resolution, it gets even worse at the lower resolutions


----------



## kruk (Dec 8, 2020)

Well, Lisa Su wasn't kidding, when she said that High-End GPUs are back on the menu. The proper reaction from nVidia would be to drop the 3090 price to $999, but instead, they will put another SKU between 3080 and 3090, so they can keep milking the fanboys. And judging from the reactions to this card, they will happily take it ...


----------



## WeeRab (Dec 8, 2020)

dicktracy said:


> Embarrassing RT performance, as expected. If you’re willing to throw $1000 on a GPU, wait for the 3080ti.


Or indeed the  2080ti


----------



## W1zzard (Dec 8, 2020)

btk2k2 said:


> EDIT: Give it a few years and when these DX11 titles drop off of the review suite some will claim AMD fine wine.


Won't be that long, AC:V is out already, so replaces AC:O, PC3 is junk, FC6 soon



15th Warlock said:


> I would like to request a follow up article once Nvidia enables PCIe BAR on their drivers, to see how it compares to SAM mode.


definitely


----------



## WeeRab (Dec 8, 2020)

Weird how everyone wants AMD to lower their prices - But no-one wants Nvidia to lower theirs....


----------



## LFaWolf (Dec 8, 2020)

I am disappointed with the performance per dollar from both companies. Even if the rumored 3080 ti will not do it for me. Guess I am skipping over this gen from both companies and stick with my 1080 ti a bit longer.


----------



## Super XP (Dec 8, 2020)

metalslaw said:


> Underwhelming.


So is the RTX 3080 & the 3090 if anyone claims the 6900XT is underwhelming 



LFaWolf said:


> I am disappointed with the performance per dollar from both companies. Even if the rumored 3080 ti will not do it for me. Guess I am skipping over this gen from both companies and stick with my 1080 ti a bit longer.


Exactly, both are Price Gouging unfortunately, and I blame Nvidia for price hiking which started with the RTX 2000 series and blame AMD for not being able to properly compete, but now that they do have competitive products, they are slightly matching Nvidia's ALREADY Overpriced GPUs. Best is to just sit this one out, until the prices come down to reality.


----------



## damric (Dec 8, 2020)

All these awesome cards this year seem so close, but so far away.


----------



## mahoney (Dec 8, 2020)

Reminds me of the old hawaii days - 6900xt is the X 6800xt the non X models yet now you pay $200 more for a similar performance gap. 3090 is at least amazing in blender while this is worse than a 3060ti. not to mention it has dlss and rtx.
No idea what to think about this. Was expecting it to be a bit faster


----------



## Mastakony (Dec 8, 2020)

Useless card like RTX 3090 BUT a useless thing 500$ cheaper LOL
RTX 3090 saved by 10 RTX games ROFL

Funny day


----------



## Bjorn_Of_Iceland (Dec 8, 2020)

It is nice if you want to bank on playing last gen games :'D


----------



## RedelZaVedno (Dec 8, 2020)

All the current GPU 'releases' look totally stupid at today's inflated prices. Just to name the cheapest Ampere GPUs (+2080ti) in stock at Mindfactory (Germany):

8GB Palit GeForce RTX 3060 Ti Dual = €549 ($664)
8GB MSI GeForce RTX 3070 GAMING X TRIO = €739 ($895)
10GB MSI GeForce RTX 3080 SUPRIM X = €1.089 ($1.320)
11GB Palit GeForce RTX 2080 Ti = €1.198 ($1.450)
24GB KFA2 GeForce RTX 3090 SG = €1.789 ($2.166)
... and ZERO AMD's Navi 21 GPUs, only RX 5600XT Navi 10 left in stock for €320 ($387) and up, inflated from €260 not long time ago.

A TOTAL SHITSHOW. What does it matter how good/bad these GPUs are if you have to sell your wife to buy them (well mine certainly isn't worth 3090 at current pricing  )?


----------



## Blueberries (Dec 8, 2020)

Beertintedgoggles said:


> Wow, what an incredibly stupid observation....  You do realize that the 3090 (even at MSRP price and not inflated) offers 68% less Perf/$ than the 3080.......?
> 
> Edit:  Assuming you're talking about the 4K resolution, it gets even worse at the lower resolutions



I'm comparing the two because they're equivalent in performance.

But thank you for the stupid observation


----------



## R0H1T (Dec 8, 2020)

RedelZaVedno said:


> What does it matter how good/bad these GPUs are if you have to sell your wife to buy them (well mine certainly isn't worth 3090 at current pricing  )?


You know where your priorities lie when have to sell your wife to get a GPU instead, not talking about you per se but really is this what the world has come to?


----------



## z1n0x (Dec 8, 2020)

WeeRab said:


> Weird how everyone wants AMD to lower their prices - But no-one wants Nvidia to lower theirs....


The same people that complain about AMD not been competitive in the past. And blame Intel/Nvidia high prices on AMD.
For them, AMD's sole reason for existances, is to lower Intel/Nvidia prices for them.


----------



## laszlo (Dec 8, 2020)

W1zzard said:


> definitely



@W1zzard  ref. Nvidia & PCIe BAR i was under the impression that without the big cache in the gpu this is not feasible at current generations ...


----------



## W1zzard (Dec 8, 2020)

laszlo said:


> @W1zzard  ref. Nvidia & PCIe BAR i was under the impression that without the big cache in the gpu this is not feasible at current generations ...


Why would you think so? BAR addresses how data travels from the CPU to the GPU's DRAM over PCIe. The L3 cache affects data traveling between GPU cores and DRAM over the GPU's internal memory bus

The L3 is REALLY REALLY small relative to the transfer amounts we're talking here, just 128 MB, when PCIe transfers multiple GBs in a second


----------



## RedelZaVedno (Dec 8, 2020)

R0H1T said:


> You know where your priorities lie when have to sell your wife to get a GPU instead, not talking about you per se but really is this what the world has come to?


I was joking, she's totally worth 3090 at MRSP  With that being said, I'm a value seeker searching for something fast enough to use Oculus Quest 2 with. I'll probably opt for 3060TI or 6700XT in the end when prices normalize. Whichever will offer better price to performance ratio.


----------



## FeelinFroggy (Dec 8, 2020)

WeeRab said:


> Weird how everyone wants AMD to lower their prices - But no-one wants Nvidia to lower theirs....



That goes without saying.  We have been preaching for Nvidia to lower their prices for years.


----------



## Beertintedgoggles (Dec 8, 2020)

Blueberries said:


> I'm comparing the two because they're equivalent in performance.
> 
> But thank you for the stupid observation



Then why bring up the 3090 if you are supposedly now comparing it to a 3080?  Stupid observation is still stupid.  Now if you want to compare the 3080 and 6800XT (or even the 6800), both of which are not the halo products which push performance at the cost of efficiency and sensible pricing, you'll see that the Nvidia and AMD cards are essentially identical in performance / $.  Comparing the two halo products (3090 and 6900XT), the charts clearly show that although the performance of AMD isn't as good as Nvidia, AMD's cost to performance ratio is much better (again, regarding halo products).  From here you can bring up the performance numbers disregarding anything else and yeah, Nvidia seems to come out on top but you weren't doing that....  You only mentioned performance / $ on a halo product from AMD while completely putting on your blinders on how poor that metric is for Nvidia's halo product


----------



## z1n0x (Dec 8, 2020)

The rBAR is so puzzling to me. Sometimes it does nothing. In others provide decent boost.


----------



## Blueberries (Dec 8, 2020)

Beertintedgoggles said:


> Then why bring up the 3090 if you are supposedly now comparing it to a 3080?  Stupid observation is still stupid.  Now if you want to compare the 3080 and 6800XT (or even the 6800), both of which are not the halo products which push performance at the cost of efficiency and sensible pricing, you'll see that the Nvidia and AMD cards are essentially identical in performance / $.  Comparing the two halo products (3090 and 6900XT), the charts clearly show that although the performance of AMD isn't as good as Nvidia, AMD's cost to performance ratio is much better (again, regarding halo products).  From here you can bring up the performance numbers disregarding anything else and yeah, Nvidia seems to come out on top but you weren't doing that....  You only mentioned performance / $ on a halo product from AMD while completely putting on your blinders on how poor that metric is for Nvidia's halo product



Yes, why would I compare two equally performing cards and their price points?

It's better to be assumed a fool...


----------



## SIGSEGV (Dec 8, 2020)

nice review. 
AMD is back. AMD is clearly the winner here. 
Congrats AMD


----------



## newtekie1 (Dec 8, 2020)

Xaled said:


> Not that I am defending AMD, but Nvidia's real prices are much more "fake" and Nvidia has been systematically doing this since 2xxx series. While AMD claims that prices will settle in early 2021. On the other hand Nvidia's 2xxx series never settled and were never sold for the announced prices.



Every nVidia card I've ever owned, including several in the 2000 series, have all been bought at MSRP or below. So I have to disagree with you.  Hell, you could buy 2080Ti's off eVGA's site for $1,999 on launch day, they just sold out quickly, and a couple months after launch they had 2080Ti's for $999.


----------



## B-Real (Dec 8, 2020)

metalslaw said:


> Underwhelming.


If the 6900XT is underwhelming, the 3090 is just as underwhelming too.


----------



## mechtech (Dec 8, 2020)

pjl321 said:


> Wake me up in March when stock and prices have calmed down!



March 2022 or 2023??


----------



## Xaled (Dec 8, 2020)

newtekie1 said:


> Every nVidia card I've ever owned, including several in the 2000 series, have all been bought at MSRP or below. So I have to disagree with you.  Hell, you could buy 2080Ti's off eVGA's site for $1,999 on launch day, they just sold out quickly, and a couple months after launch they had 2080Ti's for $999.


even now, after more than a year, 2080 tis cant be found for that price


----------



## Hattu (Dec 8, 2020)

Nice review as always, nice card, bad price. Really good that there's competition.


----------



## LFaWolf (Dec 8, 2020)

Xaled said:


> even now, after more than a year, 2080 tis cant be found for that price
> 
> View attachment 178736View attachment 178737



Those are either refurbished products or sold by 3rd party sellers. Stock of 2080 ti are mostly gone.


----------



## mechtech (Dec 8, 2020)

I’m a bit surprised they didn’t use a 512 bit memory bus on it or Gddrx or something more to distinguish it from the 6800XT.

I really hope once the $240-$280 (US$) cards come out that there is good supply at MSRP.


----------



## Beertintedgoggles (Dec 8, 2020)

Blueberries said:


> Yes, why would I compare two equally performing cards and their price points?
> 
> It's better to be assumed a fool...



Awwwww, don't go changing your argument now.



Blueberries said:


> When your 3090 killer offers 40% less Perf/$ than a 3080.
> 
> Ouch





Beertintedgoggles said:


> Then why bring up the 3090 if you are supposedly now comparing it to a 3080?  Stupid observation is still stupid.  Now if you want to compare the 3080 and 6800XT (or even the 6800), both of which are not the halo products which push performance at the cost of efficiency and sensible pricing, you'll see that the Nvidia and AMD cards are essentially identical in performance / $.  Comparing the two halo products (3090 and 6900XT), the charts clearly show that although the performance of AMD isn't as good as Nvidia, AMD's cost to performance ratio is much better (again, regarding halo products).  From here you can bring up the performance numbers disregarding anything else and yeah, Nvidia seems to come out on top but you weren't doing that....  *You only mentioned performance / $ on a halo product from AMD while completely putting on your blinders on how poor that metric is for Nvidia's halo product*



Oh look, I can add pictures too!


----------



## newtekie1 (Dec 8, 2020)

LFaWolf said:


> Those are either refurbished products or sold by 3rd party sellers. Stock of 2080 ti are mostly gone.



Exactly.






That's the Amazon price history, not 3rd party sellers, for the 2080 Ti XC Ultra which had an MSRP of $1,200. It was pretty consistently at or below $1,200 until the card was discontinued.  And the cards that actually had competition from AMD settled in price pretty quickly.


----------



## r9 (Dec 8, 2020)

WeeRab said:


> Weird how everyone wants AMD to lower their prices - But no-one wants Nvidia to lower theirs....


Both 3090 and 6900xt are intended for people with more money than common sense.


----------



## Xaled (Dec 8, 2020)

LFaWolf said:


> Those are either refurbished products or sold by 3rd party sellers. Stock of 2080 ti are mostly gone.


I've been checking prices for months and they were always the same, that's why I skipped the whole 2xxx generat


newtekie1 said:


> Exactly.
> 
> 
> 
> ...


So when did you buy it for 999$? as it only got little close only to that price for, just ONE day maybe? Cant you see that It was sold for 1200$, which is 200$ higher than the MSRP after 9 months of release.


----------



## Blueberries (Dec 8, 2020)

Beertintedgoggles said:


> Awwwww, don't go changing your argument now.




My "argument" was that the 3080 costs 40% less for the same performance of this card. Which is the only logical conclusion that any consumer with a fraction of intelligence would gather from these charts. 

You're going off on some tantrum about comparing flagships like people compare Corvettes to Toyota Avalons or something. It makes no sense. You compare products of similar performance, not because they're "flagships," nobody does that. 

In AMDs marketing slides the 6900XT outperformed the 3090, instead it's doing 3080 numbers at 40% more cost. There's no market for this card. 

I have exhausted my patience trying to figure out what planet you live on or how your brain cells fire, but if comparing the cost of two equally performing cards is a "stupid observation" to you then god have mercy on your soul.

Have a blessed day.


----------



## RainingTacco (Dec 8, 2020)

newtekie1 said:


> Every nVidia card I've ever owned, including several in the 2000 series, have all been bought at MSRP or below. So I have to disagree with you.  Hell, you could buy 2080Ti's off eVGA's site for $1,999 on launch day, they just sold out quickly, and a couple months after launch they had 2080Ti's for $999.



And also remember that nvidia gpus hold value over time better than amd equivalents.


----------



## Ferrum Master (Dec 8, 2020)

It overclocks like a turd because of the power limit. No point getting a water block on these.

The question remains, even if you crank up the power limit does it allow higher frequencies? As 3090 remains a poor clocker no matter ho much power you feed to the hog.

What's the VRM noise on these? There is a reserve?


----------



## kruk (Dec 8, 2020)

Blueberries said:


> My "argument" was that the 3080 costs 40% less for the same performance of this card. Which is the only logical conclusion that any consumer with a fraction of intelligence would gather from these charts.
> 
> You're going off on some tantrum about comparing flagships like people compare Corvettes to Toyota Avalons or something. It makes no sense. You compare products of similar performance, not because they're "flagships," nobody does that.
> 
> ...



You are a 3090 user that paid *more than twice the value *of 3080 for *10% more performance*, and is now complaining about the 6900XT performance per price. Come on ...


----------



## 0x4452 (Dec 8, 2020)

It should be priced the same as a 3080. It is 1-2% faster but without the full feature set of the GeForce.

3090 price is stupid, but it is relatively much faster (5%) than the performance difference between 6900 xt and 3080, so it has the halo price pump.


----------



## Fluffmeister (Dec 8, 2020)

Yeah the RTX 3080 already has this beat for less money, no doubt an upcoming 3080 Ti with 20GB of VRAM for the same $999 will be the final nail in it's virtual coffin.


----------



## Blueberries (Dec 8, 2020)

kruk said:


> You are a 3090 user that paid *more than twice the value *of 3080 for *10% more performance*, and is now complaining about the 6900XT performance per price. Come on ...



NVIDIA can charge whatever they want for the 3090 because presently it has no competition. 

The 6900XT has a direct competitor... that's 40% cheaper. 

This is literally 9th grade economics.


----------



## AvrageGamr (Dec 8, 2020)

All this card does is make the 3080 a better deal. Why spend $300 more for slightly better raster and significantly worse ray tracing than the 3080?


----------



## kruk (Dec 8, 2020)

Blueberries said:


> NVIDIA can charge whatever they want for the 3090 because presently it has no competition.
> 
> The 6900XT has a direct competitor... that's 40% cheaper.
> 
> This is literally 9th grade economics.



Well, I don't think I need any economics advice from someone who thinks 10% additional performance is worth $800+ more ...


----------



## Aquinus (Dec 8, 2020)

Blueberries said:


> NVIDIA can charge whatever they want for the 3090 because presently it has no competition.
> 
> The 6900XT has a direct competitor... that's 40% cheaper.
> 
> This is literally 9th grade economics.


Are we looking at the same review? That's close enough to be a competitor to the 3090 and 3080 considering it's situated right between the two from just a performance perspective. Performance is between the two, the price is between the two. What's the problem?


----------



## Blueberries (Dec 8, 2020)

kruk said:


> Well, I don't think I need any economics advice from someone who thinks 10% additional performance is worth $800+ more ...



I haven't boasted any arguments or given anyone advice. I've just stated a couple facts that are apparently controversial for unknown reasons.


----------



## altermere (Dec 8, 2020)

little typo: it's vulkan, not vulcan


----------



## Aquinus (Dec 8, 2020)

Blueberries said:


> I haven't boasted any arguments or given anyone advice. I've just stated a couple facts that are apparently controversial for unknown reasons.


It's probably because you're the last person to be talking about perf per dollar.   


kruk said:


> You are a 3090 user that paid *more than twice the value *of 3080 for *10% more performance*, and is now complaining about the 6900XT performance per price. Come on ...


----------



## Sithaer (Dec 8, 2020)

newtekie1 said:


> Every nVidia card I've ever owned, including several in the 2000 series, have all been bought at MSRP or below. So I have to disagree with you.  Hell, you could buy 2080Ti's off eVGA's site for $1,999 on launch day, they just sold out quickly, and a couple months after launch they had 2080Ti's for $999.



Yeah if you are from America that is, in my country MSRP is like nonexistent and you can still say hi to a 400$ 2060.


----------



## Blueberries (Dec 8, 2020)

Aquinus said:


> It's probably because you're the last person to be talking about perf per dollar.



So once again, performance/cost makes a lot of sense when comparing two products of either equal performance or equal cost. The 3090 is not equivalent in performance or cost to a 3080 nor any other product on the market. 

If I had purchased a 3090 when there was another card with the same performance for 60% of the price then yes, that would be nonsensical, unfortunate, and I would be a hypocrite.


----------



## milewski1015 (Dec 8, 2020)

Great review as always. Unfortunately from a consumer standpoint, the 6900XT is pointless. Let's play devil's advocate for a second and take AMD's word for it that the 6900XT is a 3090 killer. Looking at the performance summary page, the 3090 is only 5%/6%/8% faster (1080p/1440p/4K) but for a 33% price increase over the 6900XT. If you're dead set on getting the best AMD GPU you can and don't care about raytracing, then between only the top offerings from each company, the 6900XT seems like a pretty good deal. But unfortunately, the top offerings from each company don't exist in a vacuum. The 6900XT is 1% slower at 1080p and 1%/2% faster at 1440p/4K compared to the 3080, for a 30% price hike. It's 5%/5%/7% faster than the 6800XT, but for an even larger 35% price hike. There's no reason that I can fathom why you'd spend at $300 more over the cost of a 3080 for a 2% performance improvement at best. 

It's great to see that AMD can make high end GPUs that perform in the same ballpark as Nvidia's high-end offerings. Nobody expected them to be as competitive as they are this generation, and they deserve the credit for that. The lackluster RT performance I can understand given it's their first go at it while Nvidia has had over a year to further optimize the tech. The biggest issue with this card is the fact that AMD tries to market it head to head with the 3090. Again, sure, at most it's 8% worse for 33% cheaper, but because the 6800XT/3080 are so close to it in performance, there's absolutely no reason to buy one. Now Nvidia will pop in with a 3080Ti that performs better and be able to price it at $999 because AMD set the bar there. Had the 6900XT been $750, a compelling argument could be made to buy one. I think that would be a fair price point. It just barely edges out the 3080 in 1440p and 4K performance, so between that and the greater amount of VRAM, a $50 price hike over the 3080 could be argued if you didn't care about RT. 

Of course, all of this is a moot point anyway since it's almost impossible to buy a new silicon product these days. 

TL;DR: AMD shot themselves in the foot with 6900XT pricing


----------



## Roy2001 (Dec 8, 2020)

Beats 3080 by only 2%. Should be $799.


----------



## W1zzard (Dec 8, 2020)

Ferrum Master said:


> The question remains, even if you crank up the power limit does it allow higher frequencies?


All my OC testing was done at max power limit, no power limit increase = no gains from OC



libastral said:


> little typo: it's vulkan, not vulcan


fixed, thanks


----------



## Deleted member 193792 (Dec 8, 2020)

I miss the era of $400 flagship GPUs.

Mining will only make things worse...


----------



## tancabean (Dec 8, 2020)

Nice card, well done AMD. Nice review too!

The winner this generation will depend on how many developers AMD can convince to not use RT heavily. If Dirt 5 and Godfall are any indication they’re having some success so far.

What’s probably going to happen though is Nvidia will push RT even harder now that it’s available on baseline console hardware and from both PC vendors.


----------



## Condelio (Dec 8, 2020)

They are all nice cards from both sides. Each day i look more and more at rtx3080 with intention of replacing my trusty old 1070


----------



## newtekie1 (Dec 8, 2020)

Xaled said:


> So when did you buy it for 999$? as it only got little close only to that price for, just ONE day maybe? Cant you see that It was sold for 1200$, which is 200$ higher than the MSRP after 9 months of release.



Reading comprehension isn't your strong suit is it? That's a price tracker for AMAZON.  I said eVGA directly had it for $999, and it wasn't that card, it was eVGA's basic 2080Ti.



Condelio said:


> They are all nice cards from both sides. Each day i look more and more at rtx3080 with intention of replacing my trusty old 1070



I'm waiting to see if the rumored 3080 Ti comes true. I think that might be my next card for my main computer.


----------



## Ferrum Master (Dec 8, 2020)

W1zzard said:


> All my OC testing was done at max power limit, no power limit increase = no gains from OC



Basically need to wait the first one doing shunt mod and seeing what really this is capable to show. So far this is a show only for pedestrians arguing with each other.


----------



## W1zzard (Dec 8, 2020)

Ferrum Master said:


> Basically need to wait the first one doing shunt mod and seeing what really this is capable to show. So far this is a show only for pedestrians arguing with each other.


AMD doesn't use shunts, the power draw is estimated internally in the GPU afaik


----------



## murr (Dec 8, 2020)

Good luck finding one for 1k, the bots will buy them out and have them up on eBay for 2k. So the real price will be 2 thousand for awhile.


----------



## evernessince (Dec 8, 2020)

metalslaw said:


> 5-7% faster than the 6800xt _for same system_, for $350 extra. Yes. It should be priced at $700-$750, then it's a good buy.



It's also $500 cheaper than the RTX 3090 and I don't see you on the 3090 reviews giving them stick for much worse value.  Double standard IMO.  Neither card are good value.


----------



## QUANTUMPHYSICS (Dec 8, 2020)

_" The Radeon RX 6900 XT is kind of stuck in this strange place where it’s really bad value compared to the 6800 XT, good value compared to the RTX 3090, but people looking to invest in this class of GPU won’t care about value, they just want the best of the best and in our opinion that isn’t the 6900 XT. Who is going to buy the 6900 XT? Die-hard AMD fans and people who desperately want a 6800 XT but can’t find one and aren’t willing to wait. We're looking at you. "_


*When I started reading, I came here focused on just 3 things:

#1 Price
#2 How well it compared to my 3090 in Microsoft Flight Simulator (since ya'll don't use DCS World)
#3 Your conclusion.*

I got 2/3 answers: $500 less than my 3090, but almost performs as well in 4K.

Because of the GPU scalping, it's almost impossible to buy anything around here. Microcenter's shelves are empty.

Hopefully, people will be able to actually get one now despite all the scalping going on.


----------



## turbogear (Dec 8, 2020)

I am not disappointmented that I went for 6800XT.  
With undervolt on mine I am getting around 5%-7% higher performance with around 30W-40W higher power consumption.

I played Borderland 3 for many hours with DX12 Ultra settings, I have boost clock staying in the range of 2500MHz and FPS was all time in range from 120-130.

I suppose 6900XT can also be tuned by undervolting to get higher performance.

@W1zzard
Did you try by undervolting and increasing the Frequency slider?
On my 6800XT I have better overclock by undervolting.
The performance is worst if I set for example 2550MHz@1050mV than for example  2550MHz@1015mV.

The above mentioned Borderland 3 result are with 2550MHz@1015mV with power limit slider at 15%.
My card does not boost very well at default settings in Wattman.


----------



## Darmok N Jalad (Dec 8, 2020)

Finally AMD has its "Zen moment" with GPUs. Doesn't win outright, but they went from not challenging nvidia to offering valid alternatives for all but the "put me in the poorhouse" GPU edition. The energy efficiency is a complete 180 from Polaris, and even a good jump from RDNA/Navi10. Taking the crown would have been nice, but this finally looks like an architecture with some answers. It almost seemed like this day would never come.

Edit:
I should add that this GPU is a big bastard. 26.8B transistors at 7nm, which puts it at 2.6x the size of Navi10. I do wonder if the prices on top GPUs will ever be reasonable again. Probably a lot of investment to recover from both companies, so a fully-enabled chip is going to be rare and expensive. The 3080 and 6800XT are the realistic "top" GPUs. The 3090 and 6900XT are for bragging rights. Basically the Titan-tier products that just get to occupy the top end of the bar graphs.


----------



## dont whant to set it"' (Dec 8, 2020)

If only it have beaten the 3090 outright to force nv in a desperate move for "inventing" the "super ti" moniker with price cuts across the board.

Great review.

Underwhelming oc potential the card poses.


----------



## Rob94hawk (Dec 8, 2020)

Fleurious said:


> Impressive performance at a disappointing price.



The real question is what till the scalpers sell it for?


----------



## AnarchoPrimitiv (Dec 8, 2020)

In negatives it lists:

"Overclocking requires power limit increase" 

Is there a piece of computer hardware that when overclock Ed DOESN'T require a power limit increase?



dont whant to set it"' said:


> If only it have beaten the 3090 outright to force nv in a desperate move for "inventing" the "super ti" moniker with price cuts across the board.
> 
> Great review.
> 
> Underwhelming oc potential the card poses.


Underwhelming compared to what?


----------



## dont whant to set it"' (Dec 8, 2020)

To the 6800xt. I expected for it do way better than what the review sample achieved.


----------



## Aquinus (Dec 8, 2020)

AnarchoPrimitiv said:


> Is there a piece of computer hardware that when overclock Ed DOESN'T require a power limit increase?


Undervolting a Vega 64 at the stock power limit will allow the GPU to boost more often resulting in better performance.


----------



## Solid State Soul ( SSS ) (Dec 8, 2020)

Am more impressed that AMD RDNA 2 beat Nvidia's Ampere in power efficiency for more or less the same performance !


----------



## Totally (Dec 8, 2020)

Mission failed. We'll get 'em next time.

Impressive but means jack when that target has been missed by such a large margin. I'm not seeing the $300 more value over the 3080/6800XT.


----------



## MxPhenom 216 (Dec 8, 2020)

Solid State Soul ( SSS ) said:


> Am more impressed that AMD RDNA 2 beat Nvidia's Ampere in power efficiency for more or less the same performance !



Hard to compare on different nodes, Samsung 8nm is basically their 10nm with density improvements, gate pitch reduction, and a high drive strength cell in the library. Higher drive strength cells have more leakage and thus more power consumption, but are faster transistors.


----------



## wolf (Dec 8, 2020)

The only sane reason to buy this over a 6800XT is availability and impatience. It's nice and all, gotta have a halo product, but the MSRP should be more like $750. No wonder they waited to release.

Nvidia had the luxury of releasing the 3080/3090 to no competition, and _yes_ even then the 3090 doesn't make much sense unless you really were going to use the extra VRAM. But at least it had more cores _and_ more VRAM.

Diehard fans with deep pockets will still buy this card, not a shred of doubt it will fly off shelves until we're all spoiled for choice from both camps with MSRP (or close to it) products.

The 'sane' high end belongs to the 6800XT and 3080 for now, if you could get either at a good price, they're clearly the ones to go for.

Fairly compelling either way too, Pure rast and a (theoretically) lower price + extra VRAM and a perf/watt advantage, 6800XT - Better RT perf, DLSS, gamestream, voice, cuda etc etc, RTX3080. Or yno whichever you could walk into a store and buy first.


----------



## Mussels (Dec 9, 2020)

Theres no stock, but here in Au the 6900XT is listed at $1400 and the 3090's are going for $2800-$3000
That sheer dollar value down under, makes this a clear winner if you dont want ray tracing (or a 3080, if you do)


This is very close to nvidias 3090 performance, a lot cheaper - as well as cooler and quieter.
Theres no clear winner this gen, what we have is even better... COMPETITION

when stock levels increase, i bet prices will start to drop fast.


----------



## Minus Infinity (Dec 9, 2020)

So how does techspot show the 6900XT (sans SAM) being the fastest card of all at 1080p and 1440p and a bit slower than the 3090 at 4K but this shows it trailing at all resolutions.

This is a huge win for the 6900XT at the price. I guess we now get 3080 Ti 20GB on TSMC 7nm very shortly as a panic response.


----------



## R0H1T (Dec 9, 2020)

Different setups, they're also probably using the 3950x *IIRC*.


----------



## MxPhenom 216 (Dec 9, 2020)

Minus Infinity said:


> So how does techspot show the 6900XT (sans SAM) being the fastest card of all at 1080p and 1440p and a bit slower than the 3090 at 4K but this shows it trailing at all resolutions.
> 
> This is a huge win for the 6900XT at the price. I guess we now get 3080 Ti 20GB on TSMC 7nm very shortly as a panic response.



Panic response for something that isnt really widely available yet? And its probaby because techspot has less DX11 games in their review suite and different setups.


----------



## R0H1T (Dec 9, 2020)

Also Steve's using at least high quality at 1080p, coupled with a smaller suite (*18 games*) for *benchmarks*.


----------



## wolf (Dec 9, 2020)

Mussels said:


> Theres no stock, but here in Au the 6900XT is listed at $1400 and the 3090's are going for $2800-$3000
> That sheer dollar value down under, makes this a clear winner if you dont want ray tracing (or a 3080, if you do)


Cheapest I can see is $1599, where have you seen 1400? And I can see 3090's for a hair over $2600, not that it really matters.

To my eyes the *clear* choices are the 6800XT or 3080. The price hike for a 6*9*00XT over a 6*8*00XT makes exceptionally little sense, it basically exists to sell people the 6800XT as it appears, and is, much better value.


Mussels said:


> Theres no clear winner this gen, what we have is even better... COMPETITION


I'll cheers to that!


----------



## lexluthermiester (Dec 9, 2020)

Mussels said:


> Theres no clear winner this gen, what we have is even better... COMPETITION


Yes, this exactly! And this is most excellent!

Once again late to the party in this thread, but I have to say it's refreshing to see AMD standing on more or less equal ground to NVidia! The 6900XT trades blows with the 3090 and it's interesting to see the results. NVidia still has the advantage in RTRT performance and VRAM(which will only really matters in 8k gaming and professional applications) but Radeon is standing side by side with the best Geforce and not flinching.

AMD, Welcome back to the premium GPU space! Well done indeed!


----------



## afw (Dec 9, 2020)

The pricing is the problem here I guess ... but whats the point ... no stocks from both nVidia and AMD and whatever is out there is sold at almost double the price ... thank you COVID


----------



## r.h.p (Dec 9, 2020)

way overpriced , i will be looking at the 6800 myself 1440p 144 hz is my monitor max


----------



## DemonicRyzen666 (Dec 9, 2020)

*@ W1zzard*

How come their isn't a comparison in the Raytracing charts when overclocked ?
I'd like to know if it has any effect on the RT performance at all ?
There is only stock, with Raytracing enabled compared to disabled.

These cards are hard to compare because you can't really truly compare it to Nvidia's second Generation of RTX card.  The Rasterization is much higher then NVidia's First Generation RTX cards it's a tough thing to compare, with anything really..


----------



## R-T-B (Dec 9, 2020)

I don't know about you guys, but as someone who plays a lot of Indie games that are and will be predominantly DX11 for some time...  That DX11 overhead is a dealbreaker for me.

It is good to see AMD have a competivie product though, for most situations.


----------



## R0H1T (Dec 9, 2020)

DemonicRyzen666 said:


> These cards are hard to compare because you can't really truly compare it to Nvidia's second Generation of RTX card. The Rasterization is much higher then NVidia's First Generation RTX cards* it's a tough thing to compare*, with anything really..


If you prefer RT Nvidia is a better choice, at least till RDNA3 cards debut. For traditional rasterization based games AMD is the way to go *IMO*.


----------



## MikeSnow (Dec 9, 2020)

From the article conclusion:



> Zen 3 is sold out everywhere



That's really not the case. Here in Romania, there is no shortage of 5800X. I bought one yesterday at MSRP + 2%, and in 30 minutes it will be delivered, which is why I can't sleep right now . I could have also bought a 5900X at MSRP + 10%, but I wanted a CPU with only one CCD. I had much more trouble finding a good motherboard for it, since I'm switching from Intel. I didn't find the motherboard I wanted from my usual retailer, I had to order one from a more obscure shop, so it will only be delivered towards the end of the week, or even next week .

But 5600X is indeed out of stock, and so is the 5950X. Anyway, if that statement would be changed to "Zen 3 is sold out _almost_ everywhere", I wouldn't necessarily disagree. Maybe Romania is special. Although we seem to be affected by GPU shortages just as much as the rest of the world, so I have a suspicion that even worldwide the Zen 3 shortages are not as bad as the GPU shortages.

Now back to the topic, as others, I'm a bit underwhelmed by the 6900 XT performance. I expected more. I'm only interested in 4K performance, and this is how it looks in the TPU 4K benchmarks, even with SAM and DDR4-3800:


```
in  6 games - 6900 XT is slower than the 3080 at 4K
in 13 games - 6900 XT is somewhere between the 3080 and 3090 at 4K
in  4 games - 6900 XT is faster than the 3090 at 4K
```

So, even with SAM and faster memory, it's closer to a 3080 than a 3090 at 4K. Taking raytracing into account, it's even worse. And I'm interested in doing some machine learning on my GPU, and the AMDs are not ideal for that, to say the least, for various reasons.

The only saving grace is the increased efficiency of the 6900 XT. I received a Gigabyte 3080 Vision OC as a birthday present from my colleagues this weekend, and the damn thing made me open my window to cool my room. In the winter. While idle at the desktop. I shudder thinking what it will be like in the summer if I can't find a solution. I'm still troubleshooting why it's so power hungry even when idle.

So, maybe there are some use cases where the 6900 XT makes sense over the NVidia cards after all. But personally, I think the 6900 XT is at least $200 more expensive than it should be.


----------



## R0H1T (Dec 9, 2020)

AMD will probably sell at least 10x as many zen3 chiplets (or 5x as many *chips*?) by the end of the year wrt Ampere & RDNA2 cards combined. The margins are much higher, yields much better & capacity isn't as constrained.


----------



## wolf (Dec 9, 2020)

MikeSnow said:


> I received a Gigabyte 3080 Vision OC as a birthday present from my colleagues this weekend, and the damn thing made me open my window to cool my room. In the winter. While idle at the desktop.


What on earth are you on about, at idle you'll be lucky if it draws 50w. I have a 3080, summer has just started in Australia and mine doesn't appreciably warm my room when gaming. 

Most of the rest of what you're saying makes some sense but I just can't get behind that quote, if the 3080 did that to you there's a high likelihood that in an apples to apples scenario so did the last card/rest of the system.


----------



## InVasMani (Dec 9, 2020)

Now I wonder what happens going forward for RNDA3?

I'd be keen to see if the this 4 to 1 relationship gets revised to something like a  6 to 2 relationship? That might allow for more brute force as well as alternate frame half resolution temporal ray traced acceleration effects.





I'm not sure what they'll do with infinity cache. I could see a minor bump to the size especially after a node shrink or maybe 7nm EUV potentially as well. The other aspect is it's split between two 64MB slabs similar to CCX's so I wonder if a monolithic 128MB slab that's shared access is bound to happen eventually.

As for this bit on the CU scalars and schedulers where is this going?




I think maybe they'll double down or increase it's overall design layout granularity and scheduling relationship another 50% potentially. That in mind if they bump up the infinity cache another 64MB and make it all shared a 50% in this area makes a lot more sense.

I want to know more about Radeon Boost how configurable is it can you pick a custom downscale resolution target point to adhere to? It seems like it would work well in practice I'm just curious how adjustable it is. I think there are defiantly people that might prefer downscale the resolution to  1440p from 4K as opposed to 1080p or even more custom targets in between both like LOD mipmap scaling just more granular option targets to scale how much image fidelity to performance is adjusted while in motion. I really the idea of it a lot I've just only seen that one slide on it which isn't real detailed unfortunately.

I really think Wizard should consider a chart for 4K with Radeon Boost enabled with SAM on and off. The way that the smart access memory works that is a interesting combination to look at because they play into each other well with Radeon Boost making SAM enabled more ideal for people playing at high resolutions. You get a 7% SAM advantage at 1080p and 2% at 4K so with Radeon Boost with SAM you should have somewhere in the 2% to 7% ballpark!? I don't know how well average 5% roughly, but could lean more towards 7% or 2% depends how much scene activity is going on of course when it matters though it should be closer to the 7% mark. If for any other reason it would be interesting to see how Radeon Boost and SAM interact with the mixed rendering.


----------



## MikeSnow (Dec 9, 2020)

wolf said:


> What on earth are you on about, at idle you'll be lucky if it draws 50w. I have a 3080, summer has just started in Australia and mine doesn't appreciably warm my room when gaming.
> 
> Most of the rest of what you're saying makes some sense but I just can't get behind that quote, if the 3080 did that to you there's a high likelihood that in an apples to apples scenario so did the last card/rest of the system.



That's why I said I'm troubleshooting the issue; I don't think this is normal, and I'm trying to determine if it's a problem with my system or the board. Apparently, the RAM remains at full speed even when idle, and it uses 21% of its power target when idle as a result. It's a 320W card, so 21% would be something like 70 W. Which it blows towards me constantly, if I keep the side of my case open. The fan is almost never idle.

My previous card was a 2080, and it never did this, on the same system.


----------



## InVasMani (Dec 9, 2020)

Has BAR size been benchmarks at different aperture size settings for power consumption yet!!? I wonder what kind of impact it is has on that surely TDP goes up a bit, but maybe not too badly and likely mostly in line with the GPU uplift in any case. Still it's something to look at and consider and wonder if that's played any role in why until now it kind of got set at 256MB and forgotten or set aside until now.


----------



## Ferrum Master (Dec 9, 2020)

W1zzard said:


> AMD doesn't use shunts, the power draw is estimated internally in the GPU afaik



Indeed. They have spoiled any fun of doing hard OC.

Hoping the AIB versions will have a decently limited bios.


----------



## InVasMani (Dec 9, 2020)

AMD should probably consider a special form of Radeon Boost to apply just for the RTRT elements that can be adjusted between 480p/720p/1080p for the time being and revised and scaled upward later on newer upward for RDNA2. It might not be a gigantic reduction to RTRT image quality relative to the performance gains most of the scene is still ultimately rasterized. If they could add that as a software option to RDNA 2 that would change the RTRT battlefield quite a bit at least until Nvidia follows suit though is there even a way to check to the end user what the RTRT resolution adheres to? I know you can adjust the quality, but does it specify the resolution or simply the quality which could be determined by several factors like the amount of light rays and bounces.


----------



## MikeSnow (Dec 9, 2020)

wolf said:


> What on earth are you on about, at idle you'll be lucky if it draws 50w.



I think I got to the bottom of it. Using multiple monitors triggers the problem with my Gigabyte RTX 3080 Vision OC.  I have 2 or 3 displays connected at all times: a 4K @ 60 Hz monitor over DisplayPort, a 3440x1440 monitor at 100Hz, also over DisplayPort, and a 4K TV at 60Hz HDR, over HDMI, which I usually keep turned off.

After closing all applications, it still refused to reduce GPU memory speed. But I noticed when Windows turns off my displays the GPU memory frequency and power usage finally goes down. So, I disconnected my 4K monitor. The power usage went down to 7%, and the memory frequency dropped to 51MHz from 1188MHz. I turned on the 4K TV instead, the power usage and memory frequency remained low. I turned off the 4K TV again and reconnected the 4K monitor. The power usage and memory frequency went up again. I disconnected the 3440x1440 display, the frequency and power usage dropped. I turned on the 4K TV, the power usage and memory frequency remained low.

So, in short, if I connect both my monitors, over DisplayPort, the memory frequency never goes down. As a final experiment, I connected the 3440x1440 display over HDMI, at 50Hz. There were some oscillations, depending on which apps were open, but the GPU power usage and memory frequency remained low, for the most part.

So, I'm guessing it really doesn't like having multiple monitors at high refresh rates and resolutions connected, especially over DisplayPort. This is how the power and frequency usage looked while I was disconnecting/connecting various monitors:





The thing is, I looked at all the 3080 TPU reviews, and none of them mentioned the GPU memory frequency being higher when idle and using multiple monitors, unless I missed something.

@W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.


----------



## W1zzard (Dec 9, 2020)

AnarchoPrimitiv said:


> In negatives it lists:
> 
> "Overclocking requires power limit increase"
> 
> Is there a piece of computer hardware that when overclock Ed DOESN'T require a power limit increase?


All the custom design RX 6800 XT cards overclock just fine without power limit increase. "power limit increase" = you must increase the power limit slider in radeon settings or OC will not do anything.

Obviously overclocking always increases power consumption, that's not what I meant



DemonicRyzen666 said:


> How come their isn't a comparison in the Raytracing charts when overclocked ?


RT is simply not important enough at this time. I test SO many things, reviews need to be finished in a reasonable timeframe, so I have to make compromises.



MikeSnow said:


> That's really not the case. Here in Romania, there is no shortage of 5800X


Congrats on your new processor. The supply situation is definitely not normal, i.e. anyone can get any CPU at reasonable prices



InVasMani said:


> Has BAR size been benchmarks at different aperture size settings for power consumption yet


You can't adjust the BAR size, the size == VRAM size, that's the whole point of mapping all GPU memory into CPU address space. Obviously it does not "use" the whole VRAM, I also suspect some secret sauce here, i.e. per-game optimizations in how data is transfered, AMD hinted at that in the press briefings



MikeSnow said:


> @W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.


It's detailed on the power page in the expandable spoiler. Two monitors: 1920x1080 and 1280x1024, intentionally mismatched, one DVI, one HDMI, intentionally mismatched.

I think you are seeing increased clocks due to the refresh rate? Try going to 75 Hz or even 60 Hz.

Would love to hear more about this, could be good input so I can adjust my testing, in a separate thread please


----------



## ratirt (Dec 9, 2020)

Not a bad result for the 6900xt considering the price difference with 3090. But yet again, considering the price of the 6900xt that is definitely not my card.


----------



## Mussels (Dec 9, 2020)

W1zzard said:


> All the custom design RX 6800 XT cards overclock just fine without power limit increase. "power limit increase" = you must increase the power limit slider in radeon settings or OC will not do anything.
> 
> Obviously overclocking always increases power consumption, that's not what I meant
> 
> ...



power consumption tests may need higher bandwidth monitors to be relevant, i've done some quick testing here and there does seem to be a threshold at which the GPU's ramp up the multi monitor consumption... gah it'd be a shitty expense to add a high refresh display (or two) to a benchmarking system


----------



## W1zzard (Dec 9, 2020)

Mussels said:


> gah it'd be a shitty expense to add a high refresh display (or two) to a benchmarking system


indeed


----------



## Mussels (Dec 9, 2020)

wait, you can get those fancy little dongles for fake monitors - they'd be perfect for simulating extra screens without actually needing them
(random amazon image for example)


----------



## thepath (Dec 9, 2020)

It is barely faster than RTX 3080 at 4K but cost 400 dollar more

Not worth buying over RTX 3080 specially when it lacks DLSS and poor RT performance

Both RTX 3090 and RX6900 are bad value for gamers


Also, Nvidia will get something similar to SAM in the futures. It has been confirmed


----------



## Hawkster222 (Dec 9, 2020)

In my country I saw the RTX 3080 is more expensive then the 6900xt .


----------



## BorgOvermind (Dec 9, 2020)

Now all they need is to name them cards with RXT prefix.


----------



## lexluthermiester (Dec 9, 2020)

thepath said:


> It is barely slower than *RTX 3090* at 4K but cost *500 dollars less*


Fixed that for you.

You need to look at those numbers a little closer.



thepath said:


> Both RTX 3090 and RX6900 are bad value for gamers


That is an opinion not everyone will agree with.


----------



## MikeSnow (Dec 9, 2020)

lexluthermiester said:


> You need to look at those numbers a little closer.



Are you sure?


----------



## InVasMani (Dec 9, 2020)

lexluthermiester said:


> Fixed that for you.
> 
> You need to look at those numbers a little closer.
> 
> ...


 Indeed they can't be too bad value to be sold out for starters. That said quantities are low, but plenty of people are complaining about that at the same time so the demand is around.



MikeSnow said:


> Are you sure?
> 
> View attachment 178842


 Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080. 

Sure you can argue the RTX 3080 is stronger at RTRT, but average frame rates aren't RTRT in the first place those are the 0.1% frame rates in the context of game development how titles like that exist right now. If you took the catalog of all steam games in it's entirety GPU the RX6900 XT should end up ahead should performance average more or less hold true across more game titles. The fact is RTRT won't skew results too much in the big picture right now because there are so few titles with it at this junction point in time and it will take years for that to even really begin to change substantially.


----------



## Ravenas (Dec 9, 2020)

Thank you for the review W1zzard. Great graphics card at the price point versus 3090. Though,  I still think 4k hardware is not *quite* there.

I don't agree with the recent review downgrades caused by "Not for the average gamer." I would rather you provide a review on the hardware, not a review on what percentage of the market will want the hardware. I wish we could do away with this in future reviews.


----------



## lexluthermiester (Dec 9, 2020)

MikeSnow said:


> Are you sure?
> 
> View attachment 178842


Are you blind or do you not understand your own example?


----------



## Super XP (Dec 9, 2020)

MikeSnow said:


> Are you sure?
> 
> View attachment 178842


Something is wrong with the RTX 3090. It's not meant for PC Gaming, seeing how bad it performs in PC Gaming relative to the price Nvidia is asking. Had this not been released, AMDs RX 6900XT would have been the fastest GPU on the planet, at least for a short period till Nvidia musters up the RTX 3080Ti. So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is.


----------



## MikeSnow (Dec 9, 2020)

lexluthermiester said:


> Are you blind or do you not understand your own example?



Your initial modified quote said it's "barely *faster* than RTX 3090 ", instead of the 3080 from the actual quote. Which was factually wrong, and my post demonstrated that.

Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely *slower* than RTX 3090". Which is debatable, but OK, at least it's not completely wrong.

The point is the 6900 XT is closer to a 3080 than a 3090 at 4K, so the claim of the original poster of that quote, that it is barely *faster* than the *3080* at 4K, is still more appropriate than your second edit to it, in my opinion.



InVasMani said:


> Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080.



Honestly, at the moment I don't care about either DLSS or Radeon Boost. I don't intend to use DLSS, but I may change my mind in the future. And Radeon Boost just reduces the actual rendering resolution when you make fast movements, if I understand correctly. It's a nice feature, but in a title like Microsoft Flight Simulator, which is what I played the most recently, you rarely make fast movements. Depending on the game and your play style so it might not help much, if at all, even if in theory it could work for any game.


----------



## lexluthermiester (Dec 9, 2020)

Super XP said:


> So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is.


Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.



MikeSnow said:


> Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely *slower* than RTX 3090".


Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.


MikeSnow said:


> Which is debatable, but OK, at least it's not completely wrong.


Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

Yes, NVidia technically has the performance crown with the 3090, but only by the slimest of margins and *only* through collating averages over a limited number of gaming titles. As I said earlier, the 3090s advantages are the RTRT performance and extra 8GB of VRAM. Otherwise AMD has matched NVidia this round of GPU's. 3090-like performance for 2/3 the price? 6900XT. 3080-like performance for $110 less? 6800XT.

Trying to minimize AMDs progress and offerings in the way many users have been doing in this thread is the same kind of narrow-minded nonsense that people were spewing about RTRT with the release of the RTX2000 series cards a few years ago. It's pathetic.


----------



## anachron (Dec 9, 2020)

MikeSnow said:


> I think I got to the bottom of it. Using multiple monitors triggers the problem with my Gigabyte RTX 3080 Vision OC.  I have 2 or 3 displays connected at all times: a 4K @ 60 Hz monitor over DisplayPort, a 3440x1440 monitor at 100Hz, also over DisplayPort, and a 4K TV at 60Hz HDR, over HDMI, which I usually keep turned off.
> 
> After closing all applications, it still refused to reduce GPU memory speed. But I noticed when Windows turns off my displays the GPU memory frequency and power usage finally goes down. So, I disconnected my 4K monitor. The power usage went down to 7%, and the memory frequency dropped to 51MHz from 1188MHz. I turned on the 4K TV instead, the power usage and memory frequency remained low. I turned off the 4K TV again and reconnected the 4K monitor. The power usage and memory frequency went up again. I disconnected the 3440x1440 display, the frequency and power usage dropped. I turned on the 4K TV, the power usage and memory frequency remained low.
> 
> ...



It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.


----------



## MikeSnow (Dec 9, 2020)

anachron said:


> It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.



Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature 

Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.

Anyway, as @W1zzard requested, I added more details about my troubleshooting in a dedicated thread:









						RTX 3080 high power usage with high resolution / high refresh multi-monitor setups
					

I recently received a Gigabyte 3080 Vision OC as a birthday present, and it replaced my Asus Dual RTX 2080. The card is very nice, I'm very happy with it, except for one thing: it uses a lot of power and generates a lot of heat when idle, at the Windows desktop, compared to the 2080. As a...




					www.techpowerup.com


----------



## Mussels (Dec 10, 2020)

MikeSnow said:


> Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature
> 
> Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.
> 
> ...




Ahah! fantastic catch, i'd just been fiddling with HDR on my new screen as well and could have made the same mistake (honestly, HDR looks so bad on monitors)


----------



## MxPhenom 216 (Dec 10, 2020)

lexluthermiester said:


> Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.
> 
> 
> Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.
> ...



A 6800XT is $110 less than a 3080? News to me...


----------



## lexluthermiester (Dec 10, 2020)

MxPhenom 216 said:


> A 6800XT is $110 less than a 3080? News to me...


It's as easy as looking up the the prices.








						rx 6800 xt | Newegg.com
					

Search Newegg.com for rx 6800 xt. Get fast shipping and top-rated customer service.




					www.newegg.com
				











						geforce 3080 | Newegg.com
					

Search Newegg.com for geforce 3080. Get fast shipping and top-rated customer service.




					www.newegg.com
				



For example;








						ASUS TUF Gaming NVIDIA GeForce RTX 3080 TUF-RTX3080-10G-GAMING Video Card - Newegg.com
					

Buy ASUS TUF Gaming NVIDIA GeForce RTX 3080 TUF-RTX3080-10G-GAMING Video Card with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				



$699








						SAPPHIRE Radeon RX 6800 XT Video Card 21304-01-20G - Newegg.com
					

Buy SAPPHIRE Radeon RX 6800 XT 16GB GDDR6 PCI Express 4.0 ATX Video Card 21304-01-20G with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				



$649








						SAPPHIRE Radeon RX 6800 Video Card 21305-01-20G - Newegg.com
					

Buy SAPPHIRE Radeon RX 6800 16GB GDDR6 PCI Express 4.0 ATX Video Card 21305-01-20G with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				



$589
Unless my math is off, that's a $50 difference in favor of the 6900XT. The $110 difference is for the 6800. Seems I looked at a 6800 when I looked up prices earlier. Even still AMD has the value add.


----------



## LFaWolf (Dec 10, 2020)

lexluthermiester said:


> It's as easy as looking up the the prices.
> 
> 
> 
> ...



That is an 6800xt, not 6900xt. Msrp of 6900xt is $999


----------



## lexluthermiester (Dec 10, 2020)

LFaWolf said:


> That is an 6800xt, not 6900xt. Msrp of 6900xt is $999


You need to go re-read. You're missing context.


----------



## anachron (Dec 10, 2020)

The price differences may depend on the country, but here (in France), there is almost no price differences between a 6800XT (from ~770€) and a rtx 3080 (start at ~800€). The 6900XT are listed starting at 1250€, which seems quite high compared to the difference in performance with a 3080, especially if you don't have a ryzen 5000.


----------



## MikeSnow (Dec 10, 2020)

lexluthermiester said:


> Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.



OK, it's not debatable. I'm just going to post numbers then, without debating. This is how the 6900XT looks at 4K compared to the 3090, even if you give it every possible advantage, including enabling SAM:

- 17% slower in Jedi: Fallen Order
- 15% slower in Control
- 15% slower in Anno 1800
- 15% slower in The Witcher 3
- 14% slower in Civilization VI
- 14% slower in Metro Exodus
- 12% slower in Devil May Cry 5
- 8% slower in Divinity Original Sin II
- 8% slower in Borderlands 3
- 7% slower in DOOM Eternal
- 6% slower in Red Dead Redemption 2
- 4% slower in F1 2020
- 4% slower in Gears 5
- 3% slower in Assassin's Creed Odyssey
- 3% slower in Death Stranding
- 3% slower in Sekiro: Shadows Die Twice
- 3% slower in Shadow of the Tomb Raider
- 2% slower in Project Cars 3
- 1% slower in Strange Brigade

- 3% faster in Far Cry 5
- 6% faster in Battlefield V
- 6% faster in Detroit Become Human
- 8% faster in Hitman 2

Look, don't get me wrong, the 6900XT is a nice card, and I have no problem buying AMD products when they are better than the competing ones and the price makes sense. For example I have a 5800X in a box on my desk right now, and I'm waiting for the motherboard to be delivered. 

AMD has to be congratulated for closing the gap to NVidia, and I can't wait to see the next generation of AMD GPUs. All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them. If you need that additional performance and are willing to pay the price, go for it. I don't.


----------



## Mussels (Dec 10, 2020)

Now do the same comparison for 1440p - not everyone is focused on 4k


----------



## MikeSnow (Dec 10, 2020)

OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.


----------



## medi01 (Dec 10, 2020)

Great to see Zen3 + SAM combo tested.



cueman said:


> even rx 6900 xt and with SAM on and picket mem cant beat nvidia 3090 Fonders edition model.



It depends on which games you test.
Pick up the newest hottest and uh oh doh.





The games in question:







MikeSnow said:


> OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.


There is a number of people with decent 1440p monitor looking for high framerates.



MikeSnow said:


> This is how the 6900XT looks at 4K


Most games (understandably) are old crap.
And this is likely why AMD has better results at 1440p and below: you get into CPU limited scenarios with that old crap like Civ4.


----------



## lexluthermiester (Dec 10, 2020)

MikeSnow said:


> All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them.


You're forgetting the prices of Vega, Radeon7 and the RTX2000 series card. The 3090 is effectively the RTX Titan replacement, which offers approx 50% greater performance and at $1000 less. The RX5000 and RX6000 are likewise less expensive than previous gen gpu's and offer amazing performance jumps. Maybe I find everything exciting and amazing because I don't have a short memory, can keep perspective and context clearly in view. That wasn't a jab at you personally because it seems a lot of people are simply forgetting to remember the recent past.

The reality is thus, GPU offerings from both companies are exceptional this generation and are a serious value when compared to past generations of GPUs. Logic is lost on anyone who does not see and understand the context of that perspective.


----------



## InVasMani (Dec 10, 2020)

Mussels said:


> Now do the same comparison for 1440p - not everyone is focused on 4k


 AMD just needs to make single card dual GPU solution with two RX 6900 XT's problem solved $500's more expensive and best case 17% more performance than a RTX 3090...no need to worry about TDP, noise, or heat output those figures aren't considerations to Nvidia RTX 3 series users at that end of the spectrum.



MikeSnow said:


> OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.


 Alright, but 4K isn't a minority? DLSS/RTRT games aren't a minority in contrast to the amount of games w/o those features!? People that can just burn $500's for best case 17% more performance and overlook heat, noise, and power usage aren't minorities!? Who are you really Tom Cruise!?


----------



## medi01 (Dec 10, 2020)

InVasMani said:


> AMD just needs to make single card dual GPU solution








						AMD Radeon RX 6800 XT in mGPU: 2 x Big Navi GPUs = Insane Performance
					

What's better than one of AMD's new Big Navi-powered Radeon RX 6800 XT graphics cards? Two of them in mGPU mode and benchmarked.




					www.tweaktown.com


----------



## MikeSnow (Dec 10, 2020)

InVasMani said:


> Alright, but 4K isn't a minority? DLSS/RTRT games aren't a minority in contrast to the amount of games w/o those features!? People that can just burn $500's for best case 17% more performance and overlook heat, noise, and power usage aren't minorities!? Who are you really Tom Cruise!?



Yes, 4K is a minority. But my guess is 1440p high refresh rate is a minority within a minority, and might be considerably smaller than the 4K minority.

Personally I'm perfectly happy at around 90 FPS, and apparently I'm a minority, as most people seem to be happy even at 60 FPS or less. Of course, this depends on the game and the play style, these numbers are not set in stone. And of course there are minorities for which even 90 FPS is not enough.

Now, if I take my 90 FPS as the target, which as I said I consider to be quite high, and we look at the average FPS charts, we see that even the 6800XT and 3080 give you over 90 FPS at 4K on average. So for me to benefit from the 6900 XT or 3090 I would need to play at a resolution higher than 4K at 90Hz.

If we go to 1440p, even a RTX 2070 Super or a Radeon VII is enough to give you over 90 FPS, on average. That's why the performance of the 3090 and 6900XT at 1440p doesn't seem very important to me. I believe the percentage of people that want over 150 FPS on average at 1440p is extremely small, even compared to the 4K minority.


----------



## LFaWolf (Dec 10, 2020)

lexluthermiester said:


> It's as easy as looking up the the prices.
> 
> 
> 
> ...



First, you misstated that the 6800XT is $110 cheaper than the 3080.   Then, you stated it is now $50 difference in favor of the 6900XT (I bolded it above for you. What context am I missing?



LFaWolf said:


> That is an 6800xt, not 6900xt. Msrp of 6900xt is $999





lexluthermiester said:


> You need to go re-read. You're missing context.


----------



## LFaWolf (Dec 10, 2020)

lexluthermiester said:


> Yup, you caught me, I made a mistake. But you know, if you look closely, I also corrected myself. You want to stay up on your high horse?



You are really acting like an immature child. I was merely stating the facts because what you wrote just didn’t make sense, but now you are getting all defensive about it.


----------



## LFaWolf (Dec 10, 2020)

lexluthermiester said:


> No, you were ignoring information that was clearly displayed, attempting to belittle in the process. If you don't want responses like the one above, don't be condescending. Let it go.



Read all my responses to this thread - when did I become condescending at all? Find one.


----------



## LFaWolf (Dec 10, 2020)

lexluthermiester said:


> This came after I had already corrected myself. You were saying?
> 
> Yup.



No, that came after you told me to go read the context, because clearly you didn't see that you mis-stated again the 6900XT instead of 6800XT. And nothing in that is condescending.


----------



## Super XP (Dec 11, 2020)

Beertintedgoggles said:


> Then why bring up the 3090 if you are supposedly now comparing it to a 3080?  Stupid observation is still stupid.  Now if you want to compare the 3080 and 6800XT (or even the 6800), both of which are not the halo products which push performance at the cost of efficiency and sensible pricing, you'll see that the Nvidia and AMD cards are essentially identical in performance / $.  Comparing the two halo products (3090 and 6900XT), the charts clearly show that although the performance of AMD isn't as good as Nvidia, AMD's cost to performance ratio is much better (again, regarding halo products).  From here you can bring up the performance numbers disregarding anything else and yeah, Nvidia seems to come out on top but you weren't doing that....  You only mentioned performance / $ on a halo product from AMD while completely putting on your blinders on how poor that metric is for Nvidia's halo product


That's interesting the 3090 is a data centre GPU masquerading as a gaming GPU, hence the higher than high price tag. The 6900XT just like the 6800XT and the 6800 are gaming GPUs. I think AMD messed up the pricing of the 6900XT though, it's far too expensive for a slight boost in performance over its uncle 6800XT and the kid brother 6800. Can't wait to see the 2 toddlers in action though, 6700 & 6700XT.


----------



## Maximuspop (Dec 11, 2020)

Meh RT sucks the 3070 is killing it.


----------



## 95Viper (Dec 18, 2020)

Stay on topic.
Stop with your arguing/bickering back and forth... take that to PMs.


----------



## HD64G (Jan 2, 2021)

medi01 said:


> AMD Radeon RX 6800 XT in mGPU: 2 x Big Navi GPUs = Insane Performance
> 
> 
> What's better than one of AMD's new Big Navi-powered Radeon RX 6800 XT graphics cards? Two of them in mGPU mode and benchmarked.
> ...


So,* true* 8K is here already for anyone who gets 2x 6800s.


----------



## lexluthermiester (Jan 3, 2021)

HD64G said:


> So,* true* 8K is here already for anyone who gets 2x 6800s.


Are they supporting CrossFire?


----------



## HD64G (Jan 3, 2021)

lexluthermiester said:


> Are they supporting CrossFire?


CF is an obsolete marketing name and nothing else now. The post's link above has great results for the games tested that supported multi-GPU.


----------



## Super XP (Jan 3, 2021)

HD64G said:


> CF is an obsolete marketing name and nothing else now. The post's link above has great results for the games tested that supported multi-GPU.


Isn't any multi GPU options from either company a marketing name? I've read that Nvidia does a little better with SLI support, but either option you choose, the extra performance, despite its inconsistency, isn't worth the money for requiring 2 GPUs. 
By the way, Καλά Χριστούγεννα και Πρωτοχρονιά


----------



## HD64G (Jan 3, 2021)

Super XP said:


> Isn't any multi GPU options from either company a marketing name? I've read that Nvidia does a little better with SLI support, but either option you choose, the extra performance, despite its inconsistency, isn't worth the money for requiring 2 GPUs.
> By the way, Καλά Χριστούγεννα και Πρωτοχρονιά


Agreed! It is just that on big resolutions twin GPUs work well in general.

Να είσαι καλά φίλε. Ανταποδίδω τις ευχές σου!

(Translation from greek for the other members) Be well friend. Same wishes to you also!


----------



## Super XP (Jan 3, 2021)

HD64G said:


> Agreed! It is just that on big resolutions twin GPUs work well in general.
> 
> Να είσαι καλά φίλε. Ανταποδίδω τις ευχές σου!
> 
> (Translation from greek for the other members) Be well friend. Same wishes to you also!


Thank You, 
Ευχαριστώ,


----------



## lexluthermiester (Jan 4, 2021)

Let's keep it in English gentlemen.


----------



## evolucion8 (Jan 14, 2021)

Can't complain with the massive gains from my Radeon VII into the 6900XT with Horizon Zero Dawn 4K, is just ridiculous lol


----------

