# AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X



## btarunr (Nov 2, 2020)

AMD sent ripples in its late-October even launching the Radeon RX 6000 series RDNA2 "Big Navi" graphics cards, when it claimed that the top RX 6000 series parts compete with the very fastest GeForce "Ampere" RTX 30-series graphics cards, marking the company's return to the high-end graphics market. In its announcement press-deck, AMD had shown the $579 RX 6800 beating the RTX 2080 Ti (essentially the RTX 3070), the $649 RX 6800 XT trading blows with the $699 RTX 3080, and the top $999 RX 6900 XT performing in the same league as the $1,499 RTX 3090. Over the weekend, the company released even more benchmarks, with the RX 6000 series GPUs and their competition from NVIDIA being tested by AMD on a platform powered by the Ryzen 9 5900X "Zen 3" 12-core processor. 

AMD released its benchmark numbers as interactive bar graphs, on its website. You can select from ten real-world games, two resolutions (1440p and 4K UHD), and even game settings presets, and 3D API for certain tests. Among the games are Battlefield V, Call of Duty Modern Warfare (2019), Tom Clancy's The Division 2, Borderlands 3, DOOM Eternal, Forza Horizon 4, Gears 5, Resident Evil 3, Shadow of the Tomb Raider, and Wolfenstein Youngblood. In several of these tests, the RX 6800 XT and RX 6900 XT are shown taking the fight to NVIDIA's high-end RTX 3080 and RTX 3090, while the RX 6800 is being shown significantly faster than the RTX 2080 Ti (roughly RTX 3070 scores). The Ryzen 9 5900X itself is claimed to be a faster gaming processor than Intel's Core i9-10900K, and features PCI-Express 4.0 interface for these next-gen GPUs. Find more results and the interactive graphs in the source link below.



 

 

 

 



*View at TechPowerUp Main Site*


----------



## nguyen (Nov 2, 2020)

How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.


----------



## NeuralNexus (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.



NO ONE REALLY CARES ABOUT RAYTRACING...All future games will be optimized for AMD's raytracing solution anyway. Given that the gaming being developed for this generation will be built to purposefully use Zen 2 and RDNA 2 architecture as the base specs.


----------



## medi01 (Nov 2, 2020)

NeuralNexus said:


> NO ONE REALLY CARES ABOUT RAYTRACING...All future games will be optimized for AMD's raytracing solution anyway.


Which doesn't seem to be to need brute-force path tracing anyway:











Besides, I wouldn't be surprised if Zen 3 "infinity cache" inside RDNA2 lets it spank Ampere even on that (rather quite useless for now and in forseable future) front.


----------



## 1d10t (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.



It's funny because last year people ignore these RTRT and refer them as to "unnecessary" but now is major selling point along DLSS.


----------



## ratirt (Nov 2, 2020)

The 6000 series AMD cards are prepared for RayTracing. each CU has a ray tracing capability so I'm not worried. Besides, I'm more of a 4k gaming kinda guy and the ray tracing will have to wait a bit till there are games that I play supported with it. Which isn't happening this year anyway. To be fair, the 6800XT's performance is awesome. I 'm leaning more towards this card as my upgrade path.


----------



## Chomiq (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.


My guess this comes down to games being written for rtx use and it's a mess when trying to switch over to pure dxr. AMD would not benefit in this scenario so they're simply not showing it.


----------



## Searing (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.



I don't know a single gamer personally that cares about RT capability. I'm excited to play the next Battlefield 6 at 144fps, I don't want RT dragging me down. HAVE YOU SEEN THE RAY TRACING BENCHMARKS FOR WATCH DOGS? No thanks.


----------



## protain (Nov 2, 2020)

What I'd like to see are the same benchmarks on an Intel CPU, like the Core i9-10900K, to remove any of that new features that benefit from running on a 5000 series Ryzen CPU... as someone running an Intel CPU it would be nice to see. I'm in no way an NVIDIA fanboi (long time user of ATI/AMD cards in the past), I just want to get the best card for my system once silly season is over and availability is at a normal level for both.


----------



## nguyen (Nov 2, 2020)

Searing said:


> I don't know a single gamer personally that cares about RT capability. I'm excited to play the next Battlefield 6 at 144fps, I don't want RT dragging me down. HAVE YOU SEEN THE RAY TRACING BENCHMARKS FOR WATCH DOGS? No thanks.



I'm playing Watch Dogs Legion right now, the FPS is already low without RT anyways. Enabling RT Reflections really make WD Legion next gen graphically. 60FPS is already enough for single player game, idk why you need 144fps for single player game. For online multiplayer then sure, 200fps all the way.


----------



## Ferrum Master (Nov 2, 2020)

What's with the low RAM speed in the test setup.


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> make it easier to gauge RX6000 RT capability.



Poor, the RT capability is poor just as it is with any other Nvidia GPU. RT is not yet a truly usable technology across the board no matter how hard Nvidia fanboys, or Nvidia themselves, try to make it seem like it is. The performance is still crap, most games run well under 60 FPS on high resolutions, that's why Nvidia keeps pushing for DLSS by the way, not because it's amazing but because it is the only scenario which generates playable conditions. The truth is that paying over a thousand dollars to not be able to play a game maxed out is pathetic, it means that the feature was implemented way too early. I can't blame them for not wanting to show any numbers for DXR games.



nguyen said:


> 60FPS is already enough for single player game, idk why you need 144fps for single player game.



Sounds dangerously close to "your eyes can't see more than 30fps". Funny how excuses like these pop out no matter the context.


----------



## Xzibit (Nov 2, 2020)

Ferrum Master said:


> What's with the low RAM speed in the test setup.



Its the supported mem speed just like last gen


----------



## Space Lynx (Nov 2, 2020)

Ferrum Master said:


> What's with the low RAM speed in the test setup.



yep and once overclocked to 4000 1:1 these benchmarks will show it beating 3090 by a lot more... ram overclock gives ryzen major gains from my testing and what i have read.  surprised AMD didnt try to oc the ram just to flex.


----------



## Chrispy_ (Nov 2, 2020)

The vanilla 6800 is actually looking really strong in the first few of those benchmarks.

It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.


----------



## Luminescent (Nov 2, 2020)

nguyen said:


> I'm playing Watch Dogs Legion right now, the FPS is already low without RT anyways. Enabling RT Reflections really make WD Legion next gen graphically. 60FPS is already enough for single player game, idk why you need 144fps for single player game. For online multiplayer then sure, 200fps all the way.


Looked at gameplay and i am not blown away by the graphics, what surprised me the most this year was PS5 Horizon forbidden west, that is next gen graphics that truly takes advantage of what 7nm brought.
Nvidia is doing the usual crap to sway you from buying AMD and eventually kills it like PhysX, SLI and whatever gimmick they made in the past.
Ray tracing is here to stay but DLSS is a strange one, if AMD makes something similar it will probably die off and we are back to what really counts, architecture and the most important one, smaller node from TSMC or whoever is in silicon manufacturing business.
I'm looking forward to seeing what they release in the 200-300$ price range.


----------



## Dristun (Nov 2, 2020)

Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?


----------



## ratirt (Nov 2, 2020)

Luminescent said:


> Looked at gameplay and i am not blown away by the graphics, what surprised me the most this year was PS5 Horizon forbidden west, that is next gen graphics that truly takes advantage of what 7nm brought.
> Nvidia is doing the usual crap to sway you from buying AMD and eventually kills it like PhysX, SLI and whatever gimmick they made in the past.
> Ray tracing is here to stay but DLSS is a strange one, if AMD makes something similar it will probably die off and we are back to what really counts, architecture and the most important one, smaller node from TSMC or whoever is in silicon manufacturing business.
> I'm looking forward to seeing what they release in the 200-300$ price range.


AMD has an equivalent feature like the DLSS. It's called super resolution but it's hard to say how this one works and how it boost performance. We will have to wait for reviews anyway.


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> Poor, the RT capability is poor just as it is with any other Nvidia GPU. RT is not yet a truly usable technology across the board no matter how hard Nvidia fanboys, or Nvidia themselves, try to make it seem like it is. The performance is still crap, most games run well under 60 FPS on high resolutions, that's why Nvidia keeps pushing for DLSS by the way, not because it's amazing but because it is the only scenario which generates playable conditions. The truth is that paying over a thousand dollars to not be able to play a game maxed out is pathetic, it means that the feature was implemented way too early. I can't blame them for not wanting to show any numbers for DXR games.
> Sounds dangerously close to "your eyes can't see more than 30fps". Funny how excuses like these pop out no matter the context.



Seems like the people who yearn for high FPS are usually the people who can't afford high FPS 

FPS is like money, once you have sufficient amount, having more doesn't generally bring you more happiness. 60FPS is really enough for single player game, when you have extra performance, just add more visual candies like RT. 

And well if you can't afford RT then don't say something like RT is useless, that's is just living in denial...


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> And well if you can't afford RT then don't say something like RT is useless, that's is just living in denial...



Giga cringe. Maybe you read someone else's comment ? When did I say it's useless ?



nguyen said:


> 60FPS is really enough for single player game, when you have extra performance, just add more visual candies like RT.



Says who, you ? Non RT visuals are also enough. See how dumb this sounds ?


----------



## spnidel (Nov 2, 2020)

nguyen said:


> 60FPS is really enough for single player game, when you have extra performance, just add more visual candies like RT.



lolno, 60 fps feels like 30 fps after playing on a 144hz display, things are just so much snappier on 144hz


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> Giga cringe. Maybe you read someone else's comment ? When did I say it's useless ?
> 
> Says who, you ? Non RT visuals are also enough. See how dumb this sounds ?



Might as well play with Low setting to get your high FPS there , enjoy your 100fps+
RT will always apply a performance hit, it aren't free visual upgrade.



spnidel said:


> lolno, 60 fps feels like 30 fps after playing on a 144hz display, things are just so much snappier on 144hz



Really, tell me what single player games can you play at 144hz at 4K Ultra setting with your 5700XT ? CSGO ?


----------



## ratirt (Nov 2, 2020)

spnidel said:


> lolno, 60 fps feels like 30 fps after playing on a 144hz display, things are just so much snappier on 144hz


60FPS feels like 60FPS. 100FPS feels like 100FPS and so on. 
This 144Hz applies only to fast pace games mostly.


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> Might as well play with Low setting to get your high FPS there , enjoy your 100fps+



What the hell are you on about ?


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> What the hell are you on about ?



It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
Or you actually never play any game and just decide that everyone need high FPS.


----------



## Space Lynx (Nov 2, 2020)

Vya Domus said:


> What the hell are you on about ?



I don't think he understands its easy to get high refresh these days.  i am playing dragons dogma maxed out at 144 fps 1440p with a gtx 1070... lol


----------



## oldtimenoob (Nov 2, 2020)

The frostbite engine always seems to favour AMD GPU's. Even with the RX 5700's


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
> Or you actually never play any game and just decide that everyone need high FPS.



I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think everyone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> I am going to have to gather a team of researches to try and figure out what it is that you are saying.
> 
> Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think someone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.



60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.  
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> no point in downgrading visual just to get >60FPS.



Still can't figure out that this is a purely subjective conclusion ?


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> Still can't figure out that this is a purely subjective conclusion ?



Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
Because I can give you many editorials who would target 60FPS gaming


----------



## HaKN ! (Nov 2, 2020)

What is it with people and RayTracing suddenly ? Since it was announced and until before AMD showed the new cars , NOBODY was talking about RayTracing and now ?

Just be glad for the fricking competetion , its a win for all us customers. Nvidia and AMD really dont care about us , they just wanna make as much money as possible.


----------



## mystera (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.



That's like asking: _How do AMD cards perform in games that make heavy use of ®HairWorks???_

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward_,_ most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).


----------



## Vya Domus (Nov 2, 2020)

nguyen said:


> Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?



I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?


----------



## yoyo2004 (Nov 2, 2020)

nguyen said:


> Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?





nguyen said:


> Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?



I am honestly very confused as of what is it you want or trying to imply?


Different players have different preferences when gaming.


----------



## TumbleGeorge (Nov 2, 2020)

There are real RT(Ms DXR) and Nvidia RT. Real RT will be in 100% of games Nvidia RT in no more than 10% of this 100%. I think there will be not games exclusive with support Nvidia RT only.


----------



## arbiter (Nov 2, 2020)

mystera said:


> That's like asking: _How do AMD cards perform in games that make heavy use of ®HairWorks???_
> 
> The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward_,_ most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.
> 
> And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).


This sounds like typical AMD excuse for why their card sucks with what was/is a standard.


----------



## ZoneDymo (Nov 2, 2020)

Dristun said:


> Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
> Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?



I dont agree, I think the reflections in Watch Dogs look pretty dang impressive, sad its all so heavy so a true ray traced future is still several gens out for sure, but look at Digital's Foundry's latest vid on it


----------



## nguyen (Nov 2, 2020)

Vya Domus said:


> I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.
> 
> However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?



Sure just tell me which games you do you play exactly ? CSGO ? Youtube videos ?
Almost everyone recommends turning down visuals in AAA games to hit 144hz 1440p ? Yeah I really need some confirmation on that. No one would want to play AAA games with Low settings just to hit 144hz, that I'm sure of.

I didn't say anyone should play at 60FPS, if you have already max out all the graphical settings and can still getting >60FPS, then play at >60FPS, although capping the framerate really help with input latency with Nvidia Low Latency and AMD Anti Lag in certain games.


----------



## ZoneDymo (Nov 2, 2020)

arbiter said:


> This sounds like typical AMD excuse for why their card sucks with what was/is a standard.



I think you do not know what "standard" means or how its applied.


----------



## EarthDog (Nov 2, 2020)

lynx29 said:


> dragons dogma


This doesn't explain it to you? The title? Monkeys with crayons can draw the scenes fast enough, lol.

From 2016: "Given its old-gen nature, Dragon’s Dogma: Dark Arisen is not really a demanding title."






						Dragon's Dogma: Dark Arisen Gaming Graphics Performance Tweak Guide
					

We show you the many graphics options in Dragon's Dogma: Dark Arisen and how they affect gaming image quality and performance.




					www.tweaktown.com
				




Just saying. 




mystera said:


> That's like asking: _How do AMD cards perform in games that make heavy use of ®HairWorks???_


Its nothing like it, really. AMD, like NV uses DXR. They're both using the same API for RT.







mystera said:


> And based on how well AMD's raytracing looks and performs


Was anything official released on AMD RT performance?

RTX is hardware on the card. NV cards use DXR API for RT just as AMD will.


----------



## Calmmo (Nov 2, 2020)

Dristun said:


> Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
> Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?



I am similarly excited about RT.
It's just fanboys shouting at fanboys at this point. These same people would have been the ones mocking Geforce 256 back in the day about HWT&L, pay them no heed.

We're just in  that awkward phase now where DXR is still an unknown for most people and we still dont know for sure if this years or maybe the next cycle is the one that will bring mainstream acceptance/performance to RT. I personally am not aware of any non DXR games tho i do believe those nvidia developed ones like Quake RTX are probably going to be nvidia HW only. I doubt games like Control aren't going to work on AMD, I suspect it's just AMD's software side of things being still not ready enough. I would expect a lot of growing pains for the first half of 2021 and AMD DXR. Hopefully I'm wrong, but they are going into this dealing with a 2 year handicap.

New games will have cross brand hw to work with soon, and as someone with a 2070super all i can say is the DXR game library is veeery small still, and its only going to really grow now with the new consoles since more or less all cross platform AAA titles will be coming with some form of RT once this first cross platform year of releases is over. (and already some of those cross plats are coming with RT anyway) So it bodes well overall for us mid to long term, regardless of hardware brand choices or.. god forbid loyalties.


----------



## R0H1T (Nov 2, 2020)

arbiter said:


> AMD excuse for why their card sucks with what was/is a *standard*.


You mean the *DXR* standard as opposed to Nvidia's proprietary RT or going back *freesync*, *Mantle* based *Vulkan* just to name a few?


----------



## EarthDog (Nov 2, 2020)

R0H1T said:


> You mean the *DXR* standard as opposed to Nvidia's proprietary RT


Who's proprietary RT? Nvidia uses DXR as well...



> When DXR is enabled by a Game Ready Driver, targeted for April (2019), the supported GeForce GTX graphics cards will work without game updates because ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR. This industry standard API uses compute-like ray tracing workloads that are compatible with both dedicated hardware units, such as the RT Cores, and the GPU’s general purpose shader cores.


----------



## RedelZaVedno (Nov 2, 2020)

I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070  minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.


----------



## kapone32 (Nov 2, 2020)

nguyen said:


> 60FPS is enough to enjoy single player game at the best visual fidelity you can get.
> I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
> Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?
> 
> Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?



I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.


----------



## nguyen (Nov 2, 2020)

kapone32 said:


> I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.



Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.


----------



## WeeRab (Nov 2, 2020)

RedelZaVedno said:


> I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070  minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.


 The 3070 is at the moment, Unicorn breath, like the rest of the Ampere lineup.  What you call "impressive" regards the 3090, becomes idiotic when a $1500 card only beats a $800 card by 10%.
  Oh! and CUDA is no good for gaming - Whilst Ray Tracing kills performance without resorting to DLSS.
Raytracing is todays equivalent of Hairworks or Physx.
  The leather jacket openly lied to Nvidia's consumer base, claiming the 3090 was "Titan-like" when it clearly isn't, and promising plenty  of stock for buyers. The reality is that abysmal yields are the reason
the Ampere series are almost impossible to come by.


----------



## mechtech (Nov 2, 2020)

Chrispy_ said:


> The vanilla 6800 is actually looking really strong in the first few of those benchmarks.
> 
> It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.



I wonder if they will release a cheaper 8GB version of the 6800?


----------



## Xaled (Nov 2, 2020)

RedelZaVedno said:


> I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070  minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.


except the 3070's real price is not 499$. since rtx 2xxx series nvidia has been selling their cards with much higher price than the announced prices. 
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.


----------



## mysterfix (Nov 2, 2020)

RedelZaVedno said:


> I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070  minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.


You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.


----------



## B-Real (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.



XDDDD Cry cry cry.


----------



## nguyen (Nov 2, 2020)

B-Real said:


> XDDDD Cry cry cry.



Oh wow who could have thought that RX 6000 can run RT right  








						Benchmark Results Radeon RX 6800 XT Show Good RT scores and Excellent Time Spy GPU score
					

We've seen AMD post some numbers already showing that we can expect raytracing performance at a GeForce RTX 3070 level with the new 6800 XT. However, the first user based benchmarks are now surfacing...




					www.guru3d.com


----------



## EarthDog (Nov 2, 2020)

Xaled said:


> This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.


lol, wat? How delusional are you? This is not a crime.


----------



## kapone32 (Nov 2, 2020)

nguyen said:


> Do you play the single player or multi player version of the Division 2 ?
> Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.
> 
> Now tell me which do you prefer with your current GPU:
> ...


The point of getting a next generation GPU has always been about high FPS and great visual quality.  With my GPU I always prefer high settings but my AMD driver is pretty good at setting for each Game. 

It's 1440P Freesync2 43-165 HZ high refresh so there is nothing to lament. 

You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating. As far as trusting AMD you can look at my history and know that I have always had confidence in the leadership and honesty of Lisa Su. No one at AMD confirms or leaks anything substantive (except her) since Ryzen was announced. I know that it is difficult in a world amplified by social media that it is sometimes hard to understand that AMD GPUs are faster than Nvidia GPUs (apparently) period this generation. Announcing/leaking a 3080TI card while in the midst (weeks) of launch issues with current 3000 cards is actually laughable in it's desperation. Just imagine if AMD announces that 3000 series CPUs will support SAM.........period, point, blank. The future is indeed bright and X570/B550 has proven to be a great investment.


----------



## Xex360 (Nov 2, 2020)

nguyen said:


> How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.


I think no hardware is good enough for RayTracing. And it doesn't bring anything special, games need more polygons and way better textures. 








						Control PC: a vision for the future of real-time rendering?
					

Last week, we posed the question: has ray tracing finally found its killer app? While Minecraft RTX and Quake 2 RTX hav…




					www.eurogamer.net


----------



## mtcn77 (Nov 2, 2020)

You know this is like the mission 'the Amerigo' where Kerrigan finds out about the ghost project to unlock her psyonic abilities. SAM is the next HSA target, shared address space. The next one is 'common memory interface'.


----------



## EarthDog (Nov 2, 2020)

kapone32 said:


> You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating.


The difference is in games. This thread is about gaming benchmarks.

In order to get the full performance of these cards, you need to be balls deep in the AMD ecosystem. This means the latest mobos and processors. Most people aren't there and that will take time to get there. That said, the real curiosity to me how this works on intel and non 5000 series/b550/x570 setups. From the looks of the charts, that knocks things down a peg.


kapone32 said:


> Just imagine if AMD announces that 3000 series CPUs will support SAM.


just imagine....................as that all it will ever be....


----------



## Blueberries (Nov 2, 2020)

So serious question: If a 6800 will get you 80-100 FPS at 4k, is there any incentive other than "future-proofing" to purchase anything higher specced? I know some people have 120hz+ 4k panels but for the 144hz/1440p and 60hz/4k crowd (i.e., the vast majority) it doesn't seem to make a lot of sense to dump extra heat into your chassis.


----------



## mtcn77 (Nov 2, 2020)

Blueberries said:


> So serious question: If a 6800 will get you 80-100 FPS at 4k, is there any incentive other than "future-proofing" to purchase anything higher specced? I know some people have 120hz+ 4k panels but for the 144hz/1440p and 60hz/4k crowd (i.e., the vast majority) it doesn't seem to make a lot of sense to dump extra heat into your chassis.


6800 is a different hardware class. Rdna2 has less code bubbles. If anything it will lessen your heat impact, compared to your runner up gpu.

I don't want to be a teamster, but this is how the green team thinks. These gpus aren't mobile. They aren't for the same kinds of workloads.

Beating yesteryears competition to the punch is something only Intel gpu goals could imagine, so set your goals high, not to become a laughing stock, imo.


----------



## RedelZaVedno (Nov 2, 2020)

mysterfix said:


> You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.


1 Gbyte GDDR6, 3,500 MHz, 15 Gbps (MT61K256M32JE-14: A TR  )costs $7.01 at the moment if you order up to a million pieces. You can negotiate much lower prices if you order more. That's 56 bucks for 8 gigs in worse case scenario (around 40 is more realistic). AMD is likely making nice profit selling you additional vram for 80 bucks.


----------



## TheTechGuy1337 (Nov 2, 2020)

nguyen said:


> Do you play the single player or multi player version of the Division 2 ?
> Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.
> 
> Now tell me which do you prefer with your current GPU:
> ...





I completely agree with this guy. Refresh rate does not matter as much after 60 fps. I own a 60hz, 120hz, and a 144hz monitor. One of which is a laptop with horrid gray to gray scale of 45ms. All three of them perform similarly. They feel the same with vsync on the the 60hz monitor. If you were to turn vsync off then the difference shows. That is it. If the game is producing frames higher than a monitor can handle. That is the only time these higher refresh rate monitors come into play. However, with implementations like vsync this becomes less of a deal.  In my opinion response time, gray to gray scale performance, brightness, and color accuracy are hands down the most important aspects to monitors. I want to be lost in a new world and not reminded that I have to work in the morning.


----------



## xman2007 (Nov 2, 2020)

How dare AMD price their cards in line with nvidia cards /performance, how dare they turn a profit, says all the ones who were praising nvidia 2 weeks ago for selling a 2080ti performance gpu for 500, which are also vapourware


----------



## nguyen (Nov 2, 2020)

kapone32 said:


> The point of getting a next generation GPU has always been about high FPS and great visual quality.  With my GPU I always prefer high settings but my AMD driver is pretty good at setting for each Game.
> 
> It's 1440P Freesync2 43-165 HZ high refresh so there is nothing to lament.
> 
> You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating. As far as trusting AMD you can look at my history and know that I have always had confidence in the leadership and honesty of Lisa Su. No one at AMD confirms or leaks anything substantive (except her) since Ryzen was announced. I know that it is difficult in a world amplified by social media that it is sometimes hard to understand that AMD GPUs are faster than Nvidia GPUs (apparently) period this generation. Announcing/leaking a 3080TI card while in the midst (weeks) of launch issues with current 3000 cards is actually laughable in it's desperation. Just imagine if AMD announces that 3000 series CPUs will support SAM.........period, point, blank. The future is indeed bright and X570/B550 has proven to be a great investment.



So you agree that you are not using Low settings just to get 144fps then ?  
New games are designed push next gen GPU to their knee all the same, it's all a scheme to sell you GPU you know, or you could just play CSGO for eternity and not care about brand new GPU. 
Well I am just as happy when AMD is competitive as you are, because now the retailers can't charge cut throat price Ampere as before , though at this point I'm just gonna wait for 3080 Ti.


----------



## RedelZaVedno (Nov 2, 2020)

xman2007 said:


> How dare AMD price their cards in line with nvidia cards /performance, how dare they turn a profit, says all the ones who were praising nvidia 2 weeks ago for selling a 2080ti performance gpu for 500, which are also vapourware


xx70 class GPU should never cost more than 400 bucks. I blame FOMOs, fanboys and AMD not being competitive in the high end retail GPU market segment in the last 7 years, enabling Ngreedia to hike prices as they please. Let's face it, 3090 is nothing more than "3080TI" with additional vram for 500 bucks more than 2080TI which was 300 more ($500 in real life) than 1080TI. Praised 3070 is the most expensive xx70 class GPU besides 2070(S) being ever released, yet it was the best selling RTX in the Turing line, go figure.
What really pisses me off is now is AMD deciding to get along with Ngreedia price hikes, not even bothering to compete with them on bettering price to performance ratio anymore. The way things stand today, GPU prices will only go up as AMD has obviously chosen higher profit margins over trying to increasing GPU market share (by offering consumers substantially more for less). DIY PC builders are getting milked by both companies and they seem not to care. That's why I decided to get out of the market. I'm keeping 1080TI and will wait till RDNA3 to buy 3080XT on 2nd hand market for 300 bucks.


----------



## Zach_01 (Nov 2, 2020)

Personally I do not consider 6900XT a 3090 equal. Its between 3080 and 3090. And that is without any feature enabled.
On the other hand 6800XT is a 3080 direct competitor. Even with all the extra features off.

At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
Just remove white tiles above red lines


----------



## EarthDog (Nov 2, 2020)

Zach_01 said:


> At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
> Just remove wihite tiles above red lines,


I'v heard rage mode isn't much that it is mostly SAM doing this(?). I recall Linus mentioning that it isn't much more than power limit increase and fan speed increase to get more boost.

This is great for the few who are balls deep in their ecosystem...but what about for the majority of users? How many, in the current landscape, are using non B550/X570 systems (an overwhelming majority, surely)? You need to upgrade your CPU and mobo to support this feature.  Personally, for the majority, we need to see how these perform without. From the chart above, looks like it takes a win back to a loss and a couple of wins back to virtual ties. I'd love to see this compared to overclocked 3080's instead of whatever they have... oh and on the same API.


----------



## RedelZaVedno (Nov 2, 2020)

Zach_01 said:


> Personally I do not consider 6900XT a 3090 equal. Its between 3080 and 3090. And that is without any feature enabled.
> On the other hand 6800XT is a 3080 direct competitor. Even with all the extra features off.
> 
> At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
> ...


Now, let's be honest RDNA2 rasterization performance is very good if AMD is not lying, pricing not so much. 1000 bucks is still A LOT of $ to pay for a gaming GPU.


----------



## kapone32 (Nov 2, 2020)

EarthDog said:


> The difference is in games. This thread is about gaming benchmarks.
> 
> In order to get the full performance of these cards, you need to be balls deep in the AMD ecosystem. This means the latest mobos and processors. Most people aren't there and that will take time to get there. That said, the real curiosity to me how this works on intel and non 5000 series/b550/x570 setups. From the looks of the charts, that knocks things down a peg.
> just imagine....................as that all it will ever be....


Are not the purpose of Gaming benchmarks not to gauge the performance period but people also included Adobe Premiere benchmarks in AMD reviews? I do not agree that AMD has not penetrated the market and it is not like X570 or B550 (some) boards are expensive. It is not like you have to get a X670 or B650 board. X570 and now B550 are both mature platforms.  If these first CPUs all support SAM then that means that the non X parts will do the same thing so the gap could be wider. I can understand as I too am intrigued to see how Intel and Nvidia cards work with either. I am going to be selfish in this though as I bought a X570 months ago specifically for this as soon as I saw the PS5 technical brief.


----------



## Zach_01 (Nov 2, 2020)

RedelZaVedno said:


> Now, let's be honest RDNA2 rasterization performance is very good if AMD is not lying, pricing not so much. 1000 bucks is still A LOT of $ to pay for a gaming GPU.
> View attachment 174176


I'm not going to argue... but
AMD pricing +54% more for a +5% perf GPU has somehow followed the utter stupidity of...
nVidia pricing +114% more for a +10% perf GPU

It is what it is...!

Non the less, AMDs cards have more perf/$ value.


----------



## ODOGG26 (Nov 2, 2020)

mysterfix said:


> You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.


100 percent agree. People only talking this way because its AMD. I think the 6800 non xt is really good. Double vram plus 15-20% faster than 3070. lmao and people want that for no additional cost. Delusional


----------



## EarthDog (Nov 2, 2020)

kapone32 said:


> Are not the purpose of Gaming benchmarks not to gauge the performance period but people also included Adobe Premiere benchmarks in AMD reviews? I do not agree that AMD has not penetrated the market and it is not like X570 or B550 (some) boards are expensive. It is not like you have to get a X670 or B650 board. X570 and now B550 are both mature platforms.  If these first CPUs all support SAM then that means that the non X parts will do the same thing so the gap could be wider. I can understand as I too am intrigued to see how Intel and Nvidia cards work with either. I am going to be selfish in this though as I bought a X570 months ago specifically for this as soon as I saw the PS5 technical brief.


A few sites cover that, sure. But this is in reference to gaming and following the thread here.

They've penetrated the market... but of those.. who owns the B550/X570? I'd guess more own X470 and lower than X570/B550. Remember, there was a large contingent pissed about 5000 series support on those 400 series boards. With that, it feels like many were just holding out. Also, not a soul has these CPUs yet. So, there is literally zero penetration on that front. So at minimum, you need to buy a new CPU. At worst, you're buying a CPU and a motherboard for this. Again, not something a majority has at this time. So IMO, it would be more than prudent to include the majority here to see how it performs on those systems until one gets the new 5000 series and a motherboard. Obviously seeing BOTH would be ideal.


----------



## Chomiq (Nov 2, 2020)

It's funny


nguyen said:


> Oh wow who could have thought that RX 6000 can run RT right
> 
> 
> 
> ...





> That *is about as fast as the RTX 3070 with ray tracing, however, with DLSS on*. So if you filter out the performance benefit froM DLSS, that's not bad, really.



Next time read thoroughly.


----------



## Punkenjoy (Nov 2, 2020)

We can't really compare T&L on the first GeForce with Ray Tracing. Ray Tracing is adding new feature to the graphic rendering where T&L was to use a fixed function on the GPU to speed up something that every game had to do and were doing it on the CPU.


So we really adding new tricks to game engine right now and not offloading or accelerating an already existing process. 

The thing is we are at a point that we have to do so much trick to emulate lights and reflection that it become really close to require as much power as real ray tracing. Using ray tracing for theses things require a lot of computational power but require way less complexity than all the tricks we currently use to emulate it. 

It's one of the way that we can increase graphic fidelity. But there are still other area where we need to improve like more polygons, better physics and object deformation and better texture and materials. 

Ray Tracing in game is clearly the future and it will clearly improve on every generation.  It's like when the first shaders were added to GPU. It was just for few effect and the performance hit was huge. When it started to be used widely, the first GPUs supporting it were totally outclassed anyway. This will be the same with these current Generation (Both Nvidia and AMD gpu). 

So, yes, Ray tracing is the future and is here to stay. But no, it shouldn't be a buying decision. No one should say, i will get the 3080 instead of a 6800 XT to "Future proof due to ray tracing"

People should just buy the best GFX cards for game released right now and maybe in the next year max.


----------



## nguyen (Nov 2, 2020)

Chomiq said:


> It's funny
> Next time read thoroughly.



It's funny,
I just showed that RX6000 can indeed run RTX, yet AMD did not share the numbers.

Next time read more thoroughly please.
Also I get 86fps with 2080 Ti at 1440p RTX ON DLSS OFF, I have no clue what the guy who wrote the article get their numbers from.


----------



## lexluthermiester (Nov 2, 2020)

NeuralNexus said:


> NO ONE REALLY CARES ABOUT RAYTRACING


If you really think that, I have a bridge in Brooklyn NY I want to sell you...


NeuralNexus said:


> All future games will be optimized for AMD's raytracing solution anyway.


That's an assumption on your part and not a very logical one, especially considering that NVidia has already had 2 years to gain a lead in both deployment and development of RTRT.


----------



## mysterfix (Nov 2, 2020)

Zach_01 said:


> I'm not going to argue... but
> AMD pricing +54% more for a +5% perf GPU has somehow followed the utter stupidity of...
> nVidia pricing +114% more for a +10% perf GPU
> 
> ...


The simple fact is Both parts are very low yield cards and both companies know we have silly people with more money than common sense that will pay to have "The Fastest Bestest graphics card money can buy".  If no one bought them they would lower prices.


----------



## OGoc (Nov 2, 2020)

The avg frames using 5900x seem lower than TPU's using 9900K @ 5.0


----------



## ODOGG26 (Nov 2, 2020)

EarthDog said:


> I'v heard rage mode isn't much that it is mostly SAM doing this(?). I recall Linus mentioning that it isn't much more than power limit increase and fan speed increase to get more boost.
> 
> This is great for the few who are balls deep in their ecosystem...but what about for the majority of users? How many, in the current landscape, are using non B550/X570 systems (an overwhelming majority, surely)? You need to upgrade your CPU and mobo to support this feature.  Personally, for the majority, we need to see how these perform without. From the chart above, looks like it takes a win back to a loss and a couple of wins back to virtual ties. I'd love to see this compared to overclocked 3080's instead of whatever they have... oh and on the same API.


I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2


----------



## InVasMani (Nov 2, 2020)

I still want to see 1080p results the RX6800 will probably beat the RTX 3080 in quite a few of those cases which is somewhat hilarious the e-league Intel crowd will probably love those results. I don't understand Nvidia's thinking with Ampere design they are pushing 4K performance while also pushing RTRT that has no chance in hell of being practical at 4K with good results. There is something very wrong with that picture to me. I'd be really interesting to see how RNDA2 does at RTRT at 1080p compared to RTX and even 720p for that matter. The mClassic would be a rather interesting with the 720p RTRT upscale it to 1080p 120Hz.


----------



## Cheeseball (Nov 2, 2020)

Modern Warfare and BFV have always been a bit more Radeon-biased, CODMW is up to 10% better on the RX 5700 XT compared to the 2070 Super with the same graphical settings (RT off, NVIDIA Reflex off) and BFV has always shown to be faster on AMD cards (5700 XT can beat the 20*80 *Super at 1080p and 1440p).

SOTR and Doom Eternal seem to be the more even benchmarks of them all, with the Foundation engine in SOTR being on DX12 (although it supports DLSS and RTX) and id Engine Tech 7 on Vulkan.

I'd like to see how well these do in the Quantic Dreams engine or in the latest revision of Unreal Engine 4. My prediction is that the 6800 XT is match-for-match with the RTX 3080 and the RTX 3090 will be beaten due to the waaaaay more affordable price.



ODOGG26 said:


> I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2



The RX 5700 XT didn't really overclock great (2,000+ MHz only yielded at most 10 FPS with most models) as well, but we'll see how the 6800 XT works out.


----------



## EarthDog (Nov 2, 2020)

ODOGG26 said:


> I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2


We're looking at the same chart right? The one AMD put up there, not the DIY bar chart below my post, right? I stand by what I said 100%. Nothing was wrong in what I said.

I see (without all the boosts), AMD.... Wins/Ties/Wins/Wins/Wins/Loses/Loses. I'd also like to note the scale. 20% between tiers. so most of these wins (Doom, GOW5, Hitman) are really negligible. I'd call the cards about equal as well (unless you're overclocking with Rage and using X570/B550/5000 Series). But yeah, W/T/W/W/W/L/L with negligible differences between two of those wins.

RE: Overclocking, indeed, nobody has a lot of headroom these days. That said, if you look at TPU reviews, we are seeing a couple-few % depending on the model. With how close some of these AMD benchmarks are, that makes those titles a tie or flips them the other way negligibly, just like what we see in red only.


----------



## Cheeseball (Nov 2, 2020)

EarthDog said:


> RE: Overclocking, indeed, nobody has a lot of headroom these days. That said, if you look at TPU reviews, we are seeing a couple-few % depending on the model.



Indeed, these new GPUs (speaking from my experience with the RX 5700 XT and RTX 3080) are just like Ryzen 3rd gen, keep them cool and they will boost higher. No need to increase clocks (at least for me with a 144 Hz monitor), but reducing power draw while keeping stock performance (or even better if it stays cool) is what it is now.


----------



## Shatun_Bear (Nov 2, 2020)

Nvidia cheaping out by using Samsung's poor 8nm node is really biting them in the backside...

I can't believe in 1440p the middle RDNA2 card (6800XT) is faster than the 3090!! And this is without RAGE mode enabled:









						AMD discloses more Radeon RX 6900XT, RX 6800XT and RX 6800 gaming benchmarks - VideoCardz.com
					

AMD marketing machine is certainly not slowing down after the big announcement. Today the manufacturer published gaming benchmarks on an easy to use website with direct comparison to its competitor. AMD publishes further Radeon RX 6900/6800 series gaming performance data AMD is comparing Radeon...




					videocardz.com


----------



## ODOGG26 (Nov 2, 2020)

Cheeseball said:


> Modern Warfare and BFV have always been a bit more Radeon-biased, CODMW is up to 10% better on the RX 5700 XT compared to the 2070 Super with the same graphical settings (RT off, NVIDIA Reflex off) and BFV has always shown to be faster on AMD cards (5700 XT can beat the 20*80 *Super at 1080p and 1440p).
> 
> SOTR and Doom Eternal seem to be the more even benchmarks of them all, with the Foundation engine in SOTR being on DX12 (although it supports DLSS and RTX) and id Engine Tech 7 on Vulkan.
> 
> ...


Agreed. Should be better one would think with being full RDNA now. But we shall see.


----------



## Makaveli (Nov 2, 2020)

There are so many clueless post in this thread its hilarious thanks for the laughs.


----------



## Batailleuse (Nov 2, 2020)

RedelZaVedno said:


> 1 Gbyte GDDR6, 3,500 MHz, 15 Gbps (MT61K256M32JE-14: A TR  )costs $7.01 at the moment if you order up to a million pieces. You can negotiate much lower prices if you order more. That's 56 bucks for 8 gigs in worse case scenario (around 40 is more realistic). AMD is likely making nice profit selling you additional vram for 80 bucks.



you forget a few thing in your pricing, 

1= you talk raw price, you forgot there is also manufacturing to add
2= you forget that those are B2B price (no VAT) you have to add VAT to consumers
3= you forget about commercial margins.

when you add say 20% VAT on your 56 is ... 84

Honestly ... big fanboyism towards Nvidia right here. they give slightly better perf and double the ram for slightly more price but not much more that what the 8gb costs.


----------



## Deleted member 24505 (Nov 2, 2020)

Makaveli said:


> There are so many clueless post in this thread its hilarious thanks for the laughs.



yeah man


----------



## Batailleuse (Nov 2, 2020)

TheTechGuy1337 said:


> I completely agree with this guy. Refresh rate does not matter as much after 60 fps. I own a 60hz, 120hz, and a 144hz monitor. One of which is a laptop with horrid gray to gray scale of 45ms. All three of them perform similarly. They feel the same with vsync on the the 60hz monitor. If you were to turn vsync off then the difference shows. That is it. If the game is producing frames higher than a monitor can handle. That is the only time these higher refresh rate monitors come into play. However, with implementations like vsync this becomes less of a deal.  In my opinion response time, gray to gray scale performance, brightness, and color accuracy are hands down the most important aspects to monitors. I want to be lost in a new world and not reminded that I have to work in the morning.



each person has different sensivity to refresh rates. 

60 for me is low and i clearly feel the difference between my 144hz and a 60hz

however for  having done a blind test on 100-120-144-240 past 120hz i cant see the difference anymore. 

so for now im keeping my 144hz monitor, and i know that eventually if a decent smart TV 120hz 4K screen i can handle that but i couldnt go back to 60Hz


----------



## spnidel (Nov 2, 2020)

nguyen said:


> Really, tell me what single player games can you play at 144hz at 4K Ultra setting with your 5700XT ? CSGO ?


lmao who said anything about 4k 144hz wtf


----------



## Ashtr1x (Nov 2, 2020)

NeuralNexus said:


> NO ONE REALLY CARES ABOUT RAYTRACING...All future games will be optimized for AMD's raytracing solution anyway. Given that the gaming being developed for this generation will be built to purposefully use Zen 2 and RDNA 2 architecture as the base specs.



Same BS everytime. What happened to the AMD based Jaguar trash and the pathetic GPUs in the 8th gen conslow boxes ? Nothing but corners being cut and trash tradeoffs.


----------



## TheoneandonlyMrK (Nov 2, 2020)

Ashtr1x said:


> Same BS everytime. What happened to the AMD based Jaguar trash and the pathetic GPUs in the 8th gen conslow boxes ? Nothing but corners being cut and trash tradeoffs.


Well , like Atari and many others you could crack on and build your own, I would be surprised if you can beat the performance of the Xbox series X for it's price.
I mean there are a few reasons for jaguar and it's failure but they're arguably just the competition was better but damn that was a while ago.


----------



## InVasMani (Nov 2, 2020)

I really want to see AMD add 1080p results to these benchmarks so I can see how the RX6800 performance is in relation to the RTX 3080 at that resolution. The performance edge the RTX 3080 holds on average over the RX6800 narrows quite a bit at 1440p so if that extends even further at 1080p that's very interesting to look at given that the RX6800 really is aiming to compete against the RTX 3070 price wise. The RTRT performance will be interesting as well especially especially if relatively close and comparable and the RNDA2 architecture tends to perform much more competitively against Ampere at 1440p and below. If the infinity cache is playing a role in the results that's also quite fascinating and could be a huge upside perk if going with lower resolutions and higher refresh rates rather than the opposite.

https://www.amd.com/en/gaming/graphics-gaming-benchmarks



Shatun_Bear said:


> Nvidia cheaping out by using Samsung's poor 8nm node is really biting them in the backside...
> 
> I can't believe in 1440p the middle RDNA2 card (6800XT) is faster than the 3090!! And this is without RAGE mode enabled:
> 
> ...


 So the RX6800 relative to the RTX 3080 at 4K is 10.8% slower while at 1440p it's 5.2% slower. I suppose that means at 1080p the RX6800 should be about even to the RTX 3080 at that resolution at least in these types of benchmarks. In the case of RTRT it could be different of course still impressive for a card that's actually competing with the much cheaper RTX 3070. It looks like that resolution comparative difference is more pronounced on the RX6800 than with the RX6800XT and RX6900XT the additional performance gained relative to the RTX 3080 is lower for those higher up models. The fact that it's 5.2% slower in relative terms dropping from 4K to 1440p and also is not 5.4% slower is telling as well. The RX6800 will likely compete quite well against the RTX 3080 at 1080p resolution perhaps from the looks of things if the results continue to fall in line at 1080p all while being much cheaper.


----------



## B-Real (Nov 2, 2020)

1d10t said:


> It's funny because last year people ignore these RTRT and refer them as to "unnecessary" but now is major selling point along DLSS.


Absolutely what I wanted to answer to him...  Great catch.


----------



## mtcn77 (Nov 2, 2020)

B-Real said:


> Absolutely what I wanted to answer to him...  Great catch.


No catch. Nvidia hasn't buckled the trend of beating old cards to the same benchmarks yet. This is a reflection of the mobile mindset. You don't have to move the competition forward there.
I wish to say ray tracing is a success but let us be real, unless consoles make it to the mainstream, it won't be.
Exclusivity is good and all, but it doesn't pay the bills. I have anectodal reference of just how pitiful the situation is right now. I'm sure you have watched the webcasters talking the same market insider pitches on how little, just a drop in the bucket, sponsorships are next to the big bucks. Unless sales move forward, the number of ray tracing consumers are just hunting for nvidia sponsored titles. Have they yet caught on to their big break, or are they that many?


----------



## Initialised (Nov 2, 2020)

Back on October 9th you released this: https://www.techpowerup.com/273150/...tpus-own-benchmark-numbers-of-comparable-gpus






1: Can we please have an update of this chart for the three SKUs since it appears that this is based on 6800 (little big Navi)?
2: Do you think this indicates that the lower TGP/CU units in the consoles do indeed compare to 2080Ti?
3: Do the console numbers give an indication of the lower SKUs in the RX 6000 product stack?
4: Do the PS and Xbox SoCs pave the way for the 6000G APUs?
5: Do the PS5 and XBox get a 5nm refresh in a couple of years?

CUs, Frequency, Platform
20 1.565 GHz XBox Series S (6500? equivalent to 5700)
36 2.23 GHz PlayStation 5 (6600?)
52 1.825 GHz Xbox Series X (6700?)
60 1.815/2.105 GHz 6800
72 2.015/2.25 GHz 6800XT
80 2.015/2.25 GHz 6900


----------



## mtcn77 (Nov 3, 2020)

I knew I read this somewhere. Now I know what mesh shaders do, I guess. It is great: each thread that covers a pixel doing a coverage test had to be tied to a workgroup. Well, not any more... This frees a big chunk of the pipeline from fixed unit z testing. Coupled with the benefit of being cacheable and not rop bound; now, the hardware is free from small triangle penalties and the graphics can be ported over to compute. Pretty neat.


----------



## moproblems99 (Nov 3, 2020)

Presuming the benchmarks are accurate, what impresses me most is DX11 performance.  Granted most of the games in DX11 were Frostbite which always favored AMD but they haven't been competitive in DX11 for quite a while.


----------



## Zach_01 (Nov 3, 2020)

Initialised said:


> Back on October 9th you released this: https://www.techpowerup.com/273150/...tpus-own-benchmark-numbers-of-comparable-gpus
> 
> 
> 
> ...


The TPU estimation was based on AMD FPS numbers showed on ZEN3 event. Lisa Su said that it was on a 6800XT. Possibly on unfinished tuning and clock settings.
And you can't compare console SOCs to PC components CPU/GPU. Console's package of CPU and GPU combined is completely custom, designed for a console with its constraints of power draw and heat output, and price.


----------



## Minus Infinity (Nov 3, 2020)

6800XT + 5900X will be a nice Xmas present assuming I can get a hold of them.


----------



## InVasMani (Nov 3, 2020)

I think ZEN3 5600X is good value as is RX6800. That said I think the ZEN3 5800X could be the optimal CPU for gaming a bit higher base clock/peak over the ZEN3 5600X with two additional cores, but a tougher pill to swallow on the price if on a limited budget. I suppose these days I'd probably opt for that than stepping up to a RX6800XT to be fair personally I'm finding CPU core count and base frequency more and more appealing from a overall system standpoint. I think 'd get more mileage in the long run plus the RX6800 is great value for RNDA2 judging from what I've seen thus far.


----------



## lexluthermiester (Nov 3, 2020)

Minus Infinity said:


> 6800XT + 5900X will be a nice Xmas present assuming I can get a hold of them.


That would be a very nice combo.


----------



## r.h.p (Nov 3, 2020)

medi01 said:


> Which doesn't seem to be to need brute-force path tracing anyway:
> 
> 
> 
> ...


----------



## turbogear (Nov 3, 2020)

@btarunr

Thanks a lot for sharing the results.
Looks very promising.

Waiting for review from TPU especially for 6800XT to decide if that is my next GPU.  
I would be interested to see how the performance is with Zen 2 without SAM.

When Nvidia launched their new RTX generation I was thinking that it would really tough for AMD to match that but looking at the benchmarks until now it is really impressive to see AMD is catching up to NVidia at least at non DXR. 

Let's hope that these will be available on the release day in larger quantities and not be sold out within minutes after release giving one option to buy it after reading reviews and not disappear from the online stores while one is looking at reviews.


----------



## TumbleGeorge (Nov 3, 2020)

Xex360 said:


> games need more polygons and way better textures.


According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
PP 280 vs 190
TP 720 vs 556.
LoL. AMD is more future proof for long-term use!
PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.


----------



## ratirt (Nov 3, 2020)

TumbleGeorge said:


> According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
> PP 280 vs 190
> TP 720 vs 556.
> LoL. AMD is more future proof for long-term use!
> PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.


I know what you are trying to say here but these cards are different. These should not be compared 1 to 1 considering the hardware.


----------



## TumbleGeorge (Nov 3, 2020)

ratirt said:


> I know what you are trying to say here but these cards are different. These should not be compared 1 to 1 considering the hardware.


This will show results only for time when it's will be compared for first, not related for term of how long in time cards will be relevant in the future. I think that AMD cards even if they don't show a big advantage in first reviews, in the future they will perform even better compared to the competing models from Nvidia's 30* series.


----------



## EarthDog (Nov 3, 2020)

TumbleGeorge said:


> This will show results only for time when it's will be compared for first, not related for term of how long in time cards will be relevant in the future. I think that AMD cards even if they don't show a big advantage in first reviews, in the future they will perform even better compared to the competing models from Nvidia's 30* series.


??? 

Fine wine? A couple % uptick overall more in a title or two? I wouldn't hold my breath for that. And those numbers you quoted don't add up to your conclusion.


----------



## TumbleGeorge (Nov 3, 2020)

EarthDog said:


> ???
> 
> Fine wine? A couple % uptick overall more in a title or two? I wouldn't hold my breath for that. And those numbers you quoted don't add up to your conclusion.


All be clear in future. Ат the moment we can only guess, based on the characteristics we know at the moment, how things will develop in the future. It is not possible to present facts that have not yet happened.


----------



## EarthDog (Nov 3, 2020)

TumbleGeorge said:


> All be clear in future. Ат the moment we can only guess, based on the characteristics we know at the moment, how things will develop in the future. It is not possible to present facts that have not yet happened.


Im glad you understand that concept... apply it.


----------



## Zach_01 (Nov 3, 2020)

TumbleGeorge said:


> According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
> PP 280 vs 190
> TP 720 vs 556.
> LoL. AMD is more future proof for long-term use!
> PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.


As you might figured out already, those numbers are telling absolutely nothing about actual performance of a card. Same with TFLOPS. Its just for reference. Raw fillrates, computing power and VRAM bandwidth cannot be directly comparable between different architecture GPUs. Not even if GPUs are made under the same brand.
And you cant predict either the future performance gains or losses of a GPU against another product as the factors related are far too many.


----------



## TumbleGeorge (Nov 3, 2020)

EarthDog said:


> Im glad you understand that concept... apply it.


 Hmm, next factors to Nvidia incompLetences with VRAM size(exclude partially only RTX 3090 and include all other models, 3080 10GB; 3070 8GB; 3060 ti(?):


First:


> AMD will support all ray tracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API. Games making of use of proprietary raytracing APIs and extensions will not be supported.
> — AMD Marketing
> .....
> AMD has made a commitment to stick to industry standards, such as Microsoft DXR or Vulcan ray tracing APIs. Both should slowly become more popular, as the focus goes away from NVIDIA’s implementation. After all, Intel will support DirectX DXR as well, so developers will have even less reason to focus on NVIDIA’s


Second:



> Interestingly, Keith Lee revealed that in order to support 4X x 4X UltraHD textures a 12GB VRAM is required. This means that Radeon RX 6000 series, which all feature 16GB GDDR6 memory along with 128MB Infinity Cache should have no issues delivering such high-resolution textures. It may also mean that the NVIDIA GeForce RTX 3080 graphics card, which only has 10GB of VRAM, will not be enough


Links are below "First & Second"!


----------



## EarthDog (Nov 3, 2020)

TumbleGeorge said:


> Hmm,


NV uses DXR, same as AMD.....

10GB may fall short at 4K in a few years... but by then, you'll want another GPU anyway. Even DOOM on nightmare doesn't eclipse 10GB @ 4K.



Zach_01 said:


> As you might figured out already, those numbers are telling absolutely nothing about actual performance of a card. Same with TFLOPS. Its just for reference. Raw fillrates, computing power and VRAM bandwidth cannot be directly comparable between different architecture GPUs. Not even if GPUs are made under the same brand.
> And you cant predict either the future performance gains or losses of a GPU against another product as the factors related are far too many.


I'm giving up.


----------



## BoboOOZ (Nov 3, 2020)

Cheeseball said:


> The RX 5700 XT didn't really overclock great (2,000+ MHz only yielded at most 10 FPS with most models) as well, but we'll see how the 6800 XT works out.


The 5700XT OC'd pretty well (went up in frequency) but gains were small due to it already being memory bandwidth starved. Here, the memory architecture was completely overhauled, and at least the infinity cache should go up in speed with the core while overclocking, so it should be quite interesting to see...


----------



## medi01 (Nov 3, 2020)

lexluthermiester said:


> That's an assumption on your part and not a very logical one, especially considering that NVidia has already had 2 years to gain a lead in both deployment and development of RTRT.


It's a very logical assumption, given who commands console market (and situation in the upcoming GPUs too).

More likely scenario, though,  is that in that form (brute force path tracing) it will never take off.



turbogear said:


> is catching up



Smaller chips, lower power consumption, slower (and cheaper) VRAM, more of it, for lower price than competition and better perf/$ than competition.
Catching up, eh?


----------



## lexluthermiester (Nov 3, 2020)

medi01 said:


> It's a very logical assumption, given who commands console market (and situation in the upcoming GPUs too).


Oh, do help us all understand your point in more detail...


----------



## Punkenjoy (Nov 3, 2020)

One of the reason of AMD fine wine is just that AMD took more time to polish their drivers because they have way less resource than Nvidia to do so.

Another is that CGN balance between fillrate/texture rate vs compute performance was a bit more on the Compute side. NVidia on the other hands focused a bit more on the fill rate side.  

Each generation of games was shifting the resource from fill rate to compute by using more and more power and AMD GPU in a better position. But not really enough to make a card last way longer. Also the thing is low end cards where outclassed anyway were High end cards were bought by people with money that would probably change them as soon as it would make sense.

It look like that AMD with NAVI went to a more balanced setup where Nvidia is going onto the heavy compute path. We will see in the future what is the better balance but right now it's too early to tell. 

So in the end, it do not really matter. a good strategy is to buy a PC at a price that you can afford another one at the same price in 3-4 year and you will always be in good shape. If paying 500$ for a card every 3-4 years is too much, buy something cheaper and that's it.

there is good chance that in 4 years, that 500$ card will be beaten by a 250$ card anyway. Even more when we think they are going to chiplet design with GPU. that will drive a good increase on performance.


----------



## medi01 (Nov 3, 2020)

lexluthermiester said:


> Oh, do help us all understand your point in more detail...


Stranger talking about self in plural, you are seriously asking why anyone would optimize games for the LION's share of the market?


----------



## lexluthermiester (Nov 3, 2020)

medi01 said:


> Stranger talking about self in plural, you are seriously asking why anyone would optimize games for the LION's share of the market?


Then why aren't you? Hmm? Perhaps because you know both that there is a counter argument and that such an argument is perfectly valid. It's as valid now as it has been since the Console VS PC debate began.


----------



## moproblems99 (Nov 3, 2020)

medi01 said:


> It's a very logical assumption, given who commands console market (and situation in the upcoming GPUs too).



Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?


----------



## TumbleGeorge (Nov 3, 2020)

moproblems99 said:


> Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?


The explanation is extremely easy. In the past, AMDs were not ready to take advantage of the fact that the hardware of the old consoles had components developed by them. However, now they can and do!


----------



## mtcn77 (Nov 3, 2020)

TumbleGeorge said:


> The explanation is extremely easy. In the past, AMDs were not ready to take advantage of the fact that the hardware of the old consoles had components developed by them. However, now they can and do!


It is the opposite imo. After they programmed the radeon profiler, they found out about the intrinsic limits of the hardware.
Yes, the scheduler was flexible, as it was announced to be in its launch, but instruction reordering does not necessarily mean the full extent of its performance. IPC still was 0.25 and now that it is 1 is a lot in comparison. They have all these baked-in instructions doing the intrinsic tuning for them in the hardware. The isa moved away from where gcn was by a great deal. Plus, they have this mesh shader which abnegate the triangle pixel size vs wavefront thread cost to deal with it in hardware. Performance really suffered with <64 pixel area triangles. Not so, any more.


----------



## medi01 (Nov 3, 2020)

moproblems99 said:


> Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?



Oh, that is easy, my friend.
*EPIC on UE4 "it was optimized for NVidia GPUs".
EPIC today, demoes UE5 on RDNA2 chip running on the weaker of the two next gen consoles, spits on Huang's RT altogether, even though it is supported even in UE4.*










There is more fun to come.

Recent demo of XSeX vs 3080 was commended by a greenboi like "merely 2080Ti levels".
That is where next gen consoles ar > 98-99% of the PC GPU market.



lexluthermiester said:


> Then why aren't you?


It was a rhetorical question.


----------



## moproblems99 (Nov 3, 2020)

@medi01 , no idea what you just said.


----------



## mtcn77 (Nov 3, 2020)

moproblems99 said:


> @medi01 , no idea what you just said.


Yeah, me neither. An overview could be so nice. Rdna2 ftw, you were saying?


----------



## TheoneandonlyMrK (Nov 3, 2020)

moproblems99 said:


> Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?


So consider, is GPU physx big in game's ,what about Cuda ,is that big in game's because direct compute is, as is tesselation, an Rx580 will meet the minimum specs at least of any game released since it's birth.
Did Nvidia bring more performance at times, yes of course but that doesn't preclude AMD having good support for their features.
And GCN looked pretty effing capable until afew years after last gen consoles came out, about the Maxwell era no?.


----------



## mtcn77 (Nov 3, 2020)

theoneandonlymrk said:


> And GCN looked pretty effing capable until afew years after last gen consoles came out, about the Maxwell era no?.


To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
They never worked perfect out of the box.


----------



## TheoneandonlyMrK (Nov 3, 2020)

mtcn77 said:


> To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
> They never worked perfect out of the box.


I disagree I think, and wtaf does Ati have to do with anything in this debate or the last console generation, it's fine to disagree but try and stay relevant.


What the actual ffff so I did something special the last few AMD generations without even knowing it?  Tune what?!.

They didn't end like that in my country, source please.


----------



## moproblems99 (Nov 3, 2020)

theoneandonlymrk said:


> So consider, is GPU physx big in game's ,what about Cuda ,is that big in game's because direct compute is, as is tesselation, an Rx580 will meet the minimum specs at least of any game released since it's birth.
> Did Nvidia bring more performance at times, yes of course but that doesn't preclude AMD having good support for their features.
> And GCN looked pretty effing capable until afew years after last gen consoles came out, about the Maxwell era no?.



My point was that AMD had last gen consoles and games are no more optimized for them now then they were before.

However, AMD now appears to have a winner so most of this is moot.


----------



## TheoneandonlyMrK (Nov 3, 2020)

moproblems99 said:


> My point was that AMD had last gen consoles and games are no more optimized for them now then they were before.
> 
> However, AMD now appears to have a winner so most of this is moot.


So stop adding irrelevant bits, and I got your main point and still disagree.
Every new game released worked here and performance got optimised over a few months via driver, and I can prove it via AMD driver release notes.
Can you provide any proof your opinion is right?.


----------



## mtcn77 (Nov 3, 2020)

theoneandonlymrk said:


> They didn't end like that in my country, source please.


 such distrust over something so little...


----------



## TheoneandonlyMrK (Nov 3, 2020)

mtcn77 said:


> such distrust over something so little...


Distrust, I am fact checked often why not you.
And I watched every GPU release done since GPU were a thing and don't recall AMD saying what you accuse them of saying every time?! Or more importantly even once.


----------



## mtcn77 (Nov 3, 2020)

theoneandonlymrk said:


> Distrust, I am fact checked often why not you.
> And I watched every GPU release done since GPU were a thing and don't recall AMD saying what you accuse them of saying every time?! Or more importantly even once.


It was a developer session. You are literally throwing me into the hay sack with what you did there. I'm an astroturfer for sake of the argument. It is your duty to prove anything...


----------



## TheoneandonlyMrK (Nov 3, 2020)

mtcn77 said:


> It was a developer session. You are literally throwing me into the hay sack with what you did there. I'm an astroturfer for sake of the argument. It is your duty to prove anything...





mtcn77 said:


> To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
> They never worked perfect out of the box.


A developer session told you you had to custom tune your hardware to get 10%.
I'm laughing I'll leave it at that.


----------



## moproblems99 (Nov 3, 2020)

theoneandonlymrk said:


> Can you provide any proof your opinion is right?



Yes, let me quote it:



theoneandonlymrk said:


> performance got optimised over a few months via driver, and I can prove it via AMD driver release notes



There you go.  AMD did the optimizing via drivers.  Devs did the same ol shit they used to do and continue to do.  I mean shit, everybody is acting like I work for nvidia when I have an all AMD rig.


----------



## lexluthermiester (Nov 3, 2020)

moproblems99 said:


> @medi01 , no idea what you just said.


That's because it was nonsense.


----------



## mtcn77 (Nov 3, 2020)

theoneandonlymrk said:


> A developer session told you you had to custom tune your hardware to get 10%.
> I'm laughing I'll leave it at that.


To get from 90% to 100. It was a general session.


----------



## TheoneandonlyMrK (Nov 3, 2020)

moproblems99 said:


> Yes, let me quote it:
> 
> 
> 
> There you go.  AMD did the optimizing via drivers.  Devs did the same ol shit they used to do and continue to do.  I mean shit, everybody is acting like I work for nvidia when I have an all AMD rig.


It was you saying they didn't optimise, you seem to have swapped argument.



lexluthermiester said:


> That's because it was nonsense.


A lot of this talk is.


----------



## lexluthermiester (Nov 3, 2020)

theoneandonlymrk said:


> A lot of this talk is.


Yeah, pretty much. Why can't people just get along? I find it refreshing that NVidia pulled out all the stops and yet AMD has caught up in just two GPU gen jumps. It's a great time to be in the tech/PC industry!


----------



## mtcn77 (Nov 3, 2020)

I like it when exclusive brands pull out the openness card at the first sight of setback.
I cannot shed any tears, sorry. This is a very hard fought achievement. Every last bit of AMD went into making radeon right after the RTG situation. Good thing it was stopped before more damage could be incurred on the firm. RTG was losing money just about then.


----------



## moproblems99 (Nov 3, 2020)

theoneandonlymrk said:


> It was you saying they didn't optimise, you seem to have swapped argument.



What are you on about?  The discussion was clearly about *developers* not optimizing games for AMD's architecture.  Did you read any of the other posts or just assume that I am pissing on 
AMD?

Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.


----------



## TheoneandonlyMrK (Nov 3, 2020)

moproblems99 said:


> What are you on about?  The discussion was clearly about *developers* not optimizing games for AMD's architecture.  Did you read any of the other posts or just assume that I am pissing on
> AMD?
> 
> Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.


I am aware, and AMD too have people they can put in place to help game and Engine Dev's optimise their software.
I'm arguing they do.

Oh and your insinuations, stick them you know where yeh , I read your posts, and I still chose not to call you names or insinuate anything yeh.


----------



## mtcn77 (Nov 3, 2020)

moproblems99 said:


> Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.


In the past, there were shills that would attack you on moderators watch in OCN. It was a learning experience. I developed so many false premise engagement vectors there. Now I stick to astroturfing. It is so elegant, people don't know what hit them. It is so good to confuse people from outside of their perspective, they don't know whether it is real, or whether to care. It literally leaves them searching for things.

Genuine or not, I cannot help but feel sympathy for the people sharing their concern and good wishes for the well being of the red team. I'm impressed.


----------



## medi01 (Nov 4, 2020)

moproblems99 said:


> @medi01 , no idea what you just said.


You realize it is cruel towards team green, right?
Let me expand it:

EPIC, the company behind major game development framework "Unreal Engine",when being asked about "why do Unreal Engine 4 games run soo poorly on AMD" was blantly saying "it was optimized for NVidia GPUs". Demo of the UE4 engine has happened, as you would guess, on the fastest NVidia GPU. Because that was all that mattered.

In 2020, EPIC has demoed Unreal Engine 5.
Demo took place, on... PS5, next gen console, with RDNA2+Zen APU inside it.
Because NV is no longer as important as it was, in fact, it's AMD who is important.
Note that it was even before RX 6000 series were rolled out into green faces.

And to make it even more embarrassing to team green, in a very impressive demo loaded with light effects, no Nvidia style RT was used.










More to it: RDNA2 based APUs in the upcoming consoles are mighty, beating nearly entire PC market, with only 2080s/2080Ti level GPUs being faster than them.


----------



## BorgOvermind (Nov 5, 2020)

This perfectly explains it:









The battle will not be about the highest benchmark bar but about everything else.
Quality of the product and the extra features that exist or not on one side or the other will make the use of the GFx cards of one side or the other good for one thing or the other.

Overall, I expect a general "equalization" of brute computing power now that both side had a lot of time to study what the other did and come up with something like-it+.

I'd like to see some mining scores in the mean time, they can be relevant for quite a few things.

@*medi01*

Most PC games I played between the start of 2018 and mid-2019 were UE-based.
I like that engine a lot and I hope it keeps up.


----------



## BoboOOZ (Nov 5, 2020)

BorgOvermind said:


> The battle will not be about the highest benchmark bar but about everything else.
> Quality of the product and the extra features that exist or not on one side or the other will make the use of the GFx cards of one side or the other good for one thing or the other.


That's highly debatable, I personally will still buy based on price/perf. Of course, if Nvidia loses the performance crown, it's expected that they will try to move the fight to some other zone where they can still win.

But I would argue that the most important feature in the next 6 months will be AVAILABILITY


----------



## EarthDog (Nov 5, 2020)

BoboOOZ said:


> But I would argue that the most important feature in the next 6 months will be AVAILABILITY


GL getting an AIB AMD card before the end of the year.


----------



## cueman (Nov 29, 2020)

looks clear and likely way that rtx 3090 and also 3080 ti are fastest gpu 2020 until next generations gpus coming 2022...and  i mean 4K gaming for sure.

its weird that rx 6800 and 6800xt performance different is small..can i say.. even smaller than rtx 3080 vs rtx 3090,but sure also that difffrent arent big.
and looks even amd owns test,rx 6900 xt not help it much that battle,..not enough looks...

well ,we seen it 8 of december.


anyway,thouse 5 gpus (rx 6800 xt,rx 6900xt, rtx 3080,rtx 3090 and rtx 3080 ti) maded and meaning to used just and only 4K gaming ,
bcoz lower resolution gaming,there is alot better gpus with much lower price.


under 4K gaming battle is rtx 3070 ti and rx 6800?,.. hmm i except prices are same way.


i see that january 2021 we seen totally podium for fastest gpu score, when rtx 3080 ti coming also,so all gpus is table,also rtx 3070 ti.



let see!

hmm, i just imagine, when next gpu generation coming, what kind monster they are, and i mean rdna3 and nvidia hopper.
35-50% more? where we need it? i mean 95% players use under 4K monitors.

also, where is intel Xe gpu, we want 3 of gpus battle line,even more competitions!!!

also
march 2021 is month we see also totally, what is best cpu for running games, is it amd vermeer or intel rocket lake...hmm what i seen and heard, rocket lake is the one.
rocket lake is answer for amd vermeer cpu.

i wait alot intel adler lake june 2021,its 10nm and hybrid cpu, its new age cpu performance and efficiency..well all saying it all over....
let see.


yess,interesting!!


----------

