• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

What the hell are you on about ?

I don't think he understands its easy to get high refresh these days. i am playing dragons dogma maxed out at 144 fps 1440p with a gtx 1070... lol
 
It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
Or you actually never play any game and just decide that everyone need high FPS.

I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think everyone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
 
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think someone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.

60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
 
no point in downgrading visual just to get >60FPS.

Still can't figure out that this is a purely subjective conclusion ?
 
Still can't figure out that this is a purely subjective conclusion ?

Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
Because I can give you many editorials who would target 60FPS gaming
 
Last edited:
What is it with people and RayTracing suddenly ? Since it was announced and until before AMD showed the new cars , NOBODY was talking about RayTracing and now ?

Just be glad for the fricking competetion , its a win for all us customers. Nvidia and AMD really dont care about us , they just wanna make as much money as possible.
 
How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.

That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
 
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?

I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
 
Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?

Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?

I am honestly very confused as of what is it you want or trying to imply?


Different players have different preferences when gaming.
 
There are real RT(Ms DXR) and Nvidia RT. Real RT will be in 100% of games Nvidia RT in no more than 10% of this 100%. I think there will be not games exclusive with support Nvidia RT only.
 
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
 
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?

I dont agree, I think the reflections in Watch Dogs look pretty dang impressive, sad its all so heavy so a true ray traced future is still several gens out for sure, but look at Digital's Foundry's latest vid on it
 
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?

Sure just tell me which games you do you play exactly ? CSGO ? Youtube videos ?
Almost everyone recommends turning down visuals in AAA games to hit 144hz 1440p ? Yeah I really need some confirmation on that. No one would want to play AAA games with Low settings just to hit 144hz, that I'm sure of.

I didn't say anyone should play at 60FPS, if you have already max out all the graphical settings and can still getting >60FPS, then play at >60FPS, although capping the framerate really help with input latency with Nvidia Low Latency and AMD Anti Lag in certain games.
 
Last edited:
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.

I think you do not know what "standard" means or how its applied.
 
dragons dogma
This doesn't explain it to you? The title? Monkeys with crayons can draw the scenes fast enough, lol.

From 2016: "Given its old-gen nature, Dragon’s Dogma: Dark Arisen is not really a demanding title."


Just saying. ;)


That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???
Its nothing like it, really. AMD, like NV uses DXR. They're both using the same API for RT.
WCCFamdrdna2raytracing2-740x415.jpg


And based on how well AMD's raytracing looks and performs
Was anything official released on AMD RT performance?

RTX is hardware on the card. NV cards use DXR API for RT just as AMD will.
 
Last edited:
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?

I am similarly excited about RT.
It's just fanboys shouting at fanboys at this point. These same people would have been the ones mocking Geforce 256 back in the day about HWT&L, pay them no heed.

We're just in that awkward phase now where DXR is still an unknown for most people and we still dont know for sure if this years or maybe the next cycle is the one that will bring mainstream acceptance/performance to RT. I personally am not aware of any non DXR games tho i do believe those nvidia developed ones like Quake RTX are probably going to be nvidia HW only. I doubt games like Control aren't going to work on AMD, I suspect it's just AMD's software side of things being still not ready enough. I would expect a lot of growing pains for the first half of 2021 and AMD DXR. Hopefully I'm wrong, but they are going into this dealing with a 2 year handicap.

New games will have cross brand hw to work with soon, and as someone with a 2070super all i can say is the DXR game library is veeery small still, and its only going to really grow now with the new consoles since more or less all cross platform AAA titles will be coming with some form of RT once this first cross platform year of releases is over. (and already some of those cross plats are coming with RT anyway) So it bodes well overall for us mid to long term, regardless of hardware brand choices or.. god forbid loyalties.
 
You mean the DXR standard as opposed to Nvidia's proprietary RT
Who's proprietary RT? Nvidia uses DXR as well...

When DXR is enabled by a Game Ready Driver, targeted for April (2019), the supported GeForce GTX graphics cards will work without game updates because ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR. This industry standard API uses compute-like ray tracing workloads that are compatible with both dedicated hardware units, such as the RT Cores, and the GPU’s general purpose shader cores.
 
Last edited:
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
 
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?

I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
 
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.

Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
 
Last edited:
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
The 3070 is at the moment, Unicorn breath, like the rest of the Ampere lineup. What you call "impressive" regards the 3090, becomes idiotic when a $1500 card only beats a $800 card by 10%.
Oh! and CUDA is no good for gaming - Whilst Ray Tracing kills performance without resorting to DLSS.
Raytracing is todays equivalent of Hairworks or Physx.
The leather jacket openly lied to Nvidia's consumer base, claiming the 3090 was "Titan-like" when it clearly isn't, and promising plenty of stock for buyers. The reality is that abysmal yields are the reason
the Ampere series are almost impossible to come by.
 
The vanilla 6800 is actually looking really strong in the first few of those benchmarks.

It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.

I wonder if they will release a cheaper 8GB version of the 6800?
 
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
except the 3070's real price is not 499$. since rtx 2xxx series nvidia has been selling their cards with much higher price than the announced prices.
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.
 
Back
Top