• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
I think ZEN3 5600X is good value as is RX6800. That said I think the ZEN3 5800X could be the optimal CPU for gaming a bit higher base clock/peak over the ZEN3 5600X with two additional cores, but a tougher pill to swallow on the price if on a limited budget. I suppose these days I'd probably opt for that than stepping up to a RX6800XT to be fair personally I'm finding CPU core count and base frequency more and more appealing from a overall system standpoint. I think 'd get more mileage in the long run plus the RX6800 is great value for RNDA2 judging from what I've seen thus far.
 
Joined
Jun 18, 2015
Messages
341 (0.10/day)
Location
Perth , West Australia
System Name schweinestalle
Processor AMD Ryzen 7 3700 X
Motherboard Asus Prime - Pro X 570 + Asus PCI -E AC68 Dual Band Wi-Fi Adapter
Cooling Standard Air
Memory Kingston HyperX 2 x 16 gb DDR 4 3200mhz
Video Card(s) AMD Radeon 5700 XT 8 GB Strix
Storage Intel SSD 240 gb Speed Demon & WD 240 SSD Blue & WD 250 SSD & WD Green 500gb SSD & Seagate 1 TB Sata
Display(s) Asus XG 32 V ROG
Case Corsair AIR ATX
Audio Device(s) Realtech standard
Power Supply Corsair 850 Modular
Mouse CM Havoc
Keyboard Corsair Cherry Mechanical
Software Win 10
Benchmark Scores Unigine_Superposition 4K ultra 7582
Which doesn't seem to be to need brute-force path tracing anyway:



Besides, I wouldn't be surprised if Zen 3 "infinity cache" inside RDNA2 lets it spank Ampere even on that (rather quite useless for now and in forseable future) front.
:rockout:
 
Joined
Jan 6, 2014
Messages
599 (0.15/day)
Location
Germany
System Name Main Machine
Processor Intel i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex
Cooling Water cooling, 2x EK-DDC 3.2 PWM, 1x360mm+1x240mm+1x120mm EK, Mora 360 Pro, EK-Quantum Velocity 2
Memory G.SKILL 32GB DDR5-7200, 7200J3445G16GX2-TZ5RS
Video Card(s) ASRock RX 7900 XTX Aqua
Storage 2x WD_BLACK SN850X 1TB und 2TB, 2x8TB Seagate Ironwolf
Display(s) ASUS ROG Strix XG27WQ 27inch 165Hz FreeSync Premium Pro
Case Cooler Master COSMOS C700P
Audio Device(s) Turtle Beach Elite Pro Tournament + Elite Pro TAC
Power Supply Corsair AX1600i 1600W Titanium
Mouse Logitech G903 LIGHTSPEED Wireless
Keyboard ROCCAT Ryos MK Pro
Software Win 11
@btarunr

Thanks a lot for sharing the results.
Looks very promising.

Waiting for review from TPU especially for 6800XT to decide if that is my next GPU. :rolleyes:
I would be interested to see how the performance is with Zen 2 without SAM.

When Nvidia launched their new RTX generation I was thinking that it would really tough for AMD to match that but looking at the benchmarks until now it is really impressive to see AMD is catching up to NVidia at least at non DXR. :)

Let's hope that these will be available on the release day in larger quantities and not be sold out within minutes after release giving one option to buy it after reading reviews and not disappear from the online stores while one is looking at reviews. :roll:
 
Joined
Sep 1, 2020
Messages
2,353 (1.52/day)
Location
Bulgaria
games need more polygons and way better textures.
According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
PP 280 vs 190
TP 720 vs 556.
LoL. AMD is more future proof for long-term use!
PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
PP 280 vs 190
TP 720 vs 556.
LoL. AMD is more future proof for long-term use!
PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.
I know what you are trying to say here but these cards are different. These should not be compared 1 to 1 considering the hardware.
 
Joined
Sep 1, 2020
Messages
2,353 (1.52/day)
Location
Bulgaria
I know what you are trying to say here but these cards are different. These should not be compared 1 to 1 considering the hardware.
This will show results only for time when it's will be compared for first, not related for term of how long in time cards will be relevant in the future. I think that AMD cards even if they don't show a big advantage in first reviews, in the future they will perform even better compared to the competing models from Nvidia's 30* series.
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
This will show results only for time when it's will be compared for first, not related for term of how long in time cards will be relevant in the future. I think that AMD cards even if they don't show a big advantage in first reviews, in the future they will perform even better compared to the competing models from Nvidia's 30* series.
???

Fine wine? A couple % uptick overall more in a title or two? I wouldn't hold my breath for that. And those numbers you quoted don't add up to your conclusion.
 
Joined
Sep 1, 2020
Messages
2,353 (1.52/day)
Location
Bulgaria
???

Fine wine? A couple % uptick overall more in a title or two? I wouldn't hold my breath for that. And those numbers you quoted don't add up to your conclusion.
All be clear in future. Ат the moment we can only guess, based on the characteristics we know at the moment, how things will develop in the future. It is not possible to present facts that have not yet happened.
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
All be clear in future. Ат the moment we can only guess, based on the characteristics we know at the moment, how things will develop in the future. It is not possible to present facts that have not yet happened.
Im glad you understand that concept... apply it. :p
 
Joined
Sep 3, 2019
Messages
3,515 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
According to GPU database in Techpowerus RX 6900 XT has more pixel performance and more texel performance than RTX 3090.
PP 280 vs 190
TP 720 vs 556.
LoL. AMD is more future proof for long-term use!
PS. RX 6800 XT also is better than RTX 3090 if we rely only on a comparison of these numbers.
As you might figured out already, those numbers are telling absolutely nothing about actual performance of a card. Same with TFLOPS. Its just for reference. Raw fillrates, computing power and VRAM bandwidth cannot be directly comparable between different architecture GPUs. Not even if GPUs are made under the same brand.
And you cant predict either the future performance gains or losses of a GPU against another product as the factors related are far too many.
 
Joined
Sep 1, 2020
Messages
2,353 (1.52/day)
Location
Bulgaria
Im glad you understand that concept... apply it. :p
Hmm, next factors to Nvidia incompLetences with VRAM size(exclude partially only RTX 3090 and include all other models, 3080 10GB; 3070 8GB; 3060 ti(?):


First:
AMD will support all ray tracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API. Games making of use of proprietary raytracing APIs and extensions will not be supported.
— AMD Marketing
.....
AMD has made a commitment to stick to industry standards, such as Microsoft DXR or Vulcan ray tracing APIs. Both should slowly become more popular, as the focus goes away from NVIDIA’s implementation. After all, Intel will support DirectX DXR as well, so developers will have even less reason to focus on NVIDIA’s
Second:

Interestingly, Keith Lee revealed that in order to support 4X x 4X UltraHD textures a 12GB VRAM is required. This means that Radeon RX 6000 series, which all feature 16GB GDDR6 memory along with 128MB Infinity Cache should have no issues delivering such high-resolution textures. It may also mean that the NVIDIA GeForce RTX 3080 graphics card, which only has 10GB of VRAM, will not be enough
Links are below "First & Second"!
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
NV uses DXR, same as AMD.....

10GB may fall short at 4K in a few years... but by then, you'll want another GPU anyway. Even DOOM on nightmare doesn't eclipse 10GB @ 4K.

As you might figured out already, those numbers are telling absolutely nothing about actual performance of a card. Same with TFLOPS. Its just for reference. Raw fillrates, computing power and VRAM bandwidth cannot be directly comparable between different architecture GPUs. Not even if GPUs are made under the same brand.
And you cant predict either the future performance gains or losses of a GPU against another product as the factors related are far too many.
I'm giving up. ;)
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
The RX 5700 XT didn't really overclock great (2,000+ MHz only yielded at most 10 FPS with most models) as well, but we'll see how the 6800 XT works out.
The 5700XT OC'd pretty well (went up in frequency) but gains were small due to it already being memory bandwidth starved. Here, the memory architecture was completely overhauled, and at least the infinity cache should go up in speed with the core while overclocking, so it should be quite interesting to see...
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
That's an assumption on your part and not a very logical one, especially considering that NVidia has already had 2 years to gain a lead in both deployment and development of RTRT.
It's a very logical assumption, given who commands console market (and situation in the upcoming GPUs too).

More likely scenario, though, is that in that form (brute force path tracing) it will never take off.

is catching up

Smaller chips, lower power consumption, slower (and cheaper) VRAM, more of it, for lower price than competition and better perf/$ than competition.
Catching up, eh? :)
 
Joined
Oct 12, 2005
Messages
708 (0.10/day)
One of the reason of AMD fine wine is just that AMD took more time to polish their drivers because they have way less resource than Nvidia to do so.

Another is that CGN balance between fillrate/texture rate vs compute performance was a bit more on the Compute side. NVidia on the other hands focused a bit more on the fill rate side.

Each generation of games was shifting the resource from fill rate to compute by using more and more power and AMD GPU in a better position. But not really enough to make a card last way longer. Also the thing is low end cards where outclassed anyway were High end cards were bought by people with money that would probably change them as soon as it would make sense.

It look like that AMD with NAVI went to a more balanced setup where Nvidia is going onto the heavy compute path. We will see in the future what is the better balance but right now it's too early to tell.

So in the end, it do not really matter. a good strategy is to buy a PC at a price that you can afford another one at the same price in 3-4 year and you will always be in good shape. If paying 500$ for a card every 3-4 years is too much, buy something cheaper and that's it.

there is good chance that in 4 years, that 500$ card will be beaten by a 250$ card anyway. Even more when we think they are going to chiplet design with GPU. that will drive a good increase on performance.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Oh, do help us all understand your point in more detail...
Stranger talking about self in plural, you are seriously asking why anyone would optimize games for the LION's share of the market?
 
Joined
Jul 5, 2013
Messages
27,839 (6.68/day)
Stranger talking about self in plural, you are seriously asking why anyone would optimize games for the LION's share of the market?
Then why aren't you? Hmm? Perhaps because you know both that there is a counter argument and that such an argument is perfectly valid. It's as valid now as it has been since the Console VS PC debate began.
 
Joined
Mar 10, 2015
Messages
3,984 (1.12/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
It's a very logical assumption, given who commands console market (and situation in the upcoming GPUs too).

Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
The explanation is extremely easy. In the past, AMDs were not ready to take advantage of the fact that the hardware of the old consoles had components developed by them. However, now they can and do!
It is the opposite imo. After they programmed the radeon profiler, they found out about the intrinsic limits of the hardware.
Yes, the scheduler was flexible, as it was announced to be in its launch, but instruction reordering does not necessarily mean the full extent of its performance. IPC still was 0.25 and now that it is 1 is a lot in comparison. They have all these baked-in instructions doing the intrinsic tuning for them in the hardware. The isa moved away from where gcn was by a great deal. Plus, they have this mesh shader which abnegate the triangle pixel size vs wavefront thread cost to deal with it in hardware. Performance really suffered with <64 pixel area triangles. Not so, any more.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Considering they had consoles las generation as well, how did that whole optimizing for AMD architecture go?

Oh, that is easy, my friend.
EPIC on UE4 "it was optimized for NVidia GPUs".
EPIC today, demoes UE5 on RDNA2 chip running on the weaker of the two next gen consoles, spits on Huang's RT altogether, even though it is supported even in UE4.



There is more fun to come.

Recent demo of XSeX vs 3080 was commended by a greenboi like "merely 2080Ti levels".
That is where next gen consoles ar > 98-99% of the PC GPU market.

Then why aren't you?
It was a rhetorical question.
 
Joined
Mar 10, 2015
Messages
3,984 (1.12/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
@medi01 , no idea what you just said.
 
Top