Wednesday, May 4th 2022

AMD Radeon RX 6950XT Beats GeForce RTX 3090 Ti in 3DMark TimeSpy

We are nearing the arrival of AMD's Radeon RX 6x50XT graphics card refresh series, and benchmarks are starting to appear. Today, we received a 3DMark TimeSpy benchmark of the AMD Radeon RX 6950XT GPU and compared it to existing solutions. More notably, we compared it to NVIDIA's GeForce RTX 3090 Ti and came to a surprise. The Radeon RX 6950XT GPU scored 22209 points in the 3DMark TimeSpy test and looking at Graphics score, while the GeForce RTX 3090 Ti GPU scored 20855 points in the same test. Of course, we have to account that 3DMark TimeSpy is a synthetic benchmark and tends to perform very well on AMD RDNA2 hardware, so we have to wait and see for official independent testing like TechPowerUp's reviews.

AMD Radeon RX 6950XT card was tested with Ryzen 7 5800X3D CPU paired with DDR4-3600 memory and pre-released 22.10-220411n drivers on Windows 10. We could experience higher graphics scores with final drivers and see better performance of the upcoming refreshed SKUs.
Source: via WCCFTech
Add your own comment

25 Comments on AMD Radeon RX 6950XT Beats GeForce RTX 3090 Ti in 3DMark TimeSpy

#1
RedelZaVedno
RX 6950 will probably cost north of 1 grand. Why bother when you'll be able to get similar performance with much lower TPD for half the price with Navi 33 in Q4?
Posted on Reply
#2
Oberon
Of course, we have to account that 3DMark TimeSpy is a synthetic benchmark and tends to perform very well on AMD RDNA2 hardware
TimeSpy definitely favors Ampere over RDNA 2. Firestrike, OTOH, loves RDNA 2.
Posted on Reply
#3
DeeJay1001
RedelZaVednoRX 6950 will probably cost north of 1 grand. Why bother when you'll be able to get similar performance with much lower TPD for half the price with Navi 33 in Q4?
Because anyone who believes next gen will bring that much of a performance jump is dreaming.
Posted on Reply
#4
Chrispy_
Wasn't the 6900XT basically tied with a 3090 on an SAM-enabled platform already?

So the slightly faster 6950XT is a bit better than the slightly faster 3090Ti? Quelle surprise!
Posted on Reply
#7
progste
A battle of irrelevance!
Posted on Reply
#8
Chrispy_
medi01Yep. Slightly ahead at lower res, slightly behind at higher.
You are right, though I was meaning specifically about the Timespy benchmark scores.
@W1zzard doesn't test 3DMark, and rightly so because it's a stupid synthetic test that both AMD and Nvidia cheat optimise for. Nonetheless, 6900XT is a hair faster than the 3090:


(courtesy of Guru3d)

As the cheat proficiency driver updates improved performance a lot since the 6900XT launch-day review scores shown above, The median* 3090 and 6900XT scores today are now a hair over 20000 for both cards, so the new 6950X is an incredible 8.7% faster than its predecessor based on todays leak of a 6950 scoring 22209.

* - People buying flagship cards are rarely content with the base model. They'll buy the snake-oil super mega ultra deluxe edition nine times out of ten, so the median is more likely to be the STRIX OC +++ Championship Turbo Edition.
Posted on Reply
#9
ghazi
Chrispy_You are right, though I was meaning specifically about the Timespy benchmark scores.
@W1zzard doesn't test 3DMark, and rightly so because it's a stupid synthetic test that both AMD and Nvidia cheat optimise for. Nonetheless, 6900XT is a hair faster than the 3090:


(courtesy of Guru3d)

As the cheat proficiency driver updates improved performance a lot since the 6900XT launch-day review scores shown above, The median* 3090 and 6900XT scores today are now a hair over 20000 for both cards, so the new 6950X is an incredible 8.7% faster than its predecessor based on todays leak of a 6950 scoring 22209.

* - People buying flagship cards are rarely content with the base model. They'll buy the snake-oil super mega ultra deluxe edition nine times out of ten, so the median is more likely to be the STRIX OC +++ Championship Turbo Edition.
The comparison to the 6900 XT is all that matters... no point comparing it to the competition in one synthetic benchmark. Look at the same arch and extrapolate.
Posted on Reply
#11
Kickus_assius
I got 21,253 with my 3090ti on timespy. That score seems a little low. I'm just running all stock settings on a 12900k.
Posted on Reply
#12
DeathtoGnomes
The title here screams, loudly, "FANBOI THREAD OPEN!" :p:D:laugh:
Kickus_assiusI got 21,253 with my 3090ti on timespy. That score seems a little low. I'm just running all stock settings on a 12900k.
scores will vary with system specs, unlikely yours is identical to the test system.
Posted on Reply
#13
kapone32
If only prices were better for GPUs because competition is sweet. I just bought a 6500XT and cannot believe the Clock and Memory speeds. If they can get anywhere near those speeds (2903 GPU, 2293 MEM) with those bigger chips they will wipe the floor with the 3090TI in Gaming. Oh sorry I meant rasterization because DLSS and Ray Tracing (which is in so many Games).
Posted on Reply
#14
DoLlyBirD
Remember when buying a top tier GPU aka R9 290x/NVIDIA 680 you could get change from under $500...... now you're lucky top get into the middle tier aka 6600 XT/RTX 3060... progress :kookoo: yet top companies had record profits from 2020> chip shortages/covid FTW o_O
Posted on Reply
#15
wolf
Better Than Native
ghaziThe comparison to the 6900 XT is all that matters... no point comparing it to the competition in one synthetic benchmark. Look at the same arch and extrapolate.
This would be much more helpful for sure.

Or just W1zzards full review as always.
Posted on Reply
#16
Chrispy_
Kickus_assiusI got 21,253 with my 3090ti on timespy. That score seems a little low. I'm just running all stock settings on a 12900k.
Do you have an FE running at reference clocks, or a third party model?

Even cards like the TUF that don't focus on overclocking have significantly higher clocks than the reference card (1950MHz vs 1860MHz). That's an extra 1000 points, easy.
Posted on Reply
#17
Kickus_assius
Chrispy_Do you have an FE running at reference clocks, or a third party model?

Even cards like the TUF that don't focus on overclocking have significantly higher clocks than the reference card (1950MHz vs 1860MHz). That's an extra 1000 points, easy.
I have the zotac one which I believe is just stock clocks but it's possible they have played with the power limit and boost profiles. I also have a 6900xt in another system with a 5800x so I'll see what that one gets on the newest drivers too.
Posted on Reply
#18
Kawaz
Just OC a 6900xt and save the money. This is my golden chip though, but 24k is doable on most 6900xts
Posted on Reply
#19
watzupken
I won't take benchmark results seriously. Benchmarks are short, and easily "optimized" in drivers such that it shows the best of the hardware. 3D Mark in my opinion is one of the worst benchmark since things like CPU contributes significantly to the results even when we are trying to measure performance at 1440p or 4K which is unlikely to be CPU bound. In this case, the use of the 58003DX will certainly give it quite a bump in the results.
Posted on Reply
#21
MachineLearning
Tch. Both of these uArchs fail miserably when compared to Intel Arc in leaks per second...
Posted on Reply
#22
ghazi
KawazJust OC a 6900xt and save the money. This is my golden chip though, but 24k is doable on most 6900xts
Very nice score, is that the XTXH version? The problem for most of us is that we got ripped off in a very petty way. I don't know why it doesn't get any attention from the usual screechers.

Those of us with normal 6900 XTs, of course, have the ridiculously low slider limits which the XTXH cards lifted. This would be at least begrudgingly excusable, but AMD went so far out of their way to stop us from doing anything about it: despite the 6800 XT and 6900 XT having the same PCIe Device ID, the 6900 XT "XTXH" version got a unique device ID, which some code they added to the drivers references; if the device ID is not XTXH, but the clocks or power exceed the slider limits for the XTX card, the drivers automatically throttle the hell out of your card. This means you cannot even unlock the OC potential with a BIOS flash.

So on our "flagship" GPU, we are gimped and can't OC to the max, because AMD wanted to sell this secret limited edition bin later on for a higher price. This is the consumer-friendly laissez-faire company we shilled for the past 15 years because Nvidia kept lasering off backend partitions and Intel wanted to sell K CPUs. "Just OC a 6900xt and save the money" is a good value proposition for someone buying right now, but for those of us who bought before, it's all messed up.
Posted on Reply
#23
Bomby569
ghaziVery nice score, is that the XTXH version? The problem for most of us is that we got ripped off in a very petty way. I don't know why it doesn't get any attention from the usual screechers.

Those of us with normal 6900 XTs, of course, have the ridiculously low slider limits which the XTXH cards lifted. This would be at least begrudgingly excusable, but AMD went so far out of their way to stop us from doing anything about it: despite the 6800 XT and 6900 XT having the same PCIe Device ID, the 6900 XT "XTXH" version got a unique device ID, which some code they added to the drivers references; if the device ID is not XTXH, but the clocks or power exceed the slider limits for the XTX card, the drivers automatically throttle the hell out of your card. This means you cannot even unlock the OC potential with a BIOS flash.

So on our "flagship" GPU, we are gimped and can't OC to the max, because AMD wanted to sell this secret limited edition bin later on for a higher price. This is the consumer-friendly laissez-faire company we shilled for the past 15 years because Nvidia kept lasering off backend partitions and Intel wanted to sell K CPUs. "Just OC a 6900xt and save the money" is a good value proposition for someone buying right now, but for those of us who bought before, it's all messed up.
remember the time when you could simply flash another bios on gimped amd cards and get an insane boost in performance for free, like the rx 5700. Good times.
Posted on Reply
#24
ghazi
Bomby569remember the time when you could simply flash another bios on gimped amd cards and get an insane boost in performance for free, like the rx 5700. Good times.
There were even better times long ago when instead of just unlocking sliders, you unlocked the shaders too! I think the last one was the HD 6950.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts