• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 6950XT or RX 6700 XT ?

I really wonder what the GPU util would be on thát CPU. Lol. 15%?
Stay chooned, dude. My engineering sample is as underclockable as it gets.

@Vayra86, this is my RKL ES downsized from 8 to 2 cores. At 3.9 GHz it's probably even worse than an FX-8350.

The GPU is RX 6700 XT which is at least 2 times slower than 7900 XT.

GPU usage was in the range of 27 to 75 percent considering I disabled all frame limiters whatsoever. 1280x1024, maximum settings, no FSR, no RT.

I guess 15% is not as bold to assume as you might've thought it is.
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    375.6 KB · Views: 91
Seems that my dream to play in 4k is dead then
That's really depend what game you play in 4k for example I been playing War Thunder in 4k 60Hz/60Fps without any problem on my RX 6700XT all settings on MAX even RX 5700 could do the same except I need it to tweak some settings on high/medium lvl...also now you have FSR/DLSS option in many games so you can stay in 4k turn on FSR and gain a lot of fps without of losing a lot of gfx fidelity......
 
a 6950XT at 2.7Ghz is pretty awesome :D still running mine in my daily system.
 
I only decided to upgrade again from the 6700XT because I moved from 1440p to 3440x1440 so I basically lost all the performance I had gained from upgrading from a 5700XT. I really only wanted a 6800XT Nitro but a buddy found the Red Devil 6850XT for like $50 more so I couldn’t refuse.
I play at 4K and I would like to have more horsepower..
 
The difference between 6700XT and 6950XT is huge but I really don't think it's worth buying a 6950XT at this point. If you have the money for a 6950XT maybe you can go for a 7900XT instead ? They're usually not that far apart in price, maybe 100$ at most.

It's never really worth buying the highest end card of a previous generation, the value is terrible, a 6800XT would be much better if you really care about perf/dollar.
 
Hi, i try to build a new pc only for gaming and why not a little bit of streaming (1400-1500€ ), Problem is i don't know nothing about this.

In search for some gpu, i find the 6950xt attractive but the card seems huge and power hungry. In the other hand some friends advice me to pick the 6700 instead and buy a real good cpu with it (ryzen7 78003d)
Like i said it's only for gaming (1440p mostly 4k if possible). So do you think 6950 xt is good and i should buy that with a cpu that fit in my budget? or buy the 6700 xt less expensive and use the saved money to buy the super good cpu?

Thx.



Hi,

I made a configuration fo you, you can take a look: https://pcpartpicker.com/list/rm8s6r
 
The difference between 6700XT and 6950XT is huge but I really don't think it's worth buying a 6950XT at this point. If you have the money for a 6950XT maybe you can go for a 7900XT instead ? They're usually not that far apart in price, maybe 100$ at most.

It's never really worth buying the highest end card of a previous generation, the value is terrible, a 6800XT would be much better if you really care about perf/dollar.
i was lucky got mine for 500$+tax
 
Neither, go for 4070 or 4070 Ti, or if you insist on AMD, 7900XT.

Last gen purchases are almost always a bad decision in the long term, unless you're getting a very good deal on a used card from a friend you trust.

Driver support is already fragmented between RDNA2 and 3, that won't improve as time goes on.
 
  • Haha
Reactions: ARF
These are limited to only 12 GB, for their prices are hard no-goes.
Noone cares except fanboys.

Testing reveals zero VRAM difference between the VRAM "limited" cards and those that have larger buffers, even in games such as TLOU, which are notoriously, hilariously bad console ports. The performance scaling is exactly the same as with lower resolution/older games, dictated entirely by the GPU machinery and generation, not VRAM.

Even 8 GB cards don't have any issues at 1080p or 1440p.

12 GB cards are fine for everything, including 4K in all but a handful of titles.

Your "future proof" 20/24 GB card is going to run out of rastering/RT performance well before you run into any kind of VRAM bottleneck, lmao.

16 GB 4060 Ti that is otherwise identical to 4060 will prove this.

"...not a single game saw a meaningful performance hit with 12 GB, not even at 4K" From the 4070 reviews.
 
Last edited:
At least the 4090 with it's 24 GB is actually useful in non gaming, as a workstation card due to CUDA and the NVIDIA ecosystem and partner support, can't say the same for the Radeon side of things.

Do you post false information ?

This YouTube video is a proof that 8 GB is garbage and should be avoided.

Sure bud, the YouTuber looking for clickbait is right.

Now try reading every conclusion of W1z's GPU reviews, or just look at the game testing data yourself. It's irrelevant.
 
"Leading up to this launch, I've noticed a lot of discussion around the 12 GB VRAM size of RTX 4070. While I agree that 16 GB would be better, I disagree with people who say that 12 GB is already too small, or obsolete. There are a few (badly coded) games out there that use a ton of VRAM, especially at 4K, but the vast majority of titles won't even get close to such VRAM usage numbers. In our whole test suite not a single game saw a meaningful performance hit with 12 GB, not even at 4K—and RTX 4070 is fundamentally a 1440p card. You'll also have to consider that making a 16 GB card isn't just "let's add another 4 GB memory chip," but you also need to put additional signal traces on the PCB, and widen the memory controller inside the GPU, so that it can talk to all these chips in parallel. I don't think anyone would be willing to pay $700 for a 16 GB RTX 4070, would you? On the other hand, AMD does offer 16 GB VRAM on the Radeon RX 6800 XT and 6900 XT, which could make them an option for those who want to focus on VRAM future-proofing."

From the 4070 reviews.

Case closed.
 
Last edited:
Future proofing is about a year or two ahead. Not now. No one cares about now.

I said look at their prices: 12 GB for 800$ and 12 GB for 600$ is daytime rubbery.

If you are serious, at least save some money and buy the 12 GB RX 6700 XT for 330$.

Case closed.
 
And before you start quoting "future-proof", lets see how 4070 does at the resolution it isn't even designed for, 4K (worst case scenario).

Wow, still better than most of those 16 GB cards that have so much "future proofing".

1688388299057.png


Now lets take RT into account (the standard that all new engines are using moving forwards, if you want to talk about "future proofing"). Oooof.

1688388377704.png


But go ahead, recommend the 6700 XT seriously, in supposed sincere good faith, the card that gets half the performance of the 4070.

Honestly.
 
Last edited:
At least the 4090 with it's 24 GB is actually useful in non gaming, as a workstation card due to CUDA and the NVIDIA ecosystem and partner support, can't say the same for the Radeon side of things.


Sure bud, the YouTuber looking for clickbait is right.

Now try reading every conclusion of W1z's GPU reviews, or just look at the game testing data yourself. It's irrelevant.
Its not, we have several examples of games heavily capped on VRAM.

But dream on. This Youtuber isn't the only one seeing it.
 
Its not, we have several examples of games heavily capped on VRAM.

But dream on. This Youtuber isn't the only one seeing it.
Yeah, well not in TPU testing.

YouTubers hype up "problems" to get clicks, I'm not the only one seeing that either.

The fact that so many people are laser focused on the "VRAM problem" when the "RT problem" seems to be much more relevant moving forwards, is amazing to me. But then one of those "problems" allows people to favour the underdog, so...
 
And before you start quoting "future-proof", lets see how 4070 does at the resolution it isn't even designed for, 4K (worst case scenario).

Wow, still better than most of those 16 GB cards that have so much "future proofing".

View attachment 303386

Now lets take RT into account (the standard that all new engines are using moving forwards, if you want to talk about "future proofing"). Oooof.

View attachment 303387

But go ahead, recommend the 6700 XT seriously, in supposed sincere good faith, the card that gets half the performance of the 4070.

Honestly.
There is more than your short sighted usage, upgrade path and/or spending habit in life. There is no comparison for the sweet spot RDNA2 is in right now, in raw perf/dollar. And its 12GB is much better matched to its core power. The 4070 has way more core oomph than its 12GB can carry, its a waste, and a shame. The 4070ti is worse. Both cards suffer from a major bandwidth deficit too. This will make itself known, except not for the coming two years. We have seen and had this discussion many times before. If your cards last longer than a couple of years, you will see them fall short with Nvidia's new approach to VRAM capacities. Especially in the segment at and above x70, for the lower ranges it doesn't matter much because you're already compromising straight out of the box. But with an x70 you really shouldn't have to - Pascal's x70 and x80 were perfectly balanced like that. They just ran out of everything all at once; 8GB was insufficient playing TW Warhammer 3 on my 1080 at 1440p for example. The game stutters, but at the same time, the GPU also just didn't want to push for more than 40 FPS, which is edge case playable. That's where the balance should be at.

Ada except for the 4080 and 4090 is nowhere near it not even ballpark.

Perf/dollar:

1688389146961.png


The simple gist of it all is, that Ada is just a shitty stack priced too high, even if an individual GPU scores nicely today.

Yeah, well not in TPU testing.

YouTubers hype up "problems" to get clicks, I'm not the only one seeing that either.

The fact that so many people are laser focused on the "VRAM problem" when the "RT problem" seems to be much more relevant moving forwards, is amazing to me. But then one of those "problems" allows people to favour the underdog, so...
We are in full agreement on the clicks issue. But that doesn't invalidate everything that comes out of there.

The RT problem is ALSO a VRAM problem ;) The real million dollar question is why Nvidia is so tight on VRAM when they are the ones pulling RT forward. We know the real answer. They want you to buy their Ada successor 2,5 years from now. The entire strategy is gearing up for gen-to-gen, minimal perf win upgrades. But Nvidia is there to help you, with MORE proprietary technologies like DLSS4, to reduce your VRAM usage. And at that point, you're stuck like a heroin addict on free frames.

Its not for me, that commercial clusterfuck, but you do you.
 
Last edited:
That's nice, now compare the RT perf/dollar. They're identical. So what would you buy as someone interested in "future proofing", a card that's competitive in everything except the major tech that's being pushed in every new game engine moving forwards, where the last gen flagship $1000 card does worse than a $600 mid ranger, and the current gen AMD $1000 flagship is the same speed as the $800 mid ranger in modern rendering (RT)?

Should VRAM or actual measured performance dictate sales? Honest question. Because Arf was arguing about the supposed performance difference, when I demonstrated there isn't one, he then switched goalposts to talk about the performance difference "in a couple of years". Does the further increased RT adoption by that point enter those calculations?

Future proofing is about a year or two ahead. Not now. No one cares about now.
1688389892930.png
 
Last edited:
Honestly with the price of AM5 you would pay about $30 more vs going with AM4. Yo[u could get a board like the MSI X670 Pro for just over $330 Canadian and combine that with a 7600 and a 6800. (Check the AMD website). DDR5 does make a difference if only minor and you would be on the first generation of a brand new platform with MB or CPU support for the next 3 generations.
 
i find the 6950xt attractive but the card seems huge and power hungry
The 6950XT has amazing value with its current price and will let you play most games at 4K60 on ultra. However, it does draw quite a lot of power. I would strongly recommend the 7900XT in its place, especially if you intend to stream on YouTube. The 7900XT has a dedicated AV1 video encoder, is about 15% faster than the 6950XT @ 4K, and consumes less energy.

The 4070Ti may be a good alternative, particularly if you prioritize RT. It is a bit slower than the 7900XT, but also uses less power.

PC GAMER | AMD Ryzen 7 5800X 8x3.80GHz | 16GB DDR4 | RX 7900 XT 20GB | 1TB M.2 SSD what do you think of this
I would not pick this particular system. RAM latency isn't listed (read: slow) and I can't find the specs for the CPU cooler (probably a budget one).
 
PC GAMER | AMD Ryzen 7 5800X 8x3.80GHz | 16GB DDR4 | RX 7900 XT 20GB | 1TB M.2 SSD what do you think of this for less than 1600€ guys? Should i buy it?
No.

16 GB (slow) RAM.

Non X3D CPU.

Probably cheapo cooling, motherboard, power supply and case.


Top tier PSU, good RAM, good cooling, good SSD, good case, great CPU and GPU. SFF as a bonus (you can swap the case for a smaller one if you like).

Of course, this is dollars, but Euros should be similar, since things are slightly more expensive in Europe, but the Euro is worth slightly more than the USD, and we don't know where you live to look for parts.

1688393270661.png
 
Last edited:
That's nice, now compare the RT perf/dollar. They're identical. So what would you buy as someone interested in "future proofing", a card that's competitive in everything except the major tech that's being pushed in every new game engine moving forwards, where the last gen flagship $1000 card does worse than a $600 mid ranger, and the current gen AMD $1000 flagship is the same speed as the $800 mid ranger in modern rendering (RT)?

Should VRAM or actual measured performance dictate sales? Honest question. Because Arf was arguing about the supposed performance difference, when I demonstrated there isn't one, he then switched goalposts to talk about the performance difference "in a couple of years". Does the further increased RT adoption by that point enter those calculations?


View attachment 303389
I think it makes sense if your budget is around $600 the 4070 should be on your short list of considerations as 12GB VRAM will be just fine.
However under $600 you may be better served going with a card that has more than 8GB VRAM.
Perhaps I'm missing the point of the argument?
 
I think it makes sense if your budget is around $600 the 4070 should be on your short list of considerations as 12GB VRAM will be just fine.
However under $600 you may be better served going with a card that has more than 8GB VRAM.
Perhaps I'm missing the point of the argument?
The point I was making is everyone loves to talk about future proofing, and points to VRAM, but ignores RT performance which is becoming more and more relevant, to the level that new game engines use it as the default lighting implementation. So you can't separate "future proofed last gen RDNA2" from "RDNA2 is not future proof because it has poor RT", even when you compare 6950xt to a 4070.

You're right about 12 GB being just fine though.

From W1z's recent reviews:
"...not a single game saw a meaningful performance hit with 12 GB, not even at 4K"
 
Back
Top