• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6700 XT

Actually the 6700 XT is about 1,5% better in price performance compared to the RX 5700 XT, not 10%. However, I'm really waiting your at least this type of splutter on NV as the 3070 was even worse upgrade, 1% better regarding price/performance compared to the 2070.
Where you coming up with that? It's 31% faster in this review than the 5700xt and it cost 20% more than the 5700xt so that's 11% not 1%.
 
Why would anyone buy 6700XT over 3070 in normal market conditions (at MSRP)?

1. standard rasterization: -4% at 1080/1440p & -10% at 4K
2. No DLSS
3. RT around 20% higher performance hit
4. No Adobe cuda GPU acceleration support
5. Worse encoder
6. Abysmal +50% 20ms power consumption spikes rendering 6700XT dangerous for below 600W golden PSUs
7. No meaningful OC headroom
8. slower drivers upgrades due to smaller software team behind it

9. The only plus is 4 gigs more GDDR

This GPU should have MSRP of $399 MAX and $349 to call it great value proposition. Just be patient. You'll be able to buy this GPU for 200 to 250 bucks on 2nd hand market, once mining craze ends.
Who knows
-not normal market conditions
and most of the things you listed not really much issue for straight gaming, which is all most people do
I would guess for gamers
1. 4% is meh
2. if you use it or need it
3. if you have games that have it or like it or even use it/personal preference?
4. if you use it/need it
5. encoder? video?
6. interesting - got some reviews/studies?
7. if you OC
8. fair - but who wants to be updating drivers on a weekly basis?

The problem with todays MSRPs is that they will stick even when mining craze ends. Expect 7700XT to cost north of 500 bucks. The only thing that can save DIY PC market from becoming niche for the rich and PC nerds willing to sell their kidneys for new GPUs is paradoxically Raja Koduri at Intel, pulling another "Polaris like GPU" out of a hat or we're F...ed. Ngreedia&AMD are clearly OK by maximizing profits in the short term no matter what.
I think intel will do the same for price.

GPU's are like Apple products now, $1000 for anything ;)
 
Why would anyone buy 6700XT over 3070 in normal market conditions (at MSRP)?

1. standard rasterization: -4% at 1080/1440p & -10% at 4K
2. No DLSS
3. RT around 20% higher performance hit
4. No Adobe cuda GPU acceleration support
5. Worse encoder
6. Abysmal +50% 20ms power consumption spikes rendering 6700XT dangerous for below 600W golden PSUs
7. No meaningful OC headroom
8. slower drivers upgrades due to smaller software team behind it

9. The only plus is 4 gigs more GDDR

This GPU should have MSRP of $399 MAX and $349 to call it great value proposition. Just be patient. You'll be able to buy this GPU for 200 to 250 bucks on 2nd hand market, once mining craze ends.
There was an interesting Hardware Unboxed video recently showing AMD cards performing better on lower end cpu's.
 
Low quality post by HenrySomeone
Nobody is buying 3070 for RT performance. Not to mention the amount of games that truly support RT.

The extra 4 GB VRAM over the 3060 Ti and the 3070 is enough for the 6700XT to run games with very high VRAM needs smoother than those cards even in 1440P. So if you are interested in raw rasterization performance, that -4% difference means less than having 4 GB extra VRAM.
12 over 8 making a difference in 1440p today?!? :roll: I commend you for outing yourself as a hardcore AMD fanboy, so that those of us that don't subscribe to that cult can ignore you in the future...:rolleyes:
 
There was an interesting Hardware Unboxed video recently showing AMD cards performing better on lower end cpu's.
Nvidia = Software Scheduler = faster scheduler because high clocked cpu, but worse frame latency. Transferring data from cpu and back (I'm guessing)
AMD = Hardware Scheduler = slower scheduler because limited GPU clock , but better frame times. No transferring of data. (again guessing)
 
The problem with todays MSRPs is that they will stick even when mining craze ends. Expect 7700XT to cost north of 500 bucks. The only thing that can save DIY PC market from becoming niche for the rich and PC nerds willing to sell their kidneys for new GPUs is paradoxically Raja Koduri at Intel, pulling another "Polaris like GPU" out of a hat or we're F...ed. Ngreedia&AMD are clearly OK by maximizing profits in the short term no matter what.

Raja didn't work on Polaris. AMD had two driver teams, Raja worked on Vega, the other on Polaris and Navi.
 
I'm taking the liberty to rearrange and simplify product names when it makes sense for me ;)

sokyossq7s.jpg

Technically you are right of course, but apparently there's no "not Black" card, so why even include that part?
No, I mean in the conclusion page you mentioned about the launch-day reviews of several cards, including the XFX RX 6700 XT Merc 319 card. My point was the review you made was for XFX RX 6700 XT Merc 319 Black card, and the current link (https://www.techpowerup.com/review/xfx-radeon-rx-6700-xt-merc-319/) actually leads into a 404, meaning it leads to a non-existing page. Notice the link doesn't have "-black" on it.

I don't know if you just want to use "XFX RX 6700 XT Merc 319" instead of "XFX RX 6700 XT Merc 319 Black", but the correct link for your XFX RX 6700 XT Merc 319 Black review is https://www.techpowerup.com/review/xfx-radeon-rx-6700-xt-merc-319-black/ (notice the "-black" part at the end of the link).

Sorry if I made you confused here. I'm native Japanese and not a native English speaker.
 
Seriously, how can you even put stuff like this into pros/cons?
  • Very limited supply
  • Actual market price will end up much higher
Man that's stupid as f***. That has no room in a review.
This says absolutely NOTHING about the product you're reviewing.
 
Why would anyone buy 6700XT over 3070 in normal market conditions (at MSRP)?

1. standard rasterization: -4% at 1080/1440p & -10% at 4K
2. No DLSS
3. RT around 20% higher performance hit
4. No Adobe cuda GPU acceleration support
5. Worse encoder
6. Abysmal +50% 20ms power consumption spikes rendering 6700XT dangerous for below 600W golden PSUs
7. No meaningful OC headroom
8. slower drivers upgrades due to smaller software team behind it

9. The only plus is 4 gigs more GDDR

This GPU should have MSRP of $399 MAX and $349 to call it great value proposition. Just be patient. You'll be able to buy this GPU for 200 to 250 bucks on 2nd hand market, once mining craze ends.
So, nobody should buy a card with +4GB VRAM, that is slightly behind as is, and closes the gap with SAM enabled, because:

1) Ah, performance, ok
2) No glorified TAA upscaling in a handful of games that support it
3) It merely wins in 3 games out of about 11 (WoW, Fortnight, Dirt 5), ties in one (Godfall) as if, you know, RT performance in new, non-NV tailored games was better on it for some reason.
4) If you use adobe
5) If you believe FUD about encoders
6) Some FUD about power consumption
7) Ok, if we talk about reference card. At least one of 8 points is somewhat valid. :D
8) Outright nonsense from batshit crazy world

I think with Lisa AMD has quit "drop competitor's price" business and peole who want to buy slower card with 8GB RAM or weird car with same perf, but 4GB VRAM for more money, should feel free to do so.

Especially people with CPU's as "ancient" as 2600x:



I think atm the 6000 AMD gpus have some issues in Linux open drivers also
You just made that shit up.
 
Honestly, I did not even know this card was being released. There is not even a point to have an opinion of a card that you wont find anywhere for close to MSRP anytime soon.

The way things look right now, the 1080ti may be the last card I buy as consoles are looking more attractive every day given the GPU prices.
 
Wait what? 6900XT peak power at 619Watt during spike??????

power-spikes.png
Nah .. Wizz probable had his hair dryer plugged in.

Arguing nvidia vs amd right now is like debating Superheroes, it's all science fiction.
 
No power consumption of 3060 in the graphs? Why?
 
So, nobody should buy a card with +4GB VRAM, that is slightly behind as is, and closes the gap with SAM enabled, because:

1) Ah, performance, ok
2) No glorified TAA upscaling in a handful of games that support it
3) It merely wins in 3 games out of about 11 (WoW, Fortnight, Dirt 5), ties in one (Godfall) as if, you know, RT performance in new, non-NV tailored games was better on it for some reason.
4) If you use adobe
5) If you believe FUD about encoders
6) Some FUD about power consumption
7) Ok, if we talk about reference card. At least one of 8 points is somewhat valid. :D
8) Outright nonsense from batshit crazy world

I think with Lisa AMD has quit "drop competitor's price" business and peole who want to buy slower card with 8GB RAM or weird car with same perf, but 4GB VRAM for more money, should feel free to do so.

Especially people with CPU's as "ancient" as 2600x:




You just made that shit up.

W1zzard tested with Resizeable Bar enabled on default (mentioned in the GPU test system update March 2021 article), so no there won't be anymore performance improvement from the Red Team. Nvidia meanwhile is rolling out their Resizeable Bar support by the end of March, so expect some performance improvement from Ampere.
 
I feel AMD gimped too much on the Navi 22 in terms of CUs, and pushed too hard on the clockspeed. While a high clockspeed is nice, but its burning through too much power for a mid range card. While we are unlikely to see this at MSRP, still I feel the MSRP is also too high. I think people may be better off sticking to a RX 6800 if they want to stay with AMD, or consider a RTX 3070/ 3060 instead.
 
I feel AMD gimped too much on the Navi 22 in terms of CUs, and pushed too hard on the clockspeed. While a high clockspeed is nice, but its burning through too much power for a mid range card. While we are unlikely to see this at MSRP, still I feel the MSRP is also too high. I think people may be better off sticking to a RX 5700 XT or 6800 if they want to stay with AMD, or consider a RTX 3070/ 3060 instead.

Yeah, the power consumption numbers are indeed high as is the MSRP. AMD is likely making a killing on these cards given how much they are cut down.
 
W1zzard tested with Resizeable Bar enabled on default (mentioned in the GPU test system update March 2021 article), so no there won't be anymore performance improvement from the Red Team. Nvidia meanwhile is rolling out their Resizeable Bar support by the end of March, so expect some performance improvement from Ampere.
Bar has shown to barely bring anything to NV cards.
TPUs figures curiously contrast with other reviews on some games: (that's why I assumed, SAM was off)

1616046990347.png


1616047068231.png



UPDATE: nvm, i'ts 1% min vs average.

but its burning through too much power for a mid range card.
Huh, 215W is "too high" for a mid range card....
 
Browsed a lot of reviews. To me it is meh, more of 3060Ti level card than a 3070 level. That is excluding the fact DLSS 2.0 and better DXR performance on the green team.
Wait for AMD's secret sauce.
 
What's the deal with encoders?
Live as well as game capture and various other uses. Personally I use it with an Nvidia SHIELD, so I can play games rendered on my PC but with the video stream playing on my TV with exceptionally little latency penalty and IQ loss.

NVENC is well ahead in this 'features' maturity. Some won't ever use it, so it might not be a selling point, and that will factor in quite heavily into choices this generation, if Nvidia's software stack and things like more mature/performant/optimised RTX, DLSS, NVENC and CUDA maturity/supported apps are important to you, they have a reasonably stronger value proposition here.

Comparatively, AMD's stronger value perhaps lies in larger VRAM buffers across various products, and stronger relative performance in extremely CPU-limited gaming scenarios.

So when choosing, one must weigh up all sorts of things, like the features you might want to use from the product once you have it, architectural strengths and weaknesses, the games/settings/resolutions you'll want to play at, how long you might keep the card, what specs are the system it's going in... etc.

Wait for AMD's secret sauce.
I am keen as mustard to see what they can manage, and there is a tonne of unknowns, like how much it could boost performance, how much IQ is lost/retained... but for the time being, said feature is conclusively absent.
 
Last edited:
No power consumption of 3060 in the graphs? Why?
Because I have no reference card, just a near-reference card flashed with the original reference BIOS

TPUs figures curiously contrast with other reviews on some games: (that's why I assumed, SAM was off)
SAM is indeed enabled. Update the specs table in all reviews. Maybe it's because I'm using actual gameplay in AC:V, and not the integrated benchmark which they might have optimized SAM for?

Wait what? 6900XT peak power at 619Watt during spike??????
That reading is accurate, I've verified it several times. Also tested with a cheapish 700 W PSU, it shuts off every 3rd time I run the test

View attachment 192862
On which PCIE port do you stick the red part of the config ?
I felt this was an adequate spot to put it, or people would keep asking "wut? why no 5900X?"
 
Last edited:
I am keen as mustard to see what they can manage, and there is a tonne of unknowns, like how much it could boost performance, how much IQ is lost/retained... but for the time being, said feature is conclusively absent.
I note they are working on Fidelity FX which is their equivelent to DLSS to improve "visual quality" to cite their own words.
 
I feel AMD gimped too much on the Navi 22 in terms of CUs, and pushed too hard on the clockspeed. While a high clockspeed is nice, but its burning through too much power for a mid range card. While we are unlikely to see this at MSRP, still I feel the MSRP is also too high. I think people may be better off sticking to a RX 6800 if they want to stay with AMD, or consider a RTX 3070/ 3060 instead.
It just seems like RDNA2 is RDNA1 with mediocre DXR support, tweaked for higher clocks. The gamecache seems to have no effect at lower resolutions and only makes a difference at 4K, which is a bit silly because the 6700XT is already at unplayable framerates in several of the tested games at 4K. Who cares if the cache improves performance by 30% when you're still only getting 24.1 fps?

If you look at some of the heavy factory OC 5700XT models with 2150MHz clock speeds they aren't doing a whole lot worse than the ~2450MHz 6700XT reference. I get the impression that a 1900MHz 6700XT would be close enough to a 5700XT that you'd struggle to see the difference in a side-by-side comparison; You'd question whether there was actually an improvement or whether it was just within margin-of-error.

People are arguing that RDNA2 is a huge architectural leap forward, but to me the results sure look like most of the gained performance over the 5700XT is proportional to the clockspeed, meaning that IPC gains are close to zero and power efficiency takes a massive hit from running at those higher clocks.
 
Last edited:
Back
Top