# AMD Radeon RX 6700 XT: All You Need to Know



## btarunr (Mar 3, 2021)

AMD today announced the Radeon RX 6700 XT, its fourth RX 6000 series graphics card based on the RDNA2 graphics architecture. The card debuts the new 7 nm "Navi 22" silicon, which is physically smaller than the "Navi 21" powering the RX 6800/RX 6900 series. The RX 6700 XT maxes out "Navi 22," featuring 40 RDNA2 compute units, amounting to 2,560 stream processors. These are run at a maximum Game Clock frequency of 2424 MHz, a significant clock speed uplift over the previous-gen. The card comes with 12 GB of GDDR6 memory across a 192-bit wide memory interface. The card uses 16 Gbps GDDR6 memory chips, so the memory bandwidth works out to 384 GB/s. The chip packs 96 MB of Infinity Cache on-die memory, which works to accelerate the memory sub-system. AMD is targeting a typical board power metric of 230 W. The power input configuration for the reference-design RX 6700 XT board is 8-pin + 6-pin. 

AMD is marketing the RX 6700 XT as a predominantly 1440p gaming card, positioned a notch below the RX 6800. The company makes some staggering performance claims. Compared to the previous-generation the RX 6700 XT is shown beating the GeForce RTX 2080 Super. NVIDIA marketed the current-gen RTX 3060 Ti as having the same performance outlook. Things get interesting, where AMD shows that in select games, the RX 6700 XT can even beat the RTX 3070, a card NVIDIA marketed as matching its previous-gen flagship, the RTX 2080 Ti. AMD is pricing the Radeon RX 6700 XT at USD $479 (MSRP), which is very likely to be bovine defecation, given the prevailing market situation. The company announced a simultaneous launch of its reference-design and AIB custom-design boards, starting March 18, 2021.



 

 

 

 

AMD's performance claims follow.





 



*View at TechPowerUp Main Site*


----------



## KoRnMaN (Mar 3, 2021)

will you be able to actually buy one?


----------



## IbaChiba (Mar 3, 2021)

my guess is these will cost about 700+ CAD and may not reach their MSRP until the launch of the 7000 series


----------



## dirtyferret (Mar 3, 2021)

_"The company makes some staggering performance claims. Compared to the previous-generation the RX 6700 XT is shown beating the GeForce RTX 2080 Super."_

The card doesn't exist; why not have a Billy Mays type (RIP) come out and say "it slices and dices, safe on color fabrics, and if order one now we won't send you a second one free even if you pay shipping and handling.  Operators are not standing by!"


----------



## kajson (Mar 3, 2021)

KoRnMaN said:


> will you be able to actually buy one?


Our dutch hardware info website claims the yields of the 6700XT are extremely good, so good they will not launch a 6700 soon because they barely have any sub optimal outcomes. They also claim these cards will be available in significantly higher volumes then other cards. But with the current market and mining craze  you dont know if it will be enough to overcome the scalpers and really make a dent in the current shortage of latest gen cards.


----------



## Chrispy_ (Mar 3, 2021)

IMO 230W is too much for my tastes on a 335 mm2 die.

I'm interested in seeing how well this performs at 150W. If it can run north of 2GHz whilst sipping power, I'll buy one - assuming I _can_ actually buy one.


----------



## silentbogo (Mar 3, 2021)

All puzzle pieces are finally coming together. Got 3800X in a trade, got Smart Memory Access support for my board, now I can sell my RTX2060S for double-profit and chill 'till next christmas on my backup 1070. Would be nice to have GB to re-enable PCIe 4.0 support on my x470 Aorus Ultra....


----------



## sepheronx (Mar 3, 2021)

IbaChiba said:


> my guess is these will cost about 700+ CAD and may not reach their MSRP until the launch of the 7000 series



Probably more.  RX 6800 was supposed to be about the same price, not much more, than 3070.  Ended up costing almost $200 average more than a 3070 here in Canada.


----------



## SamuelL (Mar 3, 2021)

Assuming I could get one of these close(ish) to MSRP, would this be any kind of upgrade to a 1080ti? If I could get one around MSRP, then I could likely sell the 1080ti for about the same price. Just don’t know if it would be worth all the effort... I’m debating if I should just wait for the next round of future GPUs in 6-12 months given the current pricing disaster.

Opinions?


----------



## sepheronx (Mar 3, 2021)

SamuelL said:


> Assuming I could get one of these close(ish) to MSRP, would this be any kind of upgrade to a 1080ti? If I could get one around MSRP, then I could likely sell the 1080ti for about the same price. Just don’t know if it would be worth all the effort... I’m debating if I should just wait for the next round of future GPUs in 6-12 months given the current pricing disaster.
> 
> Opinions?


If you can get one, and at the mentioned MSRP, then it would be good.  But yeah, it would be better than 1080ti.  If it matches around 3070 performance, then yeah, good jump.


----------



## Durvelle27 (Mar 3, 2021)

The fact that AMD claims pit this against the RTX 3070 is astonishing. That basically means in raw power everything AMD has to offer beats out Nvidia at better prices up until the RTX 3080

So once the RX 6700 drops voice to reason it would be faster than the RTX 3060 Ti

This is a very interesting Generation for PC Gamers.


----------



## RandAlThor (Mar 3, 2021)

> AMD is pricing the Radeon RX 6700 XT at USD $479 (MSRP), which is very likely to be bovine defecation, given the prevailing market situation.


Haha you got that right and made me lol more than if you used the shorter more vulgar word that means the same thing. Good job


----------



## Aquinus (Mar 3, 2021)

Durvelle27 said:


> This is a very interesting Generation for PC Gamers.


Yeah, assuming:


KoRnMaN said:


> will you be able to actually buy one?





Chrispy_ said:


> assuming I _can_ actually buy one.





sepheronx said:


> If you can get one


----------



## catulitechup (Mar 3, 2021)

Durvelle27 said:


> This is a very interesting Generation for PC Gamers.



This is a very interesting Generation for Minners


----------



## londiste (Mar 3, 2021)

I bet performance figures are with SAM. Marketing stuff 
AMD already announced SAM coming to Ryzen 3000.
We'll see if Nvidia will push its schedule forward, they initially said late March.


----------



## birdie (Mar 3, 2021)

Spoiler



AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.

OMFG.

This world has gone crazy. Competition? Customers first? Modest margins? Never heard of it!

AMD is the new Apple. It became obvious when they announced the pricing for the Ryzen 5000 series.



Edit: to be honest the entire consumer graphics market has gone crazy. If you want to game you'd be better off buying a console. Performance/price for AMD has barely moved vs the RDNA1 generation which is just bonkers. We used to pay the same to get a lot more performance but not in today's market.


----------



## Legacy-ZA (Mar 3, 2021)

Well, yes, I suppose we will see the RTX3070Ti announced soon. Not that I am optimistic that one would actually be able to buy one, never mind the RX6700.


----------



## windwhirl (Mar 3, 2021)

catulitechup said:


> This is a very interesting Generation for Minners


Not really on AMD's side. The Radeon VII is still more profitable than any of the RDNA2 cards, apparently.


----------



## HD64G (Mar 3, 2021)

birdie said:


> AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.
> 
> OMFG.
> 
> ...


Before Apple, there is Intel for CPUs and nVidia for GPUs that milked PC owners for a decade now. AMD hasn't surpassed the $1000 mark for the desktop CPU or GPU segment yet, while nVidia did so with 2080Ti and 3090. And Intel had kept 6 cores for their workstation CPUs that cost over a 1K also. So, until AMD becomes so greedy as those 2 companies keep you horses...


----------



## catulitechup (Mar 3, 2021)

windwhirl said:


> Not really on AMD's side. The Radeon VII is still more profitable than any of the RDNA2 cards, apparently.











						GPU mining ⛏️ | minerstat
					

Most profitable GPUs and their hashrates. Compare GPUs by power consumption, efficiency, profitability, and coins.




					minerstat.com
				




Apparently this card can have same o more mining rx 5700 xt performance and rx 5700 xt stay very close to rtx 3060ti

resuming rx 6700 xt: When 500 bucks Minning Begins


----------



## RedelZaVedno (Mar 3, 2021)

40CU/2560SP up to 2424Mhz on 251 mm2 die for $479??? That's Polaris sized GPU for God sake!
It'll be 12,4 TFlops at best... so 25% faster than 5700XT for 20% higher MSRP and 50% higher AIB prices. 5% added value in best case scenario and 45% price to performance regression in reality. AMD go eat Sh/t you've become worse than Ngreedia!


----------



## dirtyferret (Mar 3, 2021)

birdie said:


> AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.
> 
> OMFG.
> 
> ...



To paraphrase the infamous Greek philosopher Kayne Westodoulopoulos; AMD & Nvidia don't care about PC gamers, they only care about their stock price


----------



## Chris34 (Mar 3, 2021)

Thanks, I have enough paper (launch) to wipe my behind for a century now.


----------



## Tom Yum (Mar 3, 2021)

RedelZaVedno said:


> 40CU/2560SP up to 2424Mhz on 251 mm2 die for $479??? That's Polaris sized GPU for God sake!
> It'll be 12,4 TFlops at best... so 25% faster than 5700XT for 20% higher MSRP and 50% higher AIB prices. 5% added value in best case scenario and 45% price to performance regression in reality. AMD go eat Sh/t you've become worse than Ngreedia!


Where did you get Navi22 being 251mm2 from? That is the size of the 5700XT (Navi 12). Navi 22 is apparently 334mm2 (Navi 21 aka 6800/6900 is 520mm2). It is also on a much more expensive process than Polaris (7nm versus 12nm) so the die cost is likely double that of Polaris.


----------



## dirtyferret (Mar 3, 2021)

catulitechup said:


> This is a very interesting Generation for Minners


PC gaming and fishing ain't what it used to be


----------



## windwhirl (Mar 3, 2021)

RedelZaVedno said:


> 40CU/2560SP up to 2424Mhz on 251 mm2 die for $479??? That's Polaris sized GPU for God sake!
> It'll be 12,4 TFlops at best... so 25% faster than 5700XT for 20% higher MSRP and 50% higher AIB prices. 5% added value in best case scenario and 45% price to performance regression in reality. AMD go eat Sh/t!


TSMC stopped giving volume discounts, for starters. And newer nodes are more expensive than older ones.



Tom Yum said:


> Where did you get Navi22 being 251mm2 from? That is the size of the 5700XT (Navi 12). Navi 22 is apparently 334mm2 (Navi 21 aka 6800/6900 is 520mm2). It is also on a much more expensive process than Polaris (7nm versus 12nm) so the die cost is likely double that of Polaris.


On that, some analysts believe the wafer cost for 7nm is more than double the cost for 16/12 nm.








						Alleged Prices of TSMC Silicon Wafers Appear
					

TSMC, one of the biggest silicon manufacturers in the world, usually doesn't disclose company pricing of the silicon it manufactures and only shares that with its customers. It appears that RetiredEngineer (@chiakokhua on Twitter) got a hold of the pricing of TSMCs wafers on every manufacturing...




					www.techpowerup.com


----------



## Relixo (Mar 3, 2021)

Title: "AMD Radeon RX 6700 XT: All You Need to Know"
_In the reality this is what we all have to know:_ *Miners will buy ALL the cards!!!*
So, go to hell miners aaaaaand


----------



## RedelZaVedno (Mar 3, 2021)

Tom Yum said:


> Where did you get Navi22 being 251mm2 from? That is the size of the 5700XT (Navi 12). Navi 22 is apparently 334mm2 (Navi 21 aka 6800/6900 is 520mm2). It is also on a much more expensive process than Polaris (7nm versus 12nm) so the die cost is likely double that of Polaris.


6900XT has 520 mm² die... Half it and you get to 260 minus smaller infinity cache and you get to RNDA1's die size. Other numbers are from AMD given to Steve at Gamers Nexus before presentation. It's overpriced every way you spin it. TSMC's 7nm process is very mature by now and we're talking about really small die size here, so number of non-functional dies per water must be really low, meaning it's cheap to produce. 6700XT could cost 350 bucks and AMD would still make higher profit margin on it as it did on Vegas and even Polaris. It's clearly a rip off not worth paying like everything else on today's dGPU market. I hate people who defend these 2 greedy corporations and don't even own their stocks. Profit margins of both have been skyrocketing this decade and they still want to milk us more. We're voting with our wallets here. There will be no back to normal once mining craze ends IF gamers buy GPUs at these MSRPs or higher. I've been buying GPUs since early 90ies and dGPU market has never been worse than it's today.


----------



## sepheronx (Mar 3, 2021)

Relixo said:


> Title: "AMD Radeon RX 6700 XT: All You Need to Know"
> _In the reality this is what we all have to know:_ *Miners will buy ALL the cards!!!*
> So, go to hell miners aaaaaand
> View attachment 190789


The RX 6800 aren't really a good price/performance for miners. A 6800XT nets you what a 3070 does and the 3070 seems to be roughly about $300+ more expensive in Canada.


----------



## Tom Yum (Mar 3, 2021)

RedelZaVedno said:


> 6900XT has 520 mm² die... Half it and you get to 260 minus smaller infinity cache and you get to RNDA1's die size. Other numbers are from AMD given to Steve at Hardware Nexus before presentation. It's overpriced every way you spin it. TSMC's 7nm process is very mature by now and we're talking about really small die size here, so number of non-functional dies per water must be really low, meaning it's cheap to produce. 6700XT could cost 350 bucks and AMD would still make higher profit margin on it as it did with Vegas and even Polaris. It's clearly a rip off north worth paying like everything else on today's dGPU market. I hate people who defend these 2 greedy corporations and not even owning their stocks. We're voting with our wallets. There will be no back to normal once mining craze ends IF gamers buy GPUs at these MSRPs or higher.


It doesn't work that way, there is much more to a GPU than just shaders, and even the Infinity cache is more than half Navi 21 (96MB vs 128MB). So Navi 22 is definitely more than 260mm2. And the maturity of 7nm is irrelevant, the shortage is not because of low yield, it's because TSMC can only make ~140,000 7nm wafers a month, and that has to satisfy the world's demand for 7nm (PS5, XBSX, 3000/5000 series Ryzen, Epyc, 4000 series Ryzen mobile, various Qualcomm, MediaTek, Broadcom, etc).

The days of yearly die shrinks on similarly priced fabrication processes are over. $/transistor is going up with each node, that is just a reality brought on by physics. Could AMD sell a bit cheaper? Sure, and it wouldn't make a lick of difference to the prices we pay. The only thing that will drive prices down is competition and supply commensurate with demand. We finally have competition, we just need supply from TSMC/Samsung to increase which can take years unfortunately.


----------



## kruk (Mar 3, 2021)

RedelZaVedno said:


> TSMC's 7nm process is very mature by now and we're talking about really small die size here, so number of non-functional dies per water must be really low, meaning it's cheap to produce. 6700XT could cost 350 bucks and AMD would still make higher profit margin on it as it did on Vegas and even Polaris. It's clearly a rip off not worth paying like everything else on today's dGPU market.


Yeah, but nVidia dictates the MSRP prices and they just follow. Since the wafers are limited, it wouldn't make sense for them to go into a price war, especially since they can get much more CPUs from a wafer than GPUs (Zen 3 dies are only like 80 mm2) and sell them with bigger profits.


----------



## TheinsanegamerN (Mar 3, 2021)

birdie said:


> AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.
> 
> OMFG.
> 
> ...


Ummm... Quite frankly WGAF? It coud be 32 bit for all I care, if it performs then it performs. I thought that would have been pretty obvious when the geforce 980 was trading blows with the radeon 290 despite the 980 having only 256 bits compared to hawaii's 512. 

This is a card that, if it is faster then a 3070, will be twice the speed of a vega 64 with more VRAM for $479. Vega was $650 in 2016, as was the 1080. Doubling performance three tiers down from the high end in two generations is pretty good considering how much trouble AMD has had lately with GPUs, and it is a surprising jump over the 5700xt, which has the same core count.


----------



## RedelZaVedno (Mar 3, 2021)

Tom Yum said:


> It doesn't work that way, there is much more to a GPU than just shaders, and even the Infinity cache is more than half Navi 21 (96MB vs 128MB). So Navi 22 is definitely more than 260mm2. And the maturity of 7nm is irrelevant, the shortage is not because of low yield, it's because TSMC can only make ~140,000 7nm wafers a month, and that has to satisfy the world's demand for 7nm (PS5, XBSX, 3000/5000 series Ryzen, Epyc, 4000 series Ryzen mobile, various Qualcomm, MediaTek, Broadcom, etc).
> 
> The days of yearly die shrinks on similarly priced fabrication processes are over. $/transistor is going up with each node, that is just a reality brought on by physics. Could AMD sell a bit cheaper? Sure, and it wouldn't make a lick of difference to the prices we pay. The only thing that will drive prices down is competition and supply commensurate with demand. We finally have competition, we just need supply from TSMC/Samsung to increase which can take years unfortunately.


I'm not buying this argument. Just look at NVidia profit margin graph. They went from 29% (2009) to 63% (2020). Similar is true for AMD after Zen moment: from 22% to 45% in 5 years. TSMC jacking up prices and covid supply chain problems are to blame only partially. Majority of price hikes over the last 6 years can be explained by AMD's & NGreedia's skyrocketing profit margins and people still defend them. I just don't get it why. It's us who're getting milked. It's duopoly bordering on monopoly with the market maker setting up prices and weaker player going along with it in a cartel like arrangement not trying to compete for the market share so they both make more profits and consumers lose.


----------



## Taz100420 (Mar 3, 2021)

If this pulls the same hash rates as a 5700/XT and pulls about same wattage, no miner will dump their 5700s and go to these. Unless the 6700 stay cheaper in price than what they bought their 5700s for, then they'll probably sell their 5700s and buy 6700s lol

I bought my 5700 for $299 in December 2019 myself. *If* the performance of the 6700XT was on par or greater with my 5700, I'd sell it, make twice the profit and get a 6700 XT if they don't sell out in first few seconds lol. I don't see that happening though XD


----------



## Mistral (Mar 3, 2021)

IbaChiba said:


> my guess is these will cost about 700+ CAD and may not reach their MSRP until the launch of the 7000 series


It's not like that wouldn't be 600CAD before taxes even at MSRP...


----------



## Chrispy_ (Mar 3, 2021)

RedelZaVedno said:


> 40CU/2560SP up to 2424Mhz on 251 mm2 die for $479??? That's Polaris sized GPU for God sake!
> It'll be 12,4 TFlops at best... so 25% faster than 5700XT for 20% higher MSRP and 50% higher AIB prices. 5% added value in best case scenario and 45% price to performance regression in reality. AMD go eat Sh/t you've become worse than Ngreedia!


It's a bigger die than that. VideoCardz and Igor's Lab have both had confirmation that the die area is 335 mm^2
Don't forget, there's a bunch of additional cache that the 5700XT never had, and there's also all of the hardware raytracing units.

Polaris was only 232mm^2 and on top of that GloFo 14nm was an unpopular node at a time when wafer supply was good and GloFo had no other major committments to other customers; AMD managed to get chips out of GloFo at a good price. TMSC's 7FF is a constrained, sought-after, premium node with a lot of bidders pushing the price up and TSMC basically getting whatever they ask for it.

*Polaris:*
Small die, 
cheap node with no competition, 
plenty of supply, 
cheap PCB with simple VRMs
cheap alu extrusion cooler for 150-180W cards.

*Navi 22:*
44% larger die, despite a higher density node
the most expensive node currently available, 
bidding against many competitors with much deeper pockets than AMD
supply constraints everywhere pushing up prices,
manufacturing difficulties due to COVID and water shortages
higher-end PCB with more layers and more complex power delivery
more expensive vapor-chamber and multi-heatpipe soldered coolers for 230-350W cards.

This is a much closer comparison in terms of transistor count, die area, and board/cooler design to the $699 RadeonVII (7nm Vega64). If you want to make a Polaris comparison, wait for Navi23 which should be much closer to the die size and transistor count of Polaris, TPU seems to have some leaked/rumoured specs here:








						AMD Radeon RX 6600 XT Specs
					

AMD Navi 23, 2589 MHz, 2048 Cores, 128 TMUs, 64 ROPs, 8192 MB GDDR6, 2000 MHz, 128 bit




					www.techpowerup.com


----------



## B-Real (Mar 3, 2021)

Faster than the 3070 (I know they chose many AMD titles, but seeing the Borderlands 3, Gears 5 or WD: Legion results, overall I think it will be a bit better), having +4GB VRAM and being cheaper? Well, this won't last for long even if miners wouldn't buy it.



RedelZaVedno said:


> 40CU/2560SP up to 2424Mhz on 251 mm2 die for $479??? That's Polaris sized GPU for God sake!
> It'll be 12,4 TFlops at best... so 25% faster than 5700XT for 20% higher MSRP and 50% higher AIB prices. 5% added value in best case scenario and 45% price to performance regression in reality. AMD go eat Sh/t you've become worse than Ngreedia!




What are you talking about? This will be a bit faster than the 3070 (around +35% compared to the 5700 XT) given the benchmarks + having +4GB VRAM + have lower MSRP.


----------



## TheinsanegamerN (Mar 3, 2021)

B-Real said:


> Faster than the 3070 (I know they chose many AMD titles, but seeing the Borderlands 3, Gears 5 or WD: Legion results, overall I think it will be a bit better), having +4GB VRAM and being cheaper? Well, this won't last for long even if miners wouldn't buy it.
> 
> 
> 
> ...


Here's the issue. The 5700xt was $399 at launch. This is "promised" at $479. 

So you get 35% more performance, but it costs 20% more. Real world price will likely be closer to 40-50% mroe then a 5700xt. so IRL pricing this will likely end up another turing level generation where price/perf barely moves.


----------



## MxPhenom 216 (Mar 3, 2021)

birdie said:


> AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.
> 
> OMFG.
> 
> ...



Holy f**** the exaggeration. Chill, go take a xanax or something


----------



## RedelZaVedno (Mar 3, 2021)

Chrispy_ said:


> It's a bigger die than that. VideoCardz and Igor's Lab have both had confirmation that the die area is 335 mm^2
> Don't forget, there's a bunch of additional cache that the 5700XT never had, and there's also all of the hardware raytracing units.
> 
> Polaris was only 232mm^2 and on top of that GloFo 14nm was an unpopular node at a time when wafer supply was good and GloFo had no other major committments to other customers; AMD managed to get chips out of GloFo at a good price. TMSC's 7FF is a constrained, sought-after, premium node with a lot of bidders pushing the price up and TSMC basically getting whatever they ask for it.
> ...


Then RDNA2 arch has to have a very large design. 3060TI has 38CU and is 'only' 392 mm2 in size on de facto 10nm die. I'm still standing behind my estimation of 250 to 275 mm2. We'll see who's right soon enough.



B-Real said:


> What are you talking about? This will be a bit faster than the 3070 (around +35% compared to the 5700 XT) given the benchmarks + having +4GB VRAM + have lower MSRP.


I don't know where you got 35% number, but from my calculations, based on AMDs own figures (40CU/2560SP up to 2424Mhz) it will have 12,4 TFlops performance at best... that's around 25% faster than 5700XT... Benchmarks AMD showed were with SAM enabled on 6700XT. We'll have to wait for 3rd party independent benchmarks first, but again math tells me it'll probably be on pair with 3060TI (or within 5%) with near zero room for OC. 

 I'm far away from being Ngreedia's fanboy (I bought 2900 GT, 4870, 6870, R290, RX480 in the past), but I have to say it: $479 MSRP is a joke. No DLSS, worse RT, worse encoder, no supported Adobe Premiere/After effects GPU acceleration... The only plus over 3060TI I can see is +4 gigs of VRAM. This GPU should be priced at $349-379 imho.


----------



## Manoa (Mar 3, 2021)

AMD drivers suckx: All You Need to Know​


----------



## Tom Yum (Mar 3, 2021)

RedelZaVedno said:


> Then RDNA2 arch has to have a very large design. 3060TI has 38CU and is 'only' 392 mm2 in size on de facto 10nm die. I'm still standing behind my estimation of 250 to 275 mm2. We'll see who's right soon enough.
> 
> 
> I don't know where you got 35% number, but from my calculations, based on AMDs own figures (40CU/2560SP up to 2424Mhz) it will have 12,4 TFlops performance at best... that's around 25% faster than 5700XT... Benchmarks AMD showed were with SAM enabled on 6700XT. We'll have to wait for 3rd party independent benchmarks first, but again math tells me it'll probably be on pair with 3060TI with near zero room for OC.


We'll have to wait until reviews, but you are also seemingly forgetting 5700XT is RDNA1, 6700XT is RDNA2 which brought in significant changes that increases the performance of each CU. You can't such scale 5700XT by the 6700XT clock speed to make an estimate of performance. You also can't translate Tflop figures to framerate on different architectures.


----------



## Camm (Mar 3, 2021)

birdie said:


> AMD has surely lost any remnants of conscience. A 192bit wide GPU for a nice price of $500.



+ 96MB of Infinity Cache, of which AMD detailed in the 256bit + 128mb 6800 series effectively extends bandwidth capacity, reducing the need for such wide buses. That 96mb of cache isn't exactly friendly to put on die (chews up a fair bit of space), so your assertions are pretty baseless.

The bigger issue IMO is that the card is 10w more than 3070 without quite matching its performance, likely pointing to these cards being pretty heavily clocked. I need a card for the HTPC so I might grab one before the prices skyrocket like every other card launch, but I have serious questions about how good this card actually is.


----------



## Chrispy_ (Mar 3, 2021)

RedelZaVedno said:


> Then RDNA2 arch has to have a very large design. 3060TI has 38CU and is 'only' 392 mm2 in size on de facto 10nm die. I'm still standing behind my estimation of 250 to 275 mm2. We'll see who's right soon enough.
> 
> 
> I don't know where you got 35% number, but from my calculations, based on AMDs own figures (40CU/2560SP up to 2424Mhz) it will have 12,4 TFlops performance at best... that's around 25% faster than 5700XT... Benchmarks AMD showed were with SAM enabled on 6700XT. We'll have to wait for 3rd party independent benchmarks first, but again math tells me it'll probably be on pair with 3060TI with zero room left to OC it.


Ampere doesn't have an additional 96MB of cache on die, so yes, it's a smaller design "per compute unit" than Ampere. All Ampere cards are Samsung 8nm, btw - the only 10nm GPUs I'm aware of are Intel's Xe dGPUs.

You don't need to see who's right - I wasn't _guessing_ the die size, Navy flounder (codename for Navi22) was confirmed a 335mm die when information intended for regulators was leaked last year. Since then we've had additional confirmation from manufacturing leaks and photos. Sapphire, who make the 6700XT reference cards, have had these things out in the wild for over a month already. For it to be available to buy in the US in three weeks time, the finished product had to leave the factory a month ago


----------



## RedelZaVedno (Mar 3, 2021)

Tom Yum said:


> We'll have to wait until reviews, but you are also seemingly forgetting 5700XT is RDNA1, 6700XT is RDNA2 which brought in significant changes that increases the performance of each CU. You can't such scale 5700XT by the 6700XT clock speed to make an estimate of performance. You also can't translate Tflop figures to framerate on different architectures.


True, but I can extrapolate from 6800(XT)/6900(XT) numbers. I know most games prefer higher frequency over larger number of shading units for now, but I still can't come to +35% over 5700XT. All I come up with is +30% with lots of math and mental gymnastics. That's why I still stand behind within 5% 3060TI performance number.



Chrispy_ said:


> Ampere doesn't have an additional 96MB of cache on die, so yes, it's a smaller design "per compute unit" than RDNA2. All Ampere cards are Samsung 8nm, btw.
> 
> You don't need to see who's right - I wasn't _guessing_ the die size, Navy flounder (codename for Navi22) was confirmed a 335mm die when information intended for regulators was leaked last year. Since then we've had additional confirmation from manufacturing leaks and photos. Sapphire, who make the 6700XT reference cards, have had these things out in the wild for over a month already. For it to be available to buy in the US in three weeks time, the finished product had to leave the factory a month ago


Samsung's '8 nm' roughly equals TSMC's 10 nm in density (Corteks had quite in deep analysis of both nodes). Yeah I see it now, it's 334.54 mm2, but still not a whole lot bigger than full RDNA1 die.


----------



## windwhirl (Mar 3, 2021)

Manoa said:


> AMD drivers suckx: All You Need to Know​


Not gonna lie, everyone always complained about a million problems while I was always just fine here lol


----------



## Chrispy_ (Mar 3, 2021)

RedelZaVedno said:


> Samsung's '8 nm' roughly equals TSMC's 10 nm in density (Corteks had quite in deep analysis of both nodes). Yeah I see it now, it's 334.54 mm2, but still not a whole lot bigger than full RDNA1 die.


Oh right, that's where you were going with the 10nm comparison....
I wouldn't get too worried about die size comparisons that aren't on the same process. Transistor count is a more accurate gauge of how "high-end" any given product is, though you do have to account for major shifts in architecture, such as Pascal vs Turing that added an enormous amount of extra transistors for RT cores and Tensor cores.


----------



## medi01 (Mar 3, 2021)

Chrispy_ said:


> IMO 230W is too much for my tastes on a 335 mm2 die.
> 
> I'm interested in seeing how well this performs at 150W. If it can run north of 2GHz whilst sipping power, I'll buy one - assuming I _can_ actually buy one.


I wonder what you say about 3070.



londiste said:


> I bet performance figures are with SAM


When performance figures are with SAM, AMD explicitly states so.


----------



## Bruno_O (Mar 3, 2021)

Chrispy_ said:


> IMO 230W is too much for my tastes on a 335 mm2 die.
> 
> I'm interested in seeing how well this performs at 150W. If it can run north of 2GHz whilst sipping power, I'll buy one - assuming I _can_ actually buy one.


agreed
happy I went with the Powercolor 6800 Fighter (non xt), with some undervolting I'm running stock <200W total on my HTPC (great card btw, awesome value in AU/NZ).


----------



## Chrispy_ (Mar 3, 2021)

medi01 said:


> I wonder what you say about 3070.


Same thing. The 3070 actually has a reasonable performance/Watt but I'm simply not interested in >200W cards.
I like my PCs to be small and inaudible, so low power consumption is a pretty high priority for me.

I have plenty of super hot and noisy boxes at work with threadrippers or multiple 2080Ti cards in them but I game and watch movies with speakers, not headphones. I like maximum dynamic range and to hear all of the subtlest, quietest details over the sound of fans whirring.


----------



## pjl321 (Mar 3, 2021)

AMD didn't set the price of this card at $479, buyers set the price of this card at around ~$600.

I believe what AMD is doing (and don't really blame them I guess) is they know AIBs will sell this card for ~$600 no matter what they set the MSRP at. The market right now means no matter what the price is they will sell out instantly, so why bother pricing these cards really low? 

If AMD set the MSRP at $349 to be ultra competitive then AIBs would just make loads of money as the street price would still be ~$600, if AMD prices them at $479 then AMD & AIBs make loads of money. And you just have to hope some of that money goes into RDNA 3, 4 and 5 development. 

Manufacturers don't set the price of products, buyers set the price! There isn't a single card on the market right now that I would buy, so my money is staying in my pocket. Once the mining bubble bursts or production catches up to demand then I will look at cards and prices again. What everyone else is up to them.

And who knows, maybe Intel will be the saviour of the PC gaming market!!!


----------



## Chrispy_ (Mar 3, 2021)

pjl321 said:


> AMD didn't set the price of this card at $479, buyers set the price of this card at around ~$600.
> 
> I believe what AMD is doing (and don't really blame them I guess) is they know AIBs will sell this card for ~$600 no matter what they set the MSRP at. The market right now means no matter what the price is they will sell out instantly, so why bother pricing these cards really low?
> 
> ...


Watch the LTT video that came out earlier today.
AMD's plan is to sell these DIRECT to consumers via AMD.COM to cut out the retail-chain queue-jumping.
So, the MSRP matters: Gamers should be able to *try* and buy these direct from AMD at AMD's MSRP.


----------



## vctr (Mar 3, 2021)

medi01 said:


> I wonder what you say about 3070.
> 
> 
> When performance figures are with SAM, AMD explicitly states so.


IN fact benchmark numbers where with SAM on, GN asked AMD and they said it's with SAM on, with SAM off I'd expect it to be a bit slower or the same performance, tho 3060ti costs 20% less and will have 10% less performance, but I think 6700XT will be on par or a bit slower than 3070.


----------



## mechtech (Mar 3, 2021)

"AMD Radeon RX 6700 XT: All You Need to Know"

Unavailable and expensive.

Also food for thought...........basically same silicon size (2560 shaders) as a 5700XT, but with 4 more gigs of ram and a smaller 192-bit bus.  5700XT was $399 at launch, so I guess 2 yrs of inflation and 4GB of extra ram is $80 US?

edit 2 - found something, albeit 2 yrs old








						GDDR6 significantly more expensive than GDDR5
					

Electronic Components Dealers lists various Micron GDDR5 and GDDR6 chips, pricing for 2,000 units. From these it can be seen that GDDR6 is currently much more expensive than GDDR5 at the moment. Goi...




					www.guru3d.com


----------



## Minus Infinity (Mar 3, 2021)

Let's see, the 3070 RRP $499 USD should be about $650-700 AUD but instead is $1100-1400, so I'm guessing the 6700XT will be around $1000-1300AUD. Total equine defecation. Screw them both, I gave up and got a 2080 Super, as new, half new price and perfectly fine for my 1440p gaming and will be as fast at least as the 6700XT for raytracing. I'm skipping both current gen cards and waiting for Lovelace or RDNA3 for my  upgrade.


----------



## Hachi_Roku256563 (Mar 4, 2021)

i agreee
i love using old outdated drives with missing features spread across 2 apps
i love it when i change a setting and the app stops responding for a good minute
that smash my harddrives randomly peg the cpu
i love it when my drivers crash
Oh wait thats my 1060s drivers not my 580s my bad


Manoa said:


> AMD drivers suckx: All You Need to Know​


----------



## defaultluser (Mar 4, 2021)

Durvelle27 said:


> The fact that AMD claims pit this against the RTX 3070 is astonishing. That basically means in raw power everything AMD has to offer beats out Nvidia at better prices up until the RTX 3080
> 
> So once the RX 6700 drops voice to reason it would be faster than the RTX 3060 Ti
> 
> This is a very interesting Generation for PC Gamers.


Not really.  AMD's RDNA2cards gets stomped by anything Nvidia when you turn on a little thing called RTX.
The 3070 matched the 6900XT at 4k, and the 6700 XT is likely to trade-blows with either the 3060 Ti, or 3060.

By the time you can actually buy these things, there will be so many  RT-supported games, it's going to be pointless.


----------



## Bruno_O (Mar 4, 2021)

defaultluser said:


> there will be so many RT-supported games, it's going to be pointless.


haha good one


----------



## Hachi_Roku256563 (Mar 4, 2021)

defaultluser said:


> Not really.  AMD's RDNA2cards gets stomped by anything Nvidia when you turn on a little thing called RTX.
> The 3070 matched the 6900XT at 4k, and the 6700 XT is likely to trade-blows with either the 3060 Ti, or 3060.
> 
> By the time you can actually buy these things, there will be so many  RT-supported games, it's going to be pointless.


ummmm no
until rtx has over 50% of the market basicly no dev is props gonna intergreate it
its expensive and stupid its a waste of time


----------



## iO (Mar 4, 2021)

mechtech said:


> "AMD Radeon RX 6700 XT: All You Need to Know"
> 
> Unavailable and expensive.
> 
> ...


It's just more expensive to make as the die is ~330mm², so 33% larger than Navi10. Also 16Gbps memory chips instead of 14Gbps. But prices creeping up constantly definitely sucks...


----------



## Hachi_Roku256563 (Mar 4, 2021)

ohhhh rgb dies


----------



## mechtech (Mar 4, 2021)

iO said:


> It's just more expensive to make as the die is ~330mm², so 33% larger than Navi10. Also 16Gbps memory chips instead of 14Gbps. But prices creeping up constantly definitely sucks...
> View attachment 190842


That much bigger!!! sigh........thanks ray tracing.


----------



## wolf (Mar 4, 2021)

Durvelle27 said:


> The fact that AMD claims pit this against the RTX 3070 is astonishing. That basically means in raw power everything AMD has to offer beats out Nvidia at better prices up until the RTX 3080


Yes and no, in rasterization (arguably matters *by far* the most) yes, absolutely. RT/AI though, not even close. The problem AMD has is that Nvidia will still outsell them significantly anyway, even if their equivalently priced product has a slight edge, that ship takes a bloody long time to turn. if Nvidia even sniffed a significant threat, a price adjustment/Super refresh would be all it takes to keep shareholders and fans happy.



Durvelle27 said:


> This is a very interesting Generation for PC Gamers.


Well ain't that the understatement of the Generation   



RedelZaVedno said:


> AMD go eat Sh/t you've become worse than Ngreedia!


Yep the Avid Money-grubbing Division can certainly be just as bad if they can get away with it, like ANY company that wants to make money.



medi01 said:


> I wonder what you say about 3070.


I'm not sure of the point? The die is 396mm2 and TDP is 220w...



Chrispy_ said:


> Same thing. The 3070 actually has a reasonable performance/Watt but I'm simply not interested in >200W cards.


Considered undervolting for this use case? given a TDP of 220w for the 3070, you could foreseeably retain 95%+ (often 100%) of the stock performance with a significant reduction in wattage and thus heat/noise output, Ampere are beasts at UV. Having said that I can see how starting from under 200w is even better, and then you could still layer UV on top of that. Genuinely don't really know how well RDNA2 cards undervolt.


----------



## hardcore_gamer (Mar 4, 2021)

Doing a quick math, (52CU/40CU)*(1.825MHz/2.424MHz) = 0.98. The performance is similar to an Xbox series X which (the entire system) costs almost the same. What a time to be a PC gamer /s


----------



## 1d10t (Mar 4, 2021)

Yeah no, I'll stick with my card ATM.


----------



## GamerGuy (Mar 4, 2021)

IF AMD sells the reference design cards (seems to be a good design as with all newer AMD and nVidia cards these days) at MSRP, they'd be the first to sell out. Now, the AIB cards would be sold at higher prices, that's a given as no AIB sells at MSRP - that goes for nVidia cards as well, and I expect prices to be around 600-700USD given the present GPU shortage and those pesky miners. Now, given that some scalpers here are selling an RTX 3060 for a whopping 800USD here in my neck of the woods, I'd not be at all surprised that the scalpers would sell the 6700 XT at that pricepoint of higher.

The local distributers of the various AIB cards would probably get a small supply of cards so I expect supply to be bad. It doesn't help that these cards would be sold by the various local brick and mortar shops in the local tech mall as a bundled system build. Based on my previous experience with the RX 6800/6900, many of these shops would sell these cards at scalers' pricing. I mean, I almost fell off my chair when I saw the RTX3070/RX 6800 being sold for 900 USD or higher.


----------



## laszlo (Mar 4, 2021)

"bovine defecation"   is a 1st on tpu as i know;  @btarunr you may use " alien defecation" to reflect also availability...bovine one we can found a lot....


----------



## medi01 (Mar 4, 2021)

wolf said:


> The die is


Who cares what the die size is?



wolf said:


> RT/AI though, not even close.


AMD beats NV in red sponsored RT games, loses in green sponsored games. (in all two dozen or so of them)
"Not even close", chuckle.

"But if I drop resolution, use TAA anti-aliasing, buzzwords and call it by upscaled resolution"...


----------



## Shatun_Bear (Mar 4, 2021)

mechtech said:


> "AMD Radeon RX 6700 XT: All You Need to Know"
> 
> Unavailable and expensive.
> 
> ...



Yeah, at least. As the 6700XT has Infinity Cache, so effective bandwidth far outstrips the 5700XT, but you chose to ignore that.

This is also a new architecture, so silicon size is totally meaningless.



hardcore_gamer said:


> Doing a quick math, (52CU/40CU)*(1.825MHz/2.424MHz) = 0.98. The performance is similar to an Xbox series X which (the entire system) costs almost the same. What a time to be a PC gamer /s



CU count is not really relevent here as Xbox Series X GPU is clocked so low (1.8Ghz). There's a reason the PS5 performs better in nearly every multiplatform game comparison despite 36 CUs.


----------



## mechtech (Mar 4, 2021)

Shatun_Bear said:


> Yeah, at least. As the 6700XT has Infinity Cache, so effective bandwidth far outstrips the 5700XT, *but you chose to ignore that*.
> 
> This is also a new architecture, so silicon size is totally meaningless.


And covid and a bunch of other things also.  I agree there is a price to everything, performance included.

My point was historically, there was usually a decent increase in performance for about the same price between generations.  Like the old radon HD3850 to HD 4850 to HD 5850.  The performance gains are there, but price has not really been incremental historically speaking. (covid, mining, supplies, tarifs, raytracing-bigger dies, etc. etc. have seen to that - unfortunately)


----------



## Chrispy_ (Mar 4, 2021)

wolf said:


> Considered undervolting for this use case? given a TDP of 220w for the 3070, you could foreseeably retain 95%+ (often 100%) of the stock performance with a significant reduction in wattage and thus heat/noise output, Ampere are beasts at UV. Having said that I can see how starting from under 200w is even better, and then you could still layer UV on top of that. Genuinely don't really know how well RDNA2 cards undervolt.


I'll undoubtedly undervolt it if I get one, but I'm not actively trying to buy an Ampere card for myself and I doubt one will fall into my lap like they used to, because of the shortages this year.


----------



## Deleted member 205776 (Mar 4, 2021)

Chrispy_ said:


> I'll undoubtedly undervolt it if I get one, but I'm not actively trying to buy an Ampere card for myself and I doubt one will fall into my lap like they used to, because of the shortages this year.


Can vouch for undervolting. My 3070 Gaming X Trio drew 230-250W on stock at 1075mv, 1965-1980 MHz, 63-65C.

Undervolted to 2010 MHz @ 900mv stable. Draws 140-180W and temps remain under 60C which is insane (case fans don't even need to ramp up past 1000 rpm so very quiet system while gaming, which is a first for me). Stable in all 3DMark tests, steady 2010 MHz frequency and even 2025 MHz sometimes.

I was very surprised to see how well these Ampere cards undervolt. Or maybe I just got lucky... or MSI did some black magic.










Stock:





UV:


----------



## MxPhenom 216 (Mar 4, 2021)

Alexa said:


> Can vouch for undervolting. My 3070 Gaming X Trio drew 230-250W on stock at 1075mv, 1965-1980 MHz, 63-65C.
> 
> Undervolted to 2010 MHz @ 900mv stable. Draws 140-180W and temps remain under 60C which is insane (case fans don't even need to ramp up past 1000 rpm so very quiet system while gaming, which is a first for me). Stable in all 3DMark tests, steady 2010 MHz frequency and even 2025 MHz sometimes.
> 
> ...


What does it mean for the curve parts that are in the blue portion? Just constant?


----------



## wolf (Mar 5, 2021)

Chrispy_ said:


> IMO 230W is too much for my tastes on a 335 mm2 die.





medi01 said:


> I wonder what you say about 3070.





medi01 said:


> Who cares what the die size is?


Perhaps I've misunderstood the sequence of quotes that lead to you saying "I wonder what you say about 3070", so again I'll ask, I'm not sure of your point, are you just genuinely curious what they think of a 3070?


medi01 said:


> AMD beats NV in red sponsored RT games, loses in green sponsored games. (in all two dozen or so of them)
> "Not even close", chuckle.
> 
> "But if I drop resolution, use TAA anti-aliasing, buzzwords and call it by upscaled resolution"...


RDNA2 silicone is simply less powerful at RT operations, and yeah, in raw terms, it's not even close. It looks to be roughly half as fast. AMD Sponsored titles perform better on AMD Hardware overall, both rast and potentially RT, because they extensively optimized it to use their bespoke hardware in the best way possible, is this news? Have you had look at the 3DMark DirectX Raytracing feature test (full path tracing), gives a pretty solid idea of RT performance capability - so In a vendor/optimization agnostic look it's not even close, chuckle.

And yeah again, Nvidia also adds DLSS into the mix to claw back the performance that DXR takes, Waiting to see what AMD can do here as a decent competitor, I genuinely want them to succeed here. No upscaling from a lower res will be without some compromise.


> > Hilbert Hagedoorn - Guru3d
> 
> 
> Looking at raytracing, we have to admit that AMD is performing reasonable at best .... We had hoped for a bit more as RT wise ..... so overall AMD is offering a fun first experience in Hybrid raytracing but we expected more. If we look at full path Raytracing, then AMD lags behind significantly as the competition is showing numbers nearly twice as fast.





> > W1zzard - Techpowerup!
> 
> 
> Raytracing performance loss bigger than on NVIDIA





> > Steve - Techspot/Hardware unboxed
> 
> 
> In fact, if you care about ray tracing, the RTX 3080 is a much better option.





> > Jarred Walton - Tom's Hardware
> 
> 
> Overall, AMD's ray tracing performance looks more like Nvidia's RTX 20-series GPUs than the new Ampere GPUs, which was sort of what we expected...Well. So much for AMD's comparable performance. AMD's RX 6800 series can definitely hold its own against Nvidia's RTX 30-series GPUs in traditional rasterization modes. Turn on ray tracing, even without DLSS, and things can get ugly.


This isn't in debate within the community, you're only trying to convince yourself.


MxPhenom 216 said:


> What does it mean for the curve parts that are in the blue portion? Just constant?


Everything to the right of the highest point that is a flat line just means the GPU won't try and boost beyond that speed/voltage point on the curve.


----------



## Deleted member 205776 (Mar 5, 2021)

MxPhenom 216 said:


> What does it mean for the curve parts that are in the blue portion? Just constant?


"Everything to the right of the highest point that is a flat line just means the GPU won't try and boost beyond that speed/voltage point on the curve."
This. I set it to run at 2025 MHz max constantly, with a constant 900mv. Don't need more than that.

On stock, it would fluctuate between 1965-1980 at higher temps and more power draw.

This way, it remains at a stable 2010-2025 MHz at 900mv, while drawling less power and having lower temps.


----------



## HenrySomeone (Mar 5, 2021)

Durvelle27 said:


> The fact that AMD claims pit this against the RTX 3070 is astonishing. That basically means in raw power everything AMD has to offer beats out Nvidia at better prices up until the RTX 3080
> 
> So once the RX 6700 drops voice to reason it would be faster than the RTX 3060 Ti
> 
> This is a very interesting Generation for PC Gamers.


They claim a lot of things, the reality on the other hand is usually (actually almost always) a "bit" different:





The RX 6800 is barely faster than RTX 3070 (yes, that's at 1080p, but given the extremely graphically demanding nature of recent new titles, that will be the resolution best suited to this cards in the longer run), so it stands to reason that 6700XT will struggle to compete with the 3060Ti. In a normal time, this card (considering its additional lack of features vs 3000 series) would be worth $350 at most...


----------



## ARF (Mar 5, 2021)

^^^ The graph that you present is misleading - it shows the 6800 severely bottlenecked by something.
In real conditions, the 6800 is around 10-15% faster than RTX 3070.


----------



## HenrySomeone (Mar 5, 2021)

You say 10-15% and you show 9?    And that's at 4k which will certainly be out of reach for 6700xt (in newer titles at decent settings)


----------



## ARF (Mar 5, 2021)

HenrySomeone said:


> You say 10-15% and you show 9?



Yes, older drivers, lack of SAM support, Core i-something instead of a Ryzen platform, bugs in Nvidia's control panel - lower settings, etc.


----------



## HenrySomeone (Mar 5, 2021)

Spoken like a true team red fanboy indeed!


----------



## ARF (Mar 5, 2021)

HenrySomeone said:


> Spoken like a true team red fanboy indeed!



Oukey..... 












						GeForce RTX 3070 vs. Radeon RX 6800
					

It's time for a GPU shootout to see how the GeForce RTX 3070 and Radeon RX 6800 compare by benchmarking them head to head in 41 games....




					www.techspot.com


----------



## Deleted member 205776 (Mar 5, 2021)

HenrySomeone said:


> Spoken like a true team red fanboy indeed!


Imagine being a fanboy of either company. Neither company cares about you, only about your wallet. Just stop this childish mindset. If AMD cards ever have the feature set I need, I'm definitely switching to try them out.


----------



## ARF (Mar 5, 2021)

Alexa said:


> Imagine being a fanboy of either company. Neither company cares about you, only about your wallet. Just stop this childish mindset. If AMD cards ever have the feature set I need, I'm definitely switching to try them out.



Sometimes they don't even care about your wallet. Because they think God grows money on the trees.

What features do you request from AMD? The Radeon is a more feature-rich product line, in general and historically.


----------



## Chrispy_ (Mar 5, 2021)

Alexa said:


> Can vouch for undervolting. My 3070 Gaming X Trio drew 230-250W on stock at 1075mv, 1965-1980 MHz, 63-65C.
> 
> Undervolted to 2010 MHz @ 900mv stable. Draws 140-180W and temps remain under 60C which is insane (case fans don't even need to ramp up past 1000 rpm so very quiet system while gaming, which is a first for me). Stable in all 3DMark tests, steady 2010 MHz frequency and even 2025 MHz sometimes.
> 
> ...


Looks solid.

In my experience, Navi10 undervolts better than Turing, but that's to be expected really as TSMC's 7FF is better than than their older 14nm process.

Samsung 8nm looks comparable to Navi10 based on your single post, and I'm assuming that Navi22 will undervolt in a very similar fashion to Navi10, being the same process and all.

The idea of a 6700XT or 3060 running at sub-100W is very appealing to me, and looking at the ebay prices of a 5700XT I can likely make a reasonable profit by selling my 5700XT on if I can find a 6700XT or 3060 to play with.



Alexa said:


> "Everything to the right of the highest point that is a flat line just means the GPU won't try and boost beyond that speed/voltage point on the curve."
> This. I set it to run at 2025 MHz max constantly, with a constant 900mv. Don't need more than that.
> 
> On stock, it would fluctuate between 1965-1980 at higher temps and more power draw.
> ...


See, I'd be running a battery of OCCT tests to work out the minimum stable voltage for each clock and then trying to work out where the beginning of diminishing returns kicks in for voltage/clocks.

It's not an idea that appeals to a lot of people but I suspect somewhere between 1500-1800MHz is the sweet spot with the highest performance/Watt. So yes, I'd happily slow down the card if it has large benefits in power draw. If I ever need more performance I'll just buy a more expensive card _contact my AMD/Nvidia account managers and try to bypass the retail chain in a desperate last-ditch effort to obtain a card _with a wider pipeline and more CUs/Cores.


----------



## medi01 (Mar 5, 2021)

wolf said:


> Perhaps I've misunderstood the sequence of quotes that lead to you saying "I wonder what you say about 3070", so again I'll ask, I'm not sure of your point, are you just genuinely curious what they think of a 3070?


It's a chip, with more VRAM than 3070, with perf roughly in the ballpark, and with claimed TDP roughly in the ballpark.
So if 6700 was bad, I was wondering, how you rated 3070.



wolf said:


> RDNA2 silicone is simply less powerful at RT operations


That's a baseless speculation.
People take stuff like Quake II RT, don't get that 90% of that perf is qurks nested in quirks nested in quirks optimized for single vendor's SHADERS, and draw funny conclusions.

One of the ray intersection issues (that didn't quite allow to drastically improve its performance) is that you need to randomle access large memory structures. Guess who has an edge at that...



wolf said:


> it's not even close.


Uh oh, doh.

Let me try again, there is NO such gap, definitely not in NV favor, in ACTUAL hardware RT perf, perfr is all over the place.













						GitHub - GPSnoopy/RayTracingInVulkan: Implementation of Peter Shirley's Ray Tracing In One Weekend book using Vulkan and NVIDIA's RTX extension.
					

Implementation of Peter Shirley's Ray Tracing In One Weekend book using Vulkan and NVIDIA's RTX extension. - GitHub - GPSnoopy/RayTracingInVulkan: Implementation of Peter Shirley's Ray ...




					github.com
				




And if you wonder "but why it is faster in GREEN SPONSORED games then", because only a fraction of what happens in games for ray tracing is ray intersection.

*Make sure to check "Random Thoughts" section on github, it's quite telling.*

Random Thoughts​
I suspect the RTX 2000 series RT cores to implement ray-AABB collision detection using reduced float precision. Early in the development, when trying to get the sphere procedural rendering to work, reporting an intersection every time the rint shader is invoked allowed to visualise the AABB of each procedural instance. The rendering of the bounding volume had many artifacts around the boxes edges, typical of reduced precision.
When I upgraded the drivers to 430.86, performance significantly improved (+50%). This was around the same time Quake II RTX was released by NVIDIA. Coincidence?
When looking at the benchmark results of an RTX 2070 and an RTX 2080 Ti, the performance differences mostly in line with the number of CUDA cores and RT cores rather than being influences by other metrics. Although I do not know at this point whether the CUDA cores or the RT cores are the main bottleneck.
UPDATE 2020-01-07: the RTX 30xx results seem to imply that performance is mostly dictated by the number of RT cores. Compared to Turing, Ampere achieves 2x RT performance *only when using ray-triangle intersection* (as expected as per NVIDIA Ampere whitepaper), otherwise performance per RT core is the same. *This leads to situations such as an RTX 2080 Ti being faster than an RTX 3080 *when using procedural geometry.
UPDATE 2020-01-31: the 6900 XT results show the *RDNA 2 architecture performing surprisingly well in procedural geometry scenes*. Is it because the RDNA2 BVH-ray intersections are done using the generic computing units (and there are plenty of those), whereas Ampere is bottlenecked by its small number of RT cores in these simple scenes? Or is RDNA2 Infinity Cache really shining here? The triangle-based geometry scenes highlight how efficient Ampere RT cores are in handling triangle-ray intersections; unsurprisingly as these scenes are more representative of what video games would do in practice.



wolf said:


> DLSS into the mix


Sorry, I cannot seriously talk about "*but if I downscale and slap TAA antialiasing, can I pretend I did not downscale*".
No, you can't. Or wait, you can. Whatever you fancy.
It's just, I won't.


----------



## Deleted member 205776 (Mar 5, 2021)

ARF said:


> Sometimes they don't even care about your wallet. Because they think God grows money on the trees.
> 
> What features do you request from AMD? The Radeon is a more feature-rich product line, in general and historically.


Idk, actual OpenGL support so my MC shaders don't run at 2 FPS, an encoder as good as NVENC, good drivers. Main things.


----------



## MxPhenom 216 (Mar 5, 2021)

Alexa said:


> "Everything to the right of the highest point that is a flat line just means the GPU won't try and boost beyond that speed/voltage point on the curve."
> This. I set it to run at 2025 MHz max constantly, with a constant 900mv. Don't need more than that.
> 
> On stock, it would fluctuate between 1965-1980 at higher temps and more power draw.
> ...


Im going to try this once i get a 3080. It'll have a waterblock on it too.  

Did you remove sone of the points from the curve. My 1070 has a ton abd id hate to have to get each one at the same freq hah


----------



## Deleted member 205776 (Mar 5, 2021)

MxPhenom 216 said:


> Im going to try this once i get a 3080. It'll have a waterblock on it too.
> 
> Did you remove sone of the points from the curve. My 1070 has a ton abd id hate to have to get each one at the same freq hah


Nope, just adjusted them. You can shift click and move a ton of squares at once, that's how I did it.

Here's an update.






2040-2055 stable @ 925mv (compared to stock 1965-1980 @ 1075mv). Max power draw 190W. Max temp 61C on air. The 66C max temp reported is the pic is from periodically going back to stock settings -- so yes, there is a 5 degree temp decrease and lots of MHz increase.

Fully stable.

Undervolt your Ampere cards people.

Also, we are getting a "bit" off topic, we should end this convo here or make a new thread lol.


----------



## N3M3515 (Mar 5, 2021)

HenrySomeone said:


> They claim a lot of things, the reality on the other hand is usually (actually almost always) a "bit" different:
> 
> 
> 
> ...



Go back and watch the table of 2560x1440 and you'll see a better representation.


----------



## hardcore_gamer (Mar 6, 2021)

Shatun_Bear said:


> CU count is not really relevent here as Xbox Series X GPU is clocked so low (1.8Ghz). There's a reason the PS5 performs better in nearly every multiplatform game comparison despite 36 CUs.



I've included the clock speeds in my calculation:


hardcore_gamer said:


> Doing a quick math, (52CU/40CU)*(1.825MHz/2.424MHz) = 0.98. The performance is similar to an Xbox series X which (the entire system) costs almost the same. What a time to be a PC gamer /s


----------



## wolf (Mar 7, 2021)

medi01 said:


> Uh oh, doh.


Doh indeed! 


> The triangle-based geometry scenes highlight how efficient Ampere RT cores are in handling triangle-ray intersections; *unsurprisingly as these scenes are more representative of what video games would do in practice.*


I don't wonder why it's faster in green sponsored games, I wonder why it's more often faster in vendor-agnostic tests and even in AMD sponsored games, in the form of adding a higher millisecond rendering time penalty to the output image.


medi01 said:


> Whatever you fancy.


I have no such reservations about how the magic pixels are rendered when the output image is virtually indistinguishable in motion and it comes with a healthy FPS boost. Quoting your own head-in-the-sand opinion in bold was a nice touch, though. It almost made me reconsider.

I'd say it was an interesting experience, but I've looked through the rose-coloured glasses before and I prefer to see the entire spectrum.

And with that, the ignore button strikes again!


----------



## medi01 (Mar 8, 2021)

wolf said:


> I wonder why it's more often faster in vendor-agnostic tests


You were presented with results of vendor-agnostic tests, along with source code and curious comments on major performance bumps.



wolf said:


> even in AMD sponsored games


1) Dirt 5 is so far the only RT game of that kind, and AMD is comfortably ahead in it
2) DF is an embarrassment



wolf said:


> when the output image is virtually indistinguishable *in motion*


Ah. In motion that is. And from sufficient distance, I bet.
That's ok then. As I recall DLSS took this:





and turned it into this:





all while reviewer kept saying that "better than native" mantra.

But one had to see that in motion, I'll remember that. Thanks!


----------



## londiste (Mar 8, 2021)

medi01 said:


> 1) Dirt 5 is so far the only RT game of that kind, and AMD is comfortably ahead in it
> 2) DF is an embarrassment


That DigitalFoundry video on probably the best analysis out there for the performance hit of raytracing effects today and across both manufacturers.
I will just link to the video again.


----------



## wolf (Mar 9, 2021)

londiste said:


> That DigitalFoundry video on probably the best analysis out there for the performance hit of raytracing effects today and across both manufacturers.
> I will just link to the video again.


Indeed, and it clearly demonstrates the penalty, where the AMD GPU pays a higher price to enable the RT effect, in an AMD sponsored title.

Fantastic channel too, they do a great job on virtually all content, they do the lengthy investigation, present the findings in full showing show you the good, the bad, and the nuance, and then on a balance of it all make informed conclusions and recommendations.


----------



## medi01 (Mar 9, 2021)

londiste said:


> That DigitalFoundry video on probably the best analysis out there for the performance hit of raytracing effects today and across both manufacturers.
> I will just link to the video again.



It's the sad bit.
It's the best analysis on RT subject that I've seen on youtube.
And it's still filled with pathetic shilling.

Yet, even without reading between the lines, you should have figured this:

Apples to apples, eh:

Typically, in any RT scenario, there are four steps. 
1) To begin with, the scene is prepared on the GPU, filled with all of the objects that can potentially affect ray tracing. 
2) In the second step, rays are shot out into that scene, traversing it and tested to see if they hit objects. 
3) Then there's the next step, where the results from step two are shaded - like the colour of a reflection or whether a pixel is in or out of shadow. 
4) The final step is denoising. You see, the GPU can't send out unlimited amounts of rays to be traced - only a finite amount can be traced, so the end result looks quite noisy. Denoising smooths out the image, and producing the final effect.

So, there are numerous factors at play in dealing with RT performance. *Of the four steps, only the second one is hardware accelerated - and the actual implementation between AMD and Nvidia is different...*

...Meanwhile, PlayStation 5's Spider-Man: Miles Morales *demonstrates that Radeon ray tracing can produce some impressive results on more challenging effects - and that's using a GPU that's significantly less powerful than the 6800 XT*....









						PC ray tracing deep dive: Radeon RX 6800 XT vs GeForce RTX 3080
					

AMD's brand new RDNA 2 architecture has arrived for desktop PCs via the RX 6000 line of graphics cards - and it's an im…




					www.eurogamer.net
				




So, uh, oh, doh, you were saying?


----------



## londiste (Mar 9, 2021)

Why are you leaning that heavily on different actual implementation? DXR is a standard thing, if the implementation is different, I would expect manufacturer to know what they are doing and aiming for.

But yes, the second step is the hardware accelerated one and their measurements give a pretty good indication that Nvidia's RT hardware is more powerful at this point (probably simply by having more units). This evidenced by the place of performance falloff on the scale of amounts of rays used. Both fall off but the respective points are different.

Miles Morales on PS5 is heavily optimized using the same methods for performance improvements that RT effects use on PC, mostly to a higher degree. Also, clever design. The same Digital Foundry has a pretty good article/video on how that is achieved: https://www.eurogamer.net/articles/digitalfoundry-2020-console-ray-tracing-in-marvels-spider-man


----------



## medi01 (Mar 9, 2021)

londiste said:


> Why are you leaning that heavily on different actual implementation? DXR is a standard thing, if the implementation is different


You have missed the points. Of a number of things that need to happen for RT to end up being an image, only one bit is hardware accelerated.



londiste said:


> Why are you leaning that heavily on different actual implementation?


There is another side of the implementation:
_For example, Quake 2 RTX and Watch Dogs Legion use a denoiser built by Nvidia and while it won't have been designed to run poorly on AMD hardware (which Nvidia would not have had access to when they coded it), it's certainly designed to run as well as possible on RTX cards._

Comparison of actual hardware RT perf benchmark, have been linked in #85 here. There is no need to run around and "guess" things, they are right there, on the surface.

The:

_the RTX 3080 could render the effect in nearly half the time in Metro Exodus, or even a third of the time in Quake 2 RTX, yet *increasing the amount of rays after this saw the RTX 3080 having less of an advantage.*_

could mean many things. This part is hilarious:

_In general, from these tests it looks like *the simpler the ray tracing is, the more similar the rendering times for the effect are* between the competing architectures. The Nvidia card is undoubtedly more capable across the entire RT pipeline_

Remember which part of ray tracing is hardware accelerated? Which "RT pipeline" cough? Vendor optimized shader code?


----------



## wolf (Mar 10, 2021)

Against my better judgment, I've viewed the ignored content, here we go again...



medi01 said:


> Ah. In motion that is. And from sufficient distance, I bet.
> That's ok then.


I never said from a distance, your words. And no, it looks fantastic close-up, too.

Yeah in motion, of *course *in motion. I tend to play games at something in the order of 60-144fps, not sitting and nitpicking stills, but for argument's sake, I'll do that too. If we're going to cherry-pick some native vs DLSS shots, I can easily do the same and show the side of the coin that you conveniently haven't.

Native left, DLSS Quality right











And the real kicker after viewing what is, at worst, comparable quality where each rendering has strengths and weaknesses, and at best, higher overall quality...





But you appear to have made up your mind, you don't like it, you won't "settle" for it. Fine, suit yourself, nobody will make you buy an RTX card, play a supported game and turn it on. Cherry picking examples to try and show how 'bad' it is doesn't make you come across as smart, and it certainly doesn't just make you right, you could have at least chosen a game with a notoriously 'meh' implementation. Not to mention the attitude, yikes.

I can't convince you, and you can't convince me, so where from here? ignore each other?


----------



## Caring1 (Mar 10, 2021)

That person's hair actually looks better in Native in comparison to DLSS, as it appears softer and cleaner as opposed to coarse and oily.


----------



## medi01 (Mar 10, 2021)

wolf said:


> Yeah in motion, of *course *in motion.


I won't bite this lie, I'm sorry.

It's not about "in motion" at all. What you present is the "best case" for any *anti-aliasing method that adds blur*, TAA in particular.
There is barely any crisp texture (face eh?) to notice the added blur.
It is heavily loaded with stuff that benefits a lot from antialiasing (hair, long grass, eyebrows).

But if you dare bringing in actual, real stuff, from the very pic in your list, Death Stranding, DLSS takes this:






and turns it into this:






no shockers here, all TAA derivatives exhibit it.

NV's TAA(u) derivative adds blur to... entire screen if you move your mouse quickly. Among other things

It's a shame ars-techinca was the only site to dare point it out.








						Why this month’s PC port of Death Stranding is the definitive version [Updated]
					

A major embargo is up, so we've added comparison images for anti-aliasing methods.




					arstechnica.com


----------



## Adam Krazispeed (Mar 10, 2021)

SamuelL said:


> Assuming I could get one of these close(ish) to MSRP, would this be any kind of upgrade to a 1080ti? If I could get one around MSRP, then I could likely sell the 1080ti for about the same price. Just don’t know if it would be worth all the effort... I’m debating if I should just wait for the next round of future GPUs in 6-12 months given the current pricing disaster.
> 
> Opinions?


pricing get better, lol, it gonna get WORSE, If mining and scalping ISNT STOPPED?/

My 6800XT just died that i paid $1100 usd for on ebay 4 months ago and itd ALREADY DEAD???? CANT EVEN RETURN IT. Contacted Sapphire for an RMA but doesn't look like its gonna be honored

I have no choice, I NEED SOMETHING? .. im gonna kill someone over a 6700 XT m i cant get another 6800 XT for less than 2 grand usd, so  I guess im done with PC gaming if i cant get on of these? yed it should be a great upgrade from a 1080/1080TI for sure,


----------



## medi01 (Mar 10, 2021)

SamuelL said:


> Assuming I could get one of these close(ish) to MSRP, would this be any kind of upgrade to a 1080ti?


A bit more VRAM and a bit more performance. Not much.


----------



## wolf (Aug 3, 2021)

londiste said:


> That DigitalFoundry video on probably the best analysis out there for the performance hit of raytracing effects today and across both manufacturers.
> I will just link to the video again.


After revisiting the subject with AMD-sponsored titles, I thought I'd had this very discussion before.

Something something, baseless speculation, vendor optimized code, no such gap...


----------



## Hachi_Roku256563 (Aug 3, 2021)

Adam Krazispeed said:


> ebay 4 months ago


when you use ebay its kinda your fault
i would not expect rma to work on a ebay gpu


----------



## sepheronx (Aug 3, 2021)

thank god I got my 6800xt for the same price as a 6700xt.  And at least I got my 3yrs warranty with Gigabyte.  I know they are terrible with RMA but at least I got the chance.


----------

