# AMD Radeon RX 6800



## W1zzard (Nov 18, 2020)

The Radeon RX 6800 is the world's most power-efficient graphics card, better than anything NVIDIA Ampere has to offer. The smaller brother to the RX 6800 XT comes with a smaller heatsink, at more affordable pricing, but retains the lavish 16 GB VRAM memory buffer.

*Show full review*


----------



## kruk (Nov 18, 2020)

Wow, look at those efficiency gains . No wonder why nVidia launched so early with cards OCed to the moon and barely any stock. It's like nVidia and AMD switched places, RDNA2 being AMDs Pascal and Ampere being nVidias Vega ... Can't wait to see rest of the lineup


----------



## wolf (Nov 18, 2020)

they've frikken done it! hats off!


----------



## Khonjel (Nov 18, 2020)

Like I feared. Bad value comapared to RTX 3070 across the bard. But I think AMD gonna price drop it when RTX 3070 Ti launches. Because unlike Nvidia, AMD can't squeeze in another SKU.


----------



## madness777 (Nov 18, 2020)

It's three times faster than RX580 and consumes less power on average... I just can't


----------



## bug (Nov 18, 2020)

wolf said:


> they've frikken done it! hats off!


Well, the engineering is there, but the pricing is off.


----------



## B-Real (Nov 18, 2020)

Khonjel said:


> Like I feared. Bad value comapared to RTX 3070 across the bard. But I think AMD gonna price drop it when RTX 3070 Ti launches. Because unlike Nvidia, AMD can't squeeze in another SKU.



Bad value compared to the RTX 3070? *Sorry, what?*

The RX 5800 is 16% more expensive than the 3070 while being 6-9% faster. I also suggest you to check the newest game benchmarks: out of the 4 games including Godfall, WD: Legion, AC: Valhalla and Dirt 5, in 3, AMD cards are crushing NV counterparts, so overall, the difference is even bigger. You can expect a 10-12% difference in average. It offers TWICE MORE VRAM, which is really important in 4K, as there are already games like Doom Eternal which run out of 8GB, and the 3070 takes a hit in performance there if you don't lower settings. 2080 Ti was considered a 4K GPU. 3070 is on the same performance level, so it is also a 4K GPU. However, the flagship Turing card had 11 GB VRAM while 3070 only has 8. That will be a problem in the long run in 4K. With the 6800, you won't have that problem. Also, you have significantly less power consumption, therefore the 6800 crushes the 3070 in efficiency. The only difference is the RT performance but TBH, how many games support it? How many games have proper RT support that is really visible? Also the 6800 can be OCd better than the 3070...  I can't even remember when an AMD card could OC better than its NV rival.



bug said:


> Well, the engineering is there, but the pricing is off.


Same to you: 16% more expensive, 10-12% faster (taking newest benchmarks into account), double VRAM, significantly less power consumption, lower temperatures, better OC. Does the things mentioned aside performance doesn't justify a 4-6% price difference?


----------



## EzioAs (Nov 18, 2020)

This thing is just terribly priced. It should match or be at most $20 higher than the MSRP of the RTX 3070.


----------



## Space Lynx (Nov 18, 2020)

I wasn't able to get a XT but I got my order in for the 6800 reference.  I'm stoked.  



EzioAs said:


> This thing is just terribly priced. It should match or be at most $20 higher than the MSRP of the RTX 3070.



I disagree, I happy I have 16gb vram and Smart Access Memory will be enabled for me.


----------



## EzioAs (Nov 18, 2020)

lynx29 said:


> I disagree, I happy I have 16gb vram and Smart Access Memory will be enabled for me.



Yeah more VRAM is nice but it shouldn't be $70 more. As for the Smart Access Memory (or Resizeable BAR), Nvidia already assures people that they will support the feature soon.


----------



## Space Lynx (Nov 18, 2020)

EzioAs said:


> Yeah more VRAM is nice but it shouldn't be $70 more. As for the Smart Access Memory (or Resizeable BAR), Nvidia already assures people that they will support the feature soon.



well I don't have to wait for April for Nvidia to be in stock now, so I'm set! time to game!!! yeeehaww! that alone is worth $79 to me.


----------



## BigBonedCartman (Nov 18, 2020)

What a coincidence NGreedia inflates prices on the RTX 2000 but now the RTX 3000 series prices are cut in half? Trump couldn’t think of a better conspiracy theory


----------



## mechtech (Nov 18, 2020)

HMMM I wonder how much that vram costs per GB??

Kind of exited for the 67 and 66 series cards now!!


----------



## bug (Nov 18, 2020)

B-Real said:


> Same to you: 16% more expensive, 10-12% faster (taking newest benchmarks into account), double VRAM, significantly less power consumption, lower temperatures, better OC. Does the things mentioned aside performance doesn't justify a 4-6% price difference?


No, they don't.


----------



## W1zzard (Nov 18, 2020)

mechtech said:


> HMMM I wonder how much that vram costs per GB??


Educated guess, at least $5 per GB


----------



## Turmania (Nov 18, 2020)

It should be around the same price of rtx 3070 plus or minus 20. Lively carf. 165w power consumption avg is just awesome. But they need to price it right.


----------



## InVasMani (Nov 18, 2020)

Would be interesting to see how much more can be squeeze out of the design with bios modding. The memory could surely be set about 1GHz higher and I'm sure the core frequency could be refined further with a improved fan profile honestly I'd bump up the fan RPM to more like 2500RPM. Is the Radeon Boost scaling configurable? It showed 4K stationary, but 1080p in motion can the in motion amount scaling adjusted to custom configurable resolution points like 1440p. I like the idea of the tech itself and making it more dynamic makes a good deal of sense.


----------



## eRacer (Nov 18, 2020)

EzioAs said:


> This thing is just terribly priced. It should match or be at most $20 higher than the MSRP of the RTX 3070.


Yes, AMD just can't compete with the plethora of 16+GB high performance NVIDIA cards currently priced at or below $580. Get one of those instead.


----------



## B-Real (Nov 18, 2020)

Turmania said:


> It should be around the same price of rtx 3070 plus or minus 20. Lively carf. 165w power consumption avg is just awesome. But they need to price it right.


The card has DOUBLE amount of VRAM which is not necessary (16 GB will not be needed for 4K, but maybe the architecture just can get double amounts) but it's 100% that you need more than 8 GB on 4K in the long run (you already have games running out of 8GB). Why should a ~10% faster card (when we know that going higher in tier gets worse price/performance cards) with double the amount of VRAM, and consuming less power be priced the same as its rival? Of course it WOULD be great, but why it SHOULD be the same price?


----------



## Space Lynx (Nov 18, 2020)

B-Real said:


> The card has DOUBLE amount of VRAM which is not necessary (16 GB will not be needed for 4K, but maybe the architecture just can get double amounts) but it's 100% that you need more than 8 GB on 4K in the long run (you already have games running out of 8GB). Why should a ~10% faster card (when we know that going higher in tier gets worse price/performance cards) with double the amount of VRAM, and consuming less power be priced the same as its rival? Of course it WOULD be great, but why it SHOULD be the same price?



i know shadow of war runs can use close to 9gb of vram if you enable insane textures in settings. and that is kind of an old game. i expect next gen console games will require more vram as well.

dude it beats a 2080 ti and is $580. its a great deal imo.  i am very happy AMD is competing again and so should you. future of PC gaming is bright.


----------



## bug (Nov 18, 2020)

eRacer said:


> Yes, AMD just can't compete with the plethora of 16+GB high performance NVIDIA cards currently priced at or below $580. Get one of those instead.


3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
Of course there's the matter of stock, but that will sort itself out eventually.

Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.


----------



## Space Lynx (Nov 18, 2020)

bug said:


> 3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
> Of course there's the matter of stock, but that will sort itself out eventually.
> 
> Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.



that's fair of you, I wouldn't be able to afford it if I didn't live rent free with my parents at the moment. part of the reason I am doing it now actually... cause I expect life will change very soon, I actually had a job interview with a few universities last week... so yeah something will eventually snag, or that's what I keep telling myself anyway lol


----------



## B-Real (Nov 18, 2020)

bug said:


> 3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
> Of course there's the matter of stock, but that will sort itself out eventually.
> 
> Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.


Well if you say that a 10% difference is performance is the same, ok. However, I can't understand how can't you say a word about the half amount of VRAM on the 3070 which will definitely not be enough for 4K in the long run.


----------



## Akkedie (Nov 18, 2020)

Funny at 3070 better price/performance. Meanwhile in the real world retailers everywhere are selling it for way more than MSRP if it's ever even in stock. Let's stop these foolish comparisons when product doesn't exist in stock! How long has been Ampere out and they still can't produce cards?? Let's compare prices when they are real - for both teams!


----------



## Mindweaver (Nov 18, 2020)

Turmania said:


> It should be around the same price of rtx 3070 plus or minus 20. Lively carf. 165w power consumption avg is just awesome. But they need to price it right.


The 6800 is their highend card that is going up against the 3080 not the 3070. If you don't want to spend the 80 bucks then wait for the 6700. I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.


----------



## Khonjel (Nov 18, 2020)

B-Real said:


> Bad value compared to the RTX 3070? *Sorry, what?*
> 
> The RX 5800 is 16% more expensive than the 3070 while being 6-9% faster. I also suggest you to check the newest game benchmarks: out of the 4 games including Godfall, WD: Legion, AC: Valhalla and Dirt 5, in 3, AMD cards are crushing NV counterparts, so overall, the difference is even bigger. You can expect a 10-12% difference in average. It offers TWICE MORE VRAM, which is really important in 4K, as there are already games like Doom Eternal which run out of 8GB, and the 3070 takes a hit in performance there if you don't lower settings. 2080 Ti was considered a 4K GPU. 3070 is on the same performance level, so it is also a 4K GPU. However, the flagship Turing card had 11 GB VRAM while 3070 only has 8. That will be a problem in the long run in 4K. With the 6800, you won't have that problem. Also, you have significantly less power consumption, therefore the 6800 crushes the 3070 in efficiency. The only difference is the RT performance but TBH, how many games support it? How many games have proper RT support that is really visible? Also the 6800 can be OCd better than the 3070...  I can't even remember when an AMD card could OC better than its NV rival.
> 
> ...


If and only if raster gaming performance is what you’re looking for.

Set aside productivity performance and RT+DLSS which Nvidia dominate anyway, Nvidia's extra features like RTX Voice, RTX Broadcast, and whatever other shit they come up with their tensore cores in the future, hell even in-house game streaming solution (idk how to articulate this feature. Just think it as PC game streaming on mobile/laptop. Separate from Geforce Now) AMD is pretty cocky asking 80 dollars more for less features. I'm not saying to price $50 less than 3070 like 6800 XT is doing to 3080. But still looking at the performance I'd happily pay $20 more (at $520) or if I'm fanboy enough $50 more (at $550). But $80 just sounds too close to $100 and I'd rather buy a 3070 and Cyberpunk 2077.


----------



## bug (Nov 18, 2020)

B-Real said:


> Well if you say that a 10% difference is performance is the same, ok. However, I can't understand how can't you say a word about the half amount of VRAM on the 3070 which will definitely not be enough for 4K in the long run.


It's actually 8% best case scenario (4k, RTRT disabled). That's how much that 8GB VRAM hinders the 3070.
Unfortunately , my crystal ball is in for repairs, I can't predict the future like you do.


----------



## Raendor (Nov 18, 2020)

6800 seems more worth it than 3070 for the price. But the problem is there're nowhere to be found for MSRP. However is arguing that $80 is a lot for jump from 3070 to 6800, because it's a lot for some - that's a no point. If you're struggling with money - you're not byoing a fking $500 gpu in a first place. You sit happy with a budget build or a gaming console as your only system at best. Period.


----------



## Cheeseball (Nov 18, 2020)

B-Real said:


> Well if you say that a 10% difference is performance is the same, ok. However, I can't understand how can't you say a word about the half amount of VRAM on the 3070 which will definitely not be enough for 4K in the long run.



I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.


----------



## Khonjel (Nov 18, 2020)

Mindweaver said:


> The 6800 is their highend card that is going up against the 3080 not the 3070. If you don't want to spend the 80 bucks then wait for the 6700. I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.


Oh yeah forgot about RX 6700 series. According to rumors, 40 CU RDNA 2 cards gonna clock even higher than the 6800 series ones. Guess AMD's gonna cradle 3070 from low-end and high-end but not directly compete against it (pricing wise).


----------



## Space Lynx (Nov 18, 2020)

Cheeseball said:


> I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.



why not? most 4k gamers only want to hit 60 fps, and once oc'd i think it will hit that in every game across the board, its only 5 fps short in most games I see at 4k  might have to lower settings just a smidge in some games, but won't be noticeable to maintain 4k 60 fps.  not bad imo. AMD is kicking major butt lately imo, it's awesome.


----------



## bobmeix (Nov 18, 2020)

I've been a long time follower and I really appreciate the quality of the GPU reviews here! 
This time I am a bit confused about the power consumption values.
Average gaming power consumption for both AMD cards seems to be quite low compared to other reviews.
I suspect that this time the cards may be underutilized in Metro: Last Light at 1920x1080.


----------



## Cheeseball (Nov 18, 2020)

lynx29 said:


> why not? most 4k gamers only want to hit 60 fps, and once oc'd i think it will hit that in every game across the board, its only 5 fps short in most games I see at 4k  might have to lower settings just a smidge in some games, but won't be noticeable to maintain 4k 60 fps.  not bad imo. AMD is kicking major butt lately imo, it's awesome.



If you're aiming at 4K at 60 FPS (without heavily dropping in-game graphics settings), then both should be fine, although the RTX 3070 would likely suffer quicker once all the VRAM is used up. I can see this set up good for HTPC setups.

But for 2160p75 or higher? Best to get a 6800 XT or RTX 3080.


----------



## bug (Nov 18, 2020)

Raendor said:


> 6800 seems more worth it than 3070 for the price. But the problem is there're nowhere to be found for MSRP. However is arguing that $80 is a lot for jump from 3070 to 6800, because it's a lot for some - that's a no point. If you're struggling with money - you're not byoing a fking $500 gpu in a first place. You sit happy with a budget build or a gaming console as your only system at best. Period.


Of course it's a point. You don't have to be struggling with money to see that you can put $80 to better use. For example, if you were building a whole system, that $80 could buy you a high-end CPU cooler or almost one TB of SSD storage.


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> Average gaming power consumption for both AMD cards seems to be quite low compared to other reviews.
> I suspect that this time the cards may be underutilized in Metro: Last Light at 1920x1080.


Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings

Edit: added this on the power page
Our Metro Last Light power measurements seem to be extremely well-suited for the Radeon RX 6800 Series. It looks like AMD has mastered dynamically reacting to situations where GPU load isn't at 100% all the time, with sub-millisecond accuracy. This will benefit your normal gameplay sessions, too. That's the reason I picked Metro LL in the first place, it runs a more realistic test scenario than just power measurement with Furmark or stand still in Battlefield.


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> Because Metro Last Light isn't a permanent 100% load. It is very dynamic with scenes that don't always use the GPU at 100%. This gives clever GPU algorithms a chance to make a difference. I found this a much more realistic test than just Furmark or stand still in Battlefield V at highest settings


I do not contest this, but there could still be a CPU bottleneck at play here. If you attain the same values at higher resolutions, then your point is valid.
Power consumption tests at 1440p or 4k would be nice anyway, as an added nuance!


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> I do not contest this, but there could still be a CPU bottleneck at play here. If you attain the same values at higher resolutions, then your point is valid.
> Power consumption tests at 1440p or 4k would be nice anyway, as an added nuance!


Already running  Just for you


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> Already running  Just for you


You are too kind!


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> You are too kind!


bah .. time to retest at 1440p... will first check what's happening with nvidia in the same test







i always use "warm" results btw


----------



## kruk (Nov 18, 2020)

bug said:


> 3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
> Of course there's the matter of stock, but that will sort itself out eventually.
> 
> Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.



The frametime variance seem much lower on the RX 6800 series vs the nVidia lineup, which should result in much smoother gameplay. If this holds for most of the tested games, then the 6800 will give you much much better experience than the 3070 as the percentage difference suggests.


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> bah .. time to retest at 1440p... will first check what's happening with nvidia in the same test


I really have to tip my hat to you for this reaction!


----------



## ab3e (Nov 18, 2020)

Every retailer on the planet sells the RTX3070 at a much higher price than MSRP yet most of you keep quoting it. The prices will not go down even when the market will be saturated, as Gamers Nexus presented in their RTX3000 MSRP video. Nvidias profit margin is over 9000!!! ) .... I just managed to buy a RX6800 at launch for £530.00 and I`m happy


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> I really have to tip my hat to you for this reaction!


thats 3080





not sure what to do now? if i had to start from scratch i would probably retest on 1440p nowadays, but testing 6800 xt at 1440p and 3080 at 1080p would be unfair

edit: now that I'm thinking about this. doesn't it look like amd simply deals better with the situations where load isnt 100% at 1080p?


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> thats 3080
> 
> 
> 
> ...


That's a tough one, but I suppose, for the sake of completeness, an added graph with average gaming power consumption values at 1440p (or even 4k) would clarify things a bit.
It seems that this new generation of graphics cards cannot be fully saturated by the current CPUs at 1080p.
It is your call, of course, if you are willing to add these retroactively to the nvidia reviews. It seems like a lot of work...


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> That's a tough one, but I suppose, for the sake of completeness, an added graph with average gaming power consumption values at 1440p (or even 4k) would clarify things a bit.
> It seems that this new generation of graphics cards cannot be fully saturated by the current CPUs at 1080p.
> It is your call, of course, if you are willing to add these retroactively to the nvidia reviews. It seems like a lot of work...


i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


----------



## eRacer (Nov 18, 2020)

bug said:


> 3070 gives you virtually the same performance level (and better RTRT) for $80 less. That's a lot of money for many people.
> Of course there's the matter of stock, but that will sort itself out eventually.
> 
> Fwiw, I'm one of those that always advocates getting the more efficient card for a few $ more, but this is too steep for me.


If an extra $80 is a lot of money for someone willing to spend $500 on a video card, then they really can't afford a $500 video card either.

AMD/Radeon AIB's should be able to sell every 6800 series 16GB card available for the next several weeks, at least. There is no pressure to price it lower, or offer a cheaper 8GB version at this time.

According to this review it appears the 6800 has better RT performance than a 2080 Super at resolutions with playable framerates. So it appears that AMD's slowest 6800 offering delivers better RT performance than ~99% of NVIDIA cards with dedicated RT hardware sold to date. Could the 6800 series RT performance been better? Sure. It the 6800 series a terrible value because it only outperforms ~99% of all NVIDIA RT cards sold to date? Probably not.


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


I think nobody would be bothered by 2 extra graphs at 1440p and 4k for gaming power consumption, as 1080p, 1440p and 4k are all valid use case scenarios. I would find this approach (3 graphs) quite useful, as I nowadays mostly game at 1440p high refresh rate, with the occasional 4k/60. No more 1080p for me. 
It would present a more nuanced picture of gaming power consumption.
In summary: average power for warm cards at 1080p, 1440p, 4k + max gaming power at 4k.


----------



## W1zzard (Nov 18, 2020)

bobmeix said:


> In summary: average power for warm cards at 1080p, 1440p, 4k + max gaming power at 4k.


i think that would be too much data? because we also have idle, multi-monitor, media playback and furmark


----------



## bobmeix (Nov 18, 2020)

W1zzard said:


> i think that would be too much data? because we also have idle, multi-monitor, media playback and furmark


If I'd have to choose one, I would choose the 4k data then, as it is the most GPU bound result and frankly still valid gaming power consumption, even if pushing the power load.



W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


6800XT is still more power efficient at higher resolutions, even if not by that much. 



W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?
> 
> 
> 
> ...


Looking at the overlapped graphs, it does seem as if that is the case (AMD dealing better with lower loads), but averages would be helpful here.


----------



## Footman (Nov 18, 2020)

Damn, the 6800 looks like a sweet upgrade for me, even though I think that the card is priced $50 higher than where it should be.

Anyone want to buy my 5700XT with Bytski water block???


----------



## FinneousPJ (Nov 18, 2020)

Looks like both cards are out of stock. Hopefully it will be better next week when board partners release.


----------



## InVasMani (Nov 18, 2020)

I'm curious how AMD's take on DLSS will end up. Will it be developer dependent on a game by game basis or game agnostic perhaps and just works on all games retroactively perhaps more like similar to how mCable works, but obviously much more sophisticated and advanced. I'd rather the later be the cases even if the performance was half as good as DLSS with similar image quality. Having it just work and on all titles is a hell of a lot more useful in the grand scheme. I'd say both are useful, but in the big picture DLSS in the way it works is less optimal in ways than the way mCable handles upscale from a approach standpoint since it just works ahem like Jensen touted RTX ironically. Let's hope AMD's approach is closer to mCable as it would be more ideal. Where I think DLSS has it's strengths and merit is closer to AMD with Mantle to squeeze and eek out that extra bit of closer to the metal hardware performance out though requires developer attention at the same time which is a big drawback.

I wonder if AMD's upscale can be combined with Radeon Boost. It could be used either before or after. It seems like 2 options are possible upscale the native resolution to a target which by extension gets applied to Radeon Boost as well or downsample with Radeon Boost then strictly upscale the in motion resolution. The latter option would have cleaner more pure stationary image result, but less of a combined performance boost performance impact. Additionally either options could have custom arbitrary frame rate target use case scenario's giving more optimal results when and where needed for image quality in relation to motion and input lag. Combine that as well with having variable rate shading able to do similar frame rate triggered use scopes and that could have a really big impact on general performance to optimize the hardware.


----------



## Durvelle27 (Nov 18, 2020)

Really excited as this offering is great


----------



## Raendor (Nov 18, 2020)

Durvelle27 said:


> Really excited as this offering is great


And hard to come by.


----------



## JonCo (Nov 18, 2020)

W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?



I came on with very similar questions to Bobmeix - I'd been thinking the power consumption figures looked suspiciously good. Definitely re-iterate his comments about how great your reviews and work is - best GPU reviews on the web in my view.

I think it's worth noting that performance per watt is also impacted by this, and that is calculated separately for 1080p, 1440p and 4k. Taking power consumption figures at 1080p and using them for 1440p and 4k Performance per Watt charts seems a bit arbitrary now we know there are significant differences in power consumption at different resolutions. Depending on the amount of work you are willing to do, the options look to me to be:

Most work, most accurate: Run power consumption at all three resolutions, publish all three and use resolution-specific figures in performance-per-watt
Midground: Add either 1440p or 4k to the power consumption test and take an average of the two resolutions. 4k seems the obvious as you get both ends of the spectrum that way
Least work, least accurate: Move from the 1080p to 1440p for the core benchmark on the assumption 1440p is likely to be in the middle of any performance gradients like we're seeing here
I think any of these would be an improvement - how many people are realistically going to buy a 240-360hz monitor to justify a 3080 or 6800xt at 1080p? 1440p and 4k are the more obvious use cases...


----------



## owen10578 (Nov 18, 2020)

Damn look at that efficiency! That's an insane performance/watt showing from AMD. I thought I was reading it wrong at first. Also the overclocking gains is not bad at all at 9%+. Seems like a better buy than the 6800XT imo especially since when RT on they perform sorta similarly...


----------



## Searing (Nov 18, 2020)

EzioAs said:


> Yeah more VRAM is nice but it shouldn't be $70 more. As for the Smart Access Memory (or Resizeable BAR), Nvidia already assures people that they will support the feature soon.



Ridiculous. Before AMD did it people were paying $70 more to go from 4GB to 8GB, not you get 16GB total and you are complaining. I can't even...


----------



## Durvelle27 (Nov 18, 2020)

Raendor said:


> And hard to come by.


I had no problem getting one


----------



## Solid State Soul ( SSS ) (Nov 18, 2020)

Most impressive indeed, hats off to Lisa Su for enabling the true talent at AMD to show what they can do


----------



## bug (Nov 18, 2020)

kruk said:


> The frametime variance seem much lower on the RX 6800 series vs the nVidia lineup, which should result in much smoother gameplay. If this holds for most of the tested games, then the 6800 will give you much much better experience than the 3070 as the percentage difference suggests.


Only in some titles, according to this: https://www.techpowerup.com/review/amd-radeon-rx-6800/39.html
And in some titles Nvidia still gets ahead.
Not worth an extra $80 to me.


----------



## F-man4 (Nov 19, 2020)

6800 is meh.
In most games it’s inferior to 3070.
If no 6800 nano release, then a modded ZOTAC 3070 will be currently the best one for ITX builds.

But there’s a good news about 6800 - it has low spikes - only 423W within 130μs.
That means we can use a Corsair SF450 on it with no risk!


----------



## Khonjel (Nov 19, 2020)

Durvelle27 said:


> I had no problem getting one


Look at this guy^ He's so happy it's unbearable. I lost count of how many threads he visited and replied he's managed to snag a card.


----------



## nguyen (Nov 19, 2020)

Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.

Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.


----------



## Lomskij (Nov 19, 2020)

W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?



Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?


----------



## Taraquin (Nov 19, 2020)

I like the 6800, but it should have cost 499usd, not 579. The value is not good vs 6800XT and 3070 unless the exceptional power consumption is important for you.

6800 should be a great card for notebooks, make a slightly downclocked version and make a 110-120W variant. It should beat any existing notebook GPU by far.


----------



## W1zzard (Nov 19, 2020)

Lomskij said:


> Purely IMHO: power consumption at a resolution that makes the most sense for that particular card? E.g. for 4k card at 4k, while for a cheap 1080p card at 1080p. Or would that make the comparison unfair?


That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?


----------



## Lomskij (Nov 19, 2020)

W1zzard said:


> That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?



This one is easy: Nvidia markets it as a 4k card :-D
But I get your point.


----------



## Vayra86 (Nov 19, 2020)

Khonjel said:


> Like I feared. Bad value comapared to RTX 3070 across the bard. But I think AMD gonna price drop it when RTX 3070 Ti launches. Because unlike Nvidia, AMD can't squeeze in another SKU.



They are going to have to move on price, but a big factor this time around is also availability, and it also impacts price.

Still, 8~10 GB is not a great thing for this performance level, we're already seeing those numbers allocated today.



W1zzard said:


> That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?



The answer is people need to be educated on what numbers they're really looking at.

CPU performance is not going up as fast as GPU performance. Resolutions have made a quadruple jump from 1080p > 4K over the course of a few generations. Yes, this is going to shift the balance around in strange ways... its like reading the news. Some people just read headlines and misinterpret the better half of them, others read the article to figure out what's really happening.

If you don't draw the line there, any news outlet is on a race to the bottom. You can't cater to stupid - we have social media for that.


----------



## Vya Domus (Nov 19, 2020)

Over 2.5 Ghz overclocked almost 25% higher clocks compared to previous generation, that's crazy.



Cheeseball said:


> I wouldn't be recommending a RTX 3070 or RX 6800 (even if this has 16 GB of VRAM) for 4K usage. Maybe 4K for slightly older games (or modifying games with high-res texture packs) for the RX 6800. But for 1080p and 1440p? Definitely.



The 6800 is totally capable of 4K.



nguyen said:


> Performance per watt calculation is really off when you base the power consumption in a much older game at 1080p and divided by the avg FPS of all games tested.
> Does the 6800 use 210W in new games like Metro Exodus, AC Valhalla, etc... ? if it doesn't then the power consumption figure is wrong, so does the performance per watt.
> 
> Maybe we can use the Watt per FPS figure ? by locking a game to 120fps, and measure the avg power consumption. This is still a valid comparison since many people play game with locked FPS.



What a bizarre and contrived way to try and invalidate AMD's far superior performance/watt, you gave me a good laugh.


----------



## Khonjel (Nov 19, 2020)

Vya Domus said:


> The 6800 is totally capable of 4K.


I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.
While 3000 series is no better with their cheapskate VRAM size.


----------



## lexluthermiester (Nov 19, 2020)

Mindweaver said:


> The 6800 is their highend card that is going up against the 3080 not the 3070.


The benchmarks don't show that. The 6800 seems to be firmly a 3070 competitor as it's wedged inbetween the 3070 and 3080, performance wise. AMD would be wise to adjust their prices quickly. And let's face reality, AMD has to be planning the 6900 & 6900XT. Those will be the models that compete(or even beat out) the 3080, 3080ti and 3090.


Mindweaver said:


> I hate to say this in a sentence but $580 isn't bad for a highend card with that amount of vram.


Why would you hate to say that? It's an excellent point! The 6800 & 6800XT are going to be a god-send for anyone wanting to do rendering and other tasks that require high amounts of VRAM. In that context, the price is excellent and the 6800 is a very well balanced card for the money.

I said this over in the 6800XT review, but it bares repeating, Welcome back to the GPU performance party AMD! Well nice this!


----------



## Vya Domus (Nov 19, 2020)

Khonjel said:


> I kinda agree with him. Looking at the graphs 6000 series is bandwidth starved at 4k it seems.



I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.


----------



## bobmeix (Nov 19, 2020)

W1zzard said:


> That is the big question, and where to draw the line. Random example: 2080 Ti, 4K or 1440p?


I still believe that 3 graphs for average gaming power would clarify that. It's better to have more than less info.
This would also make the performance/watt graphs more relevant!


----------



## W1zzard (Nov 19, 2020)

This is all my games at all resolutions on 6800 XT. I think for next round of retesting I'll revise my power testing to be a bit more demanding


----------



## Khonjel (Nov 19, 2020)

Vya Domus said:


> I see no real evidence for that, even in Red Dead Redemption 2 one of the worst games to run in 4K the performance scales in the same way it does for something like a 3090 which has a lot more DRAM bandwidth. Had that being the case performance at 4K should be much worse.


I'm speaking from overall benchmarks I've seen so far. Let's treat 3080 and 6800 XT are in the same ballpark performance and 1440p as our control environment.
1) In games where 6800 XT leads 3080, in 4k 3080 leads or difference is miniscule.
2) In games where 6800 XT is hair-width slower than 3080, in 4k 3080 leads farther.

6800 otoh is all around faster than 3070 but in 4k the gap becomes narrower.


----------



## Vya Domus (Nov 19, 2020)

Khonjel said:


> I'm speaking from overall benchmarks I've seen so far. Let's treat 3080 and 6800 XT are in the same ballpark performance and 1440p as our control environment.
> 1) In games where 6800 XT leads 3080, in 4k 3080 leads or difference is miniscule.
> 2) In games where 6800 XT is hair-width slower than 3080, in 4k 3080 leads farther.
> 
> 6800 otoh is all around faster than 3070 but in 4k the gap becomes narrower.



That's not the proper way to figure out if a GPU lacks memory bandwidth the higher you go in resolution.

The 6800 and 6800XT both have the exact same memory bandwidth, at 4K the XT is 15% faster at 1440P it's 12% faster. XT scaled better not worse as you would expect if it was indeed lacking memory bandwidth. That's also partly because at 1440P games are more CPU bound but it's fairly obvious there is nothing out of the ordinary which would suggest otherwise.


----------



## Khonjel (Nov 19, 2020)

Vya Domus said:


> That's not the proper way to figure out if a GPU lacks memory bandwidth the higher you go in resolution.
> 
> The 6800 and 6800XT both have the exact same memory bandwidth, at 4K the XT is 15% faster at 1440P it's 12% faster. XT scaled better not worse as you would expect if it was indeed lacking memory bandwidth. That's also partly because at 1440P games are more CPU bound but it's fairly obvious there is nothing out of the ordinary which would suggest otherwise.


I'm not saying RX 6800 is bandwidth starved while 6800 XT is not. I'm saying they both are bandwidth starved at 4k, compared to 3000 series that is.


----------



## W1zzard (Nov 19, 2020)

Khonjel said:


> I'm not saying RX 6800 is bandwidth starved while 6800 XT is not. I'm saying they both are bandwidth starved at 4k, compared to 3000 series that is.


While I have no data to support this claim, have you considered that L3 cache might simply see lower hit rates at 4K?


----------



## bug (Nov 19, 2020)

Vya Domus said:


> The 6800 is totally capable of 4K.



I think that's highly dependent on expectations.
I was considering getting a 3070 that would have to do some 4k gaming, but I know I'll have to lower settings here and there. With future titles, you'll probably need to scale back more. Whether that defines a 4k card for you or not, only you can decide.

Edit: It gets even more complicated when you factor in RTRT.


----------



## Khonjel (Nov 19, 2020)

W1zzard said:


> While I have no data to support this claim, have you considered that L3 cache might simply see lower hit rates at 4K?


While I don't know for sure that bandwidth starvartion is happening or not but looking at multiple reviews it looks like both RX 6800 and RX 6800 XT shine their best at 1080p. While the distance between their Nvidia counterparts decreases the higher the resolution.
And as for L3 cache hit rate, I'll admit I'm not knowledgeable about the subject. But according to AMD, it does indeed see slower hit rate at 4K.

While I can't find slide image, here's a post discussing the AMD slide. 54% hit rate on 4k, 67-68% at 1440p and 75% at 1080p. So you're meaning reduced hitrate means infinity cache is ineffective at higher resolution?


----------



## W1zzard (Nov 19, 2020)

Khonjel said:


> So you're meaning reduced hitrate means infinity cache is ineffective at higher resolution?


That's exactly the definition of "reduced hit rate" in caching


----------



## Khonjel (Nov 19, 2020)

W1zzard said:


> That's exactly the definition of "reduced hit rate" in caching


So isn't it a roundabout way of saying RDNA 2 is bandwidth starved at higher resolution?
Infinity Cache was supposed to help alleviate the bandwidth limitation of 256-bit GDDR6 wasn't it?


----------



## bug (Nov 19, 2020)

Khonjel said:


> So isn't it a roundabout way of saying RDNA 2 is bandwidth starved at higher resolution?
> Infinity Cache was supposed to help alleviate the bandwidth limitation of 256-bit GDDR6 wasn't it?


It's complicated. If the GPU is frequently waiting for data to process, then it's starved. But that's not easily measured.
Plus, memory is always slower than the GPU (or CPU for that matter) so in that sense you could say all GPUs are bandwidth starved (otherwise we wouldn't need cache in the first place).


----------



## Mindweaver (Nov 19, 2020)

lexluthermiester said:


> The benchmarks don't show that. The 6800 seems to be firmly a 3070 competitor as it's wedged inbetween the 3070 and 3080, performance wise. AMD would be wise to adjust their prices quickly. And let's face reality, AMD has to be planning the 6900 & 6900XT. Those will be the models that compete(or even beat out) the 3080, 3080ti and 3090.


The 6800 is faster in every benchmark in every 1440p and 2160p benchmark where it should be with the amount of vram it has compared to the 3070 and in most cases, it 10-20fps faster in those resolutions. It will get even better with mature drivers. Again the 6700 will be I'm guessing a little slower than a 3070 and around 50-100 dollars cheaper and probably will overclock and reach the 3070 or close enough to it.

I really think if they wanted it to go up against the 3070 then we would have got a 6850 moniker. I bet they are banking on the 6700 to do well and maybe even cannibalize the 6800 series due to the amount of 6800 wafers that didn't make it and will be just cut down to 6700. Just my guess I haven't put any time into researching anything about the 6700 other than the 6700 is coming out, but with the 6800 benchmarks I will be looking out for it for sure.



lexluthermiester said:


> Why would you hate to say that? It's an excellent point! The 6800 & 6800XT are going to be a god-send for anyone wanting to do rendering and other tasks that require high amounts of VRAM. In that context, the price is excellent and the 6800 is a very well balanced card for the money.
> 
> I said this over in the 6800XT review, but it bares repeating, Welcome back to the GPU performance party AMD! Well nice this!


Because I don't want there to be a 580 dollar card.. lol I've bought plenty of highend cards new for around 400 bucks in the past after its been out for a few months to a year(_obviously this has nothing to do with the MSRP but me being thrifty.. haha_)... So you are right 580 is great and will be a godsend for anyone wanting to do rendering and due to how many we will actually see is probably why they have them higher than the 3070.


----------



## lexluthermiester (Nov 19, 2020)

Mindweaver said:


> The 6800 is faster in every benchmark in every 1440p and 2160p benchmark where it should be with the amount of vram it has compared to the 3070 and in most cases, it 10-20fps faster in those resolutions. It will get even better with mature drivers. Again the 6700 will be I'm guessing a little slower than a 3070 and around 50-100 dollars cheaper and probably will overclock and reach the 3070 or close enough to it.


Are we looking at the same review? Because the averages clearly show what I was referring to;








						AMD Radeon RX 6800 Review
					

The Radeon RX 6800 is the world's most power-efficient graphics card, better than anything NVIDIA Ampere has to offer. The smaller brother to the RX 6800 XT comes with a smaller heatsink, at more affordable pricing, but retains the lavish 16 GB VRAM memory buffer.




					www.techpowerup.com
				




Please understand that I'm not in any way dissing or downplaying on the new Radeon hotness here. This is one hell of a showing and these cards put AMD back in the top tier GPU arena!


----------



## bug (Nov 19, 2020)

lexluthermiester said:


> Are we looking at the same review?


You are, but your glasses are tinted differently, that's all


----------



## lexluthermiester (Nov 19, 2020)

bug said:


> You are, but your glasses are tinted differently, that's all


No, no glasses. Just reading it as stated by W1zzard...


----------



## bug (Nov 19, 2020)

lexluthermiester said:


> No, no glasses. Just reading it as stated by W1zzard...


Take a look at your avatar


----------



## lexluthermiester (Nov 19, 2020)

bug said:


> Take a look at your avatar


Gotcha!


----------



## EsaT (Nov 19, 2020)

W1zzard said:


> This is all my games at all resolutions on 6800 XT. I think for next round of retesting I'll revise my power testing to be a bit more demanding


That's more like what's been mostly measured for power consumption level.

There's certaily very valid point in using same game as reference for all cards.
But gotten power consumption being lower than in many games and on typical high end settings isn't exactly good for average user not understanding that.

Though I wonder if there's still similar Power Save profile as in Vega 64, which dropped power draw dramatically with only minor performance hit:
https://www.techpowerup.com/review/amd-radeon-rx-vega-64/29.html


----------



## bug (Nov 19, 2020)

lexluthermiester said:


> Gotcha!


Fwiw, I've also been flamed for saying you can't put a piece of paper between the two.


----------



## Mindweaver (Nov 19, 2020)

lexluthermiester said:


> Are we looking at the same review? Because the averages clearly show what I was referring to;
> 
> 
> 
> ...


What were we talking about? lol You don't agree that the 6800 is faster than the 3070 in most of the benchmarks? I would agree with you if the 6800 only had 8gb but it has 16gb. We will have to disagree. I don't even know why we are still talking about this, because I don't plan to buy either card. I don't disagree with you totally but if they drop the price of the 6800 to the 3070 price then that doesn't leave a lot of room for the 6700. I think the 6700 will be priced lower than the 3070 and the 6700 xt will be price a little higher or the same as the 3070 with vram closer to the 3070.


----------



## lexluthermiester (Nov 19, 2020)

Mindweaver said:


> What were we talking about? lol You don't agree that the 6800 is faster than the 3070 in most of the benchmarks?


No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.


----------



## Mindweaver (Nov 19, 2020)

lexluthermiester said:


> No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.


Yeah, it looks good for AMD this round.


----------



## lexluthermiester (Nov 19, 2020)

Mindweaver said:


> Yeah, it looks good for AMD this round.


Totally agree! This is an exceptional showing from AMD. They're back in the ring and have come out swinging!


----------



## W1zzard (Nov 19, 2020)

This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.


----------



## bobmeix (Nov 19, 2020)

W1zzard said:


> This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.
> 
> The perf/W charts have been updated, too, and relevant texts as well.


Greatly appreciated!


----------



## Cheeseball (Nov 19, 2020)

Vya Domus said:


> The 6800 is totally capable of 4K.



Did I say that its _not _capable of 4K in general? If one has a 2160p120 or 2160p144 monitor and is aiming at high graphical settings in today's AAA games, I would recommend going with the XT instead.


----------



## InVasMani (Nov 20, 2020)

lexluthermiester said:


> No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.


 I wonder what the odds are AMD reduces the bus width and utilizes HBM2/HBM2E. The infinity cache actually means with HBM memory they can use a lower bus width and still have better bandwidth than GDDR6/GDDR6X independent of it even more than before. Cost is still a consideration though the power consumption and heat offset benefits are in turn. Here's some hypothetical comparisons factoring in infinity cache and different memory types and bus widths. The fact that infinity cache combined with HBM2E at 128-bit can have more than 3 times the bandwidth of a 384-bit GDDR6X is a rather staggering difference.

256-bit GDDR 624GB/s _(w/o infinity cache)_
*128-bit HBM2 832 GB/s  (with infinity cache) 8GB capacity*
384-bit GDDR6X 936GB/s _(w/o infinity cache)_
*128-bit HBM2E 997.75GB/s (with infinity cache) 24GB capacity*

I think one reason AMD could even consider doing so is just for the mobile segment where heat output and power consumption is much more important when taking into account battery life and cooling restraints with the form factor. AMD also has a efficiency winner with RNDA2 on it's hands as well such a chip could dominate the mobile gaming segment in a really big way perhaps. Still I'm wondering how bandwidth dependent AMD's RDNA2 architecture is by comparison to Nvidia's Ampere that obviously is another big factor on the bandwidth and corresponding bus width options that infinity cache adds a neat twist upon.


----------



## Berfs1 (Nov 20, 2020)

lynx29 said:


> dude it beats a 2080 ti and is $580. its a great deal imo.  i am very happy AMD is competing again and so should you. future of PC gaming is bright.


i mean a 3070 also does that at 500$ but ok.


----------



## Dyatlov A (Nov 20, 2020)

Ohh, it is different perfomance/watt numbers now. I have to re-read the test and rethink everything.


----------



## W1zzard (Nov 20, 2020)

Dyatlov A said:


> Ohh, it is different perfomance/watt numbers now. I have to re-read the test and rethink everything.


Indeed  What's the outcome of your reconsideration? Personally I don't think it changes much


----------



## bobmeix (Nov 20, 2020)

W1zzard said:


> This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.
> 
> The perf/W charts have been updated, too, and relevant texts as well.


Now, looking at the new numbers, it's even clearer to me what my problem with Ampere is.
At 1440p, the 3080 is 1% more efficient than the 2080 Ti and at 4K, the 3080 is 7-8% more efficient than the 2080 Ti (with 1440p gaming power numbers used / 4K power numbers wouldn't have changed the dynamic too much, I suppose).
Jumping from 1080p to 1440p for the gaming power measurements, increased the average consumption of the 2080 Ti by only 2 Watts (273W->275W), but the 3080 jumped 36 Watts (303W->339W). This is more realistic in my opinion, because the 3080 has been to a greater extent CPU limited than the 2080 Ti, at 1080p.
This speaks volumes about how efficient Turing was (and Pascal indirectly), even if overpriced and not that popular (in my opinion).
Don't get me wrong, the 3080 delivers great performance (at 4K at least) and is a real upgrade, but at what cost?! At 1440p the performance scales directly with power, not a great feat for a new(ish) node, pushed to the limit.
AMD did a much better job with RDNA2 this time. It reminds me of the efficiency jump from Kepler to Maxwell. Not even a node shrink, in both cases (Kepler->Maxwell / RDNA->RDNA2)!!!
We need 5nm to hopefully see high performance sub 250W cards again. If sub 250 Watts will even be a thing in the future again, with these increasing power consumption numbers.


----------



## basco (Nov 20, 2020)

i would not have thought that the 6800 is so close to 6800xt and now waiting on bios to flash it to xt version or is this lasercuts or whatever so not flashable?


----------



## Dyatlov A (Nov 20, 2020)

W1zzard said:


> Indeed  What's the outcome of your reconsideration? Personally I don't think it changes much



I prefer to buy the best efficient GPU, before i had Geforce 1660ti according to your tests, now Geforce 3070. But I almost went to buy RX6800 seeing the much better performance/watt number . With the newest figures (with small difference), i think i gonna keep my 3070. With DLSS has again better efficiency and will also receive smart access memory.


----------



## Turmania (Nov 20, 2020)

W1zzard said:


> Indeed  What's the outcome of your reconsideration? Personally I don't think it changes much



I think your initial test results stated the Avg. Gaming result power consumption around 160W, now with updated it is just shy over RTX 3070.  It is really about choices in the end, but we do know that Ampere series are great undervolters if someone want to go down that road.


----------



## Feyd (Nov 20, 2020)

I don't know if this is a driver issue or something completely different, but in this review, the image quality with ray tracing in Watch Dogs Legion on AMD cards is clearly different from that of NVIDIA cards. I am really curious if some owner of RX 6800 or RX 6800 XT could please test if it's true and if so if it applies to more titles. Thanks for any information on this.


----------



## W1zzard (Nov 20, 2020)

Feyd said:


> I don't know if this is a driver issue or something completely different, but in this review, the image quality with ray tracing in Watch Dogs Legion on AMD cards is clearly different from that of NVIDIA cards. I am really curious if some owner of RX 6800 or RX 6800 XT could please test if it's true and if so if it applies to more titles. Thanks for any information on this.


AMD lists this in their "known issues" document, some reviewers chose to repro the issue and report on it. I'm sure there's many driver issues in RT, because all RT so far has been developer for and tested on NV only. I'm actually impressed you can run existing games


----------



## RandallFlagg (Nov 21, 2020)

Is there any reason that every single one of these cards seems to be full-length 3 fan designs?


----------



## Scircura (Nov 22, 2020)

W1zzard said:


> i guess this thing is evolving to "what do we want to test for power?". i would definitely throw out "cold card". but the other 3 results are all valid in their own way. opinions?


This is super interesting, and reveals something that no other reviewer has noticed. 

Most reviews only measure power consumption during an intense, fully GPU constrained task, using FurMark or whatever game seems to draw the most power when the framerate is uncapped. But that's not how most people use the card, and it may not represent the relative efficiency of these cards in normal use.

If the RX 6000 series can throttle power this well in moments of light load, then it's likely to benefit more than the RTX 3000 series when running vsynced or framecapped.

I would love to know: if you run, for example, the Doom Eternal 4K benchmark with a 120Hz or 100Hz frame cap or dynamic vsync (which are the RX 6800's 50%/95% uncapped frametimes), does that measurably lower the RX 6800's average power consumption? Does it do the same for the RTX 3070? 

The answer would be the deciding factor for which card I buy, since I want the smoothest performance I can get while staying under about 200W average thermal load.


----------



## John Naylor (Nov 22, 2020)

Interesting product placement though I don't know that it will find a home priced as it is .... It I was to look at the 3070 ... I don't see anything that would make it worthwhile to move up to the $580 price point ... and if it dies, I'd be more inclined go up to $650 or $700.


As to the extra  the VRAM, it's the proverbial "teats on a bull".    In every case where multiple VRAM versions of a card have been issued .... yes every single instance or comparison testing on a reliable web site, from the 6xxx series, 7xx series, 9xxx seies, 10xxx series where this has occured, it has never been shown that the extra VRAM brought anything to the table.  By the time you jack up the settings to a point where the VRAM differences is significant, you have outstripped the capabilities of the GPU .... a 33% increase in fps is meaningless when that increase is from 25 to 20 fps.  While a small number of  games, primarily poor console ports have been exceptions, the exception as the saying goes, proves the rule.









						GTX 770 4GB vs 2GB Showdown - Page 4 of 4 - AlienBabelTech
					

Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.




					alienbabeltech.com
				




_"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600.  We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference.  If we start to add even more AA, in most cases, the frame rates will drop to unplayable on both cards.   This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770.  And one of them, Metro: Last Light still isn’t even quite a single frame difference. 

here is one last thing to note with Max Payne 3:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s. "_









						Is 4GB of VRAM enough? AMD's Fury X faces off with Nvidia's GTX 980 Ti, Titan X - Page 5 of 5 - ExtremeTech
					

Is 4GB enough for a high-end GPU? We investigated and tested 15 titles to find out.




					www.extremetech.com
				




_"Some games won’t use much VRAM, no matter how much you offer them, while others are more opportunistic. This is critically important for our purposes, because there’s not an automatic link between the amount of VRAM a game is using and the amount of VRAM it actually requires to run.  Our first article on the Fury X showed how  Shadow of Mordor actually used dramatically more VRAM on the GTX Titan X as compared with the GTX 980 Ti, without offering a higher frame rate.  

While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable on  any current GPU."_

W1zzard obviously agrees here as he again echoes what others have said before when he writes:

_"Much has been talked about VRAM size during NVIDIA's Ampere launch. The GeForce RTX 3080 has 10 GB VRAM, which I still think is plenty for today and the near future. Planning many years ahead with hardware purchases makes no sense, especially when it comes to VRAM—you'll run out of shading power long before memory becomes a problem. "_

This kinda reminds me of the 960 / 380 era ... the 380x was $230 - 240ish ... the $ 60-70  jump to the 970 was just too small to ignore.  There's nothing here to complain about it's a solid card .... however priced between the 3070 and 6800XT / 3080 .... AMD's primary competition here is itself ... At $450, it would make a lot more sense.


----------



## TheoneandonlyMrK (Nov 22, 2020)

John Naylor said:


> Interesting product placement though I don't know that it will find a home priced as it is .... It I was to look at the 3070 ... I don't see anything that would make it worthwhile to move up to the $580 price point ... and if it dies, I'd be more inclined go up to $650 or $700.
> 
> 
> As to the extra  the VRAM, it's the proverbial "teats on a bull".    In every case where multiple VRAM versions of a card have been issued .... yes every single instance or comparison testing on a reliable web site, from the 6xxx series, 7xx series, 9xxx seies, 10xxx series where this has occured, it has never been shown that the extra VRAM brought anything to the table.  By the time you jack up the settings to a point where the VRAM differences is significant, you have outstripped the capabilities of the GPU .... a 33% increase in fps is meaningless when that increase is from 25 to 20 fps.  While a small number of  games, primarily poor console ports have been exceptions, the exception as the saying goes, proves the rule.
> ...


Your opinion, I disagree with parts of , but technically ANY card is more attractive at £130 cheaper, is it not.


----------



## HD64G (Jan 4, 2021)

The most efficient GPU can get almost 20% better by UV: https://www.igorslab.de/en/the-grea...h-big-navi-and-the-morepower-tool-practice/8/


----------

