• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

AMD have never delivered strong driver performance increases

Are you on drugs? The 680, a card that beat the 7950 on launch, is generally trounced by the 7950 by 50%+.


Or the 5700XT, a card that when the 2070S launched was conclusively beaten by the 2070S, but now gets within spitting distance and beats the 2070S in some games.


AMD doesn't give driver performance my ass. There's a reason why the AMD fanbois cling to finewine(tm), as AMD cards to tend to get stronger over time.
 
Last edited:
according to DF they didn't really do an apple to apples comparison.

They took a guess.
It's better than the straight-out RX 6900 XT being RTX 2060 claim.

Since RT is memory bandwidth extensive with a small memory storage footprint (i.e. BVH is a search engine for geometry data), a very fast 128 MB Infinity cache matched RT workload which is missing in RDNA 2 based game consoles.

Game console's RDNA 2 RT cores without a high speed 128 MB Infinity cache acted like AMD's gen 1 HW RT cores.
 
Last edited:
Lmao .. no one has ever argued who has the most competent driver development team and for good reason .. but you seem to be the exception .. good luck with convincing the entire gaming community .. you have my support in your quest.
 
Lmao .. no one has ever argued who has the most competent driver development team and for good reason .. but you seem to be the exception .. good luck with convincing the entire gaming community .. you have my support in your quest.

Why is it on every product launch we get newly joined members whose only job seems to be to astroturf for the competition. Does one's head in.
 
Why is it on every product launch we get newly joined members whose only job seems to be to astroturf for the competition. Does one's head in.
.... And yet another snowflake who resorts to shaming and safe spaces to aid their argument .. lmao
 
Why is it on every product launch we get newly joined members whose only job seems to be to astroturf for the competition. Does one's head in.
Poster like Basilix continues NVIDIA's near monoply in discrete GPUs.
 
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070; assuming an All-AMD 5000/500/6000 ecosystem. GamersNexus' pinned YT post seemed to think similar; that the ability to use RAGE Mode + SAM was the reasoning behind the increase (not that they agreed or disagreed with it).

But others speculated that it was really to push people towards the 6800 XT ("well, if I'm going to fork out $580; may as well just fork out a bit more and get the 6800 XT instead for an extra $70"); same reason the 6900XT is also still priced a lot higher despite no likely way to squeeze in a "6850 XT" or "6900" (non-XT) between the 6800 XT and 6900 XT.

And still others speculate it was simply so they could make some extra $$$ before dropping the price when the replacement 3070 Ti/S comes out, suddenly wiping out any reason to buy an upcoming 3060 Ti/S or plain 3070 when you can have 2080Ti/3070 performance for cheaper.

And last one I've seen echoed a few times; the price was simply a placeholder, since they didn't yet know how much the 3070 would sell for at the time of filming the show, and could match or lower the price closer to release.
 
Lmao .. no one has ever argued who has the most competent driver development team and for good reason .. but you seem to be the exception .. good luck with convincing the entire gaming community .. you have my support in your quest.
Higher clock speed reduces the workload on driver teams i.e. shifting towards serial performance from very wide parallelism. It's the same idea since G8X's high clock speed.
 
They can announce whatever they want, until hard data proves its capable of competition I take it as that its not.

In GPU case, AMD is lately just all talk and no results.
 
People uncritically accept nvidia's claims so much they don't even remember the results after reviews are published. How much faster is the 3080? 19.2 percent. The 3080 is 19.2 percent faster on average vs the 2080 ti at 1440p. The 2080 Ti was bad, it was only 14 percent faster than the 2080 Super. Here is the proof (Ryzen 3950XT results):


I don't care about 4k or all the other ways in which nVidia misleads people, I want actual in game fps to increase at 1440p. If that is what AMD delivers (and it looks like their 1440p numbers are incredible) it is a slam dunk. Even the base 6800 might be VERY close to the 3080, yet $120 cheaper (actually $220 cheaper since the FE was never for sale), and with 6GB more VRAM. At 1440p there might only be a 5 percent difference. We'll see.
I didn't even bother to comment about nVidia claims. I'm only referring to TPU results from 23 games, those are available for everyone who wants to read them. BTW 3090, 6900 XT, 3080 and probably 6800 XT make more sense for 4K (or business). I also don't care about 4K, mostly waiting for the mid-range cards, even lower then 3070 and 6800.
 
So given the bandwidth of the infinity cache...
1603929842776.png


Let's assume in the future AMD started to use the double capacity G6 chips...

Option 1) x6 2GB chips + x2 4GB chips = 20GB VRAM
Option 2) x4 4GB chips + x4 2GB chips = 24GB VRAM
Option 3) x6 4GB chips + x2 2GB chips = 28GB VRAM
Option 4) x8 4GB chips = 32GB VRAM

It's really fascinating to think about the infinity cache implications combined with APU's and some of the brute force CPU capabilities of like ThreadRipper and Epyc like oh my then to top it off AMD with the Xilinix FPGA is wild things are getting very Skynet/Matrix/Borg quickly you can't put this genie back in the bottle.
 
I believe they indicated the 6900 would be an AMD exclusive .. which raises all sorts of concern if true .. given that their past exclusives have been shocking in comparison to AiB partner versions .. maybe they have concerns about their top end card .. so much so that they want to keep control of its functioning .. regardless of reason i cant see any valid argument as to why they wouldnt want AiB partners to extract every last drop from their products by making it available to them
It might be a yield issue. Maybe they can't produce enough of those Gpu's or maybe it is a last minute card and the Aib's did not have enough to produce cards.... Anyhow I am sure that the 6800xt OC version for aib's will be pretty damn close to the 6900xt. For much less.... I hope the 6900xt PCB will be top level and the Gpu's extremely well binned. If not it is going to be hard to justify such a card...
 
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070; assuming an All-AMD 5000/500/6000 ecosystem. GamersNexus' pinned YT post seemed to think similar; that the ability to use RAGE Mode + SAM was the reasoning behind the increase (not that they agreed or disagreed with it).

But others speculated that it was really to push people towards the 6800 XT ("well, if I'm going to fork out $580; may as well just fork out a bit more and get the 6800 XT instead for an extra $70"); same reason the 6900XT is also still priced a lot higher despite no likely way to squeeze in a "6850 XT" or "6900" (non-XT) between the 6800 XT and 6900 XT.

And still others speculate it was simply so they could make some extra $$$ before dropping the price when the replacement 3070 Ti/S comes out, suddenly wiping out any reason to buy an upcoming 3060 Ti/S or plain 3070 when you can have 2080Ti/3070 performance for cheaper.

And last one I've seen echoed a few times; the price was simply a placeholder, since they didn't yet know how much the 3070 would sell for at the time of filming the show, and could match or lower the price closer to release.
No mention of 8gb vs 16gb?
 
When will the NDA of reviews be lifted?
 
Yep .. aligning themselves to Apples reality distortion field ... lmao

Its real when it exists in real hands ... everything else is vapour .. we will know soon enough what the realities are

Well im old enough to know that back in the Geforce 2 days when Nvidias reign was threatened they released Detonator drivers which resulted in a 20-40 percent increase in performance .. the history is there for Nvidia ... sadly its never been there for AMD ... regardless of speculation AMD have never delivered strong driver performance increases .. you can deny reality until it bites you and ive been bitten by AMD's claims often .. thats why im sceptical until its independently tested.
What performance increase by detonator drivers, by cheating in benchmarks? nvidia have been caught doing that many times
 
Again with the marketing fluff. Who cares about corporate PR gibberish? Let your product prove itself in the real world instead of the paper launch designed to create pointless hype.
 
Why is it on every product launch we get newly joined members whose only job seems to be to astroturf for the competition. Does one's head in.
A little update: there is a difference between astroturf and a flame. I don't do the other, it breaks the rules, but you get to keep your head up since they cannot circumlocute a good astroturf.

What performance increase by detonator drivers, by cheating in benchmarks? nvidia have been caught doing that many times
Young kids... always straight to the point. Spoils the pleasure.

PS: Samsung 8nm goes brrr...
 
What uhh, what titles out there support AMD's ray tracing right now?
Tic-Tac-Toe 3d? :p

I glanced at something that implies that any RT title does, but nothing that there is a standard for RT.

Why is it on every product launch we get newly joined members whose only job seems to be to astroturf for the competition. Does one's head in.
paid schills?
 
Speculation for what exactly is AMD's reason for the higher price of the 6800 vs the 3070 has been interesting to read in the past couple of hours.

Some speculated that the ~80 dollar difference was "feature tax", SAM + RAGE Mode allowing it to eke ahead of a 3070; assuming an All-AMD 5000/500/6000 ecosystem. GamersNexus' pinned YT post seemed to think similar; that the ability to use RAGE Mode + SAM was the reasoning behind the increase (not that they agreed or disagreed with it).

But others speculated that it was really to push people towards the 6800 XT ("well, if I'm going to fork out $580; may as well just fork out a bit more and get the 6800 XT instead for an extra $70"); same reason the 6900XT is also still priced a lot higher despite no likely way to squeeze in a "6850 XT" or "6900" (non-XT) between the 6800 XT and 6900 XT.

And still others speculate it was simply so they could make some extra $$$ before dropping the price when the replacement 3070 Ti/S comes out, suddenly wiping out any reason to buy an upcoming 3060 Ti/S or plain 3070 when you can have 2080Ti/3070 performance for cheaper.

And last one I've seen echoed a few times; the price was simply a placeholder, since they didn't yet know how much the 3070 would sell for at the time of filming the show, and could match or lower the price closer to release.
Ill add my speculation, they finalized the video before the release of the 3070 and did not know it was going to be released that low. I believe if they hear price feedback the MSRP may drop by Nov 18th.
 
It just occurred to me, the 6800XT shroud design has no vents on the side with the I/O... so all that heat is getting shot out the top of the car direct on to the CPU... Nvidia has the better reference design here, the most hot part gets shot outside the case next to the I/O and and the less hot air shot out in the case at the rear of the card... hmm this will be interesting to see CPU temps with this gpu... that seems like a really bad design flaw... the air is literally going to hit the CPU straight as it leaves the card, especially if you use an air cooler and it will just suck that hot air right in before it has a chance to escape.

@TheLostSwede curious your thoughts on this design, am I understanding it wrong?
 
Back
Top