Wednesday, October 28th 2020

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

AMD (NASDAQ: AMD) today unveiled the AMD Radeon RX 6000 Series graphics cards, delivering powerhouse performance, incredibly life-like visuals, and must-have features that set a new standard for enthusiast-class PC gaming experiences. Representing the forefront of extreme engineering and design, the highly anticipated AMD Radeon RX 6000 Series includes the AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards, as well as the new flagship Radeon RX 6900 XT - the fastest AMD gaming graphics card ever developed.

AMD Radeon RX 6000 Series graphics cards are built upon groundbreaking AMD RDNA 2 gaming architecture, a new foundation for next-generation consoles, PCs, laptops and mobile devices, designed to deliver the optimal combination of performance and power efficiency. AMD RDNA 2 gaming architecture provides up to 2X higher performance in select titles with the AMD Radeon RX 6900 XT graphics card compared to the AMD Radeon RX 5700 XT graphics card built on AMD RDNA architecture, and up to 54 percent more performance-per-watt when comparing the AMD Radeon RX 6800 XT graphics card to the AMD Radeon RX 5700 XT graphics card using the same 7 nm process technology.
AMD RDNA 2 offers a number of innovations, including applying advanced power saving techniques to high-performance compute units to improve energy efficiency by up to 30 percent per cycle per compute unit, and leveraging high-speed design methodologies to provide up to a 30 percent frequency boost at the same power level. It also includes new AMD Infinity Cache technology that offers up to 2.4X greater bandwidth-per-watt compared to GDDR6-only AMD RDNA -based architectural designs.

"Today's announcement is the culmination of years of R&D focused on bringing the best of AMD Radeon graphics to the enthusiast and ultra-enthusiast gaming markets, and represents a major evolution in PC gaming," said Scott Herkelman, corporate vice president and general manager, Graphics Business Unit at AMD. "The new AMD Radeon RX 6800, RX 6800 XT and RX 6900 XT graphics cards deliver world class 4K and 1440p performance in major AAA titles, new levels of immersion with breathtaking life-like visuals, and must-have features that provide the ultimate gaming experiences. I can't wait for gamers to get these incredible new graphics cards in their hands."

Powerhouse Performance, Vivid Visuals & Incredible Gaming Experiences
AMD Radeon RX 6000 Series graphics cards support high-bandwidth PCIe 4.0 technology and feature 16 GB of GDDR6 memory to power the most demanding 4K workloads today and in the future. Key features and capabilities include:

Powerhouse Performance
  • AMD Infinity Cache - A high-performance, last-level data cache suitable for 4K and 1440p gaming with the highest level of detail enabled. 128 MB of on-die cache dramatically reduces latency and power consumption, delivering higher overall gaming performance than traditional architectural designs.
  • AMD Smart Access Memory - An exclusive feature of systems with AMD Ryzen 5000 Series processors, AMD B550 and X570 motherboards and Radeon RX 6000 Series graphics cards. It gives AMD Ryzen processors greater access to the high-speed GDDR6 graphics memory, accelerating CPU processing and providing up to a 13-percent performance increase on a AMD Radeon RX 6800 XT graphics card in Forza Horizon 4 at 4K when combined with the new Rage Mode one-click overclocking setting.9,10
  • Built for Standard Chassis - With a length of 267 mm and 2x8 standard 8-pin power connectors, and designed to operate with existing enthusiast-class 650 W-750 W power supplies, gamers can easily upgrade their existing large to small form factor PCs without additional cost.
True to Life, High-Fidelity Visuals
  • DirectX 12 Ultimate Support - Provides a powerful blend of raytracing, compute, and rasterized effects, such as DirectX Raytracing (DXR) and Variable Rate Shading, to elevate games to a new level of realism.
  • DirectX Raytracing (DXR) - Adding a high performance, fixed-function Ray Accelerator engine to each compute unit, AMD RDNA 2-based graphics cards are optimized to deliver real-time lighting, shadow and reflection realism with DXR. When paired with AMD FidelityFX, which enables hybrid rendering, developers can combine rasterized and ray-traced effects to ensure an optimal combination of image quality and performance.
  • AMD FidelityFX - An open-source toolkit for game developers available on AMD GPUOpen. It features a collection of lighting, shadow and reflection effects that make it easier for developers to add high-quality post-process effects that make games look beautiful while offering the optimal balance of visual fidelity and performance.
  • Variable Rate Shading (VRS) - Dynamically reduces the shading rate for different areas of a frame that do not require a high level of visual detail, delivering higher levels of overall performance with little to no perceptible change in image quality.
Elevated Gaming Experience
  • Microsoft DirectStorage Support - Future support for the DirectStorage API enables lightning-fast load times and high-quality textures by eliminating storage API-related bottlenecks and limiting CPU involvement.
  • Radeon Software Performance Tuning Presets - Simple one-click presets in Radeon Software help gamers easily extract the most from their graphics card. The presets include the new Rage Mode stable over clocking setting that takes advantage of extra available headroom to deliver higher gaming performance.
  • Radeon Anti-Lag - Significantly decreases input-to-display response times and offers a competitive edge in gameplay.
AMD Radeon RX 6000 Series Product Family
Robust Gaming Ecosystem and Partnerships
In the coming weeks, AMD will release a series of videos from its ISV partners showcasing the incredible gaming experiences enabled by AMD Radeon RX 6000 Series graphics cards in some of this year's most anticipated games. These videos can be viewed on the AMD website.
  • DIRT 5 - October 29
  • Godfall - November 2
  • World of Warcraft : Shadowlands - November 10
  • RiftBreaker - November 12
  • FarCry 6 - November 17
Pricing and Availability
  • AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards are expected to be available from global etailers/retailers and on AMD.com beginning November 18, 2020, for $579 USD SEP and $649 USD SEP, respectively. The AMD Radeon RX 6900 XT is expected to be available December 8, 2020, for $999 USD SEP.
  • AMD Radeon RX 6800 and RX 6800 XT graphics cards are also expected to be available from AMD board partners, including ASRock, ASUS, Gigabyte, MSI, PowerColor, SAPPHIRE and XFX, beginning in November 2020.
The complete AMD slide deck follows.
Add your own comment

394 Comments on AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

#276
Arc1t3ct
Cool! I can't wait for W1zzard's review where he demonstrates how inferior these cards are compared to NVIDIA.
Posted on Reply
#277
Valantar
medi01I happen to dislike paid shills.


You mean, when was it that they got caught?
Easy.
The Doom's demo with Ampere super-exclusive-totallynotforshilling-preview.
... so getting exlusive access to hardware for a limited preview makes them shills? They were very explicit about the limitations placed on them for that coverage and that this in no way amounted to a full review. Heck, it was repeated several times. Getting an exclusive preview of a new technology is a great scoop for any tech journalist, and they would all do it if they were asked - possibly with the exception of GN. This does tell us that they don't have extremely strict editorial standards, but it in no way qualifies them as paid shills. That you go that far in your judgement just makes it clear that you're seeing black and white where there are tons of shades of gray. That's your problem, not DF's.
medi01Stone that is at the same focal distance as the bush is sharper on below pic.
Like I said: the below pic has sharper grass (and the stone that's sitting in the grass), the top pic has more detailed leaves. I have no idea which of the two is DLSS, so I'm not arguing for or against either, just that your example is rather poor. You could say the bottom pic has slightly better sharpness overall, but it's essentially impossible to judge as it's clear that the focal points of the two shots are different. As for the stone being "at the same focal distance" - what? Do you mean the same distance from the camera? That's not the same thing ...
Posted on Reply
#278
ratirt
FranciscoCLXBoX One X works with HDMI VRR just like Freesync, even with LFC.


I guess it will be the same with the next consoles and videocards based on the same RDNA2 GPU.
It works but it is not the same. FreeSync offers more like Low Framerate Compensation (FreeSync premium). VRR doesn't have that.
Posted on Reply
#279
Valantar
Vya DomusIf the cropped out area looks worse isn't it a logical conclusion that the full image looks worse as well ? The problem is even a tiny bit of blur gets amplified by post processing and the fact that most displays have lower effective resolution when the image is moving.
Not necessarily: most game cameras have some form of distortion towards the edges of the image (as they are made to emulate real lenses to some degree, so there's distortion towards the edges of the image - it's a 2D projection of a more or less spherical captured image, after all so distortion is unavoidable), meaning the centre of the image is likely to be a tad sharper. Your point about displays having lower effective resolution also goes against this mattering - if your display can't show the difference, it won't matter if the source image is sharper. It's obvious that post processing exacerbates blur, but again the question becomes whether or not it's noticeable.
Posted on Reply
#280
Razbojnik
ZoneDymoperformance is good, price not so much, sigh.

like take the RX6800, beats the 2080ti so it will be a bit faster then an RTX3070, but it also costs 580 dollars vs 500 dollar for the RTX3070, so not really a clear winner in the "What to buy" discussion.
*And has 16 gbs of vram compare to 3070's 8 ....I see nothing but a clear winner here.
Posted on Reply
#281
Valantar
ratirtIt works but it is not the same. FreeSync offers more like Low Framerate Compensation (FreeSync premium). VRR doesn't have that.
... that video is explicitly about how, on the LG CX, you don't need FreeSync Premium activated for LFC to work. Did you even read the title, let alone watch it?
Posted on Reply
#282
Razbojnik
Unless nvidia comes out with super edition that burns amd both in ram capacity and boost in performance I don't see it winning this one. 300 wats same or better performance than ampere..paired with ryzen 5 these will be overclocking beasts. Only goes to show how much ampere is underperforming and how much it was actually a calculated cut from what it should have been.
Posted on Reply
#283
TumbleGeorge
LoL I'm in trouble! When AMD release Radeon VII this was only model with 16GB and for sure was marketing. But when offer entire middle+high classes with 12GB and 16GB VRAM that is not just marketing!
Posted on Reply
#284
ratirt
Valantar... that video is explicitly about how, on the LG CX, you don't need FreeSync Premium activated for LFC to work. Did you even read the title, let alone watch it?
Apparently I need to catch up since I've read different news saying it can't do it.
Posted on Reply
#285
BoboOOZ
ValantarWhat is with you and hating on Digital Foundry? Can you explain how they in any way whatsoever are "shills" for anyone? They do in-depth technical reviews with a much higher focus on image quality and rendering quality than pretty much anyone else. Are they enthusiastic about new features that have so far been Nvidia exclusive? Absolutely, but that proves nothing more than that they find these features interesting.
I'll bite for discussion's sake and because their paid review after the Nvidia launch has really upset me.

They are smart people who understand very well how what they are saying might influence the buying decisions of their audience.
Look at this video:
Instead of having a disclaimer that the exact settings for the comparisons to run are given by Nvidia, and that by using any other combination of settings (resolution, RTX/DLSS, detail) results may be quite different, he's simply hyping up the card most of the time, talking about feelings and stuff. That's nowhere near the moral standards oh Hardware Unboxed, Gamer's Nexus or TPU.
Hate is a very strong word, but I dislike the fact that they, as many other review sites and YouTubers, are becoming influencers instead of reviewers.
Posted on Reply
#286
medi01
ValantarPut simply: if you have to do that, you also need a very good screen
No.

You can "feel" things before you can easily point them out on screen, for starters.

In ars technica "quickly moving mouse" example, entire screen is blurred, no "zoom in" is required. It just helps to focus attention on particular screen.
Valantar... so getting exlusive access to hardware for a limited preview makes them shills?
That made them suspect.
The actual misleading video confirmed that they are indeed shills.
ValantarLike I said: the below pic has sharper grass
Below picture has blurred out mess, of kind that you could also see on other examples.
Posted on Reply
#287
Valantar
BoboOOZI'll bite for discussion's sake and because their paid review after the Nvidia launch has really upset me.

They are smart people who understand very well how what they are saying might influence the buying decisions of their audience.
Look at this video:
Instead of having a disclaimer that the exact settings for the comparisons to run are given by Nvidia, and that by using any other combination of settings (resolution, RTX/DLSS, detail) results may be quite different. Instead of that, he's simply hyping up the card most of the time, talking about feelings and stuff. That's nowhere near the moral standards oh Hardware Unboxed, Gamer's Nexus or TPU.
Hate is a very strong word, but I dislike the fact that they, as many other review sites and YouTubers, are becoming influencers instead of reviewers.
I agree that that video was a bit dubious, but no disclaimer? Hm. I think I read their piece rathertthan watched the video, but their written pieces are typically identical to their on-screen scripts. That contains this (second) paragraph:
Full disclosure: I can bring you the results of key tests today, but there are caveats in place. Nvidia has selected the games covered, for starters, and specified 4K resolution to remove the CPU completely from the test results and in all cases, settings were maxed as much as they could be. The games in question are Doom Eternal, Control, Shadow of the Tomb Raider, Battlefield 5, Borderlands 3 and Quake 2 RTX. Secondly, frame-time and frame-rate metrics are reserved for the reviews cycle, meaning our tests were limited to comparisons with RTX 2080 (its last-gen equivalent in both naming and price) and differences had to be expressed in percentage terms, meaning some slight re-engineering of our performance visualisation tools. This work actually proved valuable and the new visualisations will be used elsewhere in our reviews - differences in GPU power do tend to be expressed as percentages, after all.
Posted on Reply
#288
Max(IT)
ValantarUh ... only if you actually care about that kind of stuff. Which the average gamer does not whatsoever. This is quite limited enthusiast knowledge. Most gamers' level of knowledge about technical features is more or less on the level of "does the game run on my hardware, y/n?".
you are heavily underestimating young people here (and to be clear, I'm 48 years old). We are speaking about computer gamers here, and many of them know about their hardware.
By the way my entire point was that speaking about "14 years old players" is silly...
No cherry picking here. The only reason not to trust AMD's data here is that they themselves did the testing. The games used, settings used and hardware setups used are all publicized in the slide deck, if you bothered to read. There's no indication that the games they picked are disproportionately favoring AMD GPUs - shown at least partly by the relatively broad span of results in their testing, including games where they lose. Could you imagine Nvidia showing a performance graph from a game where they weren't the fastest? Yeah, there's a difference of attitude here, and AMD marketing for the past couple of years (since Raja Koduri left RTG and Dr. Su took over control there, at least) has been impressively trustworthy for first-party benchmarks.
I didn't even write about not trusting them :rolleyes:
The level of aggressiveness in this thread by AMD supporters is staggering.
I just said we need the independent reviews to actually understand the real performance of the hardware, because data showed ARE cherrypicked (the whole situation is cherrypicked being a marketing presentation) and far from being complete.
I didn't saying they were lying. They are just showing one part of the story, and it is perfectly understandable. To know how much the 6800 is better than 3070 on an average system (one without a Zen 3) we need the review.
As for Fortnite being full of 14-year-olds: that's a given. No, the majority aren't 14 - that would be really weird. But there are tons of kids playing it, and likely the majority of players are quite young. Add to that the broad popularity of the game, and it's safe to assume that the average Fortnite player has absolutely no idea what DLSS is. Heck, there are likely more people playing Fortnite on consoles and phones than people in the whole world who know what DLSS is.
being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
Posted on Reply
#289
BoboOOZ
ValantarI agree that that video was a bit dubious, but no disclaimer? Hm. I think I read their piece rathertthan watched the video, but their written pieces are typically identical to their on-screen scripts. That contains this (second) paragraph:
Ahh, yeah, watch the first 5 minutes of the video ;) , the tone and wording is quite different imho.
Posted on Reply
#290
Max(IT)
medi01Videos, with known paid shills like DF hyping it.
And then there are eyes to see and brains to use, god forbid.
And even some articles to be found, with, god forbid, actual pics:

arstechnica.com/gaming/2020/07/why-this-months-pc-port-of-death-stranding-is-the-definitive-version/


At the end of the day, it is a cherry picking game:



DLSS 2.0 (these are shots from picture that hypes DLSS by the way):




But it demonstrates how delusional DLSS hypers are. As any other image upscaling tech, it has its ups and downs.
WIth 2.0 it is largely the same as with TAA which it based on: it gets blurry, it wipes out small stuff, it struggles with quickly moving stuff.
According to your specs list you DON'T have any hardware that supports DLSS.
I have.
I saw with my eyes on several titles. DLSS + RT are something I like to have on my GPU, not a blurry mess like you are trying to demonstrate.
Posted on Reply
#291
BoboOOZ
Max(IT)being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
I'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.
Posted on Reply
#292
SLK
BoboOOZI'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.
Yeah FN has become my bonding time with my kids as well due to this Covid. My son plays on a 1050Ti and when he comes over to my PC (RTX2080), he asks why does his graphics look so much worse. I just told him recently that he can have my PC soon, as I am getting a 3080.
Posted on Reply
#293
Valantar
Max(IT)you are heavily underestimating young people here (and to be clear, I'm 48 years old). We are speaking about computer gamers here, and many of them know about their hardware.
By the way my entire point was that speaking about "14 years old players" is silly...
Sorry, but you are very much overestimating the knowledge of the average PC gamer. As people who frequent this and other forums will no doubt agree with me on, the average non-hardware enthusiast neither knows or cares about features like this, and if they know anything it is typically a poorly informed opinion mostly based on marketing and/or a reddit-based game of telephone where what comes out the other end is rather inaccurate.
Max(IT)I didn't even write about not trusting them :rolleyes:
You said they were cherry-picked benchmarks. That means the benchmarks were picked to make them look good, not to be accurate. That makes them unreliable, which means you can't trust them. So yes, you did write about not trusting them.
Max(IT)The level of aggressiveness in this thread by AMD supporters is staggering.
I just said we need the independent reviews to actually understand the real performance of the hardware, because data showed ARE cherrypicked (the whole situation is cherrypicked being a marketing presentation) and far from being complete.
I didn't saying they were lying. They are just showing one part of the story, and it is perfectly understandable. To know how much the 6800 is better than 3070 on an average system (one without a Zen 3) we need the review.
I don't mean to come off as aggressive, so sorry about that. But IMO you're using "cherry picked" wrong. It means to pick what is/looks best or most desirable, so that implies that they are leaving out (potentially a lot of) worse-looking results. Going by recent history from AMD product launches (Zen+, Zen 2, RDNA 1, Renoir), their data has been relatively reliable and in line with reviews. Their numbers also include games where they tie or lose to Nvidia, which while obviously not any kind of proof that these aren't best-case numbers, is a strong indication that they're not just picking out results that make them look good. As I said, there is one reason to not trust these numbers: the fact that they weren't produced by a reliable third party. Beyond that, recent history, the selection of games (broad, including titles where they both win and lose), the relatively detailed test setup notes, and the use of standard settings levels rather than weirdly tuned "AMD optimal" settings (see the Vega launch) are all reasons why one could reasonably expect these numbers to be more or less accurate. Of course the use of games without built-in benchmarks means that numbers aren't directly comparable to sites using the same games but different test scenarios, but that doesn't make the numbers unreliable, just not comparable. I am obviously still not for pre-ordering or even taking this at face value, but your outright dismissal is too harsh. I would be very surprised if these numbers (non-SAM, non Rage mode) were more than 5% off any third-party benchmarks.
Max(IT)being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
Saying the majority are 14 was obviously an intentional exaggeration, and taking it that literally is a bit too much for me. Besides that, even the average PC gamer knows very little about hardware or software features. Remember, the average PC gamer plays games on a laptop. The biggest group after that uses pre-built desktops. Custom, self-built or built-to-order desktops are a distant third. And even among that group, I would be surprised if the majority knew anything detailed about what DLSS or any comparable feature is - most gamers spend more time playing games than reading about this kind of stuff.
Posted on Reply
#294
mechtech
I wonder when/if 30, 36, 40CU cards and other cards will be released??
Posted on Reply
#295
Valantar
mechtechI wonder when/if 30, 36, 40CU cards and other cards will be released??
I'm expecting at least a mention of further RDNA 2 GPUs at CES - IIRC there's a Lisa Su keynote on the books there. Anything earlier than that would be rather weird, given just how close these announced GPUs are launching to the holiday season ending.
Posted on Reply
#296
Max(IT)
BoboOOZI'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.
Oh RT on Fortnite is pointless for me too, since I'm not planning to play that game on my PC.
But it still is a popular game, so it is not pointless for others.
mechtechI wonder when/if 30, 36, 40CU cards and other cards will be released??
most probably early next year.
Posted on Reply
#298
AddSub
AMD... they have to get their drivers working proper. I had four Polaris cards (2 x 480, 2 x 580) and one major issue was always software problems. Game breaking bugs, crashes, black screens, you name it, all of it went away by switching to the green team. Again, AMD gfx gear always looks "great!!!!" on paper, crapload of software issues in tow though. They have to fix their drivers and considering how their Adrenaline suite looks and behaves these days, they are going the wrong way.


...
..
.
Posted on Reply
#299
mechtech
AddSubAMD... they have to get their drivers working proper. I had four Polaris cards (2 x 480, 2 x 580) and one major issue was always software problems. Game breaking bugs, crashes, black screens, you name it, all of went away by switching to the green team. Again, AMD gfx gear always looks "great!!!!" on paper, craptop of software issues in tow though. They have to fix their drivers and considering how their Adrenaline suite looks and behaves these days, they are going the wrong way.


...
..
.
I have an RX 480 and never had any issues. Then again, I usually don't update the driver, and have left them for over a year. I kind of agree with the Adrenaline suite, I preferred the older version. Some of the older versions had the option only to install the driver and not the suite. Many times I just did that to avoid all those extras I never used or needed. They need to do that with Adrenaline, have custom install with a good selection to pick from, such as driver only. Only issue is the occasional wattman crash, that doesn't really seem to do anything. I also found windows 1709 and 1809 to be pretty good stability wise, but they are eol now.
Posted on Reply
#300
Valantar
mechtechI have an RX 480 and never had any issues. Then again, I usually don't update the driver, and have left them for over a year. I kind of agree with the Adrenaline suite, I preferred the older version. Some of the older versions had the option only to install the driver and not the suite. Many times I just did that to avoid all those extras I never used or needed. They need to do that with Adrenaline, have custom install with a good selection to pick from, such as driver only. Only issue is the occasional wattman crash, that doesn't really seem to do anything. I also found windows 1709 and 1809 to be pretty good stability wise, but they are eol now.
Driver issues seem to be extremely variable in who gets them. I never had any issues (that I didn't cause myself with aggressive undervolting or the like) with my Fury X and RX 570. But some people keep having issues across many different cards. As GN said in their video covering the launch: they have been able to recreate some, but it took some effort, so it's not like they're extremely common. Too common, yes, but not a deal-breaker unless you have one of those systems that just doesn't seem to like AMD's drivers.
Posted on Reply
Add your own comment
Nov 23rd, 2024 16:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts