Wednesday, October 28th 2020

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

AMD (NASDAQ: AMD) today unveiled the AMD Radeon RX 6000 Series graphics cards, delivering powerhouse performance, incredibly life-like visuals, and must-have features that set a new standard for enthusiast-class PC gaming experiences. Representing the forefront of extreme engineering and design, the highly anticipated AMD Radeon RX 6000 Series includes the AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards, as well as the new flagship Radeon RX 6900 XT - the fastest AMD gaming graphics card ever developed.

AMD Radeon RX 6000 Series graphics cards are built upon groundbreaking AMD RDNA 2 gaming architecture, a new foundation for next-generation consoles, PCs, laptops and mobile devices, designed to deliver the optimal combination of performance and power efficiency. AMD RDNA 2 gaming architecture provides up to 2X higher performance in select titles with the AMD Radeon RX 6900 XT graphics card compared to the AMD Radeon RX 5700 XT graphics card built on AMD RDNA architecture, and up to 54 percent more performance-per-watt when comparing the AMD Radeon RX 6800 XT graphics card to the AMD Radeon RX 5700 XT graphics card using the same 7 nm process technology.
AMD RDNA 2 offers a number of innovations, including applying advanced power saving techniques to high-performance compute units to improve energy efficiency by up to 30 percent per cycle per compute unit, and leveraging high-speed design methodologies to provide up to a 30 percent frequency boost at the same power level. It also includes new AMD Infinity Cache technology that offers up to 2.4X greater bandwidth-per-watt compared to GDDR6-only AMD RDNA -based architectural designs.

"Today's announcement is the culmination of years of R&D focused on bringing the best of AMD Radeon graphics to the enthusiast and ultra-enthusiast gaming markets, and represents a major evolution in PC gaming," said Scott Herkelman, corporate vice president and general manager, Graphics Business Unit at AMD. "The new AMD Radeon RX 6800, RX 6800 XT and RX 6900 XT graphics cards deliver world class 4K and 1440p performance in major AAA titles, new levels of immersion with breathtaking life-like visuals, and must-have features that provide the ultimate gaming experiences. I can't wait for gamers to get these incredible new graphics cards in their hands."

Powerhouse Performance, Vivid Visuals & Incredible Gaming Experiences
AMD Radeon RX 6000 Series graphics cards support high-bandwidth PCIe 4.0 technology and feature 16 GB of GDDR6 memory to power the most demanding 4K workloads today and in the future. Key features and capabilities include:

Powerhouse Performance
  • AMD Infinity Cache - A high-performance, last-level data cache suitable for 4K and 1440p gaming with the highest level of detail enabled. 128 MB of on-die cache dramatically reduces latency and power consumption, delivering higher overall gaming performance than traditional architectural designs.
  • AMD Smart Access Memory - An exclusive feature of systems with AMD Ryzen 5000 Series processors, AMD B550 and X570 motherboards and Radeon RX 6000 Series graphics cards. It gives AMD Ryzen processors greater access to the high-speed GDDR6 graphics memory, accelerating CPU processing and providing up to a 13-percent performance increase on a AMD Radeon RX 6800 XT graphics card in Forza Horizon 4 at 4K when combined with the new Rage Mode one-click overclocking setting.9,10
  • Built for Standard Chassis - With a length of 267 mm and 2x8 standard 8-pin power connectors, and designed to operate with existing enthusiast-class 650 W-750 W power supplies, gamers can easily upgrade their existing large to small form factor PCs without additional cost.
True to Life, High-Fidelity Visuals
  • DirectX 12 Ultimate Support - Provides a powerful blend of raytracing, compute, and rasterized effects, such as DirectX Raytracing (DXR) and Variable Rate Shading, to elevate games to a new level of realism.
  • DirectX Raytracing (DXR) - Adding a high performance, fixed-function Ray Accelerator engine to each compute unit, AMD RDNA 2-based graphics cards are optimized to deliver real-time lighting, shadow and reflection realism with DXR. When paired with AMD FidelityFX, which enables hybrid rendering, developers can combine rasterized and ray-traced effects to ensure an optimal combination of image quality and performance.
  • AMD FidelityFX - An open-source toolkit for game developers available on AMD GPUOpen. It features a collection of lighting, shadow and reflection effects that make it easier for developers to add high-quality post-process effects that make games look beautiful while offering the optimal balance of visual fidelity and performance.
  • Variable Rate Shading (VRS) - Dynamically reduces the shading rate for different areas of a frame that do not require a high level of visual detail, delivering higher levels of overall performance with little to no perceptible change in image quality.
Elevated Gaming Experience
  • Microsoft DirectStorage Support - Future support for the DirectStorage API enables lightning-fast load times and high-quality textures by eliminating storage API-related bottlenecks and limiting CPU involvement.
  • Radeon Software Performance Tuning Presets - Simple one-click presets in Radeon Software help gamers easily extract the most from their graphics card. The presets include the new Rage Mode stable over clocking setting that takes advantage of extra available headroom to deliver higher gaming performance.
  • Radeon Anti-Lag - Significantly decreases input-to-display response times and offers a competitive edge in gameplay.
AMD Radeon RX 6000 Series Product Family
Robust Gaming Ecosystem and Partnerships
In the coming weeks, AMD will release a series of videos from its ISV partners showcasing the incredible gaming experiences enabled by AMD Radeon RX 6000 Series graphics cards in some of this year's most anticipated games. These videos can be viewed on the AMD website.
  • DIRT 5 - October 29
  • Godfall - November 2
  • World of Warcraft : Shadowlands - November 10
  • RiftBreaker - November 12
  • FarCry 6 - November 17
Pricing and Availability
  • AMD Radeon RX 6800 and Radeon RX 6800 XT graphics cards are expected to be available from global etailers/retailers and on AMD.com beginning November 18, 2020, for $579 USD SEP and $649 USD SEP, respectively. The AMD Radeon RX 6900 XT is expected to be available December 8, 2020, for $999 USD SEP.
  • AMD Radeon RX 6800 and RX 6800 XT graphics cards are also expected to be available from AMD board partners, including ASRock, ASUS, Gigabyte, MSI, PowerColor, SAPPHIRE and XFX, beginning in November 2020.
The complete AMD slide deck follows.
Add your own comment

394 Comments on AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

#251
mtcn77
medi01You should not either.
But if you are into tech and still haven't noticed that it is almost never that Jensen says truth and almost never that Lisa lies, there is something wrong.
There must be an impostor.
Posted on Reply
#252
ratirt
Max(IT)I never said they are cheating.
It is normal marketing and every company is doing that to create the hype and increase sales. It is their business.
I'm just saying that I will make a decision after the first review by Techspot and Guru3D.
You are mixing hype with increased sales and marketing. The features are supposed to work and they do in most cases. If they do support their own products it is not hype, nor marketing scheme.
I always wait for the reviews but these presentations do give some level of information and comparison how their products perform against competition.
I can tell you now, You won't see much difference in performance between 3070 and 6800 with the reviews. If you do see those, I'm sure it will be in AMD's favor.
How will it turn out, we will yet to see. :)
Posted on Reply
#253
Vya Domus
Max(IT)still a lot of words and no evidence about "14 years old"...
There is, you are just too cringy and ignorant. By the way still waiting on that data about your ridiculous claim that 20-25 year old people must know about DLSS.
Posted on Reply
#254
Max(IT)
ratirtYou are mixing hype with increased sales and marketing. The features are supposed to work and they do in most cases. If they do support their own products it is not hype, nor marketing scheme.
I always wait for the reviews but these presentations do give some level of information and comparison how their products perform against competition.
I can tell you now, You won't see much difference in performance between 3070 and 6800 with the reviews. If you do see those, I'm sure it will be in AMD's favor.
How will it turn out, we will yet to see. :)
marketing business is to create hype to increase sales.
Everything is correlated.

I will read the reviews and if 6800 will be faster than 3070 at a comparable price I will buy it when available. I am not going to replace my 3900X with a 5900X until later next year, so I need a review without using the Smart Access Memory @ 1440P (my resolution for gaming).
Vya DomusThere is, you are just too cringy and ignorant. By the way still waiting on that data about your ridiculous claim that 20-25 year old people must know about DLSS.
you are completely turning things on the table in order to hide your clearly baseless claim about Fortnite being played mostly by 14 years old kids....

YOU started saying the 14 years old kids don't know or care what DLSS is (or on what planet they are), and I just pointed out that maybe Fortnite is also played by older guys that, according to your narrative, are more capable of understanding DLSS and RT.
Here to refresh your fading memory:
Your average Fortnite player is probably like 14 and hardly knows on what planet he is, hell he probably plays it from a phone or console, DLSS is the last thing he would think or care about.
I don't think knowledge is age related. And I surely know 16 years old kids smarter than you in this field.
Posted on Reply
#255
Vya Domus
Max(IT)I just pointed out that maybe Fortnite is also played by older
You pointed out nothing of any importance or relevance, the one study that included all age brackets shows there are many kids playing the game. Which means there are also a lot of 14 year old playing the game.

Saying that 20-25 year old people absolutely know about DLSS is by far the most ridiculous and baseless thing said on here and it doesn't follow my narrative at all, I simply said the kind of people that care about this come from a very small niche. Your original claim that most people who play Fortnite would chose a GPU with DLSS is still not only ridiculous but not backed by anything at all. Again, you wanted to play this dumb game of showing definite proof and so far all your ideas have been debunked.
Max(IT)And I surely know 16 years old kids smarter than you in this field.
Criiiiiiiiiiiiiiiiiinge
Posted on Reply
#256
medi01
What do we know about DLSS by the way?

Posted on Reply
#257
ratirt
Max(IT)marketing business is to create hype to increase sales.
Everything is correlated.

I will read the reviews and if 6800 will be faster than 3070 at a comparable price I will buy it when available. I am not going to replace my 3900X with a 5900X until later next year, so I need a review without using the Smart Access Memory @ 1440P (my resolution for gaming).
I don't think so. Marketing business is to present the products to the customers/ potential clients so that they know what the products offer and what it's are all about and if you need it.
Hype on the other hand, is presentation of a product without full scope of what it offers and blown out of proportion. For instance exaggeration of what the product can do, like performance.
At least that's how I take it.
Posted on Reply
#258
Max(IT)
Vya DomusYou pointed out nothing of any importance or relevance, the one study that included all age brackets shows there are many kids playing the game. Which means there are also a lot of 14 year old playing the game.

Saying that 20-25 year old people absolutely know about DLSS is by far the most ridiculous and baseless thing said on here and it doesn't follow my narrative at all, I simply said the kind of people that care about this come from a very small niche. Your original claim that most people who play Fortnite would chose a GPU with DLSS is still not only ridiculous but not backed by anything at all. Again, you wanted to play this dumb game of showing definite proof and so far all your ideas have been debunked.



Criiiiiiiiiiiiiiiiiinge
not going to lose anymore time with you.
English could not be my first language, but I wrote this:
AT 20-25 years old you absolutely know what DLSS is and if it fits your needs or not.
That is quite different than "any 20-25 years old know about DLSS".
So maybe you should review your language skills, other than statistics...

Let me know when you find data about 14 years old Fortnite players. Until then, bye bye...
Posted on Reply
#259
Vya Domus
medi01What do we know about DLSS by the way?

crisp 
Posted on Reply
#260
Max(IT)
medi01What do we know about DLSS by the way?

we are going OT here, but you cannot take a single frame screenshot to demonstrate how DLSS works.
Online there are tons of videos about DLSS 2.0 quality.
Posted on Reply
#261
mtcn77
Vya Domuscrisp 
Roast, or crisp?
Posted on Reply
#262
Vya Domus
Max(IT)not going to lose anymore time with you.
Good choice, because you had nothing intelligent to say. You are still in denial about there being no data, as if ages 10-24 somehow must not include 14, right ? "Learn statistics" my ass, you are illiterate on many domains not just statistics.
Posted on Reply
#263
Nater
Skimmed the thread. Two things stand out:

1. Where are all the people congratulating AMD on 16GB of vram vs 10GB? It seems they're shitting on the AMD 6000 series and still leaning towards the RTX 3000 series. So RTX > 6GB of vram (which was a dealbreaker before)?

2. My 11 year old asked for a new RTX card(he's on a GTX 1070@1080p) when the Fortnite preview dropped on YouTube. Kids aren't as ignorant as you think, and marketing works.

He gon' be pissed when he gets my 5700XT hand-me-down. :D
Posted on Reply
#264
Valantar
R0H1TLet's not kid ourselves, AMD could well be promising the jump wrt the worst performing RDNA card out there as opposed to say the most efficient one ~ which btw is inside a Mac not PC:nutkick:
So you didn't read the end notes in the slides then? Efficiency numbers are 5700 XT vs. 6800 XT. The 5700 XT is the least efficient RDNA 1 GPU, true, but the 6800 XT is by no means a best-case scenario - that would be the heavily binned, limited availability 6900 XT at the same power (which they incidentally listed as having a 65% improvement over the 5700 XT). So no, this doesn't seem like a "best v. worst" comparison.
Max(IT)At 20-25 years old you absolutely know what DLSS is and if it fits your needs or not.
Uh ... only if you actually care about that kind of stuff. Which the average gamer does not whatsoever. This is quite limited enthusiast knowledge. Most gamers' level of knowledge about technical features is more or less on the level of "does the game run on my hardware, y/n?".
Max(IT)The point is AMD cherrypicked some benchmarks (that's absolutely normal in a marketing presentation) and you cannot say "it is 18% faster" without a proper review.
No cherry picking here. The only reason not to trust AMD's data here is that they themselves did the testing. The games used, settings used and hardware setups used are all publicized in the slide deck, if you bothered to read. There's no indication that the games they picked are disproportionately favoring AMD GPUs - shown at least partly by the relatively broad span of results in their testing, including games where they lose. Could you imagine Nvidia showing a performance graph from a game where they weren't the fastest? Yeah, there's a difference of attitude here, and AMD marketing for the past couple of years (since Raja Koduri left RTG and Dr. Su took over control there, at least) has been impressively trustworthy for first-party benchmarks.
Max(IT)That is quite different than "any 20-25 years old know about DLSS".
Actually, it isn't. You effectively said "at age X, you absolutely know what ABC is and if it fits your needs or not". So, that's two statements in one: that at age X you know what ABC is, and that at age X you have the breadth of knowledge and judgement skills required to know if it fits your needs or not". So whether you meant to or not, you did in effect say that "by the age of 25, everyone knows what DLSS is".

It's obvious there are young tech enthusiasts who know what DLSS is, though at age 14 I sincerely doubt there are more than a handful of tech enthusiasts worldwide with the reasoning and temperament to not just be outright screaming fanboys and fangirls - that's not their fault, they just lack the biological and experiential basis for acting like reasonable adults. But still, inferring that "above a certain age anyone knows about [obscure technical term X]" is way, way out there.

As for Fortnite being full of 14-year-olds: that's a given. No, the majority aren't 14 - that would be really weird. But there are tons of kids playing it, and likely the majority of players are quite young. Add to that the broad popularity of the game, and it's safe to assume that the average Fortnite player has absolutely no idea what DLSS is. Heck, there are likely more people playing Fortnite on consoles and phones than people in the whole world who know what DLSS is.
Posted on Reply
#265
basco
We all love each other at TPU don´t we

i find it interesting that amd says 650 watt for 6800 \ 750 watt for 6800xt \ and 850watt for 6900xt while all have 2x8pin with latter 2 same tbp
Posted on Reply
#266
Vya Domus
NaterWhere are all the people congratulating AMD on 16GB of vram vs 10GB?
Congratulating for what ? Large VRAM capacities should have been the norm anyway.
Posted on Reply
#267
FranciscoCL
ratirtActually. The VRR is supported by these screens but it is not FreeSync equivalent. Maybe it will work but it may not necessarily work as a FreeSync monitor or a TV would work.
XBoX One X works with HDMI VRR just like Freesync, even with LFC.


I guess it will be the same with the next consoles and videocards based on the same RDNA2 GPU.
Posted on Reply
#268
Valantar
Nater2. My 11 year old asked for a new RTX card(he's on a GTX 1070@1080p) when the Fortnite preview dropped on YouTube. Kids aren't as ignorant as you think, and marketing works.

He gon' be pissed when he gets my 5700XT hand-me-down. :D
... if your 11-year old gets pissed for getting a 5700 XT as a hand-me-down, I'm sorry to say you have a seriously tech-spoiled kid. Heck, I don't even have a GPU close to the 1070 :P
Posted on Reply
#269
medi01
Max(IT)Online there are
Videos, with known paid shills like DF hyping it.
And then there are eyes to see and brains to use, god forbid.
And even some articles to be found, with, god forbid, actual pics:

arstechnica.com/gaming/2020/07/why-this-months-pc-port-of-death-stranding-is-the-definitive-version/


At the end of the day, it is a cherry picking game:



DLSS 2.0 (these are shots from picture that hypes DLSS by the way):




But it demonstrates how delusional DLSS hypers are. As any other image upscaling tech, it has its ups and downs.
WIth 2.0 it is largely the same as with TAA which it based on: it gets blurry, it wipes out small stuff, it struggles with quickly moving stuff.
Posted on Reply
#270
Valantar
medi01Videos, with known paid shills like DF hyping it.
And then there are eyes to see and brains to use, god forbid.
And even some articles to be found, with, god forbid, actual pics:

arstechnica.com/gaming/2020/07/why-this-months-pc-port-of-death-stranding-is-the-definitive-version/


At the end of the day, it is a cherry picking game:



DLSS 2.0 (these are shots from picture that hypes DLSS by the way):




But it demonstrates how delusional DLSS hypers are. As any other image upscaling tech, it has its ups and downs.
WIth 2.0 it is largely the same as with TAA which it based on: it gets blurry, it wipes out small stuff, it struggles with quickly moving stuff.
What is with you and hating on Digital Foundry? Can you explain how they in any way whatsoever are "shills" for anyone? They do in-depth technical reviews with a much higher focus on image quality and rendering quality than pretty much anyone else. Are they enthusiastic about new features that have so far been Nvidia exclusive? Absolutely, but that proves nothing more than that they find these features interesting.

Btw, those comparison shots look to have different DoF/focal planes; the one on top has sharper leaves but more blurry grass, the one on the bottom has the opposite. Makes it very difficult to make a neutral comparison.
Posted on Reply
#271
InVasMani
Vya Domuscrisp 
Even outside of the reflections at the full image size you can notice it in area's a bit, but the reflections remind me of how heat and smoke from a high intensity fire can kind of distort your vision or like a oil painting it's pretty ugly in that area. I can see that changing in the future as RTRT tensor cores improve and DLSS gets refined more, but it needs work before that happens. The way that DLSS works lower quality RTRT settings are going to look ugly like that with DLSS because it just doesn't have enough pixel density to work off of relative to the math calculations it can quickly process to benificial in the first place. I think those area's in the next generation or two of DLSS and tensor cores will improve a lot though.
Posted on Reply
#272
medi01
Valantarhating
I happen to dislike paid shills.
ValantarCan you explain how they in any way whatsoever are "shills" for anyone?
You mean, when was it that they got caught?
Easy.
The Doom's demo with Ampere super-exclusive-totallynotforshilling-preview.
ValantarBtw, those comparison shots look to have different DoF/focal planes; the one on top has sharper leaves but more blurry gras
Stone that is at the same focal distance as the bush is sharper on below pic.
Seriously, do you really want to venture in "DLSS doesn't make things look worse"?
Posted on Reply
#273
Vya Domus
ValantarWhat is with you and hating on Digital Foundry? Can you explain how they in any way whatsoever are "shills" for anyone? They do in-depth technical reviews with a much higher focus on image quality and rendering quality than pretty much anyone else. Are they enthusiastic about new features that have so far been Nvidia exclusive? Absolutely, but that proves nothing more than that they find these features interesting.

Btw, those comparison shots look to have different DoF/focal planes; the one on top has sharper leaves but more blurry grass, the one on the bottom has the opposite. Makes it very difficult to make a neutral comparison.
I can't say I hate them or think that they are shills but they have a history of parsing every single DLSS and DXR implementation even though we know very well some have been less than stellar.
Posted on Reply
#274
Valantar
I keep being surprised by how people are comparing image quality in games - especially fast-paced ones - by cropping out tiny areas, often from the corners of the screen or similar places where the player is unlikely to focus for much of the time, and often upscaling them just to shout "look, this one is a bit less sharp!"

Put simply: if you have to do that, you also need a very good screen and to be actively looking for problems to spot them while playing. There is obviously a threshold below which these things do become visible, but for now, while I don't think DLSS is the second coming of Raptorjesus like some people do, it's a damn impressive technology nonetheless. The only major drawback is how complex its implementation is and thus how limited adoption is bound to be, as well as it being proprietary.

Heck, when I get a new monitor I'll be going for something like a 32" 2160p panel (need the sharpness for work), but I'll likely play most games at 1440p render resolution and let bog-standard GPU upscaling handle the rest. I'll likely not be able to tell much of a difference in-game, and the performance increase will be massive.
Posted on Reply
#275
Vya Domus
ValantarI keep being surprised by how people are comparing image quality in games - especially fast-paced ones - by cropping out tiny areas, often from the corners of the screen or similar places where the player is unlikely to focus for much of the time, and often upscaling them just to shout "look, this one is a bit less sharp!"
If the cropped out area looks worse isn't it a logical conclusion that the full image looks worse as well ? The problem is even a tiny bit of blur gets amplified by post processing and the fact that most displays have lower effective resolution when the image is moving.
Posted on Reply
Add your own comment
Nov 23rd, 2024 17:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts