Wednesday, October 21st 2020

AMD Radeon RX 6000 Series Specs Leak: RX 6900 XT, RX 6800 XT, RX 6700 Series

AMD's Radeon RX 6000 series graphics cards, based on the RDNA2 graphics architecture, will see the introduction of the company's first DirectX 12 Ultimate graphics cards (featuring features such as real-time raytracing). A VideoCardz report sheds light on the specifications. The 7 nm "Navi 21" and "Navi 22" chips will power the top-end of the lineup. The flagship part is the Radeon RX 6900 XT, followed by the RX 6800 XT and RX 6800; which are all based on the "Navi 21." These are followed by the RX 6700 XT and RX 6700, which are based on the "Navi 22" silicon.

The "Navi 21" silicon physically features 80 RDNA2 compute units, working out to 5,120 stream processors. The RX 6900 XT maxes the chip out, enabling all 80 CUs, and is internally referred to as the "Navi 21 XTX." Besides these, the RX 6900 XT features 16 GB of GDDR6 memory across a 256-bit wide memory interface, and engine clocks boosting beyond 2.30 GHz. The next SKU in AMD's product stack is the RX 6800 XT (Navi 21 XT), featuring 72 out of 80 CUs, working out to 4,608 stream processors, the same 16 GB 256-bit GDDR6 memory configuration as the flagship, while its engine clocks go up to 2.25 GHz.
A notch below the RX 6800 XT is the RX 6800 (Navi 21 XL), which cuts down the "Navi 21" further, giving it 64 compute units or 4,096 stream processors; the very same 16 GB of 256-bit GDDR6 memory interface, and up to 2.15 GHz engine clocks. The RX 6900 XT, along with the RX 6800 series, will be announced in the October 28 presser.

The next chip AMD is designing is the 7 nm "Navi 22" silicon, which features 40 compute units. On paper, this count looks similar to that of the "Navi 10," and it remains to be seen if this is a re-badge or a new silicon based on RDNA2. The RX 6700 XT maxes this chip out, featuring 40 CUs or 2,560 stream processors; while the RX 6700 features fewer CUs (possibly 36). The interesting thing about these two is their memory configuration—12 GB of 192-bit GDDR6.
Source: VideoCardz
Add your own comment

191 Comments on AMD Radeon RX 6000 Series Specs Leak: RX 6900 XT, RX 6800 XT, RX 6700 Series

#151
Punkenjoy
Super XPThe Infinity Cache looks quite interesting. AMD releasing a 256-Bit Bus shows they are confident in there innovative IF Cache. Looking forward to seeing this thing in action. I wonder if the Consoles had anything to do with this type of design, by keeping costs down all while maintaining performance/watt.
Maybe, but I suspect it's related to NAVI3x rumours to use chiplets like Zen2/3. They probably wants each cores complex to be as self sufficient as possible.

I also don't expect 128 mb of cache. Maybe more than 1 MB of L1 and 8 MB of L2. AMD is adding new feature like hardware ray tracing and variable rate shading and we don't know yet how much space theses news features and the others architectural change will use.

But in the end it do not really matter. It's just fun to speculate. Only benchmark will matter after launch
Posted on Reply
#152
mrjayviper
Is the 6700xt the spiritual successor to the 5700xt?

Thanks
Posted on Reply
#153
Max(IT)
mrjayviperIs the 6700xt the spiritual successor to the 5700xt?

Thanks
I don’t think so.
6700 will probably be similar to 5700XT but with RT hardware support.

The “spiritual” successor of 5700XT will probably be the 6800XT, with the 6900XT on an upper level
Posted on Reply
#154
Zach_01
PunkenjoyMaybe, but I suspect it's related to NAVI3x rumours to use chiplets like Zen2/3. They probably wants each cores complex to be as self sufficient as possible.

I also don't expect 128 mb of cache. Maybe more than 1 MB of L1 and 8 MB of L2. AMD is adding new feature like hardware ray tracing and variable rate shading and we don't know yet how much space theses news features and the others architectural change will use.

But in the end it do not really matter. It's just fun to speculate. Only benchmark will matter after launch
It is said that AMD architecture would not have dedicated units for RTRT and DLSS equivalent, but will implement those inside the ”regular” pipeline. We really don’t know if and how that is increasing transistor count for the GPU as the nVidia architecture does.

Personally I do believe that it will have a large cache, more than 100MB, to compensate the narrow bandwidth.
Posted on Reply
#155
RoutedScripter
Here we go again, I have to hide my eyes while typing this, to not see all these spoilers !!!

Only learned about this now, but this is pure drama as expected, see you all later on the 29th after the proper reveal.

I get more happyness and enjoyment out of watching a proper reveal than any of this in the end.
Posted on Reply
#156
Zach_01
RoutedScripterHere we go again, I have to hide my eyes while typing this, to not see all these spoilers !!!

Only learned about this now, but this is pure drama as expected, see you all later on the 29th after the proper reveal.

I get more happyness and enjoyment out of watching a proper reveal than any of this in the end.
Good for you!
Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.
Posted on Reply
#157
bug
Zach_01Good for you!
Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.
What is there to say constructively about leaked specs, really?
Posted on Reply
#158
RoutedScripter
Zach_01Good for you!
Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.
Gee, thanks, I never had a Nvidia GPU in my life.

Yes ofcourse, constructive to say, after I would have been spoiled by all the leaks, I could only go hide my head in sand, not have any motivation talking about it. When I was young I also wanted leaks and spent a ton of time on forum and scoring the web for leads, it can be interesting, but it's a lot of time, and it can get crazy if you don't slow the horses down. Now, with all things considered, life, other things, health, priorities, I find it spoiling, I'm honestly not going to lose that much nerves and time over this highly stressful phenomenon that this leak-hunting sport has become, that's taking over so many people's lifes, dribbling about GPU leaks all day long, and actually being seriously bothered, people losing their shit when they can't get a GPU on day 1 and nonsense like this, or getting super angry and actually bothering customer support over some very light argument, this bitching and crying is causing other people negativity, and well anyone looking at such negative news isn't going to pull anything inspiring out of that, that's not enjoyment. Holding companies to high-standard and fairness is one thing, but it's all about the proper approach.

I'm commenting in general, not saying it's necessairly like this everywhere, but most of these discussions are more bickering and seem to be so heavily seriously integrated with some people's lives, a lot of precious time vested into it, what in the end is only about consumption of entertainment, consumption of force-feeded information and ideas someone else developed, playing games and only doing that isn't some kind of a high achivement for humanity, it's being feed down the throat, to lose so much nerves over that is obviously pretty evident that such a subject isn't really understanding what's happening here. If it were about work, workstation, like actual practical things we need, like a proper prosumer HW segment, like proper pro-user OS, so many things are so crappy in this space and there's not a single of these gamer dudes ranting about that ... no linux is not a replacement for Windows, linux is a bunch of various industry-related groups of agendas who are mostly favoring their goals, not the goals of an optimal example of a prosumer (pro-user), so many things in linux are either totally missing or convoluted that is just 1-2 clicks on windows, and it's not about clicks, I'm talking about the easiest most primitive and standard simple things that should NOT take more than 1-2 clicks or key presses. Most linux GUIs are infact not even meant for pro-users either. The most practical things that would seriously deconfuse all of this isn't being done, not much attention, thought, programming effort is focused into that, I think firstly it's just a failure to understand the issues, they aren't consciously aware of them. Yet they all praise themselfs of advancing ... advancing what? They're not really advancing things that would really make the golden age of PCs again, that would really get rid of all of this drama and we would enjoy things a lot more overall. So much of the tech news and PC desktop tech is so boring in so many ways, many advancements that happen are usually some things that barely matter in practice to an average and pro-user overall, yes you have faster speed, yes this and that, but are you really enjoying the games THAT MUCH, is it more like 20% enjoyment and 80% forum, driver, OS, reboot, reinstall, crap, error, bug, issue, CS, new CPU, HW failure drama ... and then people even have the time to bother and push their already exhausted bodies bickering about speculation and leaks, that's the last thing one would do for it's sanity.

Do you people understand that with all of the effort of all of these devs we would have created a much better experience overall on the PC desktop space, there could be these courses where beginners would be thought up, not left to churn in this never ending drama, because so many companies, FOSS programmers just AREN'T WORKING IN YOUR BEST, WISEST AND PROPER INTEREST, so many programmers are wasting time fiddling diddlign with some worthless FOSS apps just for the sake of programing and getting the dopamine rush effect from feeling like they're contributing to some social group and are part of it, it's a form of sport too, they're not really trying to improve the biggest offender, the biggest elephant in the room the freaking Windows OS, no, they're busy developing their python pixie dust app that does some XYZ thing on your desktop and isn't really fundamentally going to change the course at all, so much stuff out there is lovel, nice, but it's NOT ENOUGH, isn't making a fundamental impact, because the whole culture does not know how to analyze and properly identify and tackle this issue, we need to fundamentally rething the whole ecosystem, it unfortuntely starts with finances and he who has the HW manufacturing capability, but existing customers if they band together they COULD have a chance forcing the current industry culture to radically change, if they really want to. But several players DO enjoy this set of circumstances, the ones who profit from it. One thing is for sure, websites about leak news PROFIT from all this pointless and life-sucking drama.

Many more people than I realized are overtaken and seriously bothered by all of this, if the leak isn't to their expectations, they kinda even blow up, what??? It was all speculation and lottery anyway, but they don't even realize it I think that it's taking a tool on their health as a consequence of being a stressful experience and lifestyle.
From all the other things in life, school, parents, transition to adult life, adult life dangers, mafias, drugs, evil corporations and predatory military-grade advertisments ... an average teenager goes through a meatgrinder of stress and psychological assaults and rarely do they figure it out in time, and this obsession with leaks is like one of those traps, and they IMO don't realize it, probably a form of sport, the constant excitement that goes into overdrive and into stressful arguments is unwise.

I've talked about it before, this time I'll be shorter, one of the things it comes down to largely is simply beginner inexperience, simple as that, that's what an average gamer is, a developing human who is relatively inexperienced. If previous generations don't do good enough job teaching the next, than this is what you get. The next generation always faces the problem of having to re-figure and re-discover what older generations already did but haven't put enough thought into passing that on as effectively.
Certainly the previous generations can't be blamed for everything, it's a loop, they also had the same problem themselfs, they're not experienced enough themselfs either, otherwise they would know how important it is to pass on the knowledge and discoveries.
New times bring new things and challenges that weren't explored and learned before. These gamer kids have this challenge ahead of them to figure out, unless someone warns them about it, they may be at it for a long long time. They should ask themselfs how much precious effort, time and mental capacity goes around bickering about HW specs all day long? Time and effort we all really would need for many other things in life. Do they have a sense of time, how much time do they spend on speculating and theorizing about? It's about lack of knowledge, with that experience and with that lack of perspective, with it all of this specs speculation and leaks start to feel insignificant versus all the rest of things in life that justifiably warrant more time and effort to be spent on them. This whole problem isn't really anything special to this field/topic.

Look, a proper technical type discussion you would do with your buddy-ies over a weekend visit or an afternoon sitdown at a bar is one thing, but this crazyness running around the web is just ridicolous, it feels like kindergarden circus, it feels jittery and frantic half the time, it's almost like everything is all about leaks, leaks this, leaks that, and the whole experience relies on leaks and rumors half the time, and I would even blame the maufacturers themselfs which are in some ways actually benefiting from this because of generated buzz and official information is in many ways by most of todays crony corporations very confused and bent against the customer. This is evident in the product labeling and model numbering systems. In almost every case the model numbers are so confusing on purpose to be of an extra chore for a customer to figure out exactly what fits his needs.

Then there's the other part of "who knows what a few hours earlier than someone else" and the funny thing about it it's usually like people who don't even have any practical or financial gain over knowing this information earlier than they officially would, it's so silly, the only ones with most benefit by leaks are only the few engineers working at the competitor, the second group is probably some shady investors and financial speculators, probably other employees who are up to no good, but IMO everyone else in the community is in it just for the sports, and it's all about being part of a campagin, a social group, and I think it's all about getting that dopamine rush of being feeling a "worthy" part of a social group, people get rated over who knows more details about which HW part and they brag to prove their worth in a social group. All of this drama and buzz has direct benefit for various tech websites to varying degrees which are abusing this phenomenon for profits, and profiting on the drama means they have no interest in doing anything about the problems of it.

What I'm warning here is that these gamer dudes should know they're actually part of a larger bowl of soup, this whole hype-train is actually a business method. If it would be more fair it be another thing, but no, these gamer people don't know that, they're totally not getting anything IMO in return, except for that psychological excitement? That limited dopamine rush they get from the hype is all negated by themselfs when they get so upset when something goes against their expectations, so they're constantly causing themselfs harm all the time.
Excitement is good but only in short bursts in life, it adds to stress and chronic stress is one of the most unhealthiest things ever, your pushing your body day and day out to be hyped and excited and totally like a crazed chicken with it's head cut off running around the tech webz ...
I do understand the hype around CPUs and GPU can be fun, I've been there done that, but I rather be hyped mostly on that big day of the announcement. I rather have healthy balanced dose of hype or excitement. What some people do is total overkill that morphs into something totally opposite and I think it does them more bad than good overall. Some gamer people probably sit for very long times, which is very unhealthy, just navigating the webs and pressing F5 to refresh frantically, that's actually a sign of some other health issue which may show it self with these symptoms, so either way it's not something normal IMO. This kind of broken unhealthy hype and excitement should not be taught to the next generations.
Posted on Reply
#159
Max(IT)
wow... both of you derailed the thread in a big way. o_O
Posted on Reply
#161
Max(IT)
mrthanhnguyen6900xt ($350) = 3080 ($700)
That would be awesome.
That would be impossible :kookoo:
Posted on Reply
#162
Punkenjoy
i think some people think AMD is a non-profit organization...
Posted on Reply
#163
Max(IT)
Punkenjoyi think some people think AMD is a non-profit organization...
They just have unrealistic expectations
Posted on Reply
#164
Zach_01
Punkenjoyi think some people think AMD is a non-profit organization...
Max(IT)They just have unrealistic expectations
No no... just look the 1500$ MSRP GPU he got... His commend's meaning is elsewhere
mrthanhnguyen6900xt ($350) = 3080 ($700)
That would be awesome.
It would be awsome even at 1000$ and would make your fake Titan of 1500+$ look s.... s.... stupid I want to say? I'm not sure... oh well... I'm confused...
Posted on Reply
#165
Camm
The question is I think will RT+DLSS act as the next PhysX \ Gameworks \ Tesselation \ Gsync spoiler that previous generations have had where AMD has a faster card in rasterisation (or in general, a better card), but the market see's some feature or strength from Nvidia as being worth the price premium / worse product.

Against DLSS, I expect AMD to deploy a DirectML solution that will work in any DX12 title, mostly negating DLSS (with the proviso that you will need to run at a higher resolution for AMD than with DLSS for no noticeable quality loss). RT is a bit more of a wildcard (expecting that AMD will be slower), games are starting to get it more and more, but still at questionable levels of use.
Posted on Reply
#166
kapone32
ShurikNAMD's reference PCB and board design are one of the best on the market. Everything is over-engineered, and it rivals much more expensive Sapphire models for example.
I hope they don't make it a pain to install a Waterblock.
Posted on Reply
#167
bug
mrthanhnguyen6900xt ($350) = 3080 ($700)
That would be awesome.
Not if you were a shareholder, it wouldn't ;)
Posted on Reply
#168
Camm
bugNot if you were a shareholder, it wouldn't ;)
I do think it would be a mistake for AMD to not price aggressively, AMD has a mindshare problem in the GPU market and turning all the screws on Nvidia when they won't be able to respond effectively would put AMD in the best position going forward.

That being said, too many idiots still think of AMD as the no frills or budget brand when the reality is they have plenty of world beating products.
Posted on Reply
#169
N3M3515
mrthanhnguyen6900xt ($350) = 3080 ($700)
That would be awesome.
More like $650
Posted on Reply
#170
bug
CammI do think it would be a mistake for AMD to not price aggressively, AMD has a mindshare problem in the GPU market and turning all the screws on Nvidia when they won't be able to respond effectively would put AMD in the best position going forward.

That being said, too many idiots still think of AMD as the no frills or budget brand when the reality is they have plenty of world beating products.
Pricing aggressively is one thing, but same performance at half-price is just nuts.
Pricing aggressively usually means selling at close to zero profit or subsidizing from other divisions. Only AMD knows if Ryzen+Epyc allows them to do that already.
Posted on Reply
#171
Camm
bugPricing aggressively is one thing, but same performance at half-price is just nuts.
Pricing aggressively usually means selling at close to zero profit or subsidizing from other divisions. Only AMD knows if Ryzen+Epyc allows them to do that already.
My apologies if it seemed to come across that I was advocating for stupid prices (I thought I covered that when I said too many idiots treat AMD as a budget brand, but apparently not).
Posted on Reply
#172
renz496
CammThe question is I think will RT+DLSS act as the next PhysX \ Gameworks \ Tesselation \ Gsync spoiler that previous generations have had where AMD has a faster card in rasterisation (or in general, a better card), but the market see's some feature or strength from Nvidia as being worth the price premium / worse product.

Against DLSS, I expect AMD to deploy a DirectML solution that will work in any DX12 title, mostly negating DLSS (with the proviso that you will need to run at a higher resolution for AMD than with DLSS for no noticeable quality loss). RT is a bit more of a wildcard (expecting that AMD will be slower), games are starting to get it more and more, but still at questionable levels of use.
even so it is not something that will work automatically. just like DX12 multi GPU. the biggest problem with machine learning is it needs to be trained first (it's in the name). this will take time and resource to do it right. so even if they can work with any DX12 tittle those specific tittle still need training. with DLSS the training part mostly being financed by nvidia as part of the sponsorship. historically AMD did not fond of stuff that require them to spend their own resource. they rather someone else to cover the expense. be it game developer themselves or their hardware partner. probably why in the past AMD look upon DLSS weakness and try to counter it with completely different tech (image sharpening).
Posted on Reply
#173
Camm
renz496it is not something that will work automatically. just like DX12 multi GPU.
There's reason I mentioned DirectML as it can be implemented at the pipeline rather than engine level, negating the need for explicit support. Training I could see coming as part of driver updates or covered by a generic model, especially with games that support dynamic resolution (although that'd have to be exposed specifically for a AMD solution to work). I doubt AMD can match the quality and performance of DLSS, but neither does it need to if it can get to 90% of the quality at a low enough resolution to make a performance difference.
Posted on Reply
#174
goodeedidid
No GDDR6X? I guess this isn't going to really be on par as the 3080 as some people were assuming.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts