Tuesday, August 11th 2020

AMD RDNA 2 "Big Navi" to Feature 12 GB and 16 GB VRAM Configurations

As we are getting close to the launch of RDNA 2 based GPUs, which are supposedly coming in September this year, the number of rumors is starting to increase. Today, a new rumor coming from the Chinese forum Chiphell is coming our way. A user called "wjm47196" known for providing rumors and all kinds of pieces of information has specified that AMD's RDNA 2 based "Big Navi" GPU will come in two configurations - 12 GB and 16 GB VRAM variants. Being that that is Navi 21 chip, which represents the top-end GPU, it is logical that AMD has put a higher amount of VRAM like 12 GB and 16 GB. It is possible that AMD could separate the two variants like NVIDIA has done with GeForce RTX 2080 Ti and Titan RTX, so the 16 GB variant is a bit faster, possibly featuring a higher number of streaming processors.
Sources: TweakTown, via Chiphell
Add your own comment

104 Comments on AMD RDNA 2 "Big Navi" to Feature 12 GB and 16 GB VRAM Configurations

#26
mouacyk
TheLostSwedeWith bath salts...
I'm actually swimming the ocean.
Posted on Reply
#27
Valantar
londisteCould be 256-bit and use 2GB chips.
True, it could be, but 256-bit for a high-end or flagship card in 2020? Not happening.

Me? Still hoping for HBM2e, if nothing else for the combination of efficiency and compactness.
Posted on Reply
#28
windwhirl
ValantarTrue, it could be, but 256-bit for a high-end or flagship card in 2020? Not happening.

Me? Still hoping for HBM2e, if nothing else for the combination of efficiency and compactness.
HBM2e? Does that make sense for this kind of product, though? Not just for the price, but also because IIRC there were problems implementing it a couple years ago (uneven height complicating cooling is what comes to mind right now, but I can't really remember for sure)
Posted on Reply
#29
Legacy-ZA
Well, it is likely, didn't AMD release a statement earlier that 8GB VRAM should be the bare minimum.
Posted on Reply
#30
Valantar
windwhirlHBM2e? Does that make sense for this kind of product, though? Not just for the price, but also because IIRC there were problems implementing it a couple years ago (uneven height complicating cooling is what comes to mind right now, but I can't really remember for sure)
Given that AMD used more expensive HBM2 on the Vega cards, even accounting for higher capacity, HBM2e shouldn't be out of reach for these, and would provide all the bandwidth you could possibly want even with relatively low-end chips (they don't need the top-end 3.2Gbps/pin version, 2.5 or 2.7 would still make 2 stacks faster than a 512-bit GDDR6 bus). Of course this is pure speculation, and given the price of HBM (even 2e) and interposer packaging, GDDR6 is more likely for anything below $1000. But one can dream, no? It just depends how much AMD wants to increase their margins compared to previous high end cards. Of course, the VII had 16GB of HBM2 at $699, so that's something, but that was a cut-down workstation chip and can't really be counted ...

As for the issues, there were some issues with uneven package height with Vega 56/64, but given those experiences they should really have that straightened out by now. Besides, well-made coolers still performed well on those GPUs, though sufficient mounting pressure was a necessity.
Legacy-ZAWell, it is likely, didn't AMD release a statement earlier that 8GB VRAM should be the bare minimum.
If you are thinking of what I think you are thinking of, they were just advertising that their low end cards come in 8GB options compared to Nvidia's 4GB or 6GB options.
Posted on Reply
#31
R0H1T
londiste12GB and 16GB implies different memory bus widths, which should be significant.
By significant do you mean 256 bit wide? GDDR6 can easily go 16GB with that bus width, given the densities we already have. I seriously hope HBM or low cost HBM makes a comeback for mainstream GPUs, we are at an inflection point where the significant capacity investments & rewards should now "trickle down" to the consumer market!
Posted on Reply
#32
dicktracy
AMD: “we’re still here. Please give us attention!“

99% of gamers aren’t even remotely interested in any Big Navi leak XD
Posted on Reply
#33
ARF
JAB CreationsI'm all against dumb rumors but that doesn't negate logical deduction:
  • AMD did not release a high end card in the 5000 series.
  • A high end card is naturally going to have more video memory to handle higher resolutions.
  • AMD released a 7nm 16GB HMB2 card on February 7th, 2019.
  • AMD has had more money for R&D to justify investments in lower margin products such as Radeon video cards.
  • AMD releasing a 16GB card with GDDR6 is not an unreasonable assumption.
Sure, it's not been announced but it's not like there are rumors suggesting that a 6700 will be the top card in the RDNA2 line up.
Radeon VII has 16GB of VRAM because of severe technology limitations.
It's still a mid-range card, shrunk from RX Vega 64 and is just like the RX 5700 XT.
Posted on Reply
#34
Patriot
ChomiqYou tried so hard to prove your point and yet you missed the obvious Jonestown massacre reference. Try harder next time.

PS.
The "smaller good guy" has an estimated net worth of almost a hundred billion USD. It's value has been on rise for 4 years now. It's at no risk of getting "wiped off the map".
Maybe if you got the drink right... Flavor Aid not Kool-aid.
Also suggesting that AMD fans should commit suicide is...
Posted on Reply
#35
Chomiq
PatriotMaybe if you got the drink right... Flavor Aid not Kool-aid.
Also suggesting that AMD fans should commit suicide is...
Primo - en.wikipedia.org/wiki/Drinking_the_Kool-Aid
Secundo - Fanboys, not fans.

And yeah, if you're taking that literally good luck to you.
Posted on Reply
#36
Patriot
ChomiqPrimo - en.wikipedia.org/wiki/Drinking_the_Kool-Aid
Secundo - Fanboys, not fans.

And yeah, if you're taking that literally good luck to you.
The article you linked says its flavor aid bud.
And its a reference to mass suicide of a cult... so... Perhaps you shouldn't reference it if you don't mean it.
Posted on Reply
#37
Chomiq
Patriotlol, even the article you linked says its flavor aid bud.
Sure and that's why the expression is "It's time to drink the Flavor Aid", bud.

No more from me, mods will clean this up anyway.
Posted on Reply
#38
londiste
R0H1TBy significant do you mean 256 bit wide? GDDR6 can easily go 16GB with that bus width, given the densities we already have. I seriously hope HBM or low cost HBM makes a comeback for mainstream GPUs, we are at an inflection point where the significant capacity investments & rewards should now "trickle down" to the consumer market!
Unless they go for mismatched size of RAM chips, the memory bus width would be 512/384-bit or 256/192-bit.
ValantarTrue, it could be, but 256-bit for a high-end or flagship card in 2020? Not happening.
RTX2080 and 5700XT say hi. Rumors so far say 3070 as well, which should count as high-end. :D
Posted on Reply
#39
Valantar
londisteUnless they go for mismatched size of RAM chips, the memory bus width would be 512/384-bit or 256/192-bit.
RTX2080 and 5700XT say hi. Rumors so far say 3070 as well, which should count as high-end. :D
Performance and price wise (adjusted for the current GPU market both in price and performance), the 5700 XT is upper midrange. Not high end, and definitely not a flagship. AMD's last flagship was the Fury X, though the VII also pretended to be one for a few months. Flagship pricing has gone bonkers since then, but even then $400 was not high end GPU territory - the GTX 980 launched at $550. I would say high end starts at ~$500 with performance to match. Midrange (if we're working with a typical low-mid-high market overview) is the broadest range in terms of performance, going from a step above entry level at ~>$200 and up, with low end being anything below that. Flagship is whatever is the highest tier, money-is-no-object, PR move of a SKU. The 3070 if it keeps pace in terms of price with previous xx70 SKUs should be in the upper reaches of the midrange.
Posted on Reply
#40
medi01
Chrispy_I honestly think that Nvidia has a ~20% architectural advantage over AMD
Looking at 250mm2 5700/XT vs 2070/2060 and supers, seriously?
ChomiqAMD fanboys will be drinking Kool-Aid if rdna2 becomes a failure.
AMD managed to crush (product line wise) almighty Intel, which was much further ahead than NV, even back in Pascal/Polaris times. AMD getting to full range competitive portfolio in GPU market (oh, and that includes datacenter too) is a matter of months, not years.

That won't make greenboi reasonable, but, oh well, that might be, ultimately, good for the industry (more money flowing into it, outrageously overpriced NV cards letting AMD up its pricing too).
Posted on Reply
#41
Super XP
TheLostSwedeWith bath salts...
Um emm how about Organic Salts. lol
Posted on Reply
#42
arbiter
gridracedriverif full chip rdna2 are 16GB it means that the bus is 512bit, nVidia's Killer?
Um AMD has in history had massive memory pipe's on their gpu and struggled performance wise vs nvidia. There has been many nvidia cards that got stomped on memory bandwidth by amd yet when you put the 2 cards head to head it was usually nvidia card has better fps. I wouldn't use memory bus width to make any claims as history hasn't been so kind to it.
Posted on Reply
#43
Super XP
arbiterUm AMD has in history had massive memory pipe's on their gpu and struggled performance wise vs nvidia. There has been many nvidia cards that got stomped on memory bandwidth by amd yet when you put the 2 cards head to head it was usually nvidia card has better fps. I wouldn't use memory bus width to make any claims as history hasn't been so kind to it.
If games don't take advantage of all that memory then games wouldn't be a good choice to see how it would perform with heavy memory oriented program's.
Posted on Reply
#44
lemoncarbonate
Looking to upgrade my trusty RX 480 which imo a ground breaking and the best cost/performance ever on graphic card. Let's see which route is the better upgrade, RTX 3xxx or RDNA2.
Posted on Reply
#45
InVasMani
londiste12GB and 16GB implies different memory bus widths, which should be significant.
Seems like they are probably going with 192-bit and 256-bit bus widths given the GDDR6 memory. It would be really interesting if AMD went with a 288-bit bus width on the 12GB part.
Posted on Reply
#46
johnny-r
This is so cool, I never upgraded my RX-580 yet, I'm sort-off glad I waited, this will be an awesome upgrade to compliment my gen2 CPU :-)
Posted on Reply
#47
JAB Creations
ARFRadeon VII ... It's still a mid-range card...
If you think a $700 on par with a 2080 is mid-range then you're going to have to wait to get high end until the aliens visit.
Posted on Reply
#48
FreedomOfSpeech
This time I wanted to switch to AMD for the first time of my life. Last christmas I buyed a LG C9 because of its 4K @120Hz @VRR (GSync/Freesync). Nvidias RTX Generation can do VRR with HDMI 2.0 as G-Sync Combatible. Yesterday I read that the 2019s LG OLEDs wont work with Big Navi @Freesync. This means, I have to buy Nvidia again...
Posted on Reply
#49
Turmania
Call me pesimistic, but I I believe the gap between Nvdia and AMD will even grow bigger when they, both launch their new gen. products this year.
Posted on Reply
#50
medi01
FreedomOfSpeech2019s LG OLEDs wont work with Big Navi
2019 LG OLEDs support HDMI VRR, which Big Navi is highly unlikely NOT to support, please, don't spread FUD, user who registered today to post that strange message.
Posted on Reply
Add your own comment
Nov 22nd, 2024 00:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts