Friday, August 14th 2020

Micron Confirms Next-Gen NVIDIA Ampere Memory Specifications - 12 GB GDDR6X, 1 TB/s Bandwidth

Micron have spilled the beans on at least some specifications for NVIDIA's next-gen Ampere graphics cards. In a new tech brief posted by the company earlier this week, hidden away behind Micron's market outlook, strategy and positioning, lie some secrets NVIDIA might not be too keen to see divulged before their #theultimatecountdown event.

Under a comparison on ultra bandwidth solutions, segregated into the GDDR6X column, Micron lists a next-gen NVIDIA card under the "RTX 3090" product name. According to the spec sheet, this card features a total memory capacity of 12 GB GDDR6X, achieved through 12 memory chips with a 384-bit wide memory bus. As we saw today, only 11 of these seem to be populated on the RTX 3090, which, when paired with specifications for the GDDR6X memory chips being capable of 19-21 Gbps speeds, brings total memory subsystem bandwidth towards the 912 - 1008 GB/s range (using 12 chips; 11 chips results in 836 GB/s minimum). It's possible the RTX 3090 product name isn't an official NVIDIA product, but rather a Micron-guessed possibility, so don't look at it as factual representation of an upcoming graphics card. One other interesting aspect from the tech brief is that Micron expects their GDDR6X technology to enable 16 Gb (or 2 GB) density chips with 24 Gbps bandwidth, as early as 2021. You can read over the tech brief - which mentions NVIDIA by name as a development partner for GDDR6X - by following the source link and clicking on the "The Demand for Ultra-Bandwidth Solutions" document.
Source: Micron
Add your own comment

53 Comments on Micron Confirms Next-Gen NVIDIA Ampere Memory Specifications - 12 GB GDDR6X, 1 TB/s Bandwidth

#1
TheLostSwede
News Editor
Back to 1GB per chip explains why the presumed picture of the RTX 30x0 (21x0?) card has so many memory chips. Even if it's "only" 12, they're going to take up a lot of physical space until 2GB chips arrive at some point. I guess the next Super cards might have half the amount of memory chips...
Posted on Reply
#2
RedelZaVedno
Does this mean that GDDR6X is as fast as HBM2?
Posted on Reply
#3
ThrashZone
Hi,
Death rate after 6 months will say whether I'll buy anything with micron.
Posted on Reply
#5
TheLostSwede
News Editor
RedelZaVednoDoes this mean that GDDR6X is as fast as HBM2?
Note quite, but almost.
Posted on Reply
#6
EarthDog
ThrashZoneHi,
Death rate after 6 months will say whether I'll buy anything with micron.
Samsung equipped cards died too...it wasn't just micron. ;)
Posted on Reply
#7
TheLostSwede
News Editor
ZoneDymoIs 12 not a bit low?
Depends on how deep your pockets are...
Posted on Reply
#8
Daven
RedelZaVednoDoes this mean that GDDR6X is as fast as HBM2?
There is a table in this very article that summarizes all the memory tech from GDDR5 to HBM2e.
Posted on Reply
#9
oxrufiioxo
20+ always seemed a bit too much to me for a consumer level GPU given how much that would increase the cost of the card.... I'm more interested in what the 3080 will have will it be 8 or 10... I have a person I did a build for waiting for one.

I'm also interested in what Nvidia Charges for the 3090 and if it's the actual flagship or will we see a 3090 ti or super down the line.
Posted on Reply
#10
londiste
RedelZaVednoDoes this mean that GDDR6X is as fast as HBM2?
Micron's real-worlds examples are GDDR6X on 384-bit bus (12 chips) vs HBM2 on 4096-bit bus (4 chips). In this configuration, almost.
Posted on Reply
#11
ZoneDymo
TheLostSwedeDepends on how deep your pockets are...
Well im more thinking how an RX480 which was medium gaming years back shipped with 8gb of ram, so 12 on a brand new card seems low
Posted on Reply
#12
EarthDog
ZoneDymoWell im more thinking how an RX480 which was medium gaming years back shipped with 8gb of ram, so 12 on a brand new card seems low
Or maybe 8GB on a 480 was more epeen than useful? That thing was a 1080p card, barely 1440, so 8GB on such a thing wasn't warranted IMO.
Posted on Reply
#13
RoutedScripter
Can't you guys add SPOILER warnings to news like this, I don't even know why the heck am I reading the news section, I'm gone.
Posted on Reply
#14
Dristun
There aren't any games on the market that benefit from more than 8-9 gigs even in 4K. It's been tested to death already. I'm almost 100% sure that 12 will be plenty for the time the cards will actually be usable for comfortable 4K-gaming (next 4 years at best if we're lucky).
But of course people will fool themselves into "futureproofing" argument again, compounded by AMD once again loading cards unsuitable for 4K-60 with VRAM nobody needs. We're doing this thing every launch cycle!
Posted on Reply
#15
jesdals
Would rather like to know whats comming with HBM2E mem? Would love to se a 32GB VEGA 4
Posted on Reply
#16
xkm1948
Huh, 12GB Titan RTX? Never saw something like that. Titan RTX was 24GB VRAM. Guess the VRAM capacity is just place holder
Posted on Reply
#17
RoutedScripter
EarthDogOr maybe 8GB on a 480 was more epeen than useful? That thing was a 1080p card, barely 1440, so 8GB on such a thing wasn't warranted IMO.
Yeah, I have RX480 with 8GB, there's no way I'm changing it unless it has at least 12 or more.
DristunThere aren't any games on the market that benefit from more than 8-9 gigs even in 4K. It's been tested to death already. I'm almost 100% sure that 12 will be plenty for the time the cards will actually be usable for comfortable 4K-gaming (next 4 years at best if we're lucky).
But of course people will fool themselves into "futureproofing" argument again, compounded by AMD once again loading cards unsuitable for 4K-60 with VRAM nobody needs. We're doing this thing every launch cycle!
That's where you're wrong, see games do require a lot more, they just have to resort to annoying streaming, and streaming textures is very inconsitently developed so the experiences vary a lot how much effort a dev put into that system. One of the simulator games I play takes a lot of RAM/VRAM and streaming isn't perfect, if we had all that in RAM it be no problem.

What about VR and huge resolutions like Varjo HMDs ...

Ultra high resolution is eventually what will replace anti-aliasing as the proper thing to get rid of rough edges, probably equally or even more demanding as AA.
Posted on Reply
#18
ZoneDymo
EarthDogOr maybe 8GB on a 480 was more epeen than useful? That thing was a 1080p card, barely 1440, so 8GB on such a thing wasn't warranted IMO.
Well if more vram is available, then gamedevelopers can make use of it, no point in making a game devour 6+gb of memory when 70% of the gamers out there have to make do with 1.5gb of vram right?

So in teh sense of moving forward properly.....yeah I think 12 gb is low.
DristunThere aren't any games on the market that benefit from more than 8-9 gigs even in 4K. It's been tested to death already. I'm almost 100% sure that 12 will be plenty for the time the cards will actually be usable for comfortable 4K-gaming (next 4 years at best if we're lucky).
But of course people will fool themselves into "futureproofing" argument again, compounded by AMD once again loading cards unsuitable for 4K-60 with VRAM nobody needs. We're doing this thing every launch cycle!
Vram is not just about resolution, play some GTA5 and see what features ask for more Vram, now think that games could up those features way more, way higher resolution textures that can all be loaded in or loaded in faster and less pop in.
The reason that no games on the market benefit from more then 8 - 9 gigs is because there are currenly barely any cards that have more then 8 - 9 gigs, same reason not a single game is build up from the ground around ray tracing or so.

In order for the games to have to use it, the hardware needs to exist because otherwise nobody can play and thus wont purchase said games.

So with the eye on better graphics etc, giving devs more room, I think going for 12 gb now is....very dissapointing.
I would have hoped for honestly double that in this day and age.

and because Big N is not giving more, it does not even matter what AMD will do because again, all Nvidia consumers would not be able to use whatever gamecompanies could have done with all that Vram so they just wont and we will all just have to wait for the next next gen.
Posted on Reply
#19
steen
TheLostSwedeBack to 1GB per chip explains why the presumed picture of the RTX 30x0 (21x0?) card has so many memory chips. Even if it's "only" 12, they're going to take up a lot of physical space until 2GB chips arrive at some point. I guess the next Super cards might have half the amount of memory chips...
Clamshell mode. Good call on the super/refresh for 2021. There were issues with >20gbps & was dialled back to 19gbps. Don't know if it's sorted now.
Posted on Reply
#20
EarthDog
ZoneDymoWell if more vram is available, then gamedevelopers can make use of it, no point in making a game devour 6+gb of memory when 70% of the gamers out there have to make do with 1.5gb of vram right?

So in teh sense of moving forward properly.....yeah I think 12 gb is low.



Vram is not just about resolution, play some GTA5 and see what features ask for more Vram, now think that games could up those features way more, way higher resolution textures that can all be loaded in or loaded in faster and less pop in.
The reason that no games on the market benefit from more then 8 - 9 gigs is because there are currenly barely any cards that have more then 8 - 9 gigs, same reason not a single game is build up from the ground around ray tracing or so.

In order for the games to have to use it, the hardware needs to exist because otherwise nobody can play and thus wont purchase said games.

So with the eye on better graphics etc, giving devs more room, I think going for 12 gb now is....very dissapointing.
I would have hoped for honestly double that in this day and age.

and because Big N is not giving more, it does not even matter what AMD will do because again, all Nvidia consumers would not be able to use whatever gamecompanies could have done with all that Vram so they just wont and we will all just have to wait for the next next gen.
can't say I feel like devs are hamstrung by vram... You're putting the cart before the horse imo.

24gb is utterly useless and too pricey. By the time more than 12gb can be used at a reasonable resolution (because less than 2% are on 4k... and I can't think of a title that can use 12gb) it might be too slow. Devs have a fair amount of headroom already, really.
Posted on Reply
#21
bonehead123
oxrufiioxoI'm also interested in what Nvidia Charges for the 3090
New = mucho dinero, at least at launch anyways, hehehe :eek:

Given how much the 11GB cards were/still are, I shudder to think how much they will ask for the new 12/16/24GB models.. :fear:..:respect:..:cry:
Posted on Reply
#22
chstamos
That's a silly question, just for fun, but those VRAM levels really make me wonder. Could the main system THEORETICALLY access an unused chunk of VRAM memory to use as main RAM ? I know the opposite (using system RAM as an extension of GPU ram) is hampered by the fact system ram is in general bog slow compared to VRAM, plus the bandwidth bottleneck of PCIe. But VRAM is fast. Could vram be used for system ram , in theory? Other than reasons of common sense, it being much more expensive and the whole thing being impractical and , well, stupid, is it technically doable?
Posted on Reply
#23
ZoneDymo
EarthDogcan't say I feel like devs are hamstrung by vram... You're putting the cart before the horse imo.

24gb is utterly useless and too pricey. By the time more than 12gb can be used at a reasonable resolution (because less than 2% are on 4k... and I can't think of a title that can use 12gb) it might be too slow. Devs have a fair amount of headroom already, really.
Again I want to point out that Vram can be used for more then just dealing with higher resolutions.
A game could be made to use more then 20gb of Vram while playing on 1080p, it just will use the memory storage capacity for other things, like I said, for example higher resolution textures and less to no pop in.
That is kinda what is being done now with the PS5 although that uses the SSD for it that a bit more closely.
Posted on Reply
#24
EarthDog
chstamosThat's a silly question, just for fun, but those VRAM levels really make me wonder. Could the main system THEORETICALLY access an unused chunk of VRAM memory to use as main RAM ? I know the opposite (using system RAM as an extension of GPU ram) is hampered by the fact system ram is in general bog slow compared to VRAM, plus the bandwidth bottleneck of PCIe. But VRAM is fast. Could vram be used for system ram , in theory? Other than reasons of common sense, it being much more expensive and the whole thing being impractical and , well, stupid, is it technically doable?
What benefits would that have? How would that benefit card makers?
ZoneDymoAgain I want to point out that Vram can be used for more then just dealing with higher resolutions.
A game could be made to use more then 20gb of Vram while playing on 1080p, it just will use the memory storage capacity for other things, like I said, for example higher resolution textures and less to no pop in.
That is kinda what is being done now with the PS5 although that uses the SSD for it that a bit more closely.
I know it can be used for more things, but that doesn't change my point. At all. :)

More than 12-16GB is really just wasting cash right now. Its like buying a AMD processor with 12c/24t... if gaming is only using at most 6c/12t (99.9%) more NOW?

Again, devs have headroom to play with already... they aren't even using that.
Posted on Reply
#25
PowerPC
DristunThere aren't any games on the market that benefit from more than 8-9 gigs even in 4K. It's been tested to death already. I'm almost 100% sure that 12 will be plenty for the time the cards will actually be usable for comfortable 4K-gaming (next 4 years at best if we're lucky).
But of course people will fool themselves into "futureproofing" argument again, compounded by AMD once again loading cards unsuitable for 4K-60 with VRAM nobody needs. We're doing this thing every launch cycle!
Nice jab at AMD. Since when are we doing 4K-60 every launch cycle? Really, really weird argument.
Posted on Reply
Add your own comment
Oct 31st, 2024 20:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts