Wednesday, September 21st 2022

ICYMI, NVIDIA GeForce RTX 4080 12GB Uses 192-bit Memory Bus

Amid the fog of rapid announcements and AIC graphics card launches, this little, interesting detail might have missed you, but the new GeForce RTX 4080 12 GB graphics card announced yesterday; features a memory bus-width of just 192-bit, which is half that of the RTX 3080 12 GB (384-bit). The card uses 21 Gbps GDDR6X memory, which at 192-bit bus-width, works out to just 504 GB/s bandwidth. In comparison, the RTX 3080 12 GB uses 19 Gbps GDDR6X memory, which at 384-bit bus width, produces 912 GB/s. In fact, even the original RTX 3080 with 10 GB of GDDR6X memory across a 320-bit bus, has 760 GB/s on tap.

The bigger RTX 4080 16 GB variant uses 256-bit memory bus, but faster 23 Gbps GDDR6X memory, producing 736 GB/s of memory bandwidth, which again, is less than that of the original 10 GB RTX 3080. It's only the RTX 4090 that has an unchanged amount of memory bandwidth over the previous generation—1008 GB/s, which is identical to that of the RTX 3090 Ti, and a tad higher than the 936 GB/s of the RTX 3090 (non-Ti). Of course, memory bandwidth is no way to compare the RTX 40-series from its predecessors, there are a dozen other factors that weigh into performance, and what matters is you're getting generationally more memory amounts with the RTX 4080-series. The RTX 4080 12 GB offers 20% more memory than the RTX 3080, and the RTX 4080 16 GB offers 33% more than the RTX 3080 12 GB. NVIDIA tends to deliver significant performance gains with each new generation, and we expect this to hold up.
Add your own comment

81 Comments on ICYMI, NVIDIA GeForce RTX 4080 12GB Uses 192-bit Memory Bus

#1
btarunr
Editor & Senior Moderator
"PSU Power Required" in the above table refers to the minimum PSU wattage in the system requirements given out by NVIDIA.

Example: you need at least an 850 W PSU to run RTX 4090.
Posted on Reply
#2
ARF
"4080-16" is a different chip vs "4080-12". AD103 vs AD104. :banghead::banghead:

Posted on Reply
#3
Dirt Chip
memory bus... dat shit storm, all over again...

Anyway, 2 kinds of 4080 so low.
NV will go in flames for that, and for good reasons.
Posted on Reply
#4
wolf
Better Than Native
this little, interesting detail must have missed you
Might have missed us, but a lot of us didn't. I will be very keen to see how it performs. Bus width is always controversial, but if a 4080 12GB matches a 3090Ti generally... well that speaks volumes.

can't say I'm a fan of multiple 4080's with different configurations though.
Posted on Reply
#5
Crackong
We all know it is just a 4070
Posted on Reply
#6
ToTTenTranz
ARF"4080-16" is a different chip vs "4080-12". AD103 vs AD104. :banghead::banghead:

It does look like they upgraded AD104's marketing name to soften the blow of its asking price.


They're clearly testing brand loyalty here. it's up to consumers to decide if nvidia was right or not.


I for one don't see any need to buy a GPU that is faster than a 3090. There's simply no software that demands it.
Game development is still sticking to cross-gen titles due to console shortage. Everything is still just PS4-era games with RT glitter on top at best, and the top-end 2020 cards run through those at 4k120 with ease (considering the widespread of temporal solutions like FSR2 and DLSS2).
Posted on Reply
#7
hat
Enthusiast
I'm reminded of the 9600GSO with 192 bit and 128 bit variants...
Posted on Reply
#9
Zubasa
wolfMight have missed us, but a lot of us didn't. I will be very keen to see how it performs. Bus width is always controversial, but if a 4080 12GB matches a 3090Ti generally... well that speaks volumes.

can't say I'm a fan of multiple 4080's with different configurations though.
But then, it is not like the 3090/Ti are price performance kings, and their performance don't exactly blow their cheaper siblings out of the water either.
Posted on Reply
#10
ratirt
two 4080 and two different performance levels. Kinda like in the mobile market. NV has gone mad or is doing this on purpose to confuse customers. (I think the answer here is obvious why NV is doing it) 3080 was one of the most desirable products due to performance and money now NV wants to release 2 different 4080. Obviously one is not equal to the other but the prices are insane so as the power consumption. $1200 for 4080. That is insane. Companies were doing a price hike to bump the prices for the upcoming cards. People were opposed to that idea. Well, I think it is evident now if these prices are true.
Posted on Reply
#11
Mussels
Freshwater Moderator
Ouch, bit of a step backwards there


I really hope the reason the marketing so far has been focused on DLSS is because DLSS reduces the requirements on the memory bus, and they found a way to cut costs and only lose performance in non-nvidia sponsored engines...
Posted on Reply
#12
Zubasa
MusselsOuch, bit of a step backwards there
TBH I kind of expected this, they did the whole Turing thing where they jack up the msrp to clear out the old stock from a mining crash.
Also 1060 3GB and 5GB cards already happened back then. One with cutdown cores and the other cut down memory bus compare the the 6GB.
All those sold like hot cakes. The 4080 12GB is just a combination of both, a logical step for Jensen.
Posted on Reply
#13
ratirt
MusselsOuch, bit of a step backwards there
With the specs sure but these are still gonna perform. Might be a problem with all the Ray Tracing eating the mem and the bandwidth. High res might have a choke here.
Posted on Reply
#14
lexluthermiester
MusselsOuch, bit of a step backwards there
Exactly. NVidia seems screwing things up...
Posted on Reply
#15
ARF
lexluthermiester192?!? WTH Nvidia?
hatI'm reminded of the 9600GSO with 192 bit and 128 bit variants...
It is "GTX 970-3.5" and "is 10GB on an RTX 3080 enough?" all over again.
ToTTenTranzIt does look like they upgraded AD104's marketing name to soften the blow of its asking price.


They're clearly testing brand loyalty here. it's up to consumers to decide if nvidia was right or not.


I for one don't see any need to buy a GPU that is faster than a 3090. There's simply no software that demands it.
Game development is still sticking to cross-gen titles due to console shortage. Everything is still just PS4-era games with RT glitter on top at best, and the top-end 2020 cards run through those at 4k120 with ease (considering the widespread of temporal solutions like FSR2 and DLSS2).
It looks like EVGA has already said no.
Posted on Reply
#16
ratirt
To be fair, if 4080 is 192-bit what 4070 or 4060 will be? 128 and 64 bit?
Posted on Reply
#17
Gica
lexluthermiester192?!? WTH Nvidia?
It probably has Content Creation as its target. High performance, reasonable amount of memory (bandwidth does not matter in this segment), lower consumption, lower price. Attractive for integrators.
Posted on Reply
#18
Mussels
Freshwater Moderator
bandwidth aside I am totally for a 12GB and 16GB option vs 10 and 12


Maybe Nvidia assumed higher clockrate GDDR6X would exist by now and they're stuck with a lower bitrate bus that assumed higher clocked VRAM would compensate?
They've probably got a 40x0 super lineup ready and waiting when that VRAM comes out...
Posted on Reply
#19
wolf
Better Than Native
ZubasaBut then, it is not like the 3090/Ti are price performance kings, and their performance don't exactly blow their cheaper siblings out of the water either.
Price is a whooooole different discussion, one which I think is kinda insane.

More just purely, if it has less shaders and less memory bandwidth, if it can equal performance vs a 384-bit GDDR6X subsystem, then that's certainly something
Posted on Reply
#20
Mussels
Freshwater Moderator
wolfPrice is a whooooole different discussion, one which I think is kinda insane.

More just purely, if it has less shaders and less memory bandwidth, if it can equal performance vs a 384-bit GDDR6X subsystem, then that's certainly something
As long as it can, if this is just a refresh design it won't without something like DLSS (which 30 series would also benefit from)



Just... please don't make us pay more, for less. Please.
Posted on Reply
#21
Gica
ARF"4080-16" is a different chip vs "4080-12". AD103 vs AD104. :banghead::banghead:
Correct. Not how to use the same chip for the 192 and 256 bus.
64-128-256-512
or
96-192-384
The chip is designed from the start for one of the variants.
Posted on Reply
#22
Dirt Chip
by that (broken) logic, see no reason not to do a 3090 20GB variant...
:kookoo:
Posted on Reply
#23
wolf
Better Than Native
MusselsJust... please don't make us pay more, for less. Please
You're not wrong, to get 2x my 3080 performance it's looking I'll need to spend double... No thanks.

But waiting for full 4080 reviews should coincide with RDNA3 and I can choose based on something less... Insane.
Posted on Reply
#24
Bwaze
Is it possible the lower memory bandwidth will mostly show in titles with older engines, and older titles - which reviews usually don't cover?

I know the older titles should pretty much run well even on older cards, but that's not always the case.

You can bog down any card with just 11 year old Skyrim and high enough textures and mods.

And of course VR. On titles like DCS. No DLSS to help you, you need raw power and lots of memory to push high resolution of HP Reverb G2 or Valve Index in DCS and other sims that don't change their graphics engines often.
Posted on Reply
#25
lexluthermiester
ARFIt is "GTX 970-3.5" and "is 10GB on an RTX 3080 enough?" all over again.
There is no excuse for a 192bit VRAM bus on a TOP TIER CARD. It's stupid, plain and simple.
GicaIt probably has Content Creation as its target. High performance, reasonable amount of memory (bandwidth does not matter in this segment), lower consumption, lower price. Attractive for integrators.
Maybe. Fair point.
Posted on Reply
Add your own comment
Dec 22nd, 2024 15:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts