Tuesday, April 19th 2022

NVIDIA RTX 3090 Ti Memory Overclocked to 24.6 Gbps, Yielding 1.18 TB/s Bandwidth

The 21 Gbps-rated GDDR6X memory chips powering the GeForce RTX 3090 Ti really like to overclock. During our testing, we've seen these things reach as high as 1500 MHz (real clock), or 24 Gbps. Tweaktown, in the course of their crypto-mining performance testing of the RTX 3090 Ti, has managed to set a memory clock record, reaching as high as 1538 MHz, or 24.6 Gbps, which works out to a staggering 1181 GB/s memory bandwidth. 24.6 Gbps, VideoCardz notes, is higher than even the rumored 24 Gbps memory clock speed of the RTX 4090, the next-generation flagship powered by the Ada Lovelace graphics architecture. Armed with blistering memory speeds and larger on-die caches, the memory sub-system is poised to once again be a major contributor to the RTX 40-series generational performance uplift. Tweaktown set its RTX 3090 Ti memory OC feat with an MSI RTX 3090 Ti SUPRIM X graphics card.
Sources: Tweaktown, VideoCardz
Add your own comment

22 Comments on NVIDIA RTX 3090 Ti Memory Overclocked to 24.6 Gbps, Yielding 1.18 TB/s Bandwidth

#1
Denver
I imagined that GDDR7 would debut in the next gen, or at least "GDDR6X+" that Samsung had talked about...
Posted on Reply
#2
ncrs
DenverI imagined that GDDR7 would debut in the next gen, or at least "GDDR6X+" that Samsung had talked about...
GDDR6X is Micron (and NVIDIA) exclusive.
Samsung announced GDDR6+ in November, but I'm not aware of any products that shipped with it.
Posted on Reply
#3
DeathtoGnomes
ncrsGDDR6X is Micron (and NVIDIA) exclusive.
Samsung announced GDDR6+ in November, but I'm not aware of any products that shipped with it.
Why buy memory at current levels when you can have next level memory at twice the price? (Contact ref)
Posted on Reply
#4
ghazi
Rather impressive. Though, aren't we expecting plain old GDDR6 to reach 24Gbps eventually now?
Posted on Reply
#5
TheinsanegamerN
ghaziRather impressive. Though, aren't we expecting plain old GDDR6 to reach 24Gbps eventually now?
samsung only started sampling 24 gbps ddr6 in december of 2021. For reference, samsung was sampling 18 gbps ddr6 in september of 2018 and the FIRST product to use it will be the AMD 6x50xt series of cards, which are coming out in may. Of 2022. Nearly 4 years later.

24Gbps DDR6 may not be reliably available until 2026 for rDNA 5 at that rate.
Posted on Reply
#6
ixi
And 99.9% of us don't care about 3090 TI because of stupid high price, hohihuhe.
Posted on Reply
#7
ghazi
TheinsanegamerNsamsung only started sampling 24 gbps ddr6 in december of 2021. For reference, samsung was sampling 18 gbps ddr6 in september of 2018 and the FIRST product to use it will be the AMD 6x50xt series of cards, which are coming out in may. Of 2022. Nearly 4 years later.

24Gbps DDR6 may not be reliably available until 2026 for rDNA 5 at that rate.
Fair point, but I doubt it will take so long. 18Gbps has been in use on the 6900 XTXH cards for over a year now and IIRC the 6500 XT uses it as well. It probably would have been introduced earlier if not for a combination of factors: memory controllers not being able to handle it, increased PCB cost and complexity to maintain signal integrity at such high switching frequencies, NVIDIA using GDDR6X in its stead for the high-end, AMD's Infinity Cache designed to lower memory cost, etc. The fact sadly remains that GDDR6X is a tremendously underwhelming technology which barely performs better than GDDR6 while drawing far more power and being more difficult to implement, not to mention the investment that went into it -- but I digress.
Posted on Reply
#8
Vayra86
So basically Nvidia made its own 4090 half obsolete for being 'the top dog'. Imagine the laughs when your current card has better VRAM than the next best thing. Because that's really all these cards serve, epeen prowess. (oh sorry, 'creators', did I offend? :roll:) Did anyone say messy gen? Naahh
Posted on Reply
#9
GoldenX
The miracle of properly cooled memory, or how to make a decent backplate.

My trashDell 1660S can't take any overclock on VRAM with its stock fan, add some small and cheap memory heatsinks, and it can suddenly do +800MHz without issue, 24/7. VRAM temps are no joke.
Posted on Reply
#10
Unregistered
GoldenXThe miracle of properly cooled memory, or how to make a decent backplate.

My trashDell 1660S can't take any overclock on VRAM with its stock fan, add some small and cheap memory heatsinks, and it can suddenly do +800MHz without issue, 24/7. VRAM temps are no joke.
Exactly why full block water cooled GPU are so good.
Posted on Edit | Reply
#11
Zuli_Muli
K, anyone else not impressed with a screen grab anymore? What ever happened to benchmarking the OC to show the gains from the hard work put in on making a stable OC?
Posted on Reply
#12
R-T-B
Vayra86(oh sorry, 'creators', did I offend? :roll:
*shrugs*

In order to offend you'd have to be remotely on target.

I work hard, I want a card that can game hard. I like big complete chips. So I bought one. That's really all there is to my argument. I'm neither ashamed, nor proud. It simply is what it is, does what it says on the tin, and fits my needs/wants.
Posted on Reply
#13
Fluffmeister
I just miss the days when HBM was required on consumer graphics cards.
Posted on Reply
#14
eidairaman1
The Exiled Airman
R-T-B*shrugs*

In order to offend you'd have to be remotely on target.

I work hard, I want a card that can game hard. I like big complete chips. So I bought one. That's really all there is to my argument. I'm neither ashamed, nor proud. It simply is what it is, does what it says on the tin, and fits my needs/wants.
Its why Im thinking of a Radeon Pro series.
Posted on Reply
#15
Vayra86
R-T-B*shrugs*

In order to offend you'd have to be remotely on target.

I work hard, I want a card that can game hard. I like big complete chips. So I bought one. That's really all there is to my argument. I'm neither ashamed, nor proud. It simply is what it is, does what it says on the tin, and fits my needs/wants.
Wasn't targeting you ;) You're never posting with 'look at me, got latest greatest at launch' - every launch - and you're not a 'creator' posting a few shitty Youtube vids either. You know that. And the comment was mostly aimed at the 'higher more better' idea of halo cards and how this memory is likely to exceed frequencies of what comes next. That's odd, and it kills an unknown percentage of epeen power IMHO.

I'm a gimme an uncut die guy myself tbh... Except I prefer landing at a full 104 :D
Posted on Reply
#16
GoldenX
Vayra86I'm a gimme an uncut die guy myself tbh... Except I prefer landing at a full 104 :D
AMD: The 6500 XT is uncut!
Posted on Reply
#17
Vayra86
GoldenXAMD: The 6500 XT is uncut!
There you go, full die for everyone.
Posted on Reply
#18
Chomiq
That fire icon should be above the temperature sensor.
Posted on Reply
#19
trsttte
Vayra86I'm a gimme an uncut die guy myself tbh... Except I prefer landing at a full 104 :D
There's logic in this, a die that could meet spec with all it's compute units and whatever parts intact is probably of higher quality (silicon lottery wise) than one that didn't and needed to be cut down to meet a lower spec.
Posted on Reply
#20
Vayra86
trsttteThere's logic in this, a die that could meet spec with all it's compute units and whatever parts intact is probably of higher quality (silicon lottery wise) than one that didn't and needed to be cut down to meet a lower spec.
Exactly, a good example is the clocking of 1070ti's versus 1080's. The difference isn't huge, or even 'mediocre', but there's a difference between equally cooled cards.

Similarly, one of the better cards I've owned before that was a GTX 770, which was another full 104 (680 with better vram).

OTOH, cards like a 980ti kinda speak against the idea that a full die is better. 980ti is a cut as well :D I guess it differs per node and even per TDP budget/voltage curve/gen.

I'll happily admit a big part of this is emotional. Paying for a chip of the fullest version of things, sure. But a further CUT of a chip? Meh.
Posted on Reply
#21
trsttte
Vayra86Exactly, a good example is the clocking of 1070ti's versus 1080's. The difference isn't huge, or even 'mediocre', but there's a difference between equally cooled cards.

Similarly, one of the better cards I've owned before that was a GTX 770, which was another full 104 (680 with better vram).

OTOH, cards like a 980ti kinda speak against the idea that a full die is better. 980ti is a cut as well :D I guess it differs per node and even per TDP budget/voltage curve/gen.

I'll happily admit a big part of this is emotional. Paying for a chip of the fullest version of things, sure. But a further CUT of a chip? Meh.
Not having products dependent on "golden samples" (like the 980ti you mention) is smart but wastes the potential of such golden samples that can be heavely upcharged (like the 3090ti). Still, a less cut version "should" (lottery) still be better per compute unit than a heavely cut (without counting the extra ones)

This is all anecdotal of course because silicon lottery and die defects don't work neatly or linearly like that (a better hint would be max clocks and power density) but it's still a cool "theory" and my ocd prefers perfect powers of 2 (like no cutdown 192bits buses, only complete 256bits - damn you GA-102 384bit bus :banghead:)
Posted on Reply
#22
SOAREVERSOR
R-T-B*shrugs*

In order to offend you'd have to be remotely on target.

I work hard, I want a card that can game hard. I like big complete chips. So I bought one. That's really all there is to my argument. I'm neither ashamed, nor proud. It simply is what it is, does what it says on the tin, and fits my needs/wants.
This is my logic as well. I work in the office, but I also make money off my box at home. Cost means nothing when some of the software is several times the cost of the hardware. Oh well? It's in my budget so I get xeons and a 3090, my other box has a quadro. Oh well! It is what it is.

Prices have been skyrocketing for a WHILE now, so has power. Remember when the 6800 ultra hit with two damn power connectors and the first true dual slot card, people shat their pants then as well. Rational people just shrugged used the molex adapter and moved on.
Posted on Reply
Add your own comment
Nov 28th, 2024 14:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts