Wednesday, December 15th 2021

NVIDIA RTX 3080 12 GB May Offer 8,960 CUDA Cores

The RTX 3080 12 GB is still a rumored card, despite the increasing amounts of evidence towards its eventual release. The card's refreshed VRAM amount in particular seems to be an important part of NVIDIA's competitiveness plan, not only against AMD's RX 6000-series, which sport a psychological (and sometimes practical) advantage with their 16 GB VRAM capacities; but against Intel and its upcoming ARC Alchemist launch, which are expected to also offer 16 GB of VRAM at competitive shading performance levels.

But it seems that NVIDIA's refresh of the RTX 3080 graphics card won't be limited to a VRAM and bus width increase - CUDA core counts are now reported to have also increased by 2.9%, up to 8,960 from the RTX 3080 10 GB's 8,704 CUDA cores. The card is naturally still expected to make use of the GA-102 chip. An expected increased memory bus width of 384-bits will also bring about an increase in memory bandwidth, at 912 GB/s, the same as the RTX 3080 Ti (it stands at 760 GB/s for the RTX 3080).

All of these changes do however conspire to bring about an increase in the card's TDP as well - up to 350 W, the same as the RTX 3080 Ti and RTX 3090. The RTX 3080 12 GB is now expected to be available through mid-late January 2022 - it seems that board partners have already received some number of the chips that are meant to go into the RTX 3080 12 GB cards of this world.
Source: Videocardz
Add your own comment

30 Comments on NVIDIA RTX 3080 12 GB May Offer 8,960 CUDA Cores

#26
looniam
Minus InfinityI would still buy a 3080 (non Ti) over any other card ion the line-up if I could buy any of them at RRP. Moving it to 12GB will help with some AAA titles in 4K I guess, but with the explosive growth in DLSS, FSR and soon XeSS, I don't see even 10GB being much concern down the track for 4K.

The trouble is no doubt the 12GB 3080 will see a huge price jump over the 10GB version, so I'd probably lean toward that one even over a 3070 Ti with 16GB.
i haven't seen anything to believe DLSS (specifically) uses any significant less vram than not use it, unless you're just standing and looking:
there are even times, albeit briefly that DLSS uses more. yeah very small sample size but i just haven't seen why folks think that, so along with anyone else, please feel free to PM me some youtube links. lately, i've been binge watching series and movies to spend much time there. 6 feet under was a hoot.
Posted on Reply
#27
chrcoluk
my 3080 uses 8 gig vram in the browser if i leave browser open for a day, tales of berseria a old game uses over 6 gig if i run it long enough, vram is used more than assumed.

note berseria has very low quality textures, most games with 4096x4096 or higher textures need more.

im curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
Posted on Reply
#28
Mussels
Freshwater Moderator
chrcolukmy 3080 uses 8 gig vram in the browser if i leave browser open for a day, tales of berseria a old game uses over 6 gig if i run it long enough, vram is used more than assumed.

note berseria has very low quality textures, most games with 4096x4096 or higher textures need more.

im curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
Those are memory leaks dude, that's just bad programming
Posted on Reply
#29
Ruru
S.T.A.R.S.
chrcolukim curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
It's about the memory bus. You'll need to have a 256-bit or 512-bit bus for 16GB (or wider with HBM), though I don't know would mixed density chips work like they did with 550 Ti (1GB/192-bit) and 660/660 Ti (2GB/256-bit) back in the day.

As 3080 has 10GB/320-bit and this will use 12GB/384-bit bus since one chip is 32 bits wide. Removing that one chip from 1080 Ti/2080 Ti made the card 11GB/352-bit.
Posted on Reply
#30
chrcoluk
MusselsThose are memory leaks dude, that's just bad programming
Whatever they may be, it is what it is. As an end user I use binaries, I don't reverse engineer, and recompile software I use.

As a side note I discovered the browser leak issue is caused when using D3D9 for ANGLE, its by far the fastest, but it has this leak problem so that particular leak is resolved by switching to a different graphics API, but not much I can do about games using lots of VRAM, it is what it is there.

Games that use high quality textures however will need a lot of VRAM leak or not.

I do mod games that are moddable to enhance the experience, this nearly always involves better higher quality textures as well.
Posted on Reply
Add your own comment
Dec 2nd, 2024 07:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts