• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 3080 12 GB May Offer 8,960 CUDA Cores

Actually it was pretty mild on my economy cause of different reasons.

1. I got my rtx 3080 for almost msrp. A friend of mine works in a electric shop and he got the card for me to the price the store pay for the card them self, so I have paid what the store pays them self for the card. So I paid 681 usd + 25 % taxes. But with the same card else normally goet for up to twice the msrp or 1200 usd + tax. That is a really good deal I think.
Very good deal! Sometimes it's not only what you know, it's WHO you know.
 
I would still buy a 3080 (non Ti) over any other card ion the line-up if I could buy any of them at RRP. Moving it to 12GB will help with some AAA titles in 4K I guess, but with the explosive growth in DLSS, FSR and soon XeSS, I don't see even 10GB being much concern down the track for 4K.

The trouble is no doubt the 12GB 3080 will see a huge price jump over the 10GB version, so I'd probably lean toward that one even over a 3070 Ti with 16GB.
i haven't seen anything to believe DLSS (specifically) uses any significant less vram than not use it, unless you're just standing and looking:
there are even times, albeit briefly that DLSS uses more. yeah very small sample size but i just haven't seen why folks think that, so along with anyone else, please feel free to PM me some youtube links. lately, i've been binge watching series and movies to spend much time there. 6 feet under was a hoot.
 
my 3080 uses 8 gig vram in the browser if i leave browser open for a day, tales of berseria a old game uses over 6 gig if i run it long enough, vram is used more than assumed.

note berseria has very low quality textures, most games with 4096x4096 or higher textures need more.

im curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
 
my 3080 uses 8 gig vram in the browser if i leave browser open for a day, tales of berseria a old game uses over 6 gig if i run it long enough, vram is used more than assumed.

note berseria has very low quality textures, most games with 4096x4096 or higher textures need more.

im curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
Those are memory leaks dude, that's just bad programming
 
im curious why mvidia cant do 16 gig cards without charging 1.5 grand for a card.
It's about the memory bus. You'll need to have a 256-bit or 512-bit bus for 16GB (or wider with HBM), though I don't know would mixed density chips work like they did with 550 Ti (1GB/192-bit) and 660/660 Ti (2GB/256-bit) back in the day.

As 3080 has 10GB/320-bit and this will use 12GB/384-bit bus since one chip is 32 bits wide. Removing that one chip from 1080 Ti/2080 Ti made the card 11GB/352-bit.
 
Those are memory leaks dude, that's just bad programming

Whatever they may be, it is what it is. As an end user I use binaries, I don't reverse engineer, and recompile software I use.

As a side note I discovered the browser leak issue is caused when using D3D9 for ANGLE, its by far the fastest, but it has this leak problem so that particular leak is resolved by switching to a different graphics API, but not much I can do about games using lots of VRAM, it is what it is there.

Games that use high quality textures however will need a lot of VRAM leak or not.

I do mod games that are moddable to enhance the experience, this nearly always involves better higher quality textures as well.
 
Back
Top