• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
No surprise whatsoever. Look at some of the previous launch dates:


980: September 2014 >>>>>> 980 Ti: June 2015

1070: June 2016 >>>>>> 1070 Ti: November 2017

2070: October 2018 >>>>>> 2070 Super: July 2019


So I'd say that 3080 Ti 20GB is likely to be unveiled at E3 on June 15th 2021 considering that 3060 12 GB is set for late February *cough* availability. Currently no word on if we ever get a 3070 Ti 16 GB though.
 
Last edited:
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W

Sahutep

New Member
Joined
Jan 16, 2021
Messages
1 (0.00/day)
10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.

3080 with 10GB at 700$ is much more aforable that 3080 with 20GB at 1000$

Today 16GB of VRAM is pure marketing for noobs, maybe on next generation of GPUs is necessary but not now.

The 3060 with 12GB is a counterpart of 6700 with 12GB, neither of both cards needs this quantity of VRAM for play at 1080p or 2k but marketing is marketing and sell better these cards.
I hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
2500 eur for 10 days cruising in Carribean sea or new PC?
it would be difficult to find such a cruise nowadays...

thank you for the link.
But this is screaming "BS" all over the place :

without any exact figures

I hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
and again: do you have a link supporting your theory about 10 Gb not being enough ?

I'm asking this to EVERY user writing this kind of comments, and so far NO ONE answering me with a meaningful reply.

Cyberpunk 2077 at 2k with cinematic rtx enabled could easily use up 10gb vram.
still waiting for a link about this claiming....
 
Joined
Sep 26, 2012
Messages
871 (0.19/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
is this the best link you can provide ? :roll:
System requirements stating that 10 Gb are enough for the best quality at 4K...
You actually proved the contrary of your claims.

Ai you make a good point and I can go further with it...

This Unreal Rebirth demo is running on a 1080Ti in real-time with the most detailed textures any of us have ever seen, it would be great if they specified what resolution it is running in and how much vram it used, but does it matter?



PC games already like to make use of both ram and vram to share out game assets and prefer to fill up the vram first with data that doesn't really need to be there, we know that how much vram games around today actually need is not always what you see on your in-game OSD vram utilisation counter *unless you make use of the updated MSI Afterburner/RTSS which can now show the actual vram used by games.*


This has been a standard way of getting around storage speed I/O bottlenecks by unifying ram and vram for donkeys years... now how about these new fangled super fast I/O optimisations we have coming along?

The new Nvidia GPU's have NVCache, which is a way for data transfers to go direct from SSD to vram, that's very much like what the consoles have going on and you can bet your ass AMD and Intel will get on it too with DirectX DirectStorage... the idea is that it is so damn fast, textures and other data swapping can be done in real-time without impacting performance and finally after the 30ish years that SSD's have been around, we get to put HDD's firmly on the back seat for gaming.

So pair this with the latest game engines that are ready to take advantage of it all and... the new consoles only have a 16gb shared pool of memory, but are able to act like they have more than this... an ideal recommended specs gaming PC already has 16gb ram and 6gb to 8gb of vram.... 32gb ram is likely to become the new sweetspot for gaming... so this along with the proper utilisation of faster storage could very well mean that even the 3070 with the lowest amount of vram @ 8gb among the latest gpu's, won't really be seen to be struggling with handling these insanely detailed textures.... and besides this, if all else fails then there are texture settings below ultra too if you didn't know that already, and dropping to high textures is still going to look far better than anything which has come before which will be great for the lower resolutions that ain't good enough to see the extra detail that will be afforded by the ultra textures anyways?! lol.

Indeed a lot of gamers with deep pockets are still waiting or holding-off for a card with the specs of a 3080ti and as the gaming developers are throwing more and more graphic demanding games into the market. Actually many games are being purchased solely based on their 'pretty worlds' and over the top sandbox bling. There seems to be no end to this and it essentially having started when gamers were delighted to see water and cloud reflections making their day and nights. From what I hear a 20GB GPU investment is seen as sort of a minimum (gaming) future proofing at least for a few more years. I am also told hanging around the computer shops with the boys from Mumbai, that if the 3080ti will not come about for whatever reason, that most enthusiasts will get the RTX 3090 instead and call it a day. Surely NVIDIA and their investors will love that! My tech buddy Harry said: "What the hell, I paid $1,480 years ago for a MSI Gaming Trio 2080ti, so a RTX 3090 is not out of reach at all. Now a 20GB card is my sweetspot." As to me I am always short of cash, being the simple man on the steet, but I do have a blond which is my sweetpot.

Wherever did you hear that 20gb of vram is the minimum sweetspot? I'm calling bullshit on this and if anyone is going to make such a claim, then it should be backed up with some viable evidence... look at the demo above, that's a gpu with only 11gb of vram, it's very super photorealistic looking indeed with textures that are ridiculously far beyond what we see currently in games and it can't be using this new I/O technology as it wasn't in play at the time it was made.
 
Last edited:

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
Indeed, its about double. But we've already seen the 3080 breach its VRAM budget at 4k (DOOM Eternal Max & Cyberpunk 4k in RT Max). RAM bandwidth is one of those things that is always king, until you don't have enough.

But regardless, are you seriously trying to argue that its appropriate that cards higher in the stack have less vram?

DooM Eternal... The 2080 got vram breached, not the 3080 as Hardware Unboxed found? No worries though seeing as you still get a shit ton of fps and there is a texture pool size setting to make good use of too.

Cyberpunk 2077 in native 4K with RT Ultra... it doesn't matter seeing as the game becomes unplayable.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Are you seriously suggesting that Nvidia's higher end cards should have less than its lowest?
No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.
 

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.

Well no, cause what about creators? Having the option of cheaper gpu's with more vram to handle productivity apps with isn't such a bad thing and it's about time the lower end of the market wasn't filled up with 4gb vram gpu's too.

There is legitimate technical reason as to why the 3060 has 12gb ram, the 192-bit memory bus should be paired with 3gb, 6gb, 12gb etc... 256-bit memory buses should go with 4gb, 8gb, 16gb etc... that is if you don't want mostly full speed vram and a bit of gimped speed vram.

To bring the 3060 out with 6gb vram would be a huge mistake as it wouldn't give much incentive to buy it given that we've already got games that legitimately need more than this for 1080p, like DooM Eternal maxed out... think about it with the facts that matter and it makes a whole lot more sense instead of ignorantly pinning it on marketing... also AMD typically putting more vram on their gpu's, especially in the midrange than Nvidia is nothing new and since when has Nvidia bothered with knee jerk reactions to it? So I think what we're seeing is really just down to... a necessary progression.

I'm sorry but you do not know what you are talking about. I play at 5120x1440 resolution with a 6800 and just in Cyberpunk in hitting over 8GB in vram, and my res is about a million pixels less than 4k. In Battlefield 5 when I turn on Ray tracing and ultra settings I'm almost to 10GB of vram, if I increase my resolution multiplier just 20% I'm over 12GB of vram. SoTR is above 8GB as well with default ultra settings.

All these above games are either close to, at limit, or can push above 10GB limit on the 3080. Additionally these games aren't even using mods with texture packs, if they were could rise up even more.

No you don't, cause you've not considered spare allocated vs what is actually needed vram usage. Use a 3070 8gb at the same resolution and settings with all of these games and will they stutter and freeze? Nope because it's enough actual needed vram and the spare allocation goes to ram instead of vram, which is why it is important to have at least 2x8gb ram for gaming ever since last gen gaming started, especially with the 6gb+ vram gpu's in any resolution... as even if they don't see maxed out allocated vram usage, you still end up getting maxed out 0.1% and 1% low frame rate results as well as averages... so it is down to what your total available shared memory pool is too... like 16gb vs 32gb ram is already showing a small fps lead with 4K gaming and a 2080Ti? Yep.

I've just remembered a good example seeing as you mentioned texture packs... Watchdogs 2 and the ultra texture pack runs fine on 6gb+ vram gpu's, but stutters and freezes like crazy with 8gb ram and doesn't with 16gb ram...


UltraTexture Pack
For Ultra Textures, you'll need 6GB of VRAM on your graphics cards, and a minium of 12GB of System RAM (16GB Recommended). (lol, a spelling mistake.)

Some games work out in different ways too, even if it isn't mentioned... like ROTR only suggests 8gb ram at most is needed, but you get object pop-in which is gone with 16gb ram, even with an 8gb vram gpu, but there is no stuttering and freezing occurring in this case, I guess that's why only 8gb ram is recommended as it does happen to maximise fps performance.

Not that I get why there is less fps showing here with 16gb ram, just ignore that must have been some kind of testing error lol. Jump to 3:40 for the part of the ROTR benchmark that shows what I'm talking about.

 
Last edited:
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.
useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.
 

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.

Thanks for the confirmation, :) Oh yeah... someone who is into game development told me that somewhere around 11gb vram is the max looking to be used for the next gen gaming lifespan on PC, I have not been able to verify this but it does make a good bit of sense considering everything I've covered so far.

Oh nevermind, I've just found this absolutely top source of information... it covers what I've talked about and goes into deeper explanations with all of it too.

There ya go, this is exactly what you asked to see from someone.


I'm sorry but you do not know what you are talking about. I play at 5120x1440 resolution with a 6800 and just in Cyberpunk in hitting over 8GB in vram, and my res is about a million pixels less than 4k. In Battlefield 5 when I turn on Ray tracing and ultra settings I'm almost to 10GB of vram, if I increase my resolution multiplier just 20% I'm over 12GB of vram. SoTR is above 8GB as well with default ultra settings.

All these above games are either close to, at limit, or can push above 10GB limit on the 3080. Additionally these games aren't even using mods with texture packs, if they were could rise up even more.

A nice update for you... If you want to see how much vram is actually being needed, here is how you can see what's really going on.



again, its only sufficient because big green enforces it to be, why do you think games got textures packs in the past? because newer cards with more Vram could deal with those.
you cant make use of something if that something isnt there.

will a game be made today entirely path traced? no because nothing could run it...but if those cards were to exist already then sure.

if all cards had a minimum of 16 gb of ram, then games could ship with much higher quality textures, but they dont so those arnt made, but dont twist it around.

What is Minecraft RTX? A fully path-traced game and the RTX 2060 can run it just fine. Also we don't need all gpu's to have 16gb to see the next generation of photorealistic textures being used in games, check the unreal rebirth demo that's a damn nice looking bit of living proof of this that I've previously posted.

Cyberpunk 2077 at 2k with cinematic rtx enabled could easily use up 10gb vram.

Do you realise that 2k is 1080p? You're not going to find any proper source of information that suggests what you're proposing is true.

You're all soon about to realise what the real answers are after reading the reputable information that has been provided... and that's the snakes have legs situation in this thread well and truly put to bed... lol.



What people really need to invest in for gaming is not too much vram, you can leave the 16gb+ vram gpu's for the workstation users to get hold of, it is the large size nvme drives to install games on and have the new gaming I/O technologies working at their best. I would hopefully like to see that the QLC and TLC dramless price/performance 1gb and 2gb nvme drives which cost about the same as the top end sata ssd drives will be sufficient enough as I can't see a need for the top tier drives, their only real benefit over the budget drives is when going past 15gb of data being transferred, something that games are not going to be doing... I've seen a lot of people buy the top tier Samsung nvme drives who are mere gamers and average joe application users, they won't ever utilise them to the fullest by doing the high end productivity work that is needed to noticeably take advantage of them... it's a waste of money for most pc users and so you can leave them for the workstation users to get hold of too.
 
Last edited:
Joined
Sep 10, 2019
Messages
94 (0.05/day)
The sensible thing to do would have been to make 3080's EOL (end of life) and use the silicon to make a 3080Ti instead. Because the 3080 has an inadequate amount of VRAM and should never have been released in the first place

My thoughts exactly! It was probably a rushed launch. The supers and Tis are the models people should be spending their money on.
 

Locotes_Killa

New Member
Joined
Jan 16, 2021
Messages
6 (0.00/day)
My thoughts exactly! It was probably a rushed launch. The supers and Tis are the models people should be spending their money on.

Copy pasta... https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ 10GB is... actually plenty of vram for gaming. These later gpu's coming with more vram will better serve the creators, workstation users who can actually utilise the vram. Okay?

Look at the MS flight sim 2020 example in the link, the real vram usage is... only around 2.5gb? Which is well far off from what is being told by using the wrong tools to see vram usage with that says around 6gb! That's off by 245%, WHAT?

Quit with this rubbishes, the RTX 3080 does not have an inadequate amount of vram for gaming with and literally no one here so far has any valid proof of this being true.

Nvidia is really NOT conspiring against us all FFS! :roll:

I hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?

Seeing as you are seemingly very convinced of this too, do you happen to have the proof that can back up what you're saying?

Please do try and make it to the right side of this graph guys, and then you will be able to quit conspiring against the proper scientific evidence that has been presented so far... yep I'm saying don't be a bottleneck... lol.
 

Attachments

  • Dunning Kruger.jpg
    Dunning Kruger.jpg
    36.4 KB · Views: 65
  • Pseudoscience.jpg
    Pseudoscience.jpg
    7.6 KB · Views: 63
  • Bottleneck.png
    Bottleneck.png
    266.5 KB · Views: 68
Last edited:
Low quality post by Tom Sunday
Joined
Sep 24, 2020
Messages
227 (0.15/day)
Location
Stehekin, Washington
System Name (2008) Dell XPS 730x H2C
Processor Intel Extreme QX9770 @ 3.8GHz (No OC)
Motherboard Dell LGA 775 (Dell Propiatary)
Cooling Dell AIO Ceramic Water Cooling (Dell Propiatary)
Memory Corsair Dominator Platinum 16GB (4 x 4) DDR3
Video Card(s) EVGA GTX 980ti 6GB (2016 ebay-used)
Storage (2) WD 1TB Velociraptor & (1) WD 2TB Black
Display(s) Alienware 34" AW3420DW (Amazon Warehouse)
Case Stock Dell 730x with "X" Side Panel (65 pounds fully decked out)
Audio Device(s) Creative X-FI Titanium & Corsair SP2500 Speakers
Power Supply PSU: 1000 Watt (Dell Propiatary)
Mouse Alienware AW610M (Amazon Warehouse)
Keyboard Corsair K95 XT (Amazon Warehouse)
Software Windows 7 Ultimate & Alienware FX Lighting
Benchmark Scores No Benchmarking & Overclocking
Tis are the models people should be spending their money on.
[/QUOTE

Amazing...many people here talk about proof! What about common sense...where did this go?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Well no, cause what about creators? Having the option of cheaper gpu's with more vram to handle productivity apps with isn't such a bad thing and it's about time the lower end of the market wasn't filled up with 4gb vram gpu's too.

Creators that have projects than actually could use more than 6GB of VRAM, are also not buying 3060s. Because those projects need processing power more than VRAM. They're buying 3070s and 3080s or even 3060Ti's with 6GB of VRAM.

To bring the 3060 out with 6gb vram would be a huge mistake as it wouldn't give much incentive to buy it given that we've already got games that legitimately need more than this for 1080p, like DooM Eternal maxed out... think about it with the facts that matter and it makes a whole lot more sense instead of ignorantly pinning it on marketing... also AMD typically putting more vram on their gpu's, especially in the midrange than Nvidia is nothing new and since when has Nvidia bothered with knee jerk reactions to it? So I think what we're seeing is really just down to... a necessary progression.

No, there is not game that legitimately needs more than 6GB of VRAM for 1080p, there really isn't any that needs more than that for 1440p. There are games that will "use" more than that, but not ones that actually need it, and that's the big difference.

Plus, you have to wonder if the card is even powerful enough for that extra RAM to matter. By the time you raise the graphics settings to the point where 6GB would be a limit, it's likely a 3060 would be chugging anyway.

I mean, yeah, you can use ultra-high res textures in games and get VRAM usage pretty damn high(textures being the biggest VRAM hog these days). But when you can still use 6GB by just turning down the texture settings one notch, I have a hard time saying more than 6GB is needed in this level of card.

useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.

I think people really over-estimate how much VRAM games are actually using. We're talking about a card that, in modern games, will really struggle at 1440p. This is going to be this generations 1080p card. Having 6GB of VRAM doesn't make the card useless, it barely is going to affect the card at the performance segment it is aimed at(if it affects it at all). However, the extra cost of doubling the VRAM to 12GB is going to result in a high price. So you end up with a card that is priced too closely to the 3060Ti to really be a compelling buy.
 

chriszhxi

New Member
Joined
Nov 1, 2020
Messages
8 (0.01/day)
it would be difficult to find such a cruise nowadays...


thank you for the link.
But this is screaming "BS" all over the place :

without any exact figures


and again: do you have a link supporting your theory about 10 Gb not being enough ?

I'm asking this to EVERY user writing this kind of comments, and so far NO ONE answering me with a meaningful reply.


still waiting for a link about this claiming....
2k here I mean 1440p, just play the game yourself, max settings with hidden cinamtic rtx mode enabled, see the vram usage

Do you realise that 2k is 1080p? You're not going to find any proper source of information that suggests what you're proposing is true.
The Digital Cinema Initiatives (DCI), a group of motion picture studios that creates standards for digital cinema, defines Standard DCI 2K resolution as 2048 x 1080 pixels. But when buying a PC monitor or choosing a laptop, you'll rarely see this resolution. More often you’ll find 2K displays as having a of 2560 x 1440 resolution.

Occasionally, 1080p (Full HD or FHD) has been included into the 2K resolution definition. Although 1920x1080 could be considered as having a horizontal resolution of approximately 2,000 pixels, most media, including web content and books on video production, cinema references and definitions, define 1080p and 2K resolutions as separate definitions and not the same.

Although 1080p has the same vertical resolution as DCI 2K resolutions (1080 pixels), it has a smaller horizontal resolution below the range of 2K resolution formats.[4]

According to official reference material, DCI and industry standards do not officially recognize 1080p as a 2K resolution in literature concerning 2K and 4K resolution.

2k is not a standrad on pc monitor and mostly refers to 1440p, and I do not rely on media source, I play the game myself.
 
Joined
Mar 28, 2020
Messages
1,761 (1.02/day)
The RTX 3080 Ti or Super was expected to be a unicorn, so not announcing it makes sense to me. I feel most AIB are already swarmed with back orders for their RTX 3080 cards to find spare chips to manufacture the RTX 3080 Ti. Not to mentioned that Nvidia also mentioned bottleneck from Micron GDDR6X. So by doubling the VRAM, they are doubling their headache.

As for the RTX 3060, I agree that 6GB is pretty much sufficient considering the resolution this card is targeting. I believe the max this card can support will be 1440p. It will be great to have an 8GB version which will be the sweet spot for the amount of VRAM. 12GB is an overkill and its likely that 4GB is going to be under or not utilized in almost all cases. Between the MSRP of the RTX 3060 Ti and 3060, I would recommend the former even though its got less VRAM. It is clearly more powerful, and the 8GB will be sufficient for all games at 1440p, at least in the foreseeable future until RTX 4xxx arrives.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
980: September 2014 >>>>>> 980 Ti: June 2015
Spoiled Fury X launch.

1070: June 2016 >>>>>> 1070 Ti: November 2017
Undermined Vega.

So I'd say that 3080 Ti 20GB is likely to be unveiled at E3 on June 15th 2021 considering that 3060 12 GB is set for late February *cough* availability. Currently no word on if we ever get a 3070 Ti 16 GB though.
While there clearly is some cadence to it (first produce chips, then harvest, then release bumped up version), it's not the only reason to release such cards.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
2k here I mean 1440p, just play the game yourself, max settings with hidden cinamtic rtx mode enabled, see the vram usage

and , again, you are looking at memory allocation, not usage.
CP2077 isn't using 10 Gb of VRAM.
 
Joined
May 8, 2009
Messages
191 (0.03/day)
System Name Ryzen Up
Processor AMD Ryzen 9 7950X3D
Motherboard Asus ROG CROSSHAIR X670E HERO
Cooling iCUE H150i ELITE CAPELLIX
Memory G.Skill Trident Z5 NEO RGB Series (AMD Expo) 64GB (2 x 32GB) DDR5 6000 CL30 F5-6000J3040G32GX2-TZ5NR
Video Card(s) MSI Suprim X GeForce RTX 4090
Storage 4TB WD Black SN850X| Crucial P3 Plus 4TB| 2TB Samsung 980 Pro
Display(s) SAMSUNG 57" Odyssey Neo G9/ASUS ROG Swift PG279Q
Case Lian Li O11 Dynamic XL
Power Supply Corsair RM 1000x Gold
Mouse Logitech G502
Keyboard Corsair K95 RGB Platinum
Software Windows 11 Professional
and , again, you are looking at memory allocation, not usage.
CP2077 isn't using 10 Gb of VRAM

Call of Duty®_ Black Ops Cold War 1_17_2021 12_52_10 PM.png

Don't know why this game is using 20GB @1440p
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,983 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
View attachment 184422

Don't know why this game is using 20GB @1440p
Allocating and using isn't the same thing. What I've heard, having task manager open when playing games shows the real VRAM usage. The OSD shows how much VRAM it allocates.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
View attachment 184422

Don't know why this game is using 20GB @1440p
no, it is not.
That's allocated VRAM, probably running on a 24 Gb RTX 3090.
On a 3080 the value would be around 8 Gb and on a 3070 would probably be around 6 Gb, for the same game.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Allocating and using isn't the same thing. What I've heard, having task manager open when playing games shows the real VRAM usage. The OSD shows how much VRAM it allocates.

Task Manger doesn't show real VRAM usage either. The fact is, there isn't really a way currently to see real VRAM usage, only allocated size. You'd have to track the actual game to see how much of that allocated VRAM the game is actually accessing. And, at least AFAIK, there isn't anything that does that. The way developers are doing things these days is they are just cramming as many textures as possible into VRAM, even if the texture isn't anywhere near the player and isn't being seen by the player. They start at where the player is, and just load outward until VRAM is full or there aren't anymore textures to load.

There used to be a very good quick video that showed how games used to handle textures. They would only load into VRAM what the player was actually seeing and a small amount around the field of view, in a cone. This was back in the days when VRAM was actually limited. Now developers have become spoiled with huge VRAM sizes, so they just load that VRAM with as many textures as they can. But then there are textures sitting in VRAM that are never accessed, so the VRAM usage is drastically inflated.

Getting back to why this affects the 3060. There are going to be people that are like "but what if I run at 4k texture mod on [game], that uses like 10GB of VRAM?!?" Ok, so you're using a 4k texture mod on a card that is really only capable of playing at 1080p resolution? Do you really think those stupid high texture resolutions are really helping you any?
 
Top