Friday, January 15th 2021
NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes
Everyone and their mother expected NVIDIA to announce - if not a SUPER refresh to their existing graphics cards with increased memory sizes - at least the RTX 3080 Ti. That card surfaced as a planned NVIDIA counter to AMD's preemptive pricing of $999 on its RX 6900 XT graphics card (which to be fair, is in itself as abundant a card as unicorns this side of the galaxy). GamersNexus reported NVIDIA partners' comments on the indefinite postponement of the RTX 3080 Ti and possible SUPER derivatives of the RTX 30-series lineup. It's being said that NVIDIA decided (smartly, I would say) to ensure consistent supply of their existing lineup to sate demand, instead of dispersing its limited chip production across even more product lines.
This would result, I have no doubt, on NVIDIA only having even more SKUs out of stock than they currently do. Considering the market's current state of mind in regards to NVIDIA's lineup, this seems like the most sensible decision possible. TechPowerUp has in the meantime confirmed this information with NVIDIA partners themselves.
Source:
GamersNexus
This would result, I have no doubt, on NVIDIA only having even more SKUs out of stock than they currently do. Considering the market's current state of mind in regards to NVIDIA's lineup, this seems like the most sensible decision possible. TechPowerUp has in the meantime confirmed this information with NVIDIA partners themselves.
85 Comments on NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
But this is screaming "BS" all over the place :
without any exact figures and again: do you have a link supporting your theory about 10 Gb not being enough ?
I'm asking this to EVERY user writing this kind of comments, and so far NO ONE answering me with a meaningful reply. still waiting for a link about this claiming....
Cyberpunk 2077 System Requirements — Cyberpunk 2077 | Technical Support — CD PROJEKT RED
RT Ultra - 10GB VRAM
System requirements stating that 10 Gb are enough for the best quality at 4K...
You actually proved the contrary of your claims.
This Unreal Rebirth demo is running on a 1080Ti in real-time with the most detailed textures any of us have ever seen, it would be great if they specified what resolution it is running in and how much vram it used, but does it matter?
tech4gamers.com/photorealistic-graphics-of-rebirth-60-fps-look-amazing-like-with-geforce-gtx-1080-ti/
PC games already like to make use of both ram and vram to share out game assets and prefer to fill up the vram first with data that doesn't really need to be there, we know that how much vram games around today actually need is not always what you see on your in-game OSD vram utilisation counter *unless you make use of the updated MSI Afterburner/RTSS which can now show the actual vram used by games.*
techbuyersguru.com/memory-and-gaming-ram-vram-connection?page=2
This has been a standard way of getting around storage speed I/O bottlenecks by unifying ram and vram for donkeys years... now how about these new fangled super fast I/O optimisations we have coming along?
The new Nvidia GPU's have NVCache, which is a way for data transfers to go direct from SSD to vram, that's very much like what the consoles have going on and you can bet your ass AMD and Intel will get on it too with DirectX DirectStorage... the idea is that it is so damn fast, textures and other data swapping can be done in real-time without impacting performance and finally after the 30ish years that SSD's have been around, we get to put HDD's firmly on the back seat for gaming.
So pair this with the latest game engines that are ready to take advantage of it all and... the new consoles only have a 16gb shared pool of memory, but are able to act like they have more than this... an ideal recommended specs gaming PC already has 16gb ram and 6gb to 8gb of vram.... 32gb ram is likely to become the new sweetspot for gaming... so this along with the proper utilisation of faster storage could very well mean that even the 3070 with the lowest amount of vram @ 8gb among the latest gpu's, won't really be seen to be struggling with handling these insanely detailed textures.... and besides this, if all else fails then there are texture settings below ultra too if you didn't know that already, and dropping to high textures is still going to look far better than anything which has come before which will be great for the lower resolutions that ain't good enough to see the extra detail that will be afforded by the ultra textures anyways?! lol. Wherever did you hear that 20gb of vram is the minimum sweetspot? I'm calling bullshit on this and if anyone is going to make such a claim, then it should be backed up with some viable evidence... look at the demo above, that's a gpu with only 11gb of vram, it's very super photorealistic looking indeed with textures that are ridiculously far beyond what we see currently in games and it can't be using this new I/O technology as it wasn't in play at the time it was made.
Cyberpunk 2077 in native 4K with RT Ultra... it doesn't matter seeing as the game becomes unplayable.
There is legitimate technical reason as to why the 3060 has 12gb ram, the 192-bit memory bus should be paired with 3gb, 6gb, 12gb etc... 256-bit memory buses should go with 4gb, 8gb, 16gb etc... that is if you don't want mostly full speed vram and a bit of gimped speed vram.
To bring the 3060 out with 6gb vram would be a huge mistake as it wouldn't give much incentive to buy it given that we've already got games that legitimately need more than this for 1080p, like DooM Eternal maxed out... think about it with the facts that matter and it makes a whole lot more sense instead of ignorantly pinning it on marketing... also AMD typically putting more vram on their gpu's, especially in the midrange than Nvidia is nothing new and since when has Nvidia bothered with knee jerk reactions to it? So I think what we're seeing is really just down to... a necessary progression. No you don't, cause you've not considered spare allocated vs what is actually needed vram usage. Use a 3070 8gb at the same resolution and settings with all of these games and will they stutter and freeze? Nope because it's enough actual needed vram and the spare allocation goes to ram instead of vram, which is why it is important to have at least 2x8gb ram for gaming ever since last gen gaming started, especially with the 6gb+ vram gpu's in any resolution... as even if they don't see maxed out allocated vram usage, you still end up getting maxed out 0.1% and 1% low frame rate results as well as averages... so it is down to what your total available shared memory pool is too... like 16gb vs 32gb ram is already showing a small fps lead with 4K gaming and a 2080Ti? Yep.
I've just remembered a good example seeing as you mentioned texture packs... Watchdogs 2 and the ultra texture pack runs fine on 6gb+ vram gpu's, but stutters and freezes like crazy with 8gb ram and doesn't with 16gb ram...
support.ubisoft.com/en-AU/Article/000061087/System-requirements-for-Watch-Dogs-2
UltraTexture Pack
For Ultra Textures, you'll need 6GB of VRAM on your graphics cards, and a minium of 12GB of System RAM (16GB Recommended). (lol, a spelling mistake.)
Some games work out in different ways too, even if it isn't mentioned... like ROTR only suggests 8gb ram at most is needed, but you get object pop-in which is gone with 16gb ram, even with an 8gb vram gpu, but there is no stuttering and freezing occurring in this case, I guess that's why only 8gb ram is recommended as it does happen to maximise fps performance.
Not that I get why there is less fps showing here with 16gb ram, just ignore that must have been some kind of testing error lol. Jump to 3:40 for the part of the ROTR benchmark that shows what I'm talking about.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.
Oh nevermind, I've just found this absolutely top source of information... it covers what I've talked about and goes into deeper explanations with all of it too.
There ya go, this is exactly what you asked to see from someone.
www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ A nice update for you... If you want to see how much vram is actually being needed, here is how you can see what's really going on.
www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/
You're all soon about to realise what the real answers are after reading the reputable information that has been provided... and that's the snakes have legs situation in this thread well and truly put to bed... lol.
Look at the MS flight sim 2020 example in the link, the real vram usage is... only around 2.5gb? Which is well far off from what is being told by using the wrong tools to see vram usage with that says around 6gb! That's off by 245%, WHAT?
Quit with this rubbishes, the RTX 3080 does not have an inadequate amount of vram for gaming with and literally no one here so far has any valid proof of this being true.
Nvidia is really NOT conspiring against us all FFS! :roll: Seeing as you are seemingly very convinced of this too, do you happen to have the proof that can back up what you're saying?
Please do try and make it to the right side of this graph guys, and then you will be able to quit conspiring against the proper scientific evidence that has been presented so far... yep I'm saying don't be a bottleneck... lol.
Plus, you have to wonder if the card is even powerful enough for that extra RAM to matter. By the time you raise the graphics settings to the point where 6GB would be a limit, it's likely a 3060 would be chugging anyway.
I mean, yeah, you can use ultra-high res textures in games and get VRAM usage pretty damn high(textures being the biggest VRAM hog these days). But when you can still use 6GB by just turning down the texture settings one notch, I have a hard time saying more than 6GB is needed in this level of card. I think people really over-estimate how much VRAM games are actually using. We're talking about a card that, in modern games, will really struggle at 1440p. This is going to be this generations 1080p card. Having 6GB of VRAM doesn't make the card useless, it barely is going to affect the card at the performance segment it is aimed at(if it affects it at all). However, the extra cost of doubling the VRAM to 12GB is going to result in a high price. So you end up with a card that is priced too closely to the 3060Ti to really be a compelling buy.
As for the RTX 3060, I agree that 6GB is pretty much sufficient considering the resolution this card is targeting. I believe the max this card can support will be 1440p. It will be great to have an 8GB version which will be the sweet spot for the amount of VRAM. 12GB is an overkill and its likely that 4GB is going to be under or not utilized in almost all cases. Between the MSRP of the RTX 3060 Ti and 3060, I would recommend the former even though its got less VRAM. It is clearly more powerful, and the 8GB will be sufficient for all games at 1440p, at least in the foreseeable future until RTX 4xxx arrives.
CP2077 isn't using 10 Gb of VRAM.
Don't know why this game is using 20GB @1440p
That's allocated VRAM, probably running on a 24 Gb RTX 3090.
On a 3080 the value would be around 8 Gb and on a 3070 would probably be around 6 Gb, for the same game.
There used to be a very good quick video that showed how games used to handle textures. They would only load into VRAM what the player was actually seeing and a small amount around the field of view, in a cone. This was back in the days when VRAM was actually limited. Now developers have become spoiled with huge VRAM sizes, so they just load that VRAM with as many textures as they can. But then there are textures sitting in VRAM that are never accessed, so the VRAM usage is drastically inflated.
Getting back to why this affects the 3060. There are going to be people that are like "but what if I run at 4k texture mod on [game], that uses like 10GB of VRAM?!?" Ok, so you're using a 4k texture mod on a card that is really only capable of playing at 1080p resolution? Do you really think those stupid high texture resolutions are really helping you any?
The 3090 is the ultimate graphics card as a matter of fact. 6900 XT the same. But not everyone can afford them. I, for one, can get an RX 6800 non-XT at best which can be had for roughly 950 US dollars but I'm waiting for Nvidia's response first, will check the price/perf ratio before pulling the trigger though. And now AMD have returned the favor by launching RDNA 2. That 12 GB 3060 is just the beginning of a plethora of Nvidia cards changing for the better.
All hail the competition which drives us to improve even more to be the best we can be and pushes the advancement of technology further ahead.