Friday, January 15th 2021

NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

Everyone and their mother expected NVIDIA to announce - if not a SUPER refresh to their existing graphics cards with increased memory sizes - at least the RTX 3080 Ti. That card surfaced as a planned NVIDIA counter to AMD's preemptive pricing of $999 on its RX 6900 XT graphics card (which to be fair, is in itself as abundant a card as unicorns this side of the galaxy). GamersNexus reported NVIDIA partners' comments on the indefinite postponement of the RTX 3080 Ti and possible SUPER derivatives of the RTX 30-series lineup. It's being said that NVIDIA decided (smartly, I would say) to ensure consistent supply of their existing lineup to sate demand, instead of dispersing its limited chip production across even more product lines.

This would result, I have no doubt, on NVIDIA only having even more SKUs out of stock than they currently do. Considering the market's current state of mind in regards to NVIDIA's lineup, this seems like the most sensible decision possible. TechPowerUp has in the meantime confirmed this information with NVIDIA partners themselves.
Source: GamersNexus
Add your own comment

85 Comments on NVIDIA RTX 3080 Ti, Eventual SUPER Revisions Allegedly Postponed Indefinitely Amidst Supply Woes

#52
Sahutep
Doc-J10GB of VRAM is sufficient today, no need more VRAM for play modern games at 4k resolution.

3080 with 10GB at 700$ is much more aforable that 3080 with 20GB at 1000$

Today 16GB of VRAM is pure marketing for noobs, maybe on next generation of GPUs is necessary but not now.

The 3060 with 12GB is a counterpart of 6700 with 12GB, neither of both cards needs this quantity of VRAM for play at 1080p or 2k but marketing is marketing and sell better these cards.
I hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
Posted on Reply
#53
Max(IT)
Anymal2500 eur for 10 days cruising in Carribean sea or new PC?
it would be difficult to find such a cruise nowadays...
spnidelwww.techpowerup.com/277257/amd-and-nvidia-address-gpu-shortage-with-situation-improvement-on-the-horizon
thank you for the link.
But this is screaming "BS" all over the place :

without any exact figures
SahutepI hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
and again: do you have a link supporting your theory about 10 Gb not being enough ?

I'm asking this to EVERY user writing this kind of comments, and so far NO ONE answering me with a meaningful reply.
chriszhxiCyberpunk 2077 at 2k with cinematic rtx enabled could easily use up 10gb vram.
still waiting for a link about this claiming....
Posted on Reply
#56
Locotes_Killa
Max(IT)is this the best link you can provide ? :roll:
System requirements stating that 10 Gb are enough for the best quality at 4K...
You actually proved the contrary of your claims.
Ai you make a good point and I can go further with it...

This Unreal Rebirth demo is running on a 1080Ti in real-time with the most detailed textures any of us have ever seen, it would be great if they specified what resolution it is running in and how much vram it used, but does it matter?


tech4gamers.com/photorealistic-graphics-of-rebirth-60-fps-look-amazing-like-with-geforce-gtx-1080-ti/

PC games already like to make use of both ram and vram to share out game assets and prefer to fill up the vram first with data that doesn't really need to be there, we know that how much vram games around today actually need is not always what you see on your in-game OSD vram utilisation counter *unless you make use of the updated MSI Afterburner/RTSS which can now show the actual vram used by games.*

techbuyersguru.com/memory-and-gaming-ram-vram-connection?page=2

This has been a standard way of getting around storage speed I/O bottlenecks by unifying ram and vram for donkeys years... now how about these new fangled super fast I/O optimisations we have coming along?

The new Nvidia GPU's have NVCache, which is a way for data transfers to go direct from SSD to vram, that's very much like what the consoles have going on and you can bet your ass AMD and Intel will get on it too with DirectX DirectStorage... the idea is that it is so damn fast, textures and other data swapping can be done in real-time without impacting performance and finally after the 30ish years that SSD's have been around, we get to put HDD's firmly on the back seat for gaming.

So pair this with the latest game engines that are ready to take advantage of it all and... the new consoles only have a 16gb shared pool of memory, but are able to act like they have more than this... an ideal recommended specs gaming PC already has 16gb ram and 6gb to 8gb of vram.... 32gb ram is likely to become the new sweetspot for gaming... so this along with the proper utilisation of faster storage could very well mean that even the 3070 with the lowest amount of vram @ 8gb among the latest gpu's, won't really be seen to be struggling with handling these insanely detailed textures.... and besides this, if all else fails then there are texture settings below ultra too if you didn't know that already, and dropping to high textures is still going to look far better than anything which has come before which will be great for the lower resolutions that ain't good enough to see the extra detail that will be afforded by the ultra textures anyways?! lol.
Tom SundayIndeed a lot of gamers with deep pockets are still waiting or holding-off for a card with the specs of a 3080ti and as the gaming developers are throwing more and more graphic demanding games into the market. Actually many games are being purchased solely based on their 'pretty worlds' and over the top sandbox bling. There seems to be no end to this and it essentially having started when gamers were delighted to see water and cloud reflections making their day and nights. From what I hear a 20GB GPU investment is seen as sort of a minimum (gaming) future proofing at least for a few more years. I am also told hanging around the computer shops with the boys from Mumbai, that if the 3080ti will not come about for whatever reason, that most enthusiasts will get the RTX 3090 instead and call it a day. Surely NVIDIA and their investors will love that! My tech buddy Harry said: "What the hell, I paid $1,480 years ago for a MSI Gaming Trio 2080ti, so a RTX 3090 is not out of reach at all. Now a 20GB card is my sweetspot." As to me I am always short of cash, being the simple man on the steet, but I do have a blond which is my sweetpot.
Wherever did you hear that 20gb of vram is the minimum sweetspot? I'm calling bullshit on this and if anyone is going to make such a claim, then it should be backed up with some viable evidence... look at the demo above, that's a gpu with only 11gb of vram, it's very super photorealistic looking indeed with textures that are ridiculously far beyond what we see currently in games and it can't be using this new I/O technology as it wasn't in play at the time it was made.
Posted on Reply
#57
xSneak
By the time 10GB of vram will matter, the processing power of a 3080 Ti will be inadequate for the graphic settings pushing that kind of vram.
Posted on Reply
#58
Locotes_Killa
CammIndeed, its about double. But we've already seen the 3080 breach its VRAM budget at 4k (DOOM Eternal Max & Cyberpunk 4k in RT Max). RAM bandwidth is one of those things that is always king, until you don't have enough.

But regardless, are you seriously trying to argue that its appropriate that cards higher in the stack have less vram?
DooM Eternal... The 2080 got vram breached, not the 3080 as Hardware Unboxed found? No worries though seeing as you still get a shit ton of fps and there is a texture pool size setting to make good use of too.

Cyberpunk 2077 in native 4K with RT Ultra... it doesn't matter seeing as the game becomes unplayable.
Posted on Reply
#59
newtekie1
Semi-Retired Folder
CammAre you seriously suggesting that Nvidia's higher end cards should have less than its lowest?
No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.
Posted on Reply
#60
Locotes_Killa
newtekie1No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.
Well no, cause what about creators? Having the option of cheaper gpu's with more vram to handle productivity apps with isn't such a bad thing and it's about time the lower end of the market wasn't filled up with 4gb vram gpu's too.

There is legitimate technical reason as to why the 3060 has 12gb ram, the 192-bit memory bus should be paired with 3gb, 6gb, 12gb etc... 256-bit memory buses should go with 4gb, 8gb, 16gb etc... that is if you don't want mostly full speed vram and a bit of gimped speed vram.

To bring the 3060 out with 6gb vram would be a huge mistake as it wouldn't give much incentive to buy it given that we've already got games that legitimately need more than this for 1080p, like DooM Eternal maxed out... think about it with the facts that matter and it makes a whole lot more sense instead of ignorantly pinning it on marketing... also AMD typically putting more vram on their gpu's, especially in the midrange than Nvidia is nothing new and since when has Nvidia bothered with knee jerk reactions to it? So I think what we're seeing is really just down to... a necessary progression.
WarsawI'm sorry but you do not know what you are talking about. I play at 5120x1440 resolution with a 6800 and just in Cyberpunk in hitting over 8GB in vram, and my res is about a million pixels less than 4k. In Battlefield 5 when I turn on Ray tracing and ultra settings I'm almost to 10GB of vram, if I increase my resolution multiplier just 20% I'm over 12GB of vram. SoTR is above 8GB as well with default ultra settings.

All these above games are either close to, at limit, or can push above 10GB limit on the 3080. Additionally these games aren't even using mods with texture packs, if they were could rise up even more.
No you don't, cause you've not considered spare allocated vs what is actually needed vram usage. Use a 3070 8gb at the same resolution and settings with all of these games and will they stutter and freeze? Nope because it's enough actual needed vram and the spare allocation goes to ram instead of vram, which is why it is important to have at least 2x8gb ram for gaming ever since last gen gaming started, especially with the 6gb+ vram gpu's in any resolution... as even if they don't see maxed out allocated vram usage, you still end up getting maxed out 0.1% and 1% low frame rate results as well as averages... so it is down to what your total available shared memory pool is too... like 16gb vs 32gb ram is already showing a small fps lead with 4K gaming and a 2080Ti? Yep.

I've just remembered a good example seeing as you mentioned texture packs... Watchdogs 2 and the ultra texture pack runs fine on 6gb+ vram gpu's, but stutters and freezes like crazy with 8gb ram and doesn't with 16gb ram...

support.ubisoft.com/en-AU/Article/000061087/System-requirements-for-Watch-Dogs-2

UltraTexture Pack
For Ultra Textures, you'll need 6GB of VRAM on your graphics cards, and a minium of 12GB of System RAM (16GB Recommended). (lol, a spelling mistake.)

Some games work out in different ways too, even if it isn't mentioned... like ROTR only suggests 8gb ram at most is needed, but you get object pop-in which is gone with 16gb ram, even with an 8gb vram gpu, but there is no stuttering and freezing occurring in this case, I guess that's why only 8gb ram is recommended as it does happen to maximise fps performance.

Not that I get why there is less fps showing here with 16gb ram, just ignore that must have been some kind of testing error lol. Jump to 3:40 for the part of the ROTR benchmark that shows what I'm talking about.

Posted on Reply
#61
Max(IT)
newtekie1No, I'm suggesting that 12GB on a 3060 is useless and is just marketing.
useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.
Posted on Reply
#62
Locotes_Killa
Max(IT)useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.
Thanks for the confirmation, :) Oh yeah... someone who is into game development told me that somewhere around 11gb vram is the max looking to be used for the next gen gaming lifespan on PC, I have not been able to verify this but it does make a good bit of sense considering everything I've covered so far.

Oh nevermind, I've just found this absolutely top source of information... it covers what I've talked about and goes into deeper explanations with all of it too.

There ya go, this is exactly what you asked to see from someone.

www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/
WarsawI'm sorry but you do not know what you are talking about. I play at 5120x1440 resolution with a 6800 and just in Cyberpunk in hitting over 8GB in vram, and my res is about a million pixels less than 4k. In Battlefield 5 when I turn on Ray tracing and ultra settings I'm almost to 10GB of vram, if I increase my resolution multiplier just 20% I'm over 12GB of vram. SoTR is above 8GB as well with default ultra settings.

All these above games are either close to, at limit, or can push above 10GB limit on the 3080. Additionally these games aren't even using mods with texture packs, if they were could rise up even more.
A nice update for you... If you want to see how much vram is actually being needed, here is how you can see what's really going on.

www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/

ZoneDymoagain, its only sufficient because big green enforces it to be, why do you think games got textures packs in the past? because newer cards with more Vram could deal with those.
you cant make use of something if that something isnt there.

will a game be made today entirely path traced? no because nothing could run it...but if those cards were to exist already then sure.

if all cards had a minimum of 16 gb of ram, then games could ship with much higher quality textures, but they dont so those arnt made, but dont twist it around.
What is Minecraft RTX? A fully path-traced game and the RTX 2060 can run it just fine. Also we don't need all gpu's to have 16gb to see the next generation of photorealistic textures being used in games, check the unreal rebirth demo that's a damn nice looking bit of living proof of this that I've previously posted.
chriszhxiCyberpunk 2077 at 2k with cinematic rtx enabled could easily use up 10gb vram.
Do you realise that 2k is 1080p? You're not going to find any proper source of information that suggests what you're proposing is true.

You're all soon about to realise what the real answers are after reading the reputable information that has been provided... and that's the snakes have legs situation in this thread well and truly put to bed... lol.

What people really need to invest in for gaming is not too much vram, you can leave the 16gb+ vram gpu's for the workstation users to get hold of, it is the large size nvme drives to install games on and have the new gaming I/O technologies working at their best. I would hopefully like to see that the QLC and TLC dramless price/performance 1gb and 2gb nvme drives which cost about the same as the top end sata ssd drives will be sufficient enough as I can't see a need for the top tier drives, their only real benefit over the budget drives is when going past 15gb of data being transferred, something that games are not going to be doing... I've seen a lot of people buy the top tier Samsung nvme drives who are mere gamers and average joe application users, they won't ever utilise them to the fullest by doing the high end productivity work that is needed to noticeably take advantage of them... it's a waste of money for most pc users and so you can leave them for the workstation users to get hold of too.
Posted on Reply
#63
Arc1t3ct
RavenmasterThe sensible thing to do would have been to make 3080's EOL (end of life) and use the silicon to make a 3080Ti instead. Because the 3080 has an inadequate amount of VRAM and should never have been released in the first place
My thoughts exactly! It was probably a rushed launch. The supers and Tis are the models people should be spending their money on.
Posted on Reply
#64
Locotes_Killa
Arc1t3ctMy thoughts exactly! It was probably a rushed launch. The supers and Tis are the models people should be spending their money on.
Copy pasta... www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ 10GB is... actually plenty of vram for gaming. These later gpu's coming with more vram will better serve the creators, workstation users who can actually utilise the vram. Okay?

Look at the MS flight sim 2020 example in the link, the real vram usage is... only around 2.5gb? Which is well far off from what is being told by using the wrong tools to see vram usage with that says around 6gb! That's off by 245%, WHAT?

Quit with this rubbishes, the RTX 3080 does not have an inadequate amount of vram for gaming with and literally no one here so far has any valid proof of this being true.

Nvidia is really NOT conspiring against us all FFS! :roll:
SahutepI hate reading something like this.
Yes maybe for NOW (the next 2 months) this amount of VRAM will be enough.(Not even really when you use Raytracing) But after that the cards gonna become obselete very soon just because of that puny framebuffer.
Im reading this shit for years and decades now. People claiming that a certain GPU couldnt "benefit from more VRAM anyway because of its lack of performance". Thats such a damn lie. So many great GPUs made by NVIDIA especially are now nothing more than E-Waste just because they cant perform to their maximum at newer titles demanding more RAM for textures etc. I would love to be able to max out my games in two years with my 3080 but with NVIDIAs self-crippling VRAM-greed I dont think that will be possible, even if the 3080 still has enough raw horse power under the hood. Thats the reason why often AMD cards tend to age way better than NVIDIAs. Like SERIOUSLY. WHY WOULD THEY GIVE THE 3080 LESS VRAM THAN A 1080TI?
Seeing as you are seemingly very convinced of this too, do you happen to have the proof that can back up what you're saying?

Please do try and make it to the right side of this graph guys, and then you will be able to quit conspiring against the proper scientific evidence that has been presented so far... yep I'm saying don't be a bottleneck... lol.
Posted on Reply
#65
newtekie1
Semi-Retired Folder
Locotes_KillaWell no, cause what about creators? Having the option of cheaper gpu's with more vram to handle productivity apps with isn't such a bad thing and it's about time the lower end of the market wasn't filled up with 4gb vram gpu's too.
Creators that have projects than actually could use more than 6GB of VRAM, are also not buying 3060s. Because those projects need processing power more than VRAM. They're buying 3070s and 3080s or even 3060Ti's with 6GB of VRAM.
Locotes_KillaTo bring the 3060 out with 6gb vram would be a huge mistake as it wouldn't give much incentive to buy it given that we've already got games that legitimately need more than this for 1080p, like DooM Eternal maxed out... think about it with the facts that matter and it makes a whole lot more sense instead of ignorantly pinning it on marketing... also AMD typically putting more vram on their gpu's, especially in the midrange than Nvidia is nothing new and since when has Nvidia bothered with knee jerk reactions to it? So I think what we're seeing is really just down to... a necessary progression.
No, there is not game that legitimately needs more than 6GB of VRAM for 1080p, there really isn't any that needs more than that for 1440p. There are games that will "use" more than that, but not ones that actually need it, and that's the big difference.

Plus, you have to wonder if the card is even powerful enough for that extra RAM to matter. By the time you raise the graphics settings to the point where 6GB would be a limit, it's likely a 3060 would be chugging anyway.

I mean, yeah, you can use ultra-high res textures in games and get VRAM usage pretty damn high(textures being the biggest VRAM hog these days). But when you can still use 6GB by just turning down the texture settings one notch, I have a hard time saying more than 6GB is needed in this level of card.
Max(IT)useless ? Maybe.
But not only marketing.
With that bus the choice was between 6 Gb or 12 Gb.
6 Gb would have been really useless, so 12 Gb was the right choice.
I think people really over-estimate how much VRAM games are actually using. We're talking about a card that, in modern games, will really struggle at 1440p. This is going to be this generations 1080p card. Having 6GB of VRAM doesn't make the card useless, it barely is going to affect the card at the performance segment it is aimed at(if it affects it at all). However, the extra cost of doubling the VRAM to 12GB is going to result in a high price. So you end up with a card that is priced too closely to the 3060Ti to really be a compelling buy.
Posted on Reply
#66
chriszhxi
Max(IT)it would be difficult to find such a cruise nowadays...


thank you for the link.
But this is screaming "BS" all over the place :

without any exact figures


and again: do you have a link supporting your theory about 10 Gb not being enough ?

I'm asking this to EVERY user writing this kind of comments, and so far NO ONE answering me with a meaningful reply.


still waiting for a link about this claiming....
2k here I mean 1440p, just play the game yourself, max settings with hidden cinamtic rtx mode enabled, see the vram usage
Locotes_KillaDo you realise that 2k is 1080p? You're not going to find any proper source of information that suggests what you're proposing is true.
The Digital Cinema Initiatives (DCI), a group of motion picture studios that creates standards for digital cinema, defines Standard DCI 2K resolution as 2048 x 1080pixels. But when buying a PC monitor or choosing a laptop, you'll rarely see this resolution. More often you’ll find 2K displays as having a of 2560 x 1440 resolution.

Occasionally, 1080p (Full HD or FHD) has been included into the 2K resolution definition. Although 1920x1080 could be considered as having a horizontal resolution of approximately 2,000 pixels, most media, including web content and books on video production, cinema references and definitions, define 1080p and 2K resolutions as separate definitions and not the same.

Although 1080p has the same vertical resolution as DCI 2K resolutions (1080 pixels), it has a smaller horizontal resolution below the range of 2K resolution formats.[4]

According to official reference material, DCI and industry standards do not officially recognize 1080p as a 2K resolution in literature concerning 2K and 4K resolution.
2k is not a standrad on pc monitor and mostly refers to 1440p, and I do not rely on media source, I play the game myself.
Posted on Reply
#67
watzupken
The RTX 3080 Ti or Super was expected to be a unicorn, so not announcing it makes sense to me. I feel most AIB are already swarmed with back orders for their RTX 3080 cards to find spare chips to manufacture the RTX 3080 Ti. Not to mentioned that Nvidia also mentioned bottleneck from Micron GDDR6X. So by doubling the VRAM, they are doubling their headache.

As for the RTX 3060, I agree that 6GB is pretty much sufficient considering the resolution this card is targeting. I believe the max this card can support will be 1440p. It will be great to have an 8GB version which will be the sweet spot for the amount of VRAM. 12GB is an overkill and its likely that 4GB is going to be under or not utilized in almost all cases. Between the MSRP of the RTX 3060 Ti and 3060, I would recommend the former even though its got less VRAM. It is clearly more powerful, and the 8GB will be sufficient for all games at 1440p, at least in the foreseeable future until RTX 4xxx arrives.
Posted on Reply
#68
medi01
Adc7dTPU980: September 2014 >>>>>> 980 Ti: June 2015
Spoiled Fury X launch.
Adc7dTPU1070: June 2016 >>>>>> 1070 Ti: November 2017
Undermined Vega.
Adc7dTPUSo I'd say that 3080 Ti 20GB is likely to be unveiled at E3 on June 15th 2021 considering that 3060 12 GB is set for late February *cough* availability. Currently no word on if we ever get a 3070 Ti 16 GB though.
While there clearly is some cadence to it (first produce chips, then harvest, then release bumped up version), it's not the only reason to release such cards.
Posted on Reply
#69
Max(IT)
chriszhxi2k here I mean 1440p, just play the game yourself, max settings with hidden cinamtic rtx mode enabled, see the vram usage
and , again, you are looking at memory allocation, not usage.
CP2077 isn't using 10 Gb of VRAM.
Posted on Reply
#70
mdbrotha03
Max(IT)and , again, you are looking at memory allocation, not usage.
CP2077 isn't using 10 Gb of VRAM


Don't know why this game is using 20GB @1440p
Posted on Reply
#71
Ruru
S.T.A.R.S.
mdbrotha03

Don't know why this game is using 20GB @1440p
Allocating and using isn't the same thing. What I've heard, having task manager open when playing games shows the real VRAM usage. The OSD shows how much VRAM it allocates.
Posted on Reply
#72
Max(IT)
mdbrotha03

Don't know why this game is using 20GB @1440p
no, it is not.
That's allocated VRAM, probably running on a 24 Gb RTX 3090.
On a 3080 the value would be around 8 Gb and on a 3070 would probably be around 6 Gb, for the same game.
Posted on Reply
#73
newtekie1
Semi-Retired Folder
Chloe PriceAllocating and using isn't the same thing. What I've heard, having task manager open when playing games shows the real VRAM usage. The OSD shows how much VRAM it allocates.
Task Manger doesn't show real VRAM usage either. The fact is, there isn't really a way currently to see real VRAM usage, only allocated size. You'd have to track the actual game to see how much of that allocated VRAM the game is actually accessing. And, at least AFAIK, there isn't anything that does that. The way developers are doing things these days is they are just cramming as many textures as possible into VRAM, even if the texture isn't anywhere near the player and isn't being seen by the player. They start at where the player is, and just load outward until VRAM is full or there aren't anymore textures to load.

There used to be a very good quick video that showed how games used to handle textures. They would only load into VRAM what the player was actually seeing and a small amount around the field of view, in a cone. This was back in the days when VRAM was actually limited. Now developers have become spoiled with huge VRAM sizes, so they just load that VRAM with as many textures as they can. But then there are textures sitting in VRAM that are never accessed, so the VRAM usage is drastically inflated.

Getting back to why this affects the 3060. There are going to be people that are like "but what if I run at 4k texture mod on [game], that uses like 10GB of VRAM?!?" Ok, so you're using a 4k texture mod on a card that is really only capable of playing at 1080p resolution? Do you really think those stupid high texture resolutions are really helping you any?
Posted on Reply
#74
Sunny and 75
medi01first produce chips, then harvest, then release bumped up version
It is an ugly practice, I don't like it one bit but sadly that's how it is right now. That is why I'm ignoring all these early bait cards and I am gonna get me one of them Super (Ti) ones, exactly like what they did with the Turing cards.

The 3090 is the ultimate graphics card as a matter of fact. 6900 XT the same. But not everyone can afford them. I, for one, can get an RX 6800 non-XT at best which can be had for roughly 950 US dollars but I'm waiting for Nvidia's response first, will check the price/perf ratio before pulling the trigger though.
medi01[980 Ti] Spoiled Fury X launch. [1070 Ti] Undermined Vega.
And now AMD have returned the favor by launching RDNA 2. That 12 GB 3060 is just the beginning of a plethora of Nvidia cards changing for the better.


All hail the competition which drives us to improve even more to be the best we can be and pushes the advancement of technology further ahead.
Posted on Reply
#75
chriszhxi
Max(IT)and , again, you are looking at memory allocation, not usage.
CP2077 isn't using 10 Gb of VRAM.
no matter actually which is which or how you see it, once you hit the cap, the game is unplayable.
Posted on Reply
Add your own comment
Dec 22nd, 2024 16:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts