Friday, March 26th 2021
NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs
Uniko's Hardware, a usual spot for leaks and information on upcoming hardware, has put forward that NVIDIA could be looking to introduce two versions of its upcoming RTX 3070 Ti graphics card. The difference would be dual-sided GDDR6X memory or not, which would make available memory capacities for this card in the league of either 8 GB (the same as the RTX 3070) or 16 GB running at 19 Gbps.
The intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market. It could also be a time for NVIDIA to release another cryptomining-crippled graphics card - and this time to try and do it right by not releasing a driver that unlocks that particular effort. The card is rumored for launch come May, though we've already seen an unprecedented number of delays for NVIDIA's new SKUs - a sign that there is indeed a problem in the upstream semiconductor offering field.
Source:
Videocardz
The intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market. It could also be a time for NVIDIA to release another cryptomining-crippled graphics card - and this time to try and do it right by not releasing a driver that unlocks that particular effort. The card is rumored for launch come May, though we've already seen an unprecedented number of delays for NVIDIA's new SKUs - a sign that there is indeed a problem in the upstream semiconductor offering field.
79 Comments on NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs
I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
The majority of the people still use 1080p and 1440p where 4K textures are pointless, unless you sit 10 centimeters away from your monitor.
Why do TPU articles sound like they were written by NV marketing department??? Let me put this "totally unrelated" pic of a CEO that totally didn't blow up "GA104 as 3080" plans out there somewhere and totally didn't use much cheaper RAM while doing that:
That card is a bit of a dud, unless you find one that works.
Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
yeeeehaaaa...
Stop the insults.
Follow the Guidelines... read them if you need to.
Thank You and Have a Wonderful Day
1. There is no memory bottlenecking as of right now with 3000 series cards (at least in regards to VRAM size)
2. Video card memory sizes on the Nvidia side have remained stagnant for 2 generations
3. There are multiple video games that exceed 8GB of VRAM usage currently.
4. since the windows 2017 creator update, WIndows 10 task manager shows VRAM used, not just allocated. MSI afterburner has the capability as well.
5. AMD is offering competing products at lower prices (MSRP of course) that include more VRAM.
6. There is historical evidence that in a situation like the 3070 8GB finds itself in, memory issues 2-3 years down the line are likely. The 1060 3GB is a great example of this. Zero issues at launch but some games at the time did use more than 3GB. Memory usage increased year over year until eventually the memory was over-provisioned enough to the point where you get the characteristic memory stuttering and terrible frame pacing. VRAM issues don't arise from exceeding the VRAM installed on the card but instead over-provisioning the VRAM to the point where critical game data is being swapped between the main system memory and VRAM because the GPU doesn't even have space for the high priority data in the VRAM anymore. Modern video cards are pretty good at keeping high frequency access data where it's most needed which is why you don't start seeing serious issues with VRAM until you are quite a bit over your actual amount but it does get to a point where the video card can't even store data it needs from frame to frame, which causing the trademark stuttering issues. I'd also like to add to this that 16GB has also been the standard for gaming PCs for a long time and as such this could equally erode the buffer that gamers with something like a 3070 have. If your video card is overprovisioned and you don't have enough main system memory either, that means you are going to be relying on virtual memory. Now I can tell you with a 1080 Ti and 32GB of RAM I'm seeing 12GB of RAM usage and 8GB of VRAM usage in CP2077. 8GB is fine now but 4GB of RAM is not much of a buffer. Heck I'm not even running anything in the background, no steam discord nothing.
For people who want a video card that lasts, wanting more VRAM is certainly something they should want as I've demonstrated above.
DF's pathetically misleading "3080 vs 2080... preview" embarrassment was mis-using the fact of Doom's textures not fitting into 8GB of 2080 Should we pretend this is true?
A goddamn PS5/XSeX have 10GB+ reserved for GPUs.
As for "oh, but that has no impact" see example above.
3060 - 12GB
but
3060Ti - 8GB
3070 - 8GB
3080 - 10GB
How the heck could that make any sense to anyone? Does anyone believe that this was the plan?
8GB is enough for 3070? Then surely 6GB should have been enough for 3060, shouldn't it?
What happened is NVIDIA WAS FORCED TO DROP A TIER. 3070 is 3080 wannabe with half the VRAM, 3060Ti is 3070 wannabe with half the VRAM, 3080 is 3080Ti/Titan wannabe with half the VRAM.
Why? Because GA104 is not able to compete with surprisingly good (given how little time they've had) RDNA2 line of GPUs.
On top of NV using much more expensive VRAM.
Mining craze is the only reason that we are not seeing NV margins being hurt.
Graphics cards are shipping in normal quantities, but the demands are significantly higher, partially due to a production deficit which has lasted for a long while.
If you need a card, you have to find a store which accepts backorders and have some patience.
What I find more annoying is stores who have cards in stock, but reserve them for prebuilt systems. For gaming, it's a waste.
People fail to grasp that increasing VRAM size but keeping everything else the same is pointless. The only way to utilize more VRAM without requiring more bandwidth (and likely more computational performance) would be to play at higher details and lower frame rate. If you want to retain or increase frame rate, you also need more bandwidth and computational performance. VRAM size for future proofing of graphics cards is just BS, and especially pointless at a time where there are shortages on GDDR6 supply. Allocated memory and used memory are not the same.
Even worse, problems with third party texture packs are probably more due to game engine management than actual lack of VRAM. Pretty much all games today do active asset management (all kinds of LoD features), which is calibrated to offer the "right" balance in detail vs. performance. If you suddenly throw in much larger textures into the mix, there is no telling what will happen. It might turn out okay but have excessive resource utilization, but it might also result in stutter, aliasing issues, lack of detail etc.