Friday, March 26th 2021

NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs

Uniko's Hardware, a usual spot for leaks and information on upcoming hardware, has put forward that NVIDIA could be looking to introduce two versions of its upcoming RTX 3070 Ti graphics card. The difference would be dual-sided GDDR6X memory or not, which would make available memory capacities for this card in the league of either 8 GB (the same as the RTX 3070) or 16 GB running at 19 Gbps.

The intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market. It could also be a time for NVIDIA to release another cryptomining-crippled graphics card - and this time to try and do it right by not releasing a driver that unlocks that particular effort. The card is rumored for launch come May, though we've already seen an unprecedented number of delays for NVIDIA's new SKUs - a sign that there is indeed a problem in the upstream semiconductor offering field.
Source: Videocardz
Add your own comment

79 Comments on NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs

#26
windwhirl
lexluthermiesterTo all of those who voted "No" in the poll, what the hell? Do people not get it? Technology marches on, why limit yourself?
Partially agree. But I feel like 16 GB of VRAM is a little too much for this product, though doing 12 GB would have forced Nvidia to change bus width or memory clocks... Then again, I don't have a crystal ball, so maybe it's not a waste?
Posted on Reply
#27
qubit
Overclocked quantum bit
Assuming the price wasn't too exorbitant, I would always go for the 16GB version to protect my investment.

I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
Posted on Reply
#28
TheDeeGee
qubitAssuming the price wasn't too exhorbitant, I would always go for the 16GB version to protect my investment.

I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
It's time 4K textures are an optional DL, and not forced down peoples throat.

The majority of the people still use 1080p and 1440p where 4K textures are pointless, unless you sit 10 centimeters away from your monitor.
Posted on Reply
#29
medi01
RaevenlordThe intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market.
In other words, let's pretend 6800XT isn't trading blows with 3080, especially in the newer games and 6900XT does not exist, shall we...

Why do TPU articles sound like they were written by NV marketing department???
spnidelso now we've got
12gb for 3060
8gb for 3060 ti
8gb for 3070
8gb AND 16gb for 3070 ti
10gb for 3080

jesus christ this just keeps getting even more retarded
Let me put this "totally unrelated" pic of a CEO that totally didn't blow up "GA104 as 3080" plans out there somewhere and totally didn't use much cheaper RAM while doing that:

Posted on Reply
#30
Caring1
medi01In other words, let's pretend 6800XT isn't trading blows with 3080, especially in the newer games and 6900XT does not exist, shall we...

Why do TPU articles sound like they were written by NV marketing department???


Let me put this "totally unrelated" pic of a CEO that totally didn't blow up "GA104 as 3080" plans out there somewhere and totally didn't use much cheaper RAM while doing that:

Lol, Radeon V11
That card is a bit of a dud, unless you find one that works.
Posted on Reply
#31
Unregistered
Radeon VII was a major disappointment. AMD is on the right track with RX 6000.
#32
qubit
Overclocked quantum bit
TheDeeGeeIt's time 4K textures are an optional DL, and not forced down peoples throat.

The majority of the people still use 1080p and 1440p where 4K textures are pointless, unless you sit 10 centimeters away from your monitor.
Excuses. You know what I'm talking about. You wanna buy the 8GB, it doesn't matter to me.
Posted on Reply
#33
Solid State Soul ( SSS )
AlexaThis is the only time I won't feel bad for not waiting for the "Ti" variant and getting the 3070. I found it at MSRP, and it's a premium Gaming X Trio model. It was also when my previous GPU died, rendering my PC unusable due to Ryzen having no iGPU. That was 2 months ago.

Waiting for the 3070 Ti would've proven disastrous, considering I wouldn't even be able to find one.

I'm good with 8 GB, thanks.
This is Nvidia plan now, release a card, then release an upgraded refresh of that card a year later.
Posted on Reply
#34
TheinsanegamerN
lexluthermiesterTo all of those who voted "No" in the poll, what the hell? Do people not get it? Technology marches on, why limit yourself?
Do people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
Posted on Reply
#35
Unregistered
Personally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
Posted on Edit | Reply
#36
SIGSEGV
give me 3070 Ti 4Gb, 3070 Ti 6G, 3070 Ti 8G, 3070 Ti 16G, 3070 Super 4Gb, 3070 Super 6G, 3070 Super 8G, 3070 Super 16G, 3070 Ti Super 4Gb, 3070Ti Super 6G, 3070Ti Super 8G, 3070Ti Super 16G.
yeeeehaaaa...
Posted on Reply
#38
95Viper
Discuss the topic, NOT each other.
Stop the insults.
Follow the Guidelines... read them if you need to.

Thank You and Have a Wonderful Day
Posted on Reply
#39
olstyle
Why are we discussing double sided VRAM? 3070 is normal GDDR6 and the 256Bit Buswidth sound like they will use the smaller chip again. Which means you can get 2GB Chips off the shelf like they have already done(!) for the Notebooks "3080".
Posted on Reply
#40
saki630
Am I ever going to be able to buy one? Im still using a 2nd hand 1080ti and my upgrade has been soldout/scalped/resold for months. It looks like I'm going to have to pick up a 3070 in 2022 at this rate.
Posted on Reply
#41
Unregistered
saki630Am I ever going to be able to buy one? Im still using a 2nd hand 1080ti and my upgrade has been soldout/scalped/resold for months. It looks like I'm going to have to pick up a 3070 in 2022 at this rate.
leaked picture of people finally being able to buy RTX 3000 at MSRP and with abundant stock
Posted on Edit | Reply
#42
lexluthermiester
TheinsanegamerNDo people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
Ok, YOU buy the cards with less memory. Those of us who remember the past and remember how much longer cards with more memory stayed relevant will spend the extra and get more out of our purchases in the long run.
Posted on Reply
#43
evernessince
TheinsanegamerNDo people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
A quick list of the current facts

1. There is no memory bottlenecking as of right now with 3000 series cards (at least in regards to VRAM size)
2. Video card memory sizes on the Nvidia side have remained stagnant for 2 generations
3. There are multiple video games that exceed 8GB of VRAM usage currently.
4. since the windows 2017 creator update, WIndows 10 task manager shows VRAM used, not just allocated. MSI afterburner has the capability as well.
5. AMD is offering competing products at lower prices (MSRP of course) that include more VRAM.
6. There is historical evidence that in a situation like the 3070 8GB finds itself in, memory issues 2-3 years down the line are likely. The 1060 3GB is a great example of this. Zero issues at launch but some games at the time did use more than 3GB. Memory usage increased year over year until eventually the memory was over-provisioned enough to the point where you get the characteristic memory stuttering and terrible frame pacing. VRAM issues don't arise from exceeding the VRAM installed on the card but instead over-provisioning the VRAM to the point where critical game data is being swapped between the main system memory and VRAM because the GPU doesn't even have space for the high priority data in the VRAM anymore. Modern video cards are pretty good at keeping high frequency access data where it's most needed which is why you don't start seeing serious issues with VRAM until you are quite a bit over your actual amount but it does get to a point where the video card can't even store data it needs from frame to frame, which causing the trademark stuttering issues. I'd also like to add to this that 16GB has also been the standard for gaming PCs for a long time and as such this could equally erode the buffer that gamers with something like a 3070 have. If your video card is overprovisioned and you don't have enough main system memory either, that means you are going to be relying on virtual memory. Now I can tell you with a 1080 Ti and 32GB of RAM I'm seeing 12GB of RAM usage and 8GB of VRAM usage in CP2077. 8GB is fine now but 4GB of RAM is not much of a buffer. Heck I'm not even running anything in the background, no steam discord nothing.

For people who want a video card that lasts, wanting more VRAM is certainly something they should want as I've demonstrated above.
Posted on Reply
#44
medi01
AlexaPersonally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
No performance difference.. compared to what? :D

DF's pathetically misleading "3080 vs 2080... preview" embarrassment was mis-using the fact of Doom's textures not fitting into 8GB of 2080
evernessince1. There is no memory bottlenecking as of right now with 3000 series cards (at least in regards to VRAM size)
Should we pretend this is true?
A goddamn PS5/XSeX have 10GB+ reserved for GPUs.
As for "oh, but that has no impact" see example above.

3060 - 12GB
but
3060Ti - 8GB
3070 - 8GB
3080 - 10GB

How the heck could that make any sense to anyone? Does anyone believe that this was the plan?
8GB is enough for 3070? Then surely 6GB should have been enough for 3060, shouldn't it?

What happened is NVIDIA WAS FORCED TO DROP A TIER. 3070 is 3080 wannabe with half the VRAM, 3060Ti is 3070 wannabe with half the VRAM, 3080 is 3080Ti/Titan wannabe with half the VRAM.

Why? Because GA104 is not able to compete with surprisingly good (given how little time they've had) RDNA2 line of GPUs.
On top of NV using much more expensive VRAM.

Mining craze is the only reason that we are not seeing NV margins being hurt.
Posted on Reply
#45
Unregistered
medi01No performance difference.. compared to what?
Compared to me not being VRAM limited. Still rocking over 150 frames with all settings cranked up and DXR on, even when VRAM limited.
#46
efikkan
lynx29I think people are just sick of the vaporware and frustrated. Best to leave it be and let them vent mate.
It's not vaporware.
Graphics cards are shipping in normal quantities, but the demands are significantly higher, partially due to a production deficit which has lasted for a long while.

If you need a card, you have to find a store which accepts backorders and have some patience.
What I find more annoying is stores who have cards in stock, but reserve them for prebuilt systems.
windwhirlPartially agree. But I feel like 16 GB of VRAM is a little too much for this product, though doing 12 GB would have forced Nvidia to change bus width or memory clocks... Then again, I don't have a crystal ball, so maybe it's not a waste?
For gaming, it's a waste.
People fail to grasp that increasing VRAM size but keeping everything else the same is pointless. The only way to utilize more VRAM without requiring more bandwidth (and likely more computational performance) would be to play at higher details and lower frame rate. If you want to retain or increase frame rate, you also need more bandwidth and computational performance. VRAM size for future proofing of graphics cards is just BS, and especially pointless at a time where there are shortages on GDDR6 supply.
qubitI've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
Allocated memory and used memory are not the same.
Posted on Reply
#47
windwhirl
efikkanbut reserve them for prebuilt systems.
... I found myself irritated about that but with RAM sticks.
efikkanFor gaming, it's a waste.
People fail to grasp that increasing VRAM size but keeping everything else the same is pointless. The only way to utilize more VRAM without requiring more bandwidth (and likely more computational performance) would be to play at higher details and lower frame rate. If you want to retain or increase frame rate, you also need more bandwidth and computational performance. VRAM size for future proofing of graphics cards is just BS, and especially pointless at a time where there are shortages on GDDR6 supply.
Yeah. I thought so. At least with this card that is geared for 1080/1440p, it doesn't seem like a good idea to have 16 GB of VRAM. Maybe 12 GB, if you just want everything to look gorgeous (supposing the games you play have some ultra HD texture pack or something like that) and conform yourself to 60 FPS and not more... Though I doubt this GPU has even enough power for that.
Posted on Reply
#48
efikkan
windwhirlYeah. I thought so. At least with this card that is geared for 1080/1440p, it doesn't seem like a good idea to have 16 GB of VRAM. Maybe 12 GB, if you just want everything to look gorgeous (supposing the games you play have some ultra HD texture pack or something like that) and conform yourself to 60 FPS and not more... Though I doubt this GPU has even enough power for that.
Still, increasing texture resolution requires more bandwidth too, so unless the higher capacity card also has higher bandwidth, you still need lower frame rate to enjoy your gorgeous textures.

Even worse, problems with third party texture packs are probably more due to game engine management than actual lack of VRAM. Pretty much all games today do active asset management (all kinds of LoD features), which is calibrated to offer the "right" balance in detail vs. performance. If you suddenly throw in much larger textures into the mix, there is no telling what will happen. It might turn out okay but have excessive resource utilization, but it might also result in stutter, aliasing issues, lack of detail etc.
Posted on Reply
#49
qubit
Overclocked quantum bit
efikkanAllocated memory and used memory are not the same.
That doesn't negate my point.
Posted on Reply
#50
Solaris17
Super Dainty Moderator
efikkanGraphics cards are shipping in normal quantities, ............................................. production deficit which has lasted for a long while.
So which is it?
Posted on Reply
Add your own comment
Nov 22nd, 2024 04:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts