• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs

its not vaporware

the world has changed and its not my problem if other have a problem understanding.

good luck buying one then lol
 
To all of those who voted "No" in the poll, what the hell? Do people not get it? Technology marches on, why limit yourself?

Partially agree. But I feel like 16 GB of VRAM is a little too much for this product, though doing 12 GB would have forced Nvidia to change bus width or memory clocks... Then again, I don't have a crystal ball, so maybe it's not a waste?
 
Assuming the price wasn't too exorbitant, I would always go for the 16GB version to protect my investment.

I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
 
Last edited:
Assuming the price wasn't too exhorbitant, I would always go for the 16GB version to protect my investment.

I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
It's time 4K textures are an optional DL, and not forced down peoples throat.

The majority of the people still use 1080p and 1440p where 4K textures are pointless, unless you sit 10 centimeters away from your monitor.
 
The intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market.
In other words, let's pretend 6800XT isn't trading blows with 3080, especially in the newer games and 6900XT does not exist, shall we...

Why do TPU articles sound like they were written by NV marketing department???

so now we've got
12gb for 3060
8gb for 3060 ti
8gb for 3070
8gb AND 16gb for 3070 ti
10gb for 3080

jesus christ this just keeps getting even more retarded
Let me put this "totally unrelated" pic of a CEO that totally didn't blow up "GA104 as 3080" plans out there somewhere and totally didn't use much cheaper RAM while doing that:

1616838018902.png
 
In other words, let's pretend 6800XT isn't trading blows with 3080, especially in the newer games and 6900XT does not exist, shall we...

Why do TPU articles sound like they were written by NV marketing department???


Let me put this "totally unrelated" pic of a CEO that totally didn't blow up "GA104 as 3080" plans out there somewhere and totally didn't use much cheaper RAM while doing that:

View attachment 194029
Lol, Radeon V11
That card is a bit of a dud, unless you find one that works.
 
Radeon VII was a major disappointment. AMD is on the right track with RX 6000.
 
It's time 4K textures are an optional DL, and not forced down peoples throat.

The majority of the people still use 1080p and 1440p where 4K textures are pointless, unless you sit 10 centimeters away from your monitor.
Excuses. You know what I'm talking about. You wanna buy the 8GB, it doesn't matter to me.
 
This is the only time I won't feel bad for not waiting for the "Ti" variant and getting the 3070. I found it at MSRP, and it's a premium Gaming X Trio model. It was also when my previous GPU died, rendering my PC unusable due to Ryzen having no iGPU. That was 2 months ago.

Waiting for the 3070 Ti would've proven disastrous, considering I wouldn't even be able to find one.

I'm good with 8 GB, thanks.
This is Nvidia plan now, release a card, then release an upgraded refresh of that card a year later.
 
To all of those who voted "No" in the poll, what the hell? Do people not get it? Technology marches on, why limit yourself?
Do people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
 
Personally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
 
give me 3070 Ti 4Gb, 3070 Ti 6G, 3070 Ti 8G, 3070 Ti 16G, 3070 Super 4Gb, 3070 Super 6G, 3070 Super 8G, 3070 Super 16G, 3070 Ti Super 4Gb, 3070Ti Super 6G, 3070Ti Super 8G, 3070Ti Super 16G.
yeeeehaaaa...
 
The RTX 3070 is off to a solid start, it's already overtaken cards like the RX 5700 XT on Steam.
 
Why are we discussing double sided VRAM? 3070 is normal GDDR6 and the 256Bit Buswidth sound like they will use the smaller chip again. Which means you can get 2GB Chips off the shelf like they have already done(!) for the Notebooks "3080".
 
Am I ever going to be able to buy one? Im still using a 2nd hand 1080ti and my upgrade has been soldout/scalped/resold for months. It looks like I'm going to have to pick up a 3070 in 2022 at this rate.
 
Do people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.
Ok, YOU buy the cards with less memory. Those of us who remember the past and remember how much longer cards with more memory stayed relevant will spend the extra and get more out of our purchases in the long run.
 
Do people not get what, exactly? Because so far, all the "I NEED MUHMEMORY" afficianados cannot point to a single set of benchmarks showing memory bottlenecking on the nvidia 3000 series. Its pretty easy to show, frame time varience would be attrocious and stutter could be measured even by a rank amateur. People post pictures of VRAM usage from afterburner and go "SEESEESEEPROOFIWIN" seemingly unable, or unwilling, to understand that VRAM allocaion =! VRAM usage.

Pumping up a cards price by $100+ just for some pointless memory is seen as wasteful by many.

A quick list of the current facts

1. There is no memory bottlenecking as of right now with 3000 series cards (at least in regards to VRAM size)
2. Video card memory sizes on the Nvidia side have remained stagnant for 2 generations
3. There are multiple video games that exceed 8GB of VRAM usage currently.
4. since the windows 2017 creator update, WIndows 10 task manager shows VRAM used, not just allocated. MSI afterburner has the capability as well.
5. AMD is offering competing products at lower prices (MSRP of course) that include more VRAM.
6. There is historical evidence that in a situation like the 3070 8GB finds itself in, memory issues 2-3 years down the line are likely. The 1060 3GB is a great example of this. Zero issues at launch but some games at the time did use more than 3GB. Memory usage increased year over year until eventually the memory was over-provisioned enough to the point where you get the characteristic memory stuttering and terrible frame pacing. VRAM issues don't arise from exceeding the VRAM installed on the card but instead over-provisioning the VRAM to the point where critical game data is being swapped between the main system memory and VRAM because the GPU doesn't even have space for the high priority data in the VRAM anymore. Modern video cards are pretty good at keeping high frequency access data where it's most needed which is why you don't start seeing serious issues with VRAM until you are quite a bit over your actual amount but it does get to a point where the video card can't even store data it needs from frame to frame, which causing the trademark stuttering issues. I'd also like to add to this that 16GB has also been the standard for gaming PCs for a long time and as such this could equally erode the buffer that gamers with something like a 3070 have. If your video card is overprovisioned and you don't have enough main system memory either, that means you are going to be relying on virtual memory. Now I can tell you with a 1080 Ti and 32GB of RAM I'm seeing 12GB of RAM usage and 8GB of VRAM usage in CP2077. 8GB is fine now but 4GB of RAM is not much of a buffer. Heck I'm not even running anything in the background, no steam discord nothing.

For people who want a video card that lasts, wanting more VRAM is certainly something they should want as I've demonstrated above.
 
Personally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
No performance difference.. compared to what? :D

DF's pathetically misleading "3080 vs 2080... preview" embarrassment was mis-using the fact of Doom's textures not fitting into 8GB of 2080

1. There is no memory bottlenecking as of right now with 3000 series cards (at least in regards to VRAM size)
Should we pretend this is true?
A goddamn PS5/XSeX have 10GB+ reserved for GPUs.
As for "oh, but that has no impact" see example above.

3060 - 12GB
but
3060Ti - 8GB
3070 - 8GB
3080 - 10GB

How the heck could that make any sense to anyone? Does anyone believe that this was the plan?
8GB is enough for 3070? Then surely 6GB should have been enough for 3060, shouldn't it?

What happened is NVIDIA WAS FORCED TO DROP A TIER. 3070 is 3080 wannabe with half the VRAM, 3060Ti is 3070 wannabe with half the VRAM, 3080 is 3080Ti/Titan wannabe with half the VRAM.

Why? Because GA104 is not able to compete with surprisingly good (given how little time they've had) RDNA2 line of GPUs.
On top of NV using much more expensive VRAM.

Mining craze is the only reason that we are not seeing NV margins being hurt.
 
No performance difference.. compared to what?
Compared to me not being VRAM limited. Still rocking over 150 frames with all settings cranked up and DXR on, even when VRAM limited.
 
I think people are just sick of the vaporware and frustrated. Best to leave it be and let them vent mate.
It's not vaporware.
Graphics cards are shipping in normal quantities, but the demands are significantly higher, partially due to a production deficit which has lasted for a long while.

If you need a card, you have to find a store which accepts backorders and have some patience.
What I find more annoying is stores who have cards in stock, but reserve them for prebuilt systems.

Partially agree. But I feel like 16 GB of VRAM is a little too much for this product, though doing 12 GB would have forced Nvidia to change bus width or memory clocks... Then again, I don't have a crystal ball, so maybe it's not a waste?
For gaming, it's a waste.
People fail to grasp that increasing VRAM size but keeping everything else the same is pointless. The only way to utilize more VRAM without requiring more bandwidth (and likely more computational performance) would be to play at higher details and lower frame rate. If you want to retain or increase frame rate, you also need more bandwidth and computational performance. VRAM size for future proofing of graphics cards is just BS, and especially pointless at a time where there are shortages on GDDR6 supply.

I've seen modern games load up the 8GB on my 2080 to around 6GB+ already. So, next gen games at 4K could well max out an 8GB 30 series card, significantly limiting its performance, forcing a drop in quality settings that the GPU could otherwise handle. That's no good when you've just spent hundreds on the latest tech only to have it compromised by something stupid like that.
Allocated memory and used memory are not the same.
 
but reserve them for prebuilt systems.
... I found myself irritated about that but with RAM sticks.

For gaming, it's a waste.
People fail to grasp that increasing VRAM size but keeping everything else the same is pointless. The only way to utilize more VRAM without requiring more bandwidth (and likely more computational performance) would be to play at higher details and lower frame rate. If you want to retain or increase frame rate, you also need more bandwidth and computational performance. VRAM size for future proofing of graphics cards is just BS, and especially pointless at a time where there are shortages on GDDR6 supply.
Yeah. I thought so. At least with this card that is geared for 1080/1440p, it doesn't seem like a good idea to have 16 GB of VRAM. Maybe 12 GB, if you just want everything to look gorgeous (supposing the games you play have some ultra HD texture pack or something like that) and conform yourself to 60 FPS and not more... Though I doubt this GPU has even enough power for that.
 
Yeah. I thought so. At least with this card that is geared for 1080/1440p, it doesn't seem like a good idea to have 16 GB of VRAM. Maybe 12 GB, if you just want everything to look gorgeous (supposing the games you play have some ultra HD texture pack or something like that) and conform yourself to 60 FPS and not more... Though I doubt this GPU has even enough power for that.
Still, increasing texture resolution requires more bandwidth too, so unless the higher capacity card also has higher bandwidth, you still need lower frame rate to enjoy your gorgeous textures.

Even worse, problems with third party texture packs are probably more due to game engine management than actual lack of VRAM. Pretty much all games today do active asset management (all kinds of LoD features), which is calibrated to offer the "right" balance in detail vs. performance. If you suddenly throw in much larger textures into the mix, there is no telling what will happen. It might turn out okay but have excessive resource utilization, but it might also result in stutter, aliasing issues, lack of detail etc.
 
Back
Top