Friday, March 26th 2021

NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs

Uniko's Hardware, a usual spot for leaks and information on upcoming hardware, has put forward that NVIDIA could be looking to introduce two versions of its upcoming RTX 3070 Ti graphics card. The difference would be dual-sided GDDR6X memory or not, which would make available memory capacities for this card in the league of either 8 GB (the same as the RTX 3070) or 16 GB running at 19 Gbps.

The intention with the RTX 3070 Ti is to bring the fight back to AMD, who released a pretty good offering to the market in the form of the RX 6800 and RX 6800 XT graphics cards - both featuring 16 GB of GDDR6 memory. NVIDIA is looking to improve its market position compared to AMD by offering both the RTX 3070 and RTX 3070 Ti on the market. It could also be a time for NVIDIA to release another cryptomining-crippled graphics card - and this time to try and do it right by not releasing a driver that unlocks that particular effort. The card is rumored for launch come May, though we've already seen an unprecedented number of delays for NVIDIA's new SKUs - a sign that there is indeed a problem in the upstream semiconductor offering field.
Source: Videocardz
Add your own comment

79 Comments on NVIDIA GeForce RTX 3070 Ti Could be Offered in Both 8 GB and 16 GB SKUs

#51
efikkan
Solaris17So which is it?
Graphics cards are shipping in normal quantities, just look at Nvidia's numbers. But the demand is higher than before, which creates a deficit. These are absolutely not contradictions, and should be known to anyone who has followed the situation.
Posted on Reply
#52
64K
If Nvidia does release an 8GB and a 16GB card then we will have some proof whether the extra VRAM is needed right now in games but people don't buy a card just for today's games. From what I've seen people generally buy for a 4 year lifespan. I
Posted on Reply
#53
Minus Infinity
64KIf Nvidia does release an 8GB and a 16GB card then we will have some proof whether the extra VRAM is needed right now in games but people don't buy a card just for today's games. From what I've seen people generally buy for a 4 year lifespan. I
The 16GB would only be required for some 4K games, for those playing 1440p it's overkill for now. I've seen becnhmarks showing the 3070 choking on a couple of 4K games where the 6800 has no problems because VRAM usage was over 12GB. Bit for the vast majority of current titles 8GB seems ok, but this won't always be the case. You have to weight it up, do you keep your card for a long time or do you update with each new generation? I tend to skip a generation sometimes 2, so if I were getting a card now I'd prefer 12GB+. I have two PC's one with a 1080Ti and one with a 1070 and only play 1440p res and generally the 1070 is fine although I haven't played any latest release AAA titles as I know the frame rates would be too low, but for most games I have no trouble. I'm going to skip RDNA2 and Amepre and wait for RDNA3 and Lovelace for upgrading the 1080Ti, but I have bought a 2080 Super for my new build so the 1070 is going. I will now be able to play latest release titles at max settings at 1440p. With my next upgrade in a few years I'll definitely move to 4K monitors and won't buy any cards with less than 12GB.
Posted on Reply
#54
bug
64KIf Nvidia does release an 8GB and a 16GB card then we will have some proof whether the extra VRAM is needed right now in games but people don't buy a card just for today's games. From what I've seen people generally buy for a 4 year lifespan. I
This is all very subjective. It requires a bit of reading into the future. And even then, there's like 10,000+ games that already fit into those 8GB vs a few dozen (at best) that will come down the road that won't. God knows how many of those are something will actually want to play.
And that 4 year life span... I used to upgrade like every other year (because I only bought mid-range and could get most of the cost back). And I know there are those who upgrade with each and every generation just because. But I don't have the numbers to know where most of the people stand. Judging by Steam numbers, you are probably right.
Posted on Reply
#55
64K
Minus InfinityThe 16GB would only be required for some 4K games, for those playing 1440p it's overkill for now. I've seen becnhmarks showing the 3070 choking on a couple of 4K games where the 6800 has no problems because VRAM usage was over 12GB. Bit for the vast majority of current titles 8GB seems ok, but this won't always be the case. You have to weight it up, do you keep your card for a long time or do you update with each new generation? I tend to skip a generation sometimes 2, so if I were getting a card now I'd prefer 12GB+. I have two PC's one with a 1080Ti and one with a 1070 and only play 1440p res and generally the 1070 is fine although I haven't played any latest release AAA titles as I know the frame rates would be too low, but for most games I have no trouble. I'm going to skip RDNA2 and Amepre and wait for RDNA3 and Lovelace for upgrading the 1080Ti, but I have bought a 2080 Super for my new build so the 1070 is going. I will now be able to play latest release titles at max settings at 1440p. With my next upgrade in a few years I'll definitely move to 4K monitors and won't buy any cards with less than 12GB.
You can't do a real comparison between 2 cards from different companies. If Nvidia makes a 3070 Ti 8GB and a 3070 Ti 16 GB then a comparison could be made assuming the cards are identical just the VRAM would be different. The last cards that I can think of from Nvidia didn't do that. A 1060 3GB and a 1060 6Gb had more cores. Also the 6 GB . As a reslt the 6GB was around 11% faster than the 3 GB.

Also if you are going to step up to 4K then plan to be upgrading cards often.
Posted on Reply
#56
John Naylor
spnidelso now we've got
12gb for 3060
8gb for 3060 ti
8gb for 3070
8gb AND 16gb for 3070 ti
10gb for 3080

jesus christ this just keeps getting even more retarded
As opposed to system ram in double, triple quad channel, 2Gb 4 Gb , 8 GB, 16gn, 32 GB .....

Just pick what works for the screen, if ya pay attention to what is actually "used" versus how much it allocates on installation, won't have any issues. When a utility says your 8GB card is using 5.8 MB and then you insert the 4 GB version and get same fps, same user experience, same screen quality as the 8 GB var ... no it is not ***using*** 5.8 MB

1080p ==> 3 - 4 GB
1440p ==> 6 - 8GB
2160p ==> 12 - 16 GB
64KIf Nvidia does release an 8GB and a 16GB card then we will have some proof whether the extra VRAM is needed right now in games but people don't buy a card just for today's games. From what I've seen people generally buy for a 4 year lifespan. I
That's not proof, it's taking advantage of uneducated consumers. Been the same claim since the nVidia 500 series... test 2 variants of a card at the resolution it's intended for and the higher one and no difference in user experience.

This was one of the 1st "exposes" on the myth ... but it's been done for for 600, 900, 1000 series also

alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
Solid State Soul ( SSS )This is Nvidia plan now, release a card, then release an upgraded refresh of that card a year later.
Geez, imagine if car companies and women's fashions, text books, software, etc. started pulling that stuff.
AlexaPersonally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
Allocated and used are two very, very different things. Think of your master card,... lets say you have a $5,000 limit and $500 charged. How much much of your credit line are you using ? ... $500.

Now when you apply for a car loan ... what will the credit report show is allocated against your potential lime of credit. MasterCard reports $5000. You are not ***using it** at the time.... but it$5000 has been ***allocated*** to be used whenever you want. Same principle here and nvidia has explained this on several occasions.

See above link: They tried to install a 2 GB version of max payne and it would not allow the install because of insufficient VRAM at that resolution. So they installed the 4GB car, tested it... then put 2 GB card back in ... same fps, same graphics quality, same user experience.

"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: GFX Card utilities "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available."

Look at the TPU reviews. The 3 GB and 6GB 1060 reviews here on TPU are a great example. The 6 GB card has 11% more shaders than the 3 GB card so it has an inherent speed advantage aside from VRAM. And when we look at the relative speed of the cards at 1080p ... the 6GB card has a 6% speed advantage. So if this VRAM thing is true that 3 GB is wholly inadequate for 1080p ... then when we look at 1440p we should see a critical drop in performance w/ 3 GB at 1440p ... but there isn't same 6%

Now that's not to say that no game will have issues with some games at some point. Poor console ports for example have been one of the most common exceptions. But for the most part ... by the time you run out of VRAM, you will fun out of playable. Not really relevant if you get 33% more fps when the result is 15 vs 20 fps. it's unplayable either way.

On thing I like about this abundance of options is seeing W1zzard take this issue on and get a data set for the 3000 series, and see if it comes out the same way as 500, 600, 700, 900 and 1000 series
Posted on Reply
#57
Unregistered
That's fine by me, means my 8 GB will hold out for a bit longer then.
#58
Prima.Vera
spnidelso now we've got
12gb for 3060
8gb for 3060 ti
8gb for 3070
8gb AND 16gb for 3070 ti
10gb for 3080

jesus christ this just keeps getting even more retarded
This is nothing compared with utterly retarded Intel's Xeons name schemes! o_O:laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh:
or maybe nVidia just want to compete with that too.
Posted on Reply
#59
lola46
its out of stock again
Posted on Reply
#60
Vayra86
rtwjunkieI'm not venting. Perhaps you're too young to remember, but Nvidia used to play this game 15 to 20 years ago, putting out multiple memory models of the same GPU and it actually did make it confusing for many people. don't forget, most people are not as technically clued in as people in this forum. Always remember that. Think like average citizen, not tech aficionado.
Sorry but if you can't read the capacity of memory on a box you have some customer due diligence work to do. Simple enough. Its basic info, much like travelling someplace and seeing what language they speak over there or whether they drive on the left or the right. Its not new either, memory capacity is a well known concept and if you haven't learned about it, its about damn time. Its 2021, not 1995.

Are you saying its better we get a single capacity per SKU 'so that customers understand'?

This is not the same thing as just using the bigger number for bigger sales. There is a price gap and the net performance is similar, but there are use cases for higher capacities and clearly there is also demand for it. Back in the day, when Nvidia played that game, there was also, for example, SLI with shared memory pools.

Let's make a distinction between tech aficionado and what we consider normal... just people being oblivious to what they're spending on is not normal. Some go through life saying 'ignorance is bliss' and if they do trip over something its always the fault of everyone else. Facilitating that kind is bad. Really bad. And we're knee deep into it, the US especially because its so easy to sue and succeed.
Prima.VeraThis is nothing compared with utterly retarded Intel's Xeons name schemes! o_O:laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh:
or maybe nVidia just want to compete with that too.
Exactly. Nvidia is pretty clear on its type-numbering and how the stack is organized, and also about how much VRAM is on each GPU.

The one time they dropped the ball, they missed on details about a mere 0.5GB. The box clearly said 4 GB though. This also completely doesn't mix well with the idea that memory capacity would be an unknown concept.

What's worse is the OEM versions of similar SKUs that are actually pretty different, I remember the versions of Kepler well, but also their Maxwell pilot project in 7xx series, and running old architecture on mobile chips. THAT is something to get pissed about. Because that is truly misleading, same as rebranding half your stack and placing a few new things above it.
Posted on Reply
#61
bug
Prima.VeraThis is nothing compared with utterly retarded Intel's Xeons name schemes! o_O:laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh::laugh:
Epyc 75F3 :wtf:

Though I'd rather have crappy names on great products than the other way around.
Posted on Reply
#62
medi01
Oh boy, 6700XT is only 336mm2.
AlexaCompared to me not being VRAM limited. Still rocking over 150 frames with all settings cranked up and DXR on, even when VRAM limited.
So we should take your personal result and ignore whatever is coming from actual reviewers, and pretend that 3060 having more VRAM than 3060Ti, 3070, 3080 is normal?
Posted on Reply
#63
Unregistered
medi01So we should take your personal result and ignore whatever is coming from actual reviewers, and pretend that 3060 having more VRAM than 3060Ti, 3070, 3080 is normal?
When did I every say or imply that, and when did I ever say that NVIDIA's current VRAM allocations are completely normal? I believe the opposite.
#64
medi01
AlexaWhen did I every say or imply that, and when did I ever say that NVIDIA's current VRAM allocations are completely normal? I believe the opposite.
The whole argument went with "oh, but VRAM doesn't matter, it allocates but does not use... since I had not seen that" or did I miss something?
Because the main conclusion form that should be that 3060 should not need more than half of those 12GB.
Posted on Reply
#65
Unregistered
And I agree, 3060's current VRAM size is dumb solely because NVIDIA decided to go with a 192 bit bus
#66
medi01
Alexasolely because NVIDIA decided to go with a 192 bit bus
Solely (oh wait, that makes it not solely) of not being able to call 3070 a 3080.
Or rather, not being able to call 3080 a 3080Ti (it would be an overpriced crap, like 2080Ti, so 20GB of expensive VRAM would not be a problem)
Posted on Reply
#67
rtwjunkie
PC Gaming Enthusiast
Vayra86Sorry but if you can't read the capacity of memory on a box you have some customer due diligence work to do. Simple enough. Its basic info, much like travelling someplace and seeing what language they speak over there or whether they drive on the left or the right. Its not new either, memory capacity is a well known concept and if you haven't learned about it, its about damn time. Its 2021, not 1995.

Are you saying its better we get a single capacity per SKU 'so that customers understand'?

This is not the same thing as just using the bigger number for bigger sales. There is a price gap and the net performance is similar, but there are use cases for higher capacities and clearly there is also demand for it. Back in the day, when Nvidia played that game, there was also, for example, SLI with shared memory pools.

Let's make a distinction between tech aficionado and what we consider normal... just people being oblivious to what they're spending on is not normal. Some go through life saying 'ignorance is bliss' and if they do trip over something its always the fault of everyone else. Facilitating that kind is bad. Really bad. And we're knee deep into it, the US especially because its so easy to sue and succeed
Don’t read into it. You attribute way too much to what I said. It was a simple comment.
Posted on Reply
#68
arconz
So this turned out to be bogus... the 16GB die variant was the RTX A4000 :/
Posted on Reply
#69
lexluthermiester
arconzSo this turned out to be bogus...
Not yet. The 3070ti has yet to be announced. NVidia has not ruled it out.
Posted on Reply
#71
arconz
lexluthermiesterNot yet. The 3070ti has yet to be announced. NVidia has not ruled it out.
This is wrong & sounds like nothing rumours - GN have sources at nVidia & have confirmed there is only an 8GB 6X version for gamers.

The leaks were never about two 3070 Ti variants - they were about two variants of the GA104 die with 6144 CUDA cores. Then leakers assumed [incorrectly] that they were both gaming cards. But the 16GB variant of that die has now been released... its on nVidia's website lol.

Its insane to think nVidia would have any reason to sell a 16GB gaming card when they could use the scarce VRAM to sell more expensive 12GB 3080Ti's which will beat AMD's best cards.
Posted on Reply
#72
Caring1
arconzThis is wrong & sounds like nothing rumours - GN have sources at nVidia & have confirmed there is only an 8GB 6X version for gamers non professional use.
Fixed it for you.
Posted on Reply
#73
Unregistered
If the 3070 Ti ends up being 8GB 6X, fine by me. Makes me regret buying my 3070 and not waiting for it less. I don't want G6X anyway -- it's a hot mess.
#74
arconz
Caring1Fixed it for you.
Well in a way we're still both wrong lol, tons of professional studios like mine use 3060s or 3090s for 3D & GPU rendering... but its just easier to say gaming cards lol.
Posted on Reply
#75
lexluthermiester
arconzThis is wrong
Oh really? Got a link showing an announcement from NVidia one way or the other? If not, then I'm not wrong.
Posted on Reply
Add your own comment
Nov 22nd, 2024 04:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts