Wednesday, October 26th 2022

NVIDIA Partners Beginning to Carve Out RTX 3070 Ti From Larger GA102 Dies

NVIDIA manufactured a heap of large "GA102" Ampere silicon to cater to demand from the crypto-mining boom; only to see that demand vanish. With next-gen RTX 40-series awaiting ramp; the company has to digest these GA102 chips somehow, and is apparently letting its partners use them on performance-segment SKUs such as the GeForce RTX 3070 Ti. The RTX 3070 Ti is normally based on the GA104 silicon, which it maxes out, enabling all 6,144 CUDA cores, 48 RT cores, 192 Tensor cores, 192 TMUs, and 96 ROPs, besides the chip's full 256-bit wide GDDR6X memory interface. This SKU is now being carved out on the larger GA102, by enabling 48 out of 84 streaming multiprocessors (just 57% of the CUDA cores enabled); and narrowing the memory bus from its normal 384-bit, down to 256-bit.

The memory size remains at 8 GB, memory type at GDDR6X, and memory speed at 19 Gbps, working out to 608 GB/s of bandwidth. The most interesting aspect of carving the RTX 3070 Ti out of the GA102 has to be board power; with a ZOTAC-branded card listing it at 320 W, higher than the 290 W of GA104-based cards from the company. Sadly, this is a China-only SKU. Every custom-design graphics card, especially from a reputed AIC such as ZOTAC, has to go through qualification with NVIDIA; which means NVIDIA is not only aware of GA102-based RTX 3070 Ti cards, but is behind fusing the SMs to carve out the SKU, and developing the video BIOS and driver support. ZOTAC is kind enough to list the ASIC code on its website, and for this SKU it is "GA102-150-xx."
Source: VideoCardz
Add your own comment

25 Comments on NVIDIA Partners Beginning to Carve Out RTX 3070 Ti From Larger GA102 Dies

#1
Arco
Let's play the game of confusing the customer.
Posted on Reply
#2
btarunr
Editor & Senior Moderator
ArcoLet's play the game of confusing the customer.
This used to be a problem in the old days of SLI, where the same SKU carved out of different silicon would end up with SLI incompatibility; but not anymore (SLI is dead). You're getting an RTX 3070 Ti alright.
Posted on Reply
#3
wolf
Better Than Native
That board power though, same as a 3080 10GB... no bueno
Posted on Reply
#4
Arco
btarunrThis used to be a problem in the old days of SLI, where the same SKU carved out of different silicon would end up with SLI incompatibility; but not anymore (SLI is dead). You're getting an RTX 3070 Ti alright.
I wonder how many of these kinds of cards are out there. Be it miners or some weird last-minute release.
Posted on Reply
#6
SOAREVERSOR
Crackong3070 Tie KO ?
Bring back ULTRA monkier
Posted on Reply
#7
Eskimonster
I want a 3070 ti, but not with this additional TDP.
Posted on Reply
#8
neatfeatguy
btarunrThis used to be a problem in the old days of SLI, where the same SKU carved out of different silicon would end up with SLI incompatibility; but not anymore (SLI is dead). You're getting an RTX 3070 Ti alright.
I flashed a bad GTX 285 with a 280 BIOS and it worked. I ran the flashed 285 with a 280 in SLI for a couple years. I do recall of hearing about issues with SLI and certain GPUs sometimes, I just never had the issue with my odd setup.

As odd as it sounds, I do miss SLI, but I'm probably one of the few that feels this way. I never really had any issues with SLI.
Posted on Reply
#9
ZetZet
wolfThat board power though, same as a 3080 10GB... no bueno
Oh no, it's a whole 10% higher than a normal 3070 Ti. That's so crazy, who's going to save the trees now???
Posted on Reply
#10
bonehead123
Good for ya, nGreedia.. I sincerely hope you have to eat ALOT of lower margins on those chips that you used to justify the scalper's paradise card pricing during the pandemic..

aint Karma a filthy, rotten, smelly biotch ?
Posted on Reply
#11
wolf
Better Than Native
ZetZetOh no, it's a whole 10% higher than a normal 3070 Ti. That's so crazy, who's going to save the trees now???
Just an unnecessary 10% over what a 3070ti should be, a side effect from a larger die? I suppose it could be this specific model comes with a raised board power over the official spec too.
Posted on Reply
#12
Steevo
What a missed opportunity to allow mods and bios flash unlocks.
Posted on Reply
#13
mechtech
Impossible. Was nothing ever on the shelves. Means there was nothing. Or was that all lies??
Posted on Reply
#14
maxfly
They've got to dump all that excess stock somehow. If they price them right they should sell regardless of the increased power usage. Hah, who am I kidding!? They can't price them wisely. That would go against their mighty leaders recent mantra. They would rather sit on the rest of the 40 stack for another year while feigning supply shortages of the 4080/4090s.
Posted on Reply
#15
Ware
ArcoLet's play the game of confusing the customer.
It's pretty standard. Unfortunately, most customers are confused no matter what.
If you start talking normal person about ga-102's and ga-104's their eyes will just gloss over and they tune you out - normal people don't care.
The cards are for the Chinese market so confusion should be minimal.
AMD does the same thing, only worse. AMD would sell ga-104's in China as 3080's, as they did with the "RX 580" - like that's not going to confuse people.
bonehead123Good for ya, nGreedia. . I sincerely hope you have to eat ALOT of lower margins on those chips that you used to justify the scalper's paradise card pricing during the pandemic..
That's fine I guess, but Nvidia didn't create the pandemic, and they were certainly not the only corporation to take advantage of the situation.
Everyone was happy to sell to their GPU's to miners, and people bought from the scalpers, so you could point fingers all day long.
I do miss the days of getting cheap GPU's. I just don't think that Nvidia failing is going to make GPU's cheaper and more available.
mechtechImpossible. Was nothing ever on the shelves. Means there was nothing. Or was that all lies??
It's not like Samsung has the chips on a shelf for Nvidia to go pick up any time.
They were probably ordered a long time ago, and now demands have shifted so they are going to end up being used differently.
Posted on Reply
#16
evernessince
ZetZetOh no, it's a whole 10% higher than a normal 3070 Ti. That's so crazy, who's going to save the trees now???
10% over a card that was already inefficient. If you continue to excuse power consumption increases, those increases add up over time.
Posted on Reply
#17
TheEndIsNear
I'm not fanboy but I hope they choke on all these cards. The fact that they are selling LHR cards new now tells me they screwed up.
Posted on Reply
#18
nexxusty
Couldn't just throw gamers a bone and give them 16GB?

I owned a 3070Ti 8GB and it was a trash gaming GPU. Only good for mining.

8GB in 2022 is a fucking literal joke.
Posted on Reply
#19
JAB Creations
8GB? Sniper Elite 5 maxed out uses 15.7GB at 1440p.

Let's not forget the "8GB 3060" to further screw with consumers.
Posted on Reply
#20
renz496
bonehead123Good for ya, nGreedia.. I sincerely hope you have to eat ALOT of lower margins on those chips that you used to justify the scalper's paradise card pricing during the pandemic..

aint Karma a filthy, rotten, smelly biotch ?
yes good for them. because they still make money off a "reject" chip like this rather than outright have to throw them away. it is having the opposite effect of what you're hoping.
Posted on Reply
#21
Vayra86
btarunrThis used to be a problem in the old days of SLI, where the same SKU carved out of different silicon would end up with SLI incompatibility; but not anymore (SLI is dead). You're getting an RTX 3070 Ti alright.
Its much like the 1070ti, that was clearly a lesser card than the 1080 it 'replaced'.

Its just less efficient, even if a little, its definitely less trying to disguise as equal.
JAB Creations8GB? Sniper Elite 5 maxed out uses 15.7GB at 1440p.

Let's not forget the "8GB 3060" to further screw with consumers.
Inb4 'but that's just loaded, not used capacity'

...and the accompanying stutter on the vast majority of Nvidia's line up when you start pushing it :)
Posted on Reply
#22
bonehead123
WareNvidia didn't create the pandemic, and they were certainly not the only corporation to take advantage of the situation.

I just don't think that Nvidia failing is going to make GPU's cheaper and more available.
A. I neva said they created the pandemic, nor that they were the only ones to take advantage of it, but in terms of pc hdwr, they were among the worst of the bunch IMHO :(

B. And I neva implied that I wanted them to fail, merely that they deserve to suffer a bit as a direct result of their excessive greed, now that the pandemic is almost over, and the fact that they have yet to reduce their GPU prices back to anywhere near pre-pandemic/scalper-induced levels will hopefully speed up this process significantly....

With the highest inflation rate in many years and the dwindling pool of people who have an extra $1-2k just laying around that they can spend on a new GPU (or many other things too), and the overall decrease in demand for all things pc-related, they WILL get what is coming to them, one way or the other, at which time I shall be all smiles from ear to ear.... and yes, I realize this whole situation applies to many other aspects of our daily lives too....
Posted on Reply
#24
neatfeatguy
Sir Alex IceWhen has Zotac ever been a reputed AIB?!
Zotac did a helluva job with the Maxwell series. Then from what I recall, they kind of let things slip as Pascal and Turing came out in terms of some of the quality. If they could have kept up with the same quality like they had in the Maxwell gen, they'd certainly be a much more reputable AIB.
Posted on Reply
#25
stimpy88
Trash because 8GB is not enough.
Posted on Reply
Add your own comment
Dec 22nd, 2024 00:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts