Wednesday, November 4th 2020

NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM

A leak from renowned (and usually on-point) leaker Kopite7kimi claims that NVIDIA has finally settled on new graphics cards to combat AMD's RX 6800 threat after all. After the company has been reported (and never confirmed) to be working on double-memory configurations for their RTX 3070 and RTX 3080 graphics cards (with 16 GB GDDR6 and 20 GB GDDR6X, respectively), the company is now reported to have settled for a 20 GB RTX 3080 Ti to face a (apparently; pending independent reviews) resurgent AMD.

The RTX 3080 Ti specs paint a card with the same CUDA core count as the RTX 3090, with 10496 FP32 cores over the same 320-bit memory bus as the RTX 3080. Kopite includes board and SKU numbers (PG133 SKU 15) along a new GPU codename: GA102-250. The performance differentiator against the RTX 3090 stands to be the memory amount, bus, and eventually core clockspeed; memory speed and board TGP are reported to mirror those of the RTX 3080, so some reduced clocks compared to that graphics card are expected. That amount of CUDA cores means NVIDIA is essentially divvying-up the same GA-102 die between its RTX 3090 (good luck finding one in stock) and the reported RTX 3080 Ti (so good luck finding one of those in stock as well, should the time come). It is unclear how pricing would work out for this SKU, but pricing comparable to that of the RX 6900 XT is the more sensible speculation. Take this report with the usual amount of NaCl.
Sources: Kopite7kimi @ Twitter, via Videocardz
Add your own comment

140 Comments on NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM

#26
londiste
ne6togadnoso amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
Imagine the outrage if Nvidia did something like that ;)
owen10578Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most.
This is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.

For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?

Just as an example, the first 3080 GPU-Z screenshot Google search returned:
Posted on Reply
#27
ZoneDymo
ShurikN3090 is never going to get price cut that soon if ever, that's why we are getting a 3080Ti
sadly this^ Nvidia does not drop pricing because that would mean admitting fault to a degree.
Instead they just launch new SKU's to compete.
Posted on Reply
#28
owen10578
londisteThis is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.

For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?
Yes GDDR6X is much more power hungry on the controller and memory chips it seems, seeing all the more thought out memory cooling solutions. But again just one extra chip and an extra 64-bit channel won't draw that much more power. I'd be very surprised if someone can prove me wrong on this.
Posted on Reply
#29
Calmmo
So.. basically, nvidia have been scrambling trying various different configs to fill in the perf/price gap with AMD.
Posted on Reply
#30
TumbleGeorge
Nvidia's problem is complacency in the past has fallen over the years. When they led and everything was allowed to them, including playing as they should with the prices of their products. They were literally zombies to themselves that this would go on forever and that they shouldn't even make much of an effort to maintain the status quo.
Posted on Reply
#31
RedelZaVedno
LOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.

2080TI $999 -> 3080TI +$999 probably,
2080 $699 -> 3080 $699
2070 $499 -> 3070 $499
2060S 399$ -> 3060TI $399 probably

xx60 MID RANGE class GPU costing the same amount as an entire console. Pure madness.
Posted on Reply
#32
Shou Miko
I am not surprised this is properly the reason they cancelled the 20GB version of the RTX 2080 and it would get too expensive.

Welcome to the Nvidia Screw-over train, we take your money and screw you over after a little while :roll:

Same happened to Titan Pascal owners with the when GTX 1080 Ti launched it was cheaper, faster for a lot of things then the Titan Pascal was then we out phase because we can't have a card that can out perform our Titan, and then a new Titan Pascal was launched with even more CUDA cores :laugh:
Posted on Reply
#33
Shatun_Bear
This might be the first Ampere card that's not hamstrung by small memory outside of the 3090.

Unfortunately it'll be priced $200 too high most probably.
Posted on Reply
#34
ShurikN
RedelZaVednoPrice hikes of Turing are here to stay and no one seems to be pissed about it anymore.
I'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.
Posted on Reply
#35
SIGSEGV
Panic mode.
I am so happy with this news. haha.
another nuclear reactor to warm up the house in the upcoming winter season.
surviver? :(
Posted on Reply
#36
Legacy-ZA
This is going to grind the gears of early adopters, want to know what will grind them even more? When refreshes pop up on the 7nm node next year, they will probably be called, the super hyper ultra edition for lolz. Then we can also have more models to pick from for extra confusion, at prices that aren't mainstream... and of course, you can have one for a mere $1 000 000 from a scalper near you.
Posted on Reply
#37
ne6togadno
Vayra86You can run games with insufficient VRAM just fine. Stutter doesn't always appear in reviews and canned benchmarks. But I'm sure the W1zz is right on the money trying to figure that out if the moment arrives. Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.

Does that all TRULY mean there is enough VRAM though? That is debatable. The real comparison is side by side and careful analysis of frametimes. Gonna be interesting.
trough the years of playing games and following games requirements i have impression that devs decide game requirements and vram in particular either by vga cards available or soon to come (a month or 2 max) on the market or more often just by playing darts.

on my 290x 4gb i've played titles that "required" 6 or even 8gb vram just fine. i've dialed up texture quality above recommended if i didnt liked how the game looked and still hadnt problems because of lack of vram. so judging what is enough based on game requirements is a bit pointless.
set a price range. check what meets your performance requirements. buy the card with highest amount of vram that fit your price and you are good to go. by the time the games look too ugly because you had to lower textures the card would be long dead fps wise.
as for 970 the problems never was in the amount of vram. slow 0.5gb was what caused the problems as it tanked performance very hard. when nivida isolated those 0.5gb with drivers 970s worked fine even with titles that required 4+ gb vram.

on tech level both camps have different approach for solving vram limitation.
nvidia's lossless compression allows them to have lower capacities and bus but to preserve higher performance. so they fit as min as possible memory for bigger margins.

with gnc amd had to throw a lot of memory bandwidth (bus for 7970 was 384bit, 290x was 512bit, fury, vega and vii were 1024bit) to provide enough "fuel" for gpu but it never was enough. from rdna amd have memory bus topped at 256bit which before was for their midrange cards (no doubts 5700xt itself is midrange card) and now with rdna2 even their top tier 6900 has 256bit bus. sure new cash provides higher speeds but still you need to feed this cash with adequate speeds and amd thinks that what was before bus suitable for mid range cards is now enough even for flag ship.
i think 16gb ram in amd's cards is more targeted at feeding the cash (like load all textures in vram so cash can have instant access w/o need of calls from ram/storage) and/or they believe the can have significant performance boost from direct cpu access to vram so they made sure they provide enough vram for devs to play with.
it will be interesting to see if those thing will really help amd
londisteImagine the outrage if Nvidia did something like that ;)
i dont have to imagine anything. they already did it with hairworks, forced tessellation and gameworks (or wharever it was called) extensions and physx. i dont remember the outrage thou. :rolleyes:
now that amd holds consoles and devs has to do optimization for amd's hardware the coin has flipped and nvidia is quite jumpy when something becomes close to take away "performance crown".
a single game announcement is enough to cause... leakages :rolleyes:
btw physx is open source for some time now ;)
Posted on Reply
#39
RedelZaVedno
ShurikNI'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.
Yeah, it seems like AMD has chosen higher profit margins over gaining larger market share. I'm getting out of the GPU market, holding on to 1080TI as long as I can and then buy something for 300 bucks on 2nd hard market. I'm unwilling to support greed.
Posted on Reply
#40
Sicofante
You all think this is just another gaming card? How naive...

The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.

The 3080 Ti fills the gap for low end workstations. The 3080 10 GB just don't cut it for rendering or even complex video editing and FX. AMD's 6900 XT was looking like the right purchase until this announcement.

That's why this 3080 Ti makes tons of sense outside gaming. I for one, will buy it the instant I can find it in stock.
Posted on Reply
#41
Calmmo
SicofanteThe 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.
lol, no.
Those VFX studios are buying Mac Pro's
Posted on Reply
#42
Sicofante
Calmmolol, no.
Those VFX studios are buying Mac Pro's
Clueless.
Posted on Reply
#43
Nater
Who cares.

"Auto-Notify"
"Not-In-Stock"
"Backordered"
Posted on Reply
#44
Calmmo
SicofanteClueless.
bro, they don't even have double precision or driver support. Wake up
Posted on Reply
#45
ShurikN
Calmmolol, no.
Those VFX studios are buying Mac Pro's
And quadros
Posted on Reply
#46
CrAsHnBuRnXp
Called it. I said from the start when the 3080 20GB was rumored it would be a TI version.
Posted on Reply
#47
N3M3515
Vya DomusNvidia is still acting arrogantly refusing to compete in terms of price but instead offering a largely worthless single digit performance differential.
Pride.
Posted on Reply
#49
N3M3515
RedelZaVednoLOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.

1080Ti $700 - 2080TI $1200 -> 3080TI +$1200 probably,
1080 $500 - 2080 $699 -> 3080 $699
1070 $350 - 2070 $499 -> 3070 $499
1060 $250 - 2060S 399$ -> 3060TI $399 probably

xx60 MID RANGE class GPU costing the same amount as an entire console. Pure madness.
Added for perspective.
Posted on Reply
#50
SIGSEGV
SicofanteYou all think this is just another gaming card? How naive...

The 3090 is a workstation class GPU
Oh pls god, not this kind of shit again.
Posted on Reply
Add your own comment
Nov 24th, 2024 09:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts