# NVIDIA Reportedly Working on GeForce RTX 3080 Ti Graphics Card with 20 GB GDDR6X VRAM



## Raevenlord (Nov 4, 2020)

A leak from renowned (and usually on-point) leaker Kopite7kimi claims that NVIDIA has finally settled on new graphics cards to combat AMD's RX 6800 threat after all. After the company has been reported (and never confirmed) to be working on double-memory configurations for their RTX 3070 and RTX 3080 graphics cards (with 16 GB GDDR6 and 20 GB GDDR6X, respectively), the company is now reported to have settled for a 20 GB RTX 3080 Ti to face a (apparently; pending independent reviews) resurgent AMD.

The RTX 3080 Ti specs paint a card with the same CUDA core count as the RTX 3090, with 10496 FP32 cores over the same 320-bit memory bus as the RTX 3080. Kopite includes board and SKU numbers (PG133 SKU 15) along a new GPU codename: GA102-250. The performance differentiator against the RTX 3090 stands to be the memory amount, bus, and eventually core clockspeed; memory speed and board TGP are reported to mirror those of the RTX 3080, so some reduced clocks compared to that graphics card are expected. That amount of CUDA cores means NVIDIA is essentially divvying-up the same GA-102 die between its RTX 3090 (good luck finding one in stock) and the reported RTX 3080 Ti (so good luck finding one of those in stock as well, should the time come). It is unclear how pricing would work out for this SKU, but pricing comparable to that of the RX 6900 XT is the more sensible speculation. Take this report with the usual amount of NaCl.





*View at TechPowerUp Main Site*


----------



## Vayra86 (Nov 4, 2020)

Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?


----------



## ExcuseMeWtf (Nov 4, 2020)

Announced Godfall system requirements and suddenly borderline vaporware RTX 3080 non-Ti not looking so hot anymore with its 10 GB


----------



## Deleted member 24505 (Nov 4, 2020)

So chop 4gb off a 3090 and call it a 3080ti


----------



## Accelerator (Nov 4, 2020)

It might be one of earlier ES cards. Just be discoverd recently and be considered as 3080Ti. In 20 series, we have ES cards of 2080Ti, which have 4352CUDA 12GB, 4480CUDA 11GB&12GB, 4608CUDA 11GB&12GB. However, these cards never be released. We can only see them in some second hand trading platform.


----------



## the54thvoid (Nov 4, 2020)

It's all a bit silly when there's absolutely minimal stock in the channel. I'm guessing we'll see these in proper numbers early 2021?


----------



## Chomiq (Nov 4, 2020)

They need to start working on getting the cards to stores at msrp instead of scalping left and right on pre-orders that take forever to deliver.


----------



## delshay (Nov 4, 2020)

Man, I really love competition. I'm expecting a price cut of the 3090 some time in the future (within next three months).


----------



## AnarchoPrimitiv (Nov 4, 2020)

Nvidia might as well announce five more cards that are $1000+ to target a market that represents an exceedingly small amount of consumers and with virtually no stock, so that for all intents and purposes, it only exists on paper, all for the sake of.... Pride? Fuel for online fanboy/trolls?  Maybe instead they should work on actually getting cards on the shelves so people can actually buy them instead of spending efforts on creating more paper launches.


----------



## londiste (Nov 4, 2020)

Vayra86 said:


> Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?


Interesting how Nvidia plans are leaking these days. AIBs, I suppose?
After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).


----------



## ZoneDymo (Nov 4, 2020)

I wonder if the article below is somehow related….


----------



## Space Lynx (Nov 4, 2020)

lol nvidia just got a fire lit under their butt.  its glorious.  jensen get skill boi!  Lisa Su got them skills, chick is straight up level 90 Ninjitsu


----------



## Vya Domus (Nov 4, 2020)

There is something quite surreal about planning to launch a new product that provides maybe 10% more performance at best when the rest of the lineup is still almost nonexistent in stores. Their strategy is also bewildering , this is clearly going to be a 1000$ card competing with the 6900XT but why ? AMD was never going to sell many 6900XTs, why not lower the price of the 3080 or 3070, Nvidia is still acting arrogantly refusing to compete in terms of price but instead offering a largely worthless single digit performance differential. You know who kept doing that as well ? Intel and look what happened, AMD gained a colossal amount of mindshare in just 2-3 years, everyone marvels at every one of their products and scoffs whenever Intel comes out with something new, plus they ended up having to lower their prices anyway.



londiste said:


> Interesting how Nvidia plans are leaking these days. AIBs, I suppose?



That is indeed quite odd, there are usually no significant leaks with regards to Nvidia products not even right up to their launch. But this time it was very different, for instance the FE design was leaked months before release.

But the thing is most of these leaks are more often than not intentional, meant to drive interest in upcoming products.


----------



## ShurikN (Nov 4, 2020)

delshay said:


> Man, I really love competition. I'm expecting a price cut of the 3090 some time in the future (within next three months).


3090 is never going to get price cut that soon if ever, that's why we are getting a 3080Ti


----------



## ratirt (Nov 4, 2020)

londiste said:


> Interesting how Nvidia plans are leaking these days. AIBs, I suppose?
> After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).


It kinda makes sense but these cards will still be a niche. I'd rather focus on the 3070  and 6800 type of performance mostly.


----------



## medi01 (Nov 4, 2020)

I guess this means:


----------



## Vayra86 (Nov 4, 2020)

the54thvoid said:


> It's all a bit silly when there's absolutely minimal stock in the channel. I'm guessing we'll see these in proper numbers early 2021?



I'm having a strong Vega Deja-Vu where company in question realizes they've got a released stack with weak margins that will still get overrun completely. Ampere seems to move to that space rapidly as more information gets out in both camps.

The result was pretty weak availability until much later when the card's performance was hardly competitive. Sure, that was touted as a demand / production problem, too.

Smoke > Fire.



londiste said:


> Interesting how Nvidia plans are leaking these days. AIBs, I suppose?
> After all the rumors, this 20GB 3080Ti does kind of make sense though. From specs it has the sole purpose of matching RX 6900XT (that undercuts 3090 primarily with price).



The 20GB card made sense to begin with, but only because the 3080 is somehow based on 10GB. The VRAM capacity still makes no sense in the larger picture. 12 or 16 would have been much more suitable. Realistically, their 20GB release would completely cannibalize their halo 3090, even before they managed to sell them proper, and also makes the 3080 less competitive in a way.

Watch this gen's VRAM requirements unfold... already its looking like that 3070 might not even be in an optimal place either. This also fits right in with Nvidia pre-empting those console announcements and overpromising on availability. They know they're screwed.


----------



## Vya Domus (Nov 4, 2020)

Vayra86 said:


> The VRAM capacity still makes no sense in the larger picture.



It makes sense to them, see 10 GB appears to be too little and 20 GB too much. Physiologically you need to either settle for something that you know it's not quite enough or bite the bullet and pay the extra 300$ or whatever this one will cost.


----------



## Vayra86 (Nov 4, 2020)

Vya Domus said:


> It makes sense to them, see 10 GB appears to be too little and 20 GB too much. Physiologically you need to either settle for something that you know it's not quite enough or bite the bullet and pay the extra 300$ or whatever this one will cost.



Doesn't fit in with the previous Nvidia gens where every time they offered sufficient VRAM with some wiggle room on top in the high end. Nvidia still has some outs like DLSS and other proprietary stuff they announced with Ampere but still... I think they got blindsided here. Above all Nvidia is a company that wants to sell and wants a positive mindshare. That is what brought them where they are now. Not shitty releases that aren't quite enough. This is an exception to a rule.


----------



## TumbleGeorge (Nov 4, 2020)

10GB...12GB via 384 bit vs 20GB via 320(?) bit vs 24GB via 384 bit? 

Guess:
RTX 3080 10GB -$100-150 from $699 to $599-549 to compete with RX 6700 XT?
RTX 3080 12GB for compete with RX 6800(without  "XT")
RTX 3080 ti 20GB for compete with RX 6800 XT; RX 6900 XT and with RTX 3090 too  for good compete with RX 6800 XT, RTX 3080 ti 20 GB must got price equal to MSRP of RTX 3080 10GB in release date ($699) or something close $749-799(?)  


Panic defense?


----------



## ne6togadno (Nov 4, 2020)

marketing wars.
nvida has troubles to provide quantities from their new lineup. 
ppl on the internet consider canceling preorders and switching to "big navi"
so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
minutes later leaker leaks news for 3080ti with 20gb vram (because you know 20>16).
i'll put a pint on bet that we'll see more ti/super leaks soon.
and another pint that godfall with 4k ultra settings will play @60 fps  just fine on 8gb card if gpu itself has enough hp to do 4k@60


----------



## The Quim Reaper (Nov 4, 2020)

People missing out on early 3080 stock haven't just dodged a bullet but a full on artillery barrage...


----------



## Vayra86 (Nov 4, 2020)

ne6togadno said:


> marketing wars.
> nvida has troubles to provide quantities from their new lineup.
> ppl on the internet consider canceling preorders and switching to "big navi"
> so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.
> ...



You can run games with insufficient VRAM just fine. Stutter doesn't always appear in reviews and canned benchmarks. But I'm sure the W1zz is right on the money trying to figure that out if the moment arrives. Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.

Does that all TRULY mean there is enough VRAM though? That is debatable. The real comparison is side by side and careful analysis of frametimes. Gonna be interesting.


----------



## londiste (Nov 4, 2020)

Vayra86 said:


> The 20GB card made sense to begin with, but only because the 3080 is somehow based on 10GB. The VRAM capacity still makes no sense in the larger picture. 12 or 16 would have been much more suitable. Realistically, their 20GB release would completely cannibalize their halo 3090, even before they managed to sell them proper, and also makes the 3080 less competitive in a way.


You are looking at the reasoning from the wrong side. 12 or 16 do not make inherently more sense than 10 or 20, we are just more used to seeing these capacities.

GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.



Vayra86 said:


> Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.


There were no real bugs to be fixed. that whole 3.5+0.5GB was a design problem in hardware. From what I could see at the time what Nvidia eventually did as a workaround was forcing the basic usage like Windows' Aero stuff and other non-gaming stuff (that did not care about speed that much) to that last 0.5GB and using that for 3D load only when absolutely necessary. It did help that there were only a couple games in first couple years of GTX970's life span that actually had memory consumption in that 3.5-4GB range effectively making the impact surface surprisingly small.


----------



## Vayra86 (Nov 4, 2020)

londiste said:


> You are looking at the reasoning from the wrong side. 12 or 16 do not make inherently more sense than 10 or 20, we are just more used to seeing these capacities.
> 
> GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.



I agree, I think this started with the choice for Samsung and having to adapt to those limitations. But in the end we're looking at these products as customers right? 10GB doesn't seem like the optimal selling point for capacity, why would Nvidia design something that's a choice of evils? They even know they have work to do to gain parity with consoles if you look at some of the tech they announced, like RTX IO. You're right, lots of considerations, I question whether they made the right ones.


----------



## owen10578 (Nov 4, 2020)

londiste said:


> GA102 has a 384bit memory bus that Nvidia can play with. For some reason they did not want to do a full-width card as x80, product segmentation is definitely a big reason but I would suspect not the only one (also power or yields perhaps) especially with the rumored 3080Ti still having 320-bit bus. 16gb would mean going down to 256-bit memory bus and they seem to want to avoid going there probably because of sizable hit to bandwidth. Basically, lots of considerations.



Nah definitely not yield reasons, if you look at all the 3080 cards they all have missing VRAM chips on the same spot. Now if the GPU dies are binned for bad memory controllers then its impossible for all of them to have a defective controller all in the same channel. This is purely a product segmentation reason. Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most. They kept the "3080Ti" if true at 320-bit just so the 3090 still have some sort of advantage at least on paper, but I suspect it won't have a real performance impact other than in some edge cases.


----------



## londiste (Nov 4, 2020)

ne6togadno said:


> so amd puts oil in the fire by asking partner to announce 12gb vram requirement (which obsoletes 2/3 of current nvidia lineup and 100% of last gen) for their upcoming game.


Imagine the outrage if Nvidia did something like that 



owen10578 said:


> Power reasons not so much either, adding an extra 64-bit channel won't add much more than probably 10W at the most.


This is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.

For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?

Just as an example, the first 3080 GPU-Z screenshot Google search returned:


----------



## ZoneDymo (Nov 4, 2020)

ShurikN said:


> 3090 is never going to get price cut that soon if ever, that's why we are getting a 3080Ti



sadly this^ Nvidia does not drop pricing because that would mean admitting fault to a degree.
Instead they just launch new SKU's to compete.


----------



## owen10578 (Nov 4, 2020)

londiste said:


> This is something that seems very strange to me for 3080/3090. Unless GPU-Z returns crap data, MVDDC usage is quite high and GPU seems to draw less power than I would expect from total. Check the GPU Chip Power Draw and MVDDC Power Draw results in GPU-Z. The numbers themselves vary but the relative amounts seem surprising to me. Also, 3090 cards seem to have heavy-duty backplates, I do not remember seeing backplates with heatpipes before.
> 
> For comparison, my 2080 has 200-210W of the reported power draw on the GPU and about 20W on RAM plus minor amounts left over for other stuff. I seem to remember GPU taking the majority of power budget from earlier generations as well. Was something changed in how this stuff is reported or is GDDR6X really this power hungry?



Yes GDDR6X is much more power hungry on the controller and memory chips it seems, seeing all the more thought out memory cooling solutions. But again just one extra chip and an extra 64-bit channel won't draw that much more power. I'd be very surprised if someone can prove me wrong on this.


----------



## Calmmo (Nov 4, 2020)

So.. basically, nvidia have been scrambling trying various different configs to fill in the perf/price gap with AMD.


----------



## TumbleGeorge (Nov 4, 2020)

Nvidia's problem is complacency in the past has fallen over the years. When they led and everything was allowed to them, including playing as they should with the prices of their products. They were literally zombies to themselves that this would go on forever and that they shouldn't even make much of an effort to maintain the status quo.


----------



## RedelZaVedno (Nov 4, 2020)

LOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.

2080TI $999 -> 3080TI +$999 probably,
2080 $699 -> 3080 $699
2070 $499 -> 3070 $499
2060S 399$ -> 3060TI $399 probably

xx60 MID RANGE class GPU costing the same amount as an entire console. Pure madness.


----------



## puma99dk| (Nov 4, 2020)

I am not surprised this is properly the reason they cancelled the 20GB version of the RTX 2080 and it would get too expensive.

Welcome to the Nvidia Screw-over train, we take your money and screw you over after a little while 

Same happened to Titan Pascal owners with the when GTX 1080 Ti launched it was cheaper, faster for a lot of things then the Titan Pascal was then we out phase because we can't have a card that can out perform our Titan, and then a new Titan Pascal was launched with even more CUDA cores


----------



## Shatun_Bear (Nov 4, 2020)

This might be the first Ampere card that's not hamstrung by small memory outside of the 3090.

Unfortunately it'll be priced $200 too high most probably.


----------



## ShurikN (Nov 4, 2020)

RedelZaVedno said:


> Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.


I'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.


----------



## SIGSEGV (Nov 4, 2020)

Panic mode. 
I am so happy with this news. haha. 
another nuclear reactor to warm up the house in the upcoming winter season. 
surviver?


----------



## Legacy-ZA (Nov 4, 2020)

This is going to grind the gears of early adopters, want to know what will grind them even more? When refreshes pop up on the 7nm node next year, they will probably be called, the super hyper ultra edition for lolz. Then we can also have more models to pick from for extra confusion, at prices that aren't mainstream... and of course, you can have one for a mere $1 000 000 from a scalper near you.


----------



## ne6togadno (Nov 4, 2020)

Vayra86 said:


> You can run games with insufficient VRAM just fine. Stutter doesn't always appear in reviews and canned benchmarks. But I'm sure the W1zz is right on the money trying to figure that out if the moment arrives. Nvidia can deploy a lot of driver trickery to still provide a decent experience, they've done similar with the 970 for example - all the VRAM related bugs were fixed on that GPU.
> 
> Does that all TRULY mean there is enough VRAM though? That is debatable. The real comparison is side by side and careful analysis of frametimes. Gonna be interesting.


trough the years of playing games and following games requirements i have impression that devs decide game requirements and vram in particular either by vga cards available or soon to come (a month or 2 max) on the market or more often just  by playing darts.

on my 290x 4gb  i've played titles that "required" 6 or even 8gb vram just fine. i've dialed up texture quality above recommended if i didnt liked how the game looked and still hadnt problems because of lack of vram. so judging what is enough based on game requirements is a bit pointless.
set a price range. check what meets your performance requirements. buy the card with highest amount of vram that fit your price and you are good to go. by the time the games look too ugly because you had to lower textures the card would be long dead fps wise.
as for 970 the problems never was in the amount of vram. slow 0.5gb was what caused the problems as it tanked performance very hard. when nivida isolated those 0.5gb with drivers 970s worked fine even with titles that required 4+ gb vram.

on tech level both camps have different approach for solving vram limitation.
nvidia's lossless compression allows them to have lower capacities and bus but to preserve higher performance. so they fit as min as possible memory for bigger margins.

with gnc amd had to throw a lot of memory bandwidth (bus for 7970 was 384bit, 290x was 512bit, fury, vega and vii were 1024bit) to provide enough "fuel" for gpu but it never was enough. from rdna amd have memory bus topped at 256bit which before was for their midrange cards (no doubts 5700xt itself is midrange card) and now with rdna2 even their top tier 6900 has 256bit bus. sure new cash provides higher speeds but still you need to feed this cash with adequate speeds and amd thinks that what was before bus suitable for mid range cards is now enough even for flag ship.
i think 16gb ram in amd's cards is more targeted at feeding the cash (like load all textures in vram so cash can have instant access w/o need of calls from ram/storage) and/or they believe the can have significant performance boost from direct cpu access to vram so they made sure they provide enough vram for devs to play with.
it will be interesting to see if those thing will really help amd



londiste said:


> Imagine the outrage if Nvidia did something like that


i dont have to imagine anything. they already did it with hairworks, forced tessellation and gameworks (or wharever it was called) extensions and physx. i dont remember the outrage thou. 
now that amd holds consoles and devs has to do optimization for amd's hardware the coin has flipped and nvidia is quite jumpy when something becomes close to take away "performance crown". 
a single game announcement is enough to cause... leakages  
btw physx is open source for some time now


----------



## okbuddy (Nov 4, 2020)

$50 cheaper than 3090


----------



## RedelZaVedno (Nov 4, 2020)

ShurikN said:


> I'm pissed a lot, but there is nothing to do, considering AMD will follow suit with prices. The price war has died.



Yeah, it seems like AMD has chosen higher profit margins over gaining larger market share. I'm getting out of the GPU market, holding on to 1080TI as long as I can and then buy something for 300 bucks on 2nd hard market. I'm unwilling to support greed.


----------



## Sicofante (Nov 4, 2020)

You all think this is just another gaming card? How naive... 

The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering. 

The 3080 Ti fills the gap for low end workstations. The 3080 10 GB just don't cut it for rendering or even complex video editing and FX. AMD's 6900 XT was looking like the right purchase until this announcement. 

That's why this 3080 Ti makes tons of sense outside gaming. I for one, will buy it the instant I can find it in stock.


----------



## Calmmo (Nov 4, 2020)

Sicofante said:


> The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.



lol, no.
Those VFX studios are buying Mac Pro's


----------



## Sicofante (Nov 4, 2020)

Calmmo said:


> lol, no.
> Those VFX studios are buying Mac Pro's


Clueless.


----------



## Nater (Nov 4, 2020)

Who cares. 

"Auto-Notify"
"Not-In-Stock"
"Backordered"


----------



## Calmmo (Nov 4, 2020)

Sicofante said:


> Clueless.


bro, they don't even have double precision or driver support. Wake up


----------



## ShurikN (Nov 4, 2020)

Calmmo said:


> lol, no.
> Those VFX studios are buying Mac Pro's


And quadros


----------



## CrAsHnBuRnXp (Nov 4, 2020)

Called it. I said from the start when the 3080 20GB was rumored it would be a TI version.


----------



## N3M3515 (Nov 4, 2020)

Vya Domus said:


> Nvidia is still acting arrogantly refusing to compete in terms of price but instead offering a largely worthless single digit performance differential.



Pride.


----------



## medi01 (Nov 4, 2020)

Leaked video:


----------



## N3M3515 (Nov 4, 2020)

RedelZaVedno said:


> LOL, Jensen has done it again! Price hikes of Turing are here to stay and no one seems to be pissed about it anymore.
> 
> *1080Ti $700* - 2080TI *$1200* -> 3080TI *+$1200* probably,
> *1080 $500* - 2080 $699 -> 3080 $699
> ...



Added for perspective.


----------



## SIGSEGV (Nov 4, 2020)

Sicofante said:


> You all think this is just another gaming card? How naive...
> 
> The 3090 is a workstation class GPU



Oh pls god, not this kind of shit again.


----------



## nguyen (Nov 4, 2020)

If I'm not mistaken Nvidia kept their secrets really close to their chests, no leaker has got the core counts right for Ampere line ups, wouldn't surprise me if these leaked specs turns out false.


----------



## Durvelle27 (Nov 4, 2020)

I really don't understand. Why release a 3080Ti that only offers upto 10% over the standard 3080. That makes little sense as AIBS will likely run $1200+. So it would make the 3090 Obsolete which is also targeted at gaming as its not a Titan Replacement


----------



## medi01 (Nov 4, 2020)

nguyen said:


> If I'm not mistaken Nvidia kept their secrets really close to their chests, no leaker has got the core counts right for Ampere line ups, wouldn't surprise me if these leaked specs turns out false.



Leaked CUs were spot on, then Huang thought "hey, since I can do fp+fp, why don't I claim double /CU figure, for naive and clueless".



Durvelle27 said:


> I really don't understand. Why release a 3080Ti that only offers upto 10% over the standard 3080.


Did you get "why release card 20-30% faster than 2080Ti for $699"?

AMD is kicking NV's ass, NV is trying to react.
They don't seem to be able to make money with 3080, Ti for die hard fans who are not die hard enough to pay $1500 is coming for that market.


----------



## londiste (Nov 4, 2020)

N3M3515 said:


> 1080Ti $700 - 2080TI $1200 -> 3080TI +$1200 probably,
> 1080 $500 - 2080 $699 -> 3080 $699
> 1070 $350 - 2070 $499 -> 3070 $499
> 1060 $250 - 2060S 399$ -> 3060TI $399 probably
> ...


So, this gets a bit tricky.
1080 was released at $600 and got reduced to $500 10 months later when 1080Ti was released.
1070 was released at $380.

Turing is out of their usual line in any case but Ampere is only out of line when you look at the card/GPU names. The chips behind these names are not that far off.
1080Ti was (cut down) GP102 at $700 - 3080 is (cut down) GA102 at $700
1080 was GP104 at $500 - 3070 is GA104 at $500
1070 was (cut down) GP104 at $380 - no direct Ampere match yet.



Durvelle27 said:


> I really don't understand. Why release a 3080Ti that only offers upto 10% over the standard 3080. That makes little sense as AIBS will likely run $1200+. So it would make the 3090 Obsolete which is also targeted at gaming as its not a Titan Replacement


It is a response to 6900XT which should make 3090 obsolete in any case.



medi01 said:


> Did you get "why release card 20-30% faster than 2080Ti for $699"?
> AMD is kicking NV's ass, NV is trying to react.


Lets wait for reviews. Right now, AMD Big Navi seems to have basically matched what Nvidia has in Ampere. They undercut the 3090 with 6900XT which isn't likely to matter to a lot of people and they found a nice niche for 6800 with 3 SKUs of the high end die compared to 2 SKUs that Nvidia has.

When it comes to prices - it is all about consoles. $500 box with a (hopefully) 2080/2080S class performance is the competition. Both Nvidia and AMD need to one-up that at the same price which is what 3070 did and AMD's response to that will do as well. Anything more expensive better come with a hefty performance boost over that and thus the cards we got from both.


----------



## AusWolf (Nov 4, 2020)

Another day, another nvidia rumour.


----------



## moproblems99 (Nov 4, 2020)

Sigh....I really don't care which gpu I get as long as it is 20% or more over a 2080ti.  I want one available before CP2077.  Red, green, blue.  Don't care.


----------



## N3M3515 (Nov 4, 2020)

londiste said:


> So, this gets a bit tricky.
> 1080 was released at $600 and got reduced to $500 10 months later when 1080Ti was released.
> 1070 was released at $380.
> 
> ...


Price hike anyway, end users don't care about codenames.


----------



## thesmokingman (Nov 4, 2020)

That's funny, they be reacting with more vapor cards that won't exist for a long time.


----------



## TumbleGeorge (Nov 4, 2020)

thesmokingman said:


> That's funny, they be reacting with more vapor cards that won't exist for a long time.


After Jensen semi-baked  30* gen in his oven, now its got boiled with water?! This is likely to cause hysteria in some beta testers.

I agree for conclusion for hard times for big scalpers, who bet too much on Nvidia, despite the success of those who bought only a few cards that they managed to realize.


----------



## Makaveli (Nov 4, 2020)

RedelZaVedno said:


> Yeah, it seems like AMD has chosen higher profit margins over gaining larger market share. I'm getting out of the GPU market, holding on to 1080TI as long as I can and then buy something for 300 bucks on 2nd hard market. I'm unwilling to support greed.



That is how this works when you are publicly traded company with shareholders.

Any company in this position would do the same so I guess they would all be greedy based on what you are saying.


----------



## xman2007 (Nov 4, 2020)

Funny at people hating on AMD, Ngreedia are the ones who set the absurd prices with ampere, and most of you probably went out and bought them, now AMD has parity and are pricing their gpu's the same their the bad guy, let's face it, those complaining would have only waited for nvidia to drop their prices when AMD had an equal performer at a lower price, but they seem to be equal or even better and getting hate because they're not half the price of nvidia


----------



## Dave65 (Nov 4, 2020)

Nvidia seems to be in panic mode over 16 gb in the AMD flavor. And on top of all that it will be a paper launch like the rest, not saying AMD won't be the same tho..


----------



## Max(IT) (Nov 4, 2020)

Here we go...  nvidia filling the gaps artificially created since the beginning.


----------



## TumbleGeorge (Nov 4, 2020)

Dave65 said:


> Nvidia seems to be in panic mode over 16 gb in the AMD flavor. And on top of all that it will be a paper launch like the rest, not saying AMD won't be the same tho..


...If more VRAM is only reason, but panic mode is because reasons are more than one. Mostly of them were exposed but real number of reasons will be count maybe after RX 6000 reviews or later.


----------



## TheLostSwede (Nov 4, 2020)

Vayra86 said:


> Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?


Why wait until then, when you can have it today?








						GIGABYTE GeForce RTX 3060 Ti EAGLE already on sale in Saudi Arabia - VideoCardz.com
					

Saudi Retailer already offers GeForce RTX 3060 Ti from Gigabyte Silicon Valley, a retailer from Saudi Arabia is already offering GeForce RTX 3060 Ti graphics card, according to the company’s website. The seller claims that the card is already in stock and is being offered in stores and online...




					videocardz.com
				











						NVIDIA GeForce RTX 3060 and RTX 3050 Ti rumored to feature GA106 GPU - VideoCardz.com
					

A fresh rumor from Kopite7kimi about GA106-based GeForce RTX 30 graphics cards. NVIDIA GA106 GPU coming to RTX 3060 and RTX 3050 Ti? The Twitter user that brought us numerous confirmed leaks in the past has now provided an update on GA106-based models. According to his information, both RTX 3060...




					videocardz.com


----------



## Max(IT) (Nov 4, 2020)

TheLostSwede said:


> Why wait until then, when you can have it today?
> 
> 
> 
> ...


Well, that is really strange...


----------



## InVasMani (Nov 4, 2020)

Unless it comes with a leather jacket not interested. I'd say Nvidia appears quite concerned though to be prepping all these additional SKU's to drop or at least be announced. It's clear enough AMD has got them at least a little bit startled and that's probably great for everyone that there appears to be some healthy competition.


----------



## MxPhenom 216 (Nov 4, 2020)

ExcuseMeWtf said:


> Announced Godfall system requirements and suddenly borderline vaporware RTX 3080 non-Ti not looking so hot anymore with its 10 GB



For 4K. Relax.


----------



## HD64G (Nov 4, 2020)

Panic button for nVidia. I think they will stop making 3090s and will be sell the dame die with a bit less VRAM as a 3080Ti in order not to cut the 3090's crazy price since 6900XT made it irrelevant. And 3080Ti will go on sale for $999-1099 to match 6900XT's place in the market. My 5c.


----------



## InVasMani (Nov 4, 2020)

Yeah I think RNDA2's been a bit disruptive to Nvidia's comfort levels with Ampere's performance and price structure for certain. I wouldn't say Nvidia is "worried", but at the same time they certainly aren't going to ignore it and hope it goes away it's a real threat to potential profits and market share and they want all the monies.


----------



## Sicofante (Nov 4, 2020)

Calmmo said:


> bro, they don't even have double precision or driver support. Wake up


Again: clueless. 3D rendering uses single precision and Studio drivers are good enough. That's why most studios don't use Quadros anymore. Even Nvidia is phasing out Quadros...



ShurikN said:


> And quadros


Nope, sorry. VFX Studios haven't used Quadros for a while and you definitely don't need them for Maya or Blender.



SIGSEGV said:


> Oh pls god, not this kind of shit again.


What kind of shit exactly?

You all take a look at what Puget Systems, for instance, is putting in the workstations they sell. Read a little.


----------



## lexluthermiester (Nov 4, 2020)

Vayra86 said:


> Again? Oh of course, the rumored Nvidia Ampere stack is full, as they've already had a 3060ti, 3070ti... next week we'll probably see another bit of rumor about the 3060ti. Rinse and repeat?


I'm ok with a full product stack. I'd like to see AMD follow that strategy. Lots of choice means people can fill their needs with something that fits into their budget. Given what we know about how binning works, it only make sense.


----------



## Vya Domus (Nov 4, 2020)

Sicofante said:


> You all think this is just another gaming card? How naive...
> 
> The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.
> 
> ...



The only naive thing is believing that the 3090 really is a workstation card. It's hardly halfway there, it's compute capabilities are severely cut down and it only has half the memory of the actual workstation cards which are the Quadros.


----------



## ExcuseMeWtf (Nov 4, 2020)

MxPhenom 216 said:


> For 4K. Relax.



Did that look stressed to you?

Or do you plan to buy RTX 3080 for exclusive 1080p 60 fps experience?


----------



## AusWolf (Nov 4, 2020)

lexluthermiester said:


> I'm ok with a full product stack. I'd like to see AMD follow that strategy. Lots of choice means people can fill their needs with something that fits into their budget. Given what we know about how binning works, it only make sense.


The thing is, nvidia already has a full product stack:

it starts with the almost ready 3060,
then at least 3 different variants of the non-existent 3060 Ti,
the only-real-product 3070,
then again 5 dreamed up variants of the 3070 Ti,
the gone-with-the-wind 3080 with 4 theoretical Ti revisions,
the can't-touch-this 3090 and the Maybe Titan to top it all off.
I'm pretty sure Jensen is cooking the latest Super news as we speak just to satisfy us even more.


----------



## Nater (Nov 4, 2020)

AusWolf said:


> The thing is, nvidia already has a full product stack:
> 
> it starts with the almost ready 3060,
> then at least 3 different variants of the non-existent 3060 Ti,
> ...


Man I just heard the Street Fighter announcer.

Nvidia RTX QUADRO GEFORCE 3100X Ti NITRO SUPER TOURNAMENT ULTRA TITAN EDITION!


----------



## Deleted member 190774 (Nov 4, 2020)

You'd hope NVidia were working on a 3080Ti, but honestly, it's too soon. If I'd bought a 3080 at release, I would be fairly annoyed by a successor (flagshhip model) arriving so soon.

Clearly a knee-jerk reaction to AMD's line-up.


----------



## medi01 (Nov 4, 2020)

xman2007 said:


> Funny at people hating on AMD, Ngreedia are the ones who set the absurd prices with ampere, and most of you probably went out and bought them, now AMD has parity* and are pricing their gpu's the same their the bad guy*, let's face it, those complaining would have only waited for nvidia to drop their prices when AMD had an equal performer at a lower price, but they seem to be equal or even better and getting hate because they're not half the price of nvidia



You are getting that wrong.
It is the other way around, Huang new Navi is coming so no more fun with pricing.
Compare it to turing, 2080Ti + 20-30% for $699, why so cheap?


----------



## Bones (Nov 4, 2020)

medi01 said:


> I guess this means:
> 
> View attachment 174363


Nah - More like it's safe to get those bots ready to buy.......
And they will if history is any indication of things. 
Lather, rinse, repeat.


----------



## InVasMani (Nov 4, 2020)

ExcuseMeWtf said:


> Did that look stressed to you?
> 
> Or do you plan to buy RTX 3080 for exclusive 1080p 60 fps experience?


 Based on what I've seen of RNDA2 for 4K and 1440p and how things scale and tilt significantly going from 4K down to 1440p I wouldn't hold my breath on Ampere being better at 1080p on value for dollar. We shall see though in due time how it plays out and unfolds.


----------



## MxPhenom 216 (Nov 4, 2020)

ExcuseMeWtf said:


> Did that look stressed to you?
> 
> Or do you plan to buy RTX 3080 for exclusive 1080p 60 fps experience?



I am going to buy what ever is fastest and available Q1 next year. I don't care if its red or green.


----------



## xman2007 (Nov 4, 2020)

medi01 said:


> You are getting that wrong.
> It is the other way around, Huang new Navi is coming so no more fun with pricing.
> Compare it to turing, 2080Ti + 20-30% for $699, why so cheap?


it's not cheap though, and you still have $500/$700 and now $1500 pricing and as per the latest news, 2080ti will slot nicely into $800+ price point, regardless of performance improvement, here's the thing, with new GPU's the performance is MEANT to improve on last gen, go figure huh? that doesn't mean that in 2023 you should be paying $1500 for **70 cause 3x better than 2070 in 2019


----------



## Paganstomp (Nov 4, 2020)

get to work you Chinese slaves! Americans want their product!


----------



## lexluthermiester (Nov 4, 2020)

Paganstomp said:


> get to work you Chinese slaves! Americans want their product!


Actually, it's Korea and Taiwan.


----------



## WeeRab (Nov 4, 2020)

Sicofante said:


> You all think this is just another gaming card? How naive...
> 
> The 3090 is a workstation class GPU that will be sold in droves to VFX studios and freelancers, who will buy them in pairs to use NVlink and get a much needed 48 GB for 3D rendering.
> 
> ...



More fool you then.  The 3090 doesn't come with the production-specific drivers that would make it that good outside of gaming. (nor will a 3080ti)
 All that will come with the £3000 Titan variant.


----------



## wolf (Nov 5, 2020)

medi01 said:


> AMD is kicking NV's ass, NV is trying to react.


They haven't even launched their products yet, no consumers have them and Nvidia is getting it's ass.... kicked?

RIP Nvidia right? How could they possibly survive this flogging.


----------



## gmn 17 (Nov 5, 2020)

If its US$899 with these specs, then I’ll get one


----------



## Xex360 (Nov 5, 2020)

Besides availability, they just to adjust their pricing, the 3070 should be priced around 450$, 3080 600$, and obviously make the same prices everywhere, where there is no excessive taxes.


----------



## InVasMani (Nov 5, 2020)

wolf said:


> They haven't even launched their products yet, no consumers have them and Nvidia is getting it's ass.... kicked?
> 
> RIP Nvidia right? How could they possibly survive this flogging.


 I get what you're saying, but Nvidia is certainly reacting though getting it's ***...kicked is a healthy exaggeration of course. It's one generation of GPU's and sure AMD might have the better architecture, but they still have to move the product and have enough SKU's to remain flexible with the amount Nvidia offers that could take away sales that might go to AMD. Plus beyond all of that Nvidia financially is very sound they could probably have 3 relatively terrible generations of architectures kind of on par with Fermi and still on a fairly even footing with AMD. Right now Nvidia's financial situation is more sound than AMD's by a larger margin Ampere will in no way make or break the company. On the other hand RNDA2 won't make or break AMD either, but it will make a notable impact to their future if they execute well and move enough of those chips with enough flexible option points. How that all plays out is still somewhat undecided. I think AMD has a good design based on what I see, but honestly the RX6800 is a bit higher price and TDP than I'm leaning towards though it's close. If they can scale it down a slight amount further at a better power draw and price that would be great I want to see what a RX6700 or RX6750 looks like possibly even a RX6600 or RX6650.


----------



## Metroid (Nov 5, 2020)

Yeah this was the only choice they have at moment. I predicted this.


----------



## wolf (Nov 5, 2020)

InVasMani said:


> I get what you're saying, but Nvidia is certainly reacting though getting it's ***...kicked is a healthy exaggeration of course. It's one generation of GPU's and sure AMD might have the better architecture, but they still have to move the product and have enough SKU's to remain flexible with the amount Nvidia offers that could take away sales that might go to AMD. Plus beyond all of that Nvidia financially is very sound they could probably have 3 relatively terrible generations of architectures kind of on par with Fermi and still on a fairly even footing with AMD. Right now Nvidia's financial situation is more sound than AMD's by a larger margin Ampere will in no way make or break the company. On the other hand RNDA2 won't make or break AMD either, but it will make a notable impact to their future if they execute well and move enough of those chips with enough flexible option points. How that all plays out is still somewhat undecided. I think AMD has a good design based on what I see, but honestly the RX6800 is a bit higher price and TDP than I'm leaning towards though it's close. If they can scale it down a slight amount further at a better power draw and price that would be great I want to see what a RX6700 or RX6750 looks like possibly even a RX6600 or RX6650.


Exactly, it's quite the exaggeration, Nvidia is likely just getting ducks in a row to effectively answer AMD products, like maybe price tweaks or additional SKU's, but this notion that they're getting they ass kicked and are sweating bullets or in a corner crying and is just ridiculous. They will just do what they need to do and you know what, they'll be just fine at the end of the day. To say for the past ~ two generations AMD got it's ass kicked would be far more accurate.


----------



## dinmaster (Nov 5, 2020)

amd having the console market will help a lot with optimizing for their products even on pc. The cpu - gpu connection which lets the cpu use gpu resources is probably something that came from console rnd. Amd is in a good position like their cpu's for a come back. sure rdna 1, 2, then 3 like the zen 3 could be a push ahead, i like the strategy that amd is executing currently. If they master the power usage then they will have a real winner on their hands even if performance is slightly behind. power here has gone up and on peak times are 20.5c cad per kwh...


----------



## purecain (Nov 5, 2020)

Didnt the new PlayStation use a weaker GPU compared to the new Xbox due to the infinity cashe helping out. They state the performance lift is there so the 5900x and the 6900xt look like the cpu/gpu combo to go for. 
  Even if Nvidia release a 3080ti now, the damage has already been done. Their availability and Nvidia's bad choices have really let them down this time around. I bet those who bought the 3080 are not feeling too great about the ti model being released so quickly after they were reassured there wasnt going to be one(not for a while in any case). 
Although whats announced and what actually gets to market, are two different things altogether this year. 

    I'm praying AMD pull it off and get back on top form rolling into 2021 as top dog... I cant wait to have Radeon card in what will be an almost fully AMD system (again).


----------



## wolf (Nov 5, 2020)

purecain said:


> I bet those who bought the 3080 are not feeling too great about the ti model being released so quickly after they were reassured there wasnt going to be one(not for a while in any case).


Not even slightly feeling what you suggest. Tech ALWAYS gets superseded, the sooner or later changes but it always happens, anyone buying any tech should be comfortable with that notion or you're setting yourself up for disappointment. I got a massive upgrade (up from a GTX1080) at a very justifiable price point for the 200%+ performance uplift I got, considering I bought the GTX1080 launch week too, and have already been enjoying this performance level since launch (I will grant that while I put in some work to secure the order, I am 'lucky'). Add to that the card runs cool and virtually silent, I am still extremely happy with it and have Finished Control, HZD and a full DOOM Eternal Replay on it so far.


----------



## evernessince (Nov 5, 2020)

Nvidia is pulling out all the stops to distract people from upcoming RDNA2 cards.


----------



## purecain (Nov 5, 2020)

I'm glad you feel no remorse, i know i would feel a little sore about it though personally.  I'm glad i lost my 3090 strix oc pre order. Im looking forwards to seeing amd's cpu and gpu work together, the pixel fillrate is double what my titan V can output and the performance of that card is amazing. I'm also interested in the performance uplift of just the 5900x over the 3900x.


----------



## thesmokingman (Nov 5, 2020)

Sicofante said:


> Again: clueless. 3D rendering uses single precision and Studio drivers are good enough. That's why most studios don't use Quadros anymore. Even Nvidia is phasing out Quadros...
> 
> 
> Nope, sorry. VFX Studios haven't used Quadros for a while and you definitely don't need them for Maya or Blender.
> ...



While all that may be true, it still doesn't make the 3090 a workstation class card. And workstation class doesn't necessarily equal a render station which can get by single or half precision. The 3090 would get pounded by an old VII in double precision so you'd never pick it for a scientific workload, ie. not a workstation card. It's a halo product!


----------



## DeathtoGnomes (Nov 5, 2020)

This so called leak is a fishing expedition to see if there would be enough interest in a Ti revision. The small gains would not warrant a competitive price tag with the 3090 and in fact would likely even take away from 3090 sales, because "everyone knows a Ti is the best you can get!"


----------



## thesmokingman (Nov 5, 2020)

DeathtoGnomes said:


> This so called leak is a fishing expedition to see if there would be enough interest in a Ti revision. The small gains would not warrant a competitive price tag with the 3090 and in fact would likely even take away from 3090 sales, because "everyone knows a Ti is the best you can get!"



Yea, that just highlights that it's a problem Nvidia put themselves into with only a 10% performance gap between the 3080 and 3090. And even if they made a 3080ti it would still lose to a 6900xt albeit if the 6900xt turns out to really be comparable to a 3090. There's no where for them to go with a 6900xt at priced at 1k.


----------



## InVasMani (Nov 5, 2020)

wolf said:


> Not even slightly feeling what you suggest. Tech ALWAYS gets superseded, the sooner or later changes but it always happens, anyone buying any tech should be comfortable with that notion or you're setting yourself up for disappointment.


 That's true tech innovation never lasts forever and normally quickly replaced with something better. The real question is how much better and at what value point. I think AMD's done well with RNDA2 from it appears like, but they defiantly gotta get more SKU's out there and the sooner they do so the better is my feeling.

I feel like 6 more SKU's from AMD would be smart add 3 below the RX6800 with 44/48/52CU's and 3 above the RX6800XT with 88/92/96CU's. The both designs could reduce the infinity fabric to 96MB with the higher end design swapping out the GDDR6 for HBM2 to offset the bandwidth while reducing power and allow for die space for the additional CU's. I think I'd just retain the same bus width or make one 192-bit and the other 320-bit. Probably keeping the bus the same would be easiest and less involved, but either way. I think though with those CU value's and the same board design and infinity cache size it would make it easier to ramp up both. The less than ideal die's could be cut in half and turned into low end models while the better quality ones used for the higher end models.


----------



## beautyless (Nov 5, 2020)

Adding more cu above existing design take time. They can just overclock 6900XT for 5% and call it 6900XTX.


----------



## TumbleGeorge (Nov 5, 2020)

RTX 3090 is semi-pro/semi-game not "full" pro. Problem with RTX 3090 - solved!


----------



## lexluthermiester (Nov 5, 2020)

wolf said:


> Exactly, it's quite the exaggeration, Nvidia is likely just getting ducks in a row to effectively answer AMD products


That is exactly right. The next few years are going to be exciting to watch as AMD battles it out with both Intel and NVidia. As of right now, AMD is giving Intel serious trouble. They might be ready to do the same to NVidia. Also, you are completely correct about the comment above, "Ass-kicking" is quite the over-statement.


----------



## Crackong (Nov 5, 2020)

I am glad I did not get a regular 3080 now. 
So much relieved.


----------



## ratirt (Nov 5, 2020)

So this 3080Ti 20GB would compete with 6800 XT? It will definitely be slower than 3090. Or maybe it will be the gap-fill between 3080 and 3090 although there isn't much to fill anyway.


----------



## londiste (Nov 5, 2020)

ratirt said:


> So this 3080Ti 20GB would compete with 6800 XT? It will definitely be slower than 3090. Or maybe it will be the gap-fill between 3080 and 3090 although there isn't much to fill anyway.


Why 6800XT? Quite sure it will compete with 6900XT.


----------



## ratirt (Nov 5, 2020)

londiste said:


> Why 6800XT? Quite sure it will compete with 6900XT.


Well the 3090 competes with 6900xt so if 3080 Ti (I think ) will be slower than a 3090 so stuck between the 6900xt and 6800xt i suppose. Wonder what the price will be for that 3080 TI $1200 or lower?


----------



## londiste (Nov 5, 2020)

ratirt said:


> Well the 3090 competes with 6900xt so if 3080 Ti (I think ) will be slower than a 3090 so stuck between the 6900xt and 6800xt i suppose. Wonder what the price will be for that 3080 TI $1200 or lower?


3090 also has $1500 MSRP vs 6900XT at $1000. The easiest response Nvidia has is a new SKU.


----------



## ZoneDymo (Nov 5, 2020)

londiste said:


> Why 6800XT? Quite sure it will compete with 6900XT.



Nvidia kinda messed up here, the 3090 is not a titan card, its a 1500 dollar gaming card....
Previous Titan vs Ti pricing/performance was acceptable because of the pro-features Titan offered, the 3090 does not have these, a titan card is probably coming in the future.

So if they have a 3090 and a 3080Ti basically just going head to head….then the pricing becomes really odd, you just have 2 nearly identical cards for very different prices and like...within 2 - 3 months with no availability....I mean they just killing their own products this way.

Should just do a price drop on the 3090 but that would be admitting defeat/losing face so they rather put in a "new" SKU.


----------



## ratirt (Nov 5, 2020)

londiste said:


> 3090 also has $1500 MSRP vs 6900XT at $1000. The easiest response Nvidia has is a new SKU.


You have to ask yourself, 3090 and 3080Ti are basically same so the 4GB memory decrease and some bandwidth changes, how much will that affect the price of the 3080TI? 
The other thing is, if the 3080 TI will perform on par with 3090, (or very close) what is that make of the 3090? I know why NV is releasing it but the question still remains. What will the price for this 3080Ti be. I'm 100% sure, it won't be below or equal to $1k. 


ZoneDymo said:


> Should just do a price drop on the 3090 but that would be admitting defeat/losing face so they rather put in a "new" SKU.


Price drop for the 3090 would have been the way to go but since NV is releasing 3080 TI (slightly different spec) is to make the card cheaper which means price drop for the 3090 is not an option. Otherwise NV would have done it. This decision, is based on (probably), NV would lose money since it costs more to make the 3090 than sell it for $1k. $500 difference in price is massive for a product that performs equally.


----------



## londiste (Nov 5, 2020)

ZoneDymo said:


> Should just do a price drop on the 3090 but that would be admitting defeat/losing face so they rather put in a "new" SKU.


Dropping price on 3090 would be a big FU to anyone who bought a 3090. Having a 3080Ti with 4GB less memory and lower power limit allows them to retain the impression that 3090 is still a faster card. This has happened before with Titans and new SKUs are an answer.


ratirt said:


> You have to ask yourself, 3090 and 3080Ti are basically same so the 4GB memory decrease and some bandwidth changes, how much will that affect the price of the 3080TI?


Prices of cards, especially something like 3090 are not based on manufacturing costs. There is a very healthy margin there.


----------



## Vya Domus (Nov 5, 2020)

ZoneDymo said:


> Previous Titan vs Ti pricing/performance was acceptable because of the pro-features Titan offered, the 3090 does not have these, a titan card is probably coming in the future.



Only Titan cards that had "pro-features" was the first one and Titan V, all the others were simply gaming products that just happened to have more memory. You can tell this is true by looking at the GPUs used, Titan V used the same V100 chip that was used for the Tesla product with the same name, meanwhile Titan RTX for instance used the same TU102 the 2080ti had.


----------



## TumbleGeorge (Nov 5, 2020)

londiste said:


> Dropping price on 3090 would be a big FU to anyone who bought a 3090. Having a 3080Ti with 4GB less memory and lower power limit allows them to retain the impression that 3090 is still a faster card. This has happened before with Titans and new SKUs are an answer.
> Prices of cards, especially something like 3090 are not based on manufacturing costs. There is a very healthy margin there.


Buying hardware is not required by law. Whoever decides to do it, rushing at his own risk, is his personal choice and no one is to blame, or he is obliged to keep the prices for a long time so as not to affect him financially.


----------



## londiste (Nov 5, 2020)

TumbleGeorge said:


> Buying hardware is not required by law. Whoever decides to do it, rushing at his own risk, is his personal choice and no one is to blame, or he is obliged to keep the prices for a long time so as not to affect him financially.


Does not make it less of a problem for manufacturer.


----------



## Deleted member 190774 (Nov 5, 2020)

TumbleGeorge said:


> Buying hardware is not required by law. Whoever decides to do it, rushing at his own risk, is his personal choice and no one is to blame, or he is obliged to keep the prices for a long time so as not to affect him financially.


While this is absolutely true, it still represents an irrevocably botched product release.

NVidia should stick to their game plan, release the interim products and accept they've stuffed this product line. They can focus on not underestimating competition, and getting back on track in 2022.


----------



## ZoneDymo (Nov 5, 2020)

Vya Domus said:


> Only Titan cards that had "pro-features" was the first one and Titan V, all the others were simply gaming products that just happened to have more memory. You can tell this is true by looking at the GPUs used, Titan V used the same V100 chip that was used for the Tesla product with the same name, meanwhile Titan RTX for instance used the same TU102 the 2080ti had.



Ill admit I dont know much about this, but in LTT's review of the 3090 they compared it to previous titan and it lost in some workloads which I think they mentioned is due to intentional software limits.
Arbitrary or not, the 3090 did not get this due to it not being considered a titan card.


----------



## medi01 (Nov 5, 2020)

wolf said:


> They haven't even launched their products yet, no consumers have them and Nvidia is getting it's ass.... kicked?
> 
> RIP Nvidia right? How could they possibly survive this flogging.



Hard times in green lands, aren't they?

Going from "nVidia is years ahead", "I doubt AMD would match 3070", "perhaps 20% slower than 3080" to "oh, THAT is why Huang stopped ripping us off".
Yes, AMD just kicked NV's lower bottom part, rolling out the fastest GPU on the market, with smile, superior power consumption and for $500 less.

As for "NV not surviving" it, it's a strawman.
Even if Ampere fiasco will be followed by 3 more fiascos, there will still be enough green fanboi to buy overpriced green crap (with hilarious reasoning attached).


----------



## wolf (Nov 5, 2020)

medi01 said:


> Hard times in green lands, aren't they?


Are they?


medi01 said:


> Going from "nVidia is years ahead", "I doubt AMD would match 3070", "perhaps 20% slower than 3080" to "oh, THAT is why Huang stopped ripping us off".
> Yes, AMD just kicked NV's lower bottom part, rolling out the fastest GPU on the market, with smile, superior power consumption and for $500 less.


From rumour to rumour to rumour, to products that haven't launched or even been reviewed yet, with or without smile.

But, lets for arguments say you're totally right and they actually "wipe the floor" product vs product, by basically offering what is *at best* a respectable price to performance benefit, with some extra placebo VRAM and a minor perf;watt advantage. Nvidia adjust prices, actually *officially* announces SKU's that fit the gaps, and life goes on with both happily selling GPU's to happy consumers.


medi01 said:


> As for "NV not surviving" it, it's a strawman.
> Even if Ampere fiasco will be followed by 3 more fiascos, there will still be enough green fanboi to buy overpriced green crap (with hilarious reasoning attached).


I can't even, you made my day with this one, priceless.


----------



## RedelZaVedno (Nov 5, 2020)

Makaveli said:


> That is how this works when you are publicly traded company with shareholders.
> 
> Any company in this position would do the same so I guess they would all be greedy based on what you are saying.


That's how it is in duopoly which was illegal not that long ago. We need working antitrust legislation ASAP.


----------



## medi01 (Nov 5, 2020)

wolf said:


> some extra placebo VRAM


6800 should be about 20% faster than 3070, but speaking of VRAM...

You know DF managed to artificially down 2080 performance in 3080 "preview" video? 
Right, by not fitting textures in VRAM.

Welp:


----------



## londiste (Nov 5, 2020)

medi01 said:


> 6800 should be about 20% faster than 3070, but speaking of VRAM...


Lets see reviews first, but in addition to being ~16% faster than 2080Ti according to AMD slides, it is also 15% more expensive.


----------



## medi01 (Nov 5, 2020)

londiste said:


> Lets see reviews first, but in addition to being ~16% faster than 2080Ti according to AMD slides, it is also 15% more expensive.


No, you poor thing, it is 18% faster than 2080Ti (which is a tad faster than 3070) and 16% more expensive than 3070. 
Oh, and has twice VRAM.


----------



## mahirzukic2 (Nov 5, 2020)

AusWolf said:


> Another day, another nvidia rumour.


They need it, lest they fall into obscurity.


----------



## AusWolf (Nov 5, 2020)

mahirzukic2 said:


> They need it, lest they fall into obscurity.


What they need is actual physical graphics cards on store shelves, especially now before Christmas and the AMD launch.


----------



## purecain (Nov 5, 2020)

Exactly, we're being forced to buy online due to covid and we have idiots using bots to buy what little stock there is, I'd of set my own bot up just to buy one for myself had i known it was going to be every single new product. This is going to have a terrible affect on the community. This would of been enough to put me off pc's all together when i was first starting. I cant stand dishonesty. Manufacturers should of made sure the public could get their products, we have shops using sister companies which basically prevent hardly any of us being able to buy at the msrp. If one shop had sold them at msrp no one would of been able to profit from scalped cards. That only works if people cant get them anywhere. It feels artificial, like this was planned and i'm sure it was. Call me paranoid.


----------



## AusWolf (Nov 5, 2020)

purecain said:


> Exactly, we're being forced to buy online due to covid and we have idiots using bots to buy what little stock there is, I'd of set my own bot up just to buy one for myself had i known it was going to be every single new product. This is going to have a terrible affect on the community. This would of been enough to put me off pc's all together when i was first starting. I cant stand dishonesty. Manufacturers should of made sure the public could get their products, we have shops using sister companies which basically prevent hardly any of us being able to buy at the msrp. If one shop had sold them at msrp no one would of been able to profit from scalped cards. That only works if people cant get them anywhere. It feels artificial, like this was planned and i'm sure it was. Call me paranoid.


I'm sure scalpers are part of the problem, but I also think they're only a small part of it. Interestingly, I never heard so much about scalping during any of the previous hardware launches.

Personally, I think the real problem is Samsung's 8 nm node not being able to produce a reasonable number of GPUs for nvidia and board partners. Random fact: there is not a single fully unlocked GPU die either in the current, or rumoured 30-series product stack anywhere. Why is that? Is nvidia making room for a later coming Super series? Maybe, but I think yields are so low that they basically can't produce fully functional dies at all, and even partially functional ones are short on supply - that's probably why we're hearing rumours about the large GA102 seeping down into 3070 Ti territory and the GA104 into 3060 Ti, or even 3060 levels (instead of introducing new, smaller dies as usual).

In short: scalping is only ever an issue when there is a pre-existent shortage of a certain product.


----------



## purecain (Nov 5, 2020)

I'm just guttered, usually we would all be overclocking and benchmarking our new cards and others would be making their minds up based on our experiences. This is usually a lot of fun for our community.  Lets pray for decent amounts of stock over the next month so we can at least get set up for xmas, although i have a feeling we will all still be hunting parts into February 2021.


----------



## lexluthermiester (Nov 5, 2020)

purecain said:


> Lets pray for decent amounts of stock over the next month


They are working on it. Shipments are incoming.


----------



## Makaveli (Nov 5, 2020)

RedelZaVedno said:


> That's how it is in duopoly which was illegal not that long ago. We need working antitrust legislation ASAP.



Good luck with that let me know how it goes.


----------



## ratirt (Nov 6, 2020)

AusWolf said:


> Personally, I think the real problem is Samsung's 8 nm node not being able to produce a reasonable number of GPUs for nvidia and board partners. Random fact: there is not a single fully unlocked GPU die either in the current, or rumoured 30-series product stack anywhere. Why is that? Is nvidia making room for a later coming Super series? Maybe, but I think yields are so low that they basically can't produce fully functional dies at all, and even partially functional ones are short on supply - that's probably why we're hearing rumours about the large GA102 seeping down into 3070 Ti territory and the GA104 into 3060 Ti, or even 3060 levels (instead of introducing new, smaller dies as usual).


I don't think so. 8nm Samsung is 2 years old news. How is that holding back production I have no idea since the yields are good. 
If it really was Samsung's node, and the yields would be low due to defects for the 3090 and 3080, You could salvage those to get decent number of 3070 GPUs and these have problems  with stock as well. Problem is elsewhere.


----------



## fynxer (Nov 6, 2020)

320bit mem bus must be fake news and Trump won the election if you only count his votes 

Why 320bit mem bus? If people are going to pay +$300 they would want some upgrade to the mem bus like 352bit.

What ever! If I feel like Nvidia is screwing me I will jump to AMD


----------



## AusWolf (Nov 6, 2020)

ratirt said:


> I don't think so. 8nm Samsung is 2 years old news. How is that holding back production I have no idea since the yields are good.
> If it really was Samsung's node, and the yields would be low due to defects for the 3090 and 3080, You could salvage those to get decent number of 3070 GPUs and these have problems  with stock as well. Problem is elsewhere.


My theory is that the originally planned 3090 and 3080 produced defects, that's why we have those awkward numbers of cuda cores in those GPUs, and basically no fully unlocked chip anywhere in the product stack. Even the 3070 uses a GPU with disabled parts, and I'm pretty sure there isn't so much difference between the performance of 46 and 48 SMs that would make any sense for nvidia to have plans with the unlocked chip in the future. Even the 3070 Ti rumours suggest a highly watered-down version of the GA102 instead of a full GA104. If it is financially beneficial to manufacture such a large chip and disable parts for a card several tiers below the 3090, I can only assume that something is awfully wrong with the node or the design (or both).


----------



## ratirt (Nov 7, 2020)

AusWolf said:


> My theory is that the originally planned 3090 and 3080 produced defects, that's why we have those awkward numbers of cuda cores in those GPUs, and basically no fully unlocked chip anywhere in the product stack. Even the 3070 uses a GPU with disabled parts, and I'm pretty sure there isn't so much difference between the performance of 46 and 48 SMs that would make any sense for nvidia to have plans with the unlocked chip in the future. Even the 3070 Ti rumours suggest a highly watered-down version of the GA102 instead of a full GA104. If it is financially beneficial to manufacture such a large chip and disable parts for a card several tiers below the 3090, I can only assume that something is awfully wrong with the node or the design (or both).


Sure but you need to realize that the bigger the chip is, the harder it is to make it. Of course there will be defects and you can use that chip as a lower tier. The fact that NV chips are quite big is no fantasy and that would contribute to defective chips. So the quantity might not be as impressive as a smaller chip but that's the way it is. The node is matured and it does well with all the improvements it had. (To be clear it is 10nm Samsung improved becoming 8nm) So I don't think the problem is the node. Maybe memory if you consider 3080 and 3090. 
The other thing is, maybe NV was rushing the release so bad, they didnt have the stock to begin with. Maybe these cards were going to be released later not August but much later. They didnt have the stock but wanted to release it before AMD. Hard to tell but the node problem or yields problem (they are there due to large chips) is not convincing me.


----------



## r9 (Nov 8, 2020)

What a ground breaking news ... for the two people that will be able to buy it.


----------



## londiste (Nov 8, 2020)

AusWolf said:


> My theory is that the originally planned 3090 and 3080 produced defects, that's why we have those awkward numbers of cuda cores in those GPUs, and basically no fully unlocked chip anywhere in the product stack. Even the 3070 uses a GPU with disabled parts, and I'm pretty sure there isn't so much difference between the performance of 46 and 48 SMs that would make any sense for nvidia to have plans with the unlocked chip in the future. Even the 3070 Ti rumours suggest a highly watered-down version of the GA102 instead of a full GA104. If it is financially beneficial to manufacture such a large chip and disable parts for a card several tiers below the 3090, I can only assume that something is awfully wrong with the node or the design (or both).


Has Big Navi die size been confirmed? 
Rumored 536mm² is only 15% smaller than Nvidia's 628mm². TSMC N7 is a smaller, newer and more complex node compared to Samsung's 8N.


----------



## Zach_01 (Nov 8, 2020)

wolf said:


> But, lets for arguments say you're totally right and they actually "wipe the floor" product vs product, by basically offering what is at best a respectable price to performance benefit, *with some extra placebo VRAM *and a minor perf;watt advantage. Nvidia adjust prices, actually officially announces SKU's that fit the gaps, and life goes on with both happily selling GPU's to happy consumers.


Its not placebo... they need it for SmartAccessMemory to increase performance even higher for their own platform. How good is it or not, or how this will be exploited by game devs in future is yet to be seen.


----------



## geogan (Dec 2, 2020)

ratirt said:


> Price drop for the 3090 would have been the way to go but since NV is releasing 3080 TI (slightly different spec) is to make the card cheaper which means price drop for the 3090 is not an option. Otherwise NV would have done it. This decision, is based on (probably), *NV would lose money since it costs more to make the 3090 than sell it for $1k*. $500 difference in price is massive for a product that performs equally.



Where the hell did you get that idea from? I don't know why uninformed people think this. How do people not realise that 3080/3090 cards are more or less identical down to atomic levels of similarity.

It costs exactly the same to make a 3090 GPU die as it does a 3080 GPU die - they are the SAME THING - only difference is the 3080 has a few broken CUs in GPU die so they disabled them and sell it for less.

If they are able to make and sell 3080 for 800 PROFITABLY then they DEFINITELY could make the 3090 and sell it profitably for 800 too. Its just they don't have to - there are enough idiots around who will gladly give them another 700 in profit for the same thing.


----------



## londiste (Dec 2, 2020)

3090 includes more than just the die, leaving the validity of binning as price increase factor aside. RTX3090 uses 14 more RAM chips - even for GDDR6 additional 14 RAM chips would be $150 if not more, GDDR6X is undoubtedly more expensive - most of the additional RAM chips are on the opposing side of PCB - with increased power consumption of GDDR6X it means there needs to be cooling for some 20-30W or more (I don't think I have ever seen backplates with heatpipes before RTX3090). Coolers and VRMs seem to also be more substantial for RTX3090s.

There is also a marketing or PR angle to this. The people who have bought RTX3090 would be disappointed or angry if the same card was sold for 30% less so soon after release. So a new SKU is slightly worse than RTX3090 and current RTX3090 owners can use that small bit to make them feel better.


----------



## im.thatoneguy (Dec 3, 2020)

WeeRab said:


> More fool you then.  The 3090 doesn't come with the production-specific drivers that would make it that good outside of gaming. (nor will a 3080ti)
> All that will come with the £3000 Titan variant.



NVIDIA DRIVERS NVIDIA Studio Driver



SUPPORTED PRODUCTS*NVIDIA TITAN Series:*
NVIDIA TITAN RTX, NVIDIA TITAN V, NVIDIA TITAN Xp, NVIDIA TITAN X (Pascal)
*GeForce RTX 30 Series:*
*GeForce RTX 3090*, GeForce RTX 3080
*GeForce RTX 20 Series:*
GeForce RTX 2080 Ti, GeForce RTX 2080 SUPER, GeForce RTX 2080, GeForce RTX 2070 SUPER, GeForce RTX 2070, GeForce RTX 2060 SUPER, GeForce RTX 2060
*GeForce 16 Series:*
GeForce GTX 1660 SUPER, GeForce GTX 1650 SUPER, GeForce GTX 1660 Ti, GeForce GTX 1660, GeForce GTX 1650
*GeForce 10 Series:*
GeForce GTX 1080 Ti, GeForce GTX 1080, GeForce GTX 1070 Ti, GeForce GTX 1070, GeForce GTX 1060, GeForce GTX 1050 Ti, GeForce GTX 1050


----------



## ratirt (Dec 3, 2020)

geogan said:


> Where the hell did you get that idea from? I don't know why uninformed people think this. How do people not realise that 3080/3090 cards are more or less identical down to atomic levels of similarity.
> 
> It costs exactly the same to make a 3090 GPU die as it does a 3080 GPU die - they are the SAME THING - only difference is the 3080 has a few broken CUs in GPU die so they disabled them and sell it for less.
> 
> If they are able to make and sell 3080 for 800 PROFITABLY then they DEFINITELY could make the 3090 and sell it profitably for 800 too. Its just they don't have to - there are enough idiots around who will gladly give them another 700 in profit for the same thing.


Because you have a handful of these 3090 and these are being priced accordingly. If you get a wafer and try to make only 3090's you will get few of them per wafer. All the chips with defects will be salvaged as 3070 and 3080. So yeah, the price for the 3090 is high because of it since it is the top-notch silicon binned and you have RAM as well. They are same gen but they are not the same. You should get a closer look to wafers etc. These costs a lot of cash and profits must be acquired across the wafer. With few 3090 per wafer, the price will be high and they would not lower the price because the salvaged chips used for 3070 and 3080 with lower price tags on them, may not make that profitable in a longer run


----------

