Friday, October 23rd 2020
NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080
Possibly unsure of the GeForce RTX 3070 tackling AMD's Radeon RX 6000 series parts, NVIDIA is designing a new RTX 30-series SKU positioned between the RTX 3070 and RTX 3080. This is not a 16 GB variant of the RTX 3070, but rather a new SKU based on the 8 nm "GA102" silicon, according to a reliable source with NVIDIA leaks, kopite7kimi. The SKU is based on the GA102 with the ASIC code "GA102-150-KD-A1." The silicon is configured with 7,424 CUDA cores across 58 streaming multiprocessors (29 TPCs), 232 tensor cores, 232 TMUs, 58 RT cores, and an unknown number of ROPs. According to kopite7kimi, the card is configured with a 320-bit wide memory interface, although it's not known if this is conventional GDDR6, like the RTX 3070 has, or faster GDDR6X, like that on the RTX 3080.
NVIDIA recently "cancelled" a future 16 GB variant of the RTX 3070, and 20 GB variant of the RTX 3080, which is possibly the company calibrating its response to the Radeon RX 6000 series. We theorize that doubling in memory amounts may not have hit the desired cost-performance targets; and the company probably believes the competitive outlook of the RTX 3080 10 GB is secure. This explains the need for a SKU with performance halfway between that of the RTX 3070 and RTX 3080. As for pricing, with the RTX 3070 positioned at $500 and the RTX 3080 at $700, the new SKU could be priced somewhere in between. AMD's RDNA2-based Radeon RX 6000 series GPUs are expected to feature DirectX 12 Ultimate logo compliance, meaning that there is a level playing ground between AMD and NVIDIA in the performance segment.
Source:
kopite7kimi (Twitter)
NVIDIA recently "cancelled" a future 16 GB variant of the RTX 3070, and 20 GB variant of the RTX 3080, which is possibly the company calibrating its response to the Radeon RX 6000 series. We theorize that doubling in memory amounts may not have hit the desired cost-performance targets; and the company probably believes the competitive outlook of the RTX 3080 10 GB is secure. This explains the need for a SKU with performance halfway between that of the RTX 3070 and RTX 3080. As for pricing, with the RTX 3070 positioned at $500 and the RTX 3080 at $700, the new SKU could be priced somewhere in between. AMD's RDNA2-based Radeon RX 6000 series GPUs are expected to feature DirectX 12 Ultimate logo compliance, meaning that there is a level playing ground between AMD and NVIDIA in the performance segment.
86 Comments on NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080
There is some talking going on for a while that now they decided to supply half the 3080s 10GB and 3070 8GB from original plan but same number for 3090 24GB.
Also they canceled the 20GB options for 3070/3080s. By the way, the 3070 20GB was going to be with GDDR6, so the claimed shortage of GDDR6X is not true.
This, if true, indicating that after leaks/rumors of RDNA2 performance nVidia re-evaluated their segmentation and profit margins plan. They saw that 3070/3080s with 20GB are pointless, and they bringing down the GA102 to a new 3070 (Super/Ti or ?) and this (short of) confirms the cut in half supply numbers for 3070 8GB and 3080 10GB.
First they planed towards more segmentation in the upper high tiers for more profit margins (I’m surprised!) but the RDNA2 leaks and rumors turned them to move into more segmentation towards the middle tier parts and have more options there, to “play” with perf and price.
In a nutshell their plan has backfired right in their face...
They could have gone to profit 10, but they wanted 15~20 and now they’re going to get <10.
Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
Why did you retracted your "Haha" emoji from @Shatun_Bear latest post? ...after I did the same and left the #59 post.
Should I take this as an opportunity to tell on me? I dont want to, to be honest.
First I wasnt agree with him, and was trying to make a point that if this is true then it must also apply to 3080. You know... try to start conversation...
just let them continue to spew their made up rumor "facts" and other such drivel. Fortunately it's relatively contained to these news sections. Mostly.
- want a future-proof card: wait and see how well AMD's offerings perform with 16 GB VRAM,
- want to play current games at 1080p: keep what you have and be happy, or buy something second-hand.
8/10 GB VRAM is enough now, but who knows what the future brings. The 3080/90 might run out of VRAM sooner than raw GPU power in the future, and this wouldn't be the first instance. My brother has a 2 GB GTX 960 which could easily play Kingdom Come: Deliverance just fine, would it not drop into single digit FPS whenever it's updating the framebuffer. Personally, I'd rather avoid a similar situation.Disclaimer: all of my comment contains personal opinion. If it differs from anyone else's, so be it. No offense. :)
At the end of the day if ur rig plays games the way you want, who cares how big your theoretical future ePeen is.
Inferior hardware encoder ?
Sure you can sugar-coat it with price to performance thingy but there really is no feature that could distinguish 5700XT from the budget GPU. People who bought 5700XT are about to upgrade to Big Navi ASAP, so talking about future proofing when it's involved Nvidia is just pure hypocrisy
Saying that Navi is inferior because it doesn't support hardware RT and DLSS is just as much BS as saying that RTX 20 series cards are inferior because they only communicate through PCI-E 3.0.
My point is: If AMD released the 5700XT right now, it would be a product of questionable value, but at the time of release, it was just as valid as nvidia's GTX 16 series (one of which I bought and used happily until a month ago).
Edit: I still don't see how it all connects to the main article.
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.
Btw DX12 Ultimate is more than RT.
Yet everyone bought a 5700 because of its rasterization performance (the performance that matters today) saying that buying a card now for RT is stupid because its not relevant yet except for some gimmicky effects in a handful of games with terrible performance and if you wanted proper RT support from both games and hardware you had to wait atleast a gen, probably two.
And guess what, that is exactly where we are, only the 3000 series now can do at sorta playable fps what was released during the 2000 series and yet even now its still quite gimmicky, if you really want RT even this gen is not enough, you will have to wait until the next for actually full on RT based games (if even then...maybe even another gen is needed).
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
Besides, if you're happy to play at lower settings, then why is it an issue for you if a graphics card doesn't have hardware-accelerated RT? ;)
Not to mention, if you're lowering your settings right now, then what kind of future proofing are we talking about?
RTX2000 series was the first go at RT, and even now with the 3000 series out RT is not even really a thing yet, so if you bought the RTX2000 series for RT then you were just being silly yet that is what you base your opinion on as to why the 5700XT was obsolete on release.
Future proofing means more than just VRAM, it's also the features set. Anyways the only people I know that correlate VRAM to performance are tech noobie. I really have no clue what you are talking about, I finished 4 RTX games just fine. Who said you need 100+ fps to enjoy single player games ? AMD owners ?:roll:
Btw I can probably max out CP2077 settings with my 2080 Ti and it would still be enjoyable.