Friday, October 23rd 2020

NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080

Possibly unsure of the GeForce RTX 3070 tackling AMD's Radeon RX 6000 series parts, NVIDIA is designing a new RTX 30-series SKU positioned between the RTX 3070 and RTX 3080. This is not a 16 GB variant of the RTX 3070, but rather a new SKU based on the 8 nm "GA102" silicon, according to a reliable source with NVIDIA leaks, kopite7kimi. The SKU is based on the GA102 with the ASIC code "GA102-150-KD-A1." The silicon is configured with 7,424 CUDA cores across 58 streaming multiprocessors (29 TPCs), 232 tensor cores, 232 TMUs, 58 RT cores, and an unknown number of ROPs. According to kopite7kimi, the card is configured with a 320-bit wide memory interface, although it's not known if this is conventional GDDR6, like the RTX 3070 has, or faster GDDR6X, like that on the RTX 3080.

NVIDIA recently "cancelled" a future 16 GB variant of the RTX 3070, and 20 GB variant of the RTX 3080, which is possibly the company calibrating its response to the Radeon RX 6000 series. We theorize that doubling in memory amounts may not have hit the desired cost-performance targets; and the company probably believes the competitive outlook of the RTX 3080 10 GB is secure. This explains the need for a SKU with performance halfway between that of the RTX 3070 and RTX 3080. As for pricing, with the RTX 3070 positioned at $500 and the RTX 3080 at $700, the new SKU could be priced somewhere in between. AMD's RDNA2-based Radeon RX 6000 series GPUs are expected to feature DirectX 12 Ultimate logo compliance, meaning that there is a level playing ground between AMD and NVIDIA in the performance segment.
Source: kopite7kimi (Twitter)
Add your own comment

86 Comments on NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080

#51
AusWolf
saki630Nvidia is one generation ahead of Nvidia. This article is full of fake news made up to fill up a web post.

What people need to understand is that if $600 3070 is really good, then a cheaper weaker AMD might be a better choice.
Since they're having trouble producing GPUs, they might as well produce an unprecedented number of news articles just to keep the hype train going. What I've seen of the 30 series so far isn't that good anyway (meh amounts of VRAM, huge power consumption, weak 1080p performance...).
Posted on Reply
#52
purplekaycee
CammNvidia must be bloody terrified of what AMD has if its moving GA102 down to 70 Ti levels
When are the new AMD cards coming
Posted on Reply
#53
Zach_01
nVidia was planning to flood the market by Oct end, Nov start with a few hundred thousand Ampere cards (~300K of 3080 10GB, ~30K of 3090 24GB) including the RTX3070 8GB also by a hefty number.

There is some talking going on for a while that now they decided to supply half the 3080s 10GB and 3070 8GB from original plan but same number for 3090 24GB.
Also they canceled the 20GB options for 3070/3080s. By the way, the 3070 20GB was going to be with GDDR6, so the claimed shortage of GDDR6X is not true.

This, if true, indicating that after leaks/rumors of RDNA2 performance nVidia re-evaluated their segmentation and profit margins plan. They saw that 3070/3080s with 20GB are pointless, and they bringing down the GA102 to a new 3070 (Super/Ti or ?) and this (short of) confirms the cut in half supply numbers for 3070 8GB and 3080 10GB.

First they planed towards more segmentation in the upper high tiers for more profit margins (I’m surprised!) but the RDNA2 leaks and rumors turned them to move into more segmentation towards the middle tier parts and have more options there, to “play” with perf and price.

In a nutshell their plan has backfired right in their face...
They could have gone to profit 10, but they wanted 15~20 and now they’re going to get <10.
Posted on Reply
#54
P4-630
Zach_01By the way, the 3070 20GB was going to be with GDDR6
Rumors were 20GB GDDR6x for the 3080(Ti) and 16GB GGDR6 for the 3070(Ti).
Posted on Reply
#55
Zach_01
Yes I’ve made a mistake about the quantity of the 3070 VRAM. The rest is the same.
Posted on Reply
#56
phanbuey
Seems like Nvidia is just having yield issues and it's probably their only option / easier to make a slightly gimped 3080 with more ram to fight 6800xt while fixing said issues.
Posted on Reply
#57
Shatun_Bear
You'd have to have a screw loose to buy a $500 graphics card with just 8GB of memory (3070) going in to 2021.
Posted on Reply
#58
Zach_01
Shatun_BearYou'd have to have a screw loose to buy a $500 graphics card with just 8GB of memory (3070) going in to 2021.
Same-wise a 700~800$ with 10GB either...
Posted on Reply
#59
EarthDog
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
Posted on Reply
#60
Zach_01
EarthDogIs anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
Probably is for me since you havent quote anyone.
Why did you retracted your "Haha" emoji from @Shatun_Bear latest post? ...after I did the same and left the #59 post.
Should I take this as an opportunity to tell on me? I dont want to, to be honest.

First I wasnt agree with him, and was trying to make a point that if this is true then it must also apply to 3080. You know... try to start conversation...
Posted on Reply
#61
Calmmo
EarthDogIs anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
Yep, cant do much about it.
just let them continue to spew their made up rumor "facts" and other such drivel. Fortunately it's relatively contained to these news sections. Mostly.
Posted on Reply
#62
AusWolf
EarthDogIs anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
As far as I see it, the reviews point out that the 3080/90 are better than the 20 series. It is true, yet you can't ignore certain factors: no availability in stores, high power consumption, mediocre amounts of VRAM, tiny performance gains in lower resolutions. Sure, if you want to play current titles in 4K, go ahead, buy a 3080, but if you...
  • want a future-proof card: wait and see how well AMD's offerings perform with 16 GB VRAM,
  • want to play current games at 1080p: keep what you have and be happy, or buy something second-hand.
8/10 GB VRAM is enough now, but who knows what the future brings. The 3080/90 might run out of VRAM sooner than raw GPU power in the future, and this wouldn't be the first instance. My brother has a 2 GB GTX 960 which could easily play Kingdom Come: Deliverance just fine, would it not drop into single digit FPS whenever it's updating the framebuffer. Personally, I'd rather avoid a similar situation.

Disclaimer: all of my comment contains personal opinion. If it differs from anyone else's, so be it. No offense. :)
Posted on Reply
#63
Xmpere
EarthDogIs anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
But if I point them out, the staff delete the comment but don’t do shit when there’s people who go to people dms personally attacking folks because of their “views”and still they don’t do shit about it . It’s hilarious.
Posted on Reply
#64
nguyen
Hypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.
Posted on Reply
#65
phanbuey
I'm actually pretty happy with my 2080Ti perf - overclocked I get 7600 in timespy extreme and 16K in timespy, which is about ~15% behind 3080 at 1440p. All my games run in the 150-100 fps range and 100-80 if i crank the details at so I think ill just wait for GPU wars to start before picking anything up. Gonna hold on to it and hope DLSS 2.0 gives me good perf in cyberpunk.

At the end of the day if ur rig plays games the way you want, who cares how big your theoretical future ePeen is.
Posted on Reply
#66
AusWolf
nguyenHypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.
How was the RX 5700 XT (or any equivalent GPU) obsolete at the moment of release? Also, how is this connected to the article about a new nvidia SKU?
Posted on Reply
#67
nguyen
AusWolfHow was the RX 5700 XT (or any equivalent GPU) obsolete at the moment of release?
No DX12 Ultimate ?
Inferior hardware encoder ?
Sure you can sugar-coat it with price to performance thingy but there really is no feature that could distinguish 5700XT from the budget GPU. People who bought 5700XT are about to upgrade to Big Navi ASAP, so talking about future proofing when it's involved Nvidia is just pure hypocrisy
Posted on Reply
#68
AusWolf
nguyenNo DX12 Ultimate ?
Inferior hardware encoder ?
Sure you can sugar-coat it with price to performance thingy but there really is no feature that could distinguish 5700XT from the budget GPU.
DX12 Ultimate was nowhere near a thing when the 5700XT was released. Hardware-accelerated Ray Tracing and DLSS were (and still are) an nvidia niche represented only in a handful of games. Also, how is the hardware encoder inferior?

Saying that Navi is inferior because it doesn't support hardware RT and DLSS is just as much BS as saying that RTX 20 series cards are inferior because they only communicate through PCI-E 3.0.

My point is: If AMD released the 5700XT right now, it would be a product of questionable value, but at the time of release, it was just as valid as nvidia's GTX 16 series (one of which I bought and used happily until a month ago).

Edit: I still don't see how it all connects to the main article.
Posted on Reply
#69
nguyen
AusWolfDX12 Ultimate was nowhere near a thing when the 5700XT was released. Hardware-accelerated Ray Tracing and DLSS were (and still are) an nvidia niche represented only in a handful of games. Also, how is the hardware encoder inferior?
See ? when you talk about RT and DLSS it's "only in a handful of games", yet they are obviously the future proofing features of that time. Do you know how many games require more than 10GB VRAM right now ? 0 ?
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.

Btw DX12 Ultimate is more than RT.
Posted on Reply
#70
AusWolf
nguyenSee ? when you talk about RT and DLSS it's "only in a handful of games", yet they are obviously the future proofing features of that time. Do you know how many games require more than 10GB VRAM right now ? 0 ?
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.
I still can't say that RT will be dominantly present (not to mention required) in the near future, just like I can't see electric cars taking over the domestic vehicle market in the next few years. Sure, if you want the best experience, you'll buy a RT enabled card, but I'm not sure if you really need it to be future-proof. VRAM is a different story: you need it whether you use RT or not.
Posted on Reply
#71
ZoneDymo
nguyenHypocripsy at the highest level when people who bought Navi are worrying about future proofing when they bought something that is already obsolete the moment it released.
No thats more the case for the RTX2000 series, because people bought those cards "because of ray tracing which is the future" which is exactly the reason why you now claim the 5700xt was "obsolete the moment it released".
Yet everyone bought a 5700 because of its rasterization performance (the performance that matters today) saying that buying a card now for RT is stupid because its not relevant yet except for some gimmicky effects in a handful of games with terrible performance and if you wanted proper RT support from both games and hardware you had to wait atleast a gen, probably two.

And guess what, that is exactly where we are, only the 3000 series now can do at sorta playable fps what was released during the 2000 series and yet even now its still quite gimmicky, if you really want RT even this gen is not enough, you will have to wait until the next for actually full on RT based games (if even then...maybe even another gen is needed).
Posted on Reply
#72
nguyen
AusWolfI still can't say that RT will be dominantly present (not to mention required) in the near future, just like I can't see electric cars taking over the domestic vehicle market in the next few years. Sure, if you want the best experience, you'll buy a RT enabled card, but I'm not sure if you really need it to be future-proof. VRAM is a different story: you need it whether you use RT or not.
Even worse argument there, I can play with 4GB VRAM all the same, just lower the details setting to Medium and voila. Not sure if you have tried but RT reflections and Transparent Reflection is much more noticeable in game than Ultra details vs Medium. Furthermore all RTX games will come with DLSS, so yeah, not sure what manual you read regarding VRAM for future proofing.
ZoneDymoNo thats more the case for the RTX2000 series, because people bought those cards "because of ray tracing which is the future" which is exactly the reason why you now claim the 5700xt was "obsolete the moment it released".
Yet everyone bought a 5700 because of its rasterization performance (the performance that matters today) saying that buying a card now for RT is stupid because its not relevant yet except for some gimmicky effects in a handful of games with terrible performance and if you wanted proper RT support from both games and hardware you had to wait atleast a gen, probably two.

And guess what, that is exactly where we are, only the 3000 series now can do at sorta playable fps what was released during the 2000 series and yet even now its still quite gimmicky, if you really want RT even this gen is not enough, you will have to wait until the next for actually full on RT based games (if even then...maybe even another gen is needed).
Well if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
Posted on Reply
#73
AusWolf
nguyenEven worse argument there, I can play with 4GB VRAM all the same, just lower the details setting to Medium and voila. Not sure if you have tried but RT reflections and Transparent Reflection is much more noticeable in game than Ultra details vs Medium. Furthermore all RTX games will come with DLSS, so yeah, not sure what manual you read regarding VRAM for future proofing.

Well if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
It's not just the settings, but the general assets of the game, texture quality, etc. You can lower your settings, but you can't lower them indefinitely. The fact that you can still play games fine with 4 GB VRAM has nothing to do with it. I currently play The Witcher 3 on my GT 1030 2 GB (since I sold my GTX 1660 Ti) and only use about 1.5 GB with 1080p medium settings. Microsoft Flight Simulator would be a totally different story, I guess. You can't say that 4 GB is enough just because it's enough for you.

Besides, if you're happy to play at lower settings, then why is it an issue for you if a graphics card doesn't have hardware-accelerated RT? ;)

Not to mention, if you're lowering your settings right now, then what kind of future proofing are we talking about?
Posted on Reply
#74
ZoneDymo
nguyenWell if you state it that way then there is no need for future proofing, every GPU is obsolete when its successor come out anyway. So why bother with 8GB or 10GB VRAM, can't play game with anything less than Ultra High settings ?
I would pretty much prefer RT to Ultra High settings, at least it doesn't need 400% magnification to distinguish.
Its not so much that, its that the first go at a new tech is never going to be worth it because what you buy it for its so new its not established yet and by the time it is, then we are 2 or so generations further which will be needed to handle it properly anyway.
RTX2000 series was the first go at RT, and even now with the 3000 series out RT is not even really a thing yet, so if you bought the RTX2000 series for RT then you were just being silly yet that is what you base your opinion on as to why the 5700XT was obsolete on release.
Posted on Reply
#75
nguyen
AusWolfIt's not just the settings, but the general assets of the game, texture quality, etc. You can lower your settings, but you can't lower them indefinitely. The fact that you can still play games fine with 4 GB VRAM has nothing to do with it. I currently play The Witcher 3 on my GT 1030 2 GB (since I sold my GTX 1660 Ti) and only use about 1.5 GB with 1080p medium settings. Microsoft Flight Simulator would be a totally different story, I guess. You can't say that 4 GB is enough just because it's enough for you.

Besides, if you're happy to play at lower settings, then why is it an issue for you if a graphics card doesn't have hardware-accelerated RT? ;)

Not to mention, if you're lowering your settings right now, then what kind of future proofing are we talking about?
By your definition then the Radeon VII is the most future proofed GPU right now :roll: (except the 3090).
Future proofing means more than just VRAM, it's also the features set. Anyways the only people I know that correlate VRAM to performance are tech noobie.
ZoneDymoIts not so much that, its that the first go at a new tech is never going to be worth it because what you buy it for its so new its not established yet and by the time it is, then we are 2 or so generations further which will be needed to handle it properly anyway.
RTX2000 series was the first go at RT, and even now with the 3000 series out RT is not even really a thing yet, so if you bought the RTX2000 series for RT then you were just being silly yet that is what you base your opinion on as to why the 5700XT was obsolete on release.
I really have no clue what you are talking about, I finished 4 RTX games just fine. Who said you need 100+ fps to enjoy single player games ? AMD owners ?:roll:
Btw I can probably max out CP2077 settings with my 2080 Ti and it would still be enjoyable.
Posted on Reply
Add your own comment
Dec 23rd, 2024 20:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts