Friday, September 7th 2018

NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

NVIDIA's Turing-based RTX 20-series graphics cards have been announced to begin shipping on the 20th of September. Their most compelling argument for users to buy them is the leap in ray-tracing performance, enabled by the integration of hardware-based acceleration via RT cores that have been added to NVIDIA's core design. NVIDIA has been pretty bullish as to how this development reinvents graphics as we know it, and are quick to point out the benefits of this approach against other, shader-based approximations of real, physics-based lighting. In a Q&A at the Citi 2018 Global Technology Conference, NVIDIA's Colette Kress expounded on their new architecture's strengths - but also touched upon a possible segmentation of graphics cards by raytracing capabilities.

During that Q&A, NVIDIA's Colette Kress put Turing's performance at a cool 2x improvement over their 10-series graphics cards, discounting any raytracing performance uplift - and when raytracing is indeed brought into consideration, she said performance has increased by up to 6x compared to NVIDIA's last generation. There's some interesting wording when it comes to NVIDIA's 20-series lineup, though; as Kress puts it, "We'll start with the ray-tracing cards. We have the 2080 Ti, the 2080 and the 2070 overall coming to market," which, in context, seems to point out towards a lack of raytracing hardware in lower-tier graphics cards (apparently, those based on the potential TU106 silicon and lower-level variants).
This is just speculation - based on Kress's comments, though - but if that translates to reality, this would be a tremendous misstep for NVIDIA and raytracing in general. The majority of the market games on sub-**70 tier graphics cards (the 20-series has even seen a price hike up to $499 for the RTX 2070...), and failing to add RT hardware to lower-tier graphics would exclude a huge portion of the playerbase from raytracing effects. This would mean that developers adding NVIDIA's RTX technologies and implementing Microsoft's DXR would be spending development resources catering to the smallest portion of gamers - the ones with high-performance discrete solutions. And we've seen in the past what developers think of devoting their precious time to such features.
Additionally, if this graphics card segregation by RTX support (or lack of it) were to happen, what would be of NVIDIA's lineup? GTX graphics cards up to the GTX 2060 (and maybe 2060 Ti), and RTX upwards? Dilluting NVIDIA's branding through GTX and RTX doesn't seem like a sensible choice, but of course, if that were to happen, it would be much better than keeping the RTX prefix across the board.

It could also be a simple case of it not being feasible to include RT hardware on smaller, lower performance GPUs. As performance leaks and previews have been showing us, even NVIDIA's top of the line RTX 2080 Ti can only deliver 40-60 FPS at 1080p in games such as the upcoming Shadow of the Tomb Raider and Battlefield V (DICE has even said they had to tone down levels of raytracing to achieve playable performance levels). Performance improvements until release could bring FPS up to a point, but all signs point towards a needed decrease in rendering resolution for NVIDIA's new 20-series to be able to cope with the added raytracing compute. And if performance looks like this on NVIDIA's biggest (revelaed) Turing die, with its full complement of RT cores, we can only extrapolate what raytracing performance would look like in cut-down dies with lower number of RT execution units. Perhaps it really wouldn't make much sense to add the increased costs and per-die-area of this dedicated hardware, if raytracing could only be supported in playable levels at 720p.
All in all, it seems to this editor that segregation of graphics cards via RTX capabilities would be a mistake, not only because of userbase fracturing, but also because the highest amount of players game at **60 and lower levels. Developers wouldn't be so inclined to add RTX to their games to such a small userbase, and NVIDIA would be looking at dilluting its gaming brand via RTX and GTX - or risk confusing customers by branding a non-RTX card with the RTX branding. If any of these scenarios come to pass, I risk saying it might have been too soon for the raytracing push - even as I applaud NVIDIA for doing it, anyway, and pushing graphics rendering further. But perhaps timing and technology could have been better? But I guess we all just better wait for actual performance reviews, right?
Source: NVIDIA via Seeking Alpha
Add your own comment

132 Comments on NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

#1
Metroid
So now we will have to pay $599 or higher for raytracing, i'm out.
Posted on Reply
#2
Durvelle27
Well if someone thought RT would be on lower tier GPUs that’s their fault.
Posted on Reply
#3
RejZoR
Durvelle27Well if someone thought RT would be on lower tier GPUs that’s their fault.
Well, some expect it even if it runs at 10fps in the end. Remember the times when 3DMark ran like that on top of the line cards? These days, I'm getting like 50+ fps in anything I throw at graphic cards... Not much of a benchmark if it doesn't bring top of the line graphic card to its knees.
Posted on Reply
#4
IceShroom
Do those new RTX card support 10bit colour or still 8bit like rest of Nvidia Geforce series?
Posted on Reply
#5
Solidstate89
I don't know, this is hardly unprecedented. When tessellation was the new thing on the block you could really only do it on the high-end cards for several years before the performance and architectural improvements moved downmarket. Also given the fact that 2080Tis were only running those Raytracing demos at 1080p I cringe to think of how bad performance would be on the 2060 cards and below.

If real-time Raytracing does become the next big leap in videogame engine design, I fully expect AMD and nVidia mid-range cards to be able to adequately handle the task in the next 2-3 generations. Until then though, it'll be the same issue whenever a brand new bleeding edge rendering/engine feature is introduced and that is either hardware or at the very least performance exclusive to the high-end.
Posted on Reply
#6
phanbuey
honestly if they could release a 2080ti without ray tracing for $300-$400 less i would be all over that.
Posted on Reply
#7
atomicus
They certainly can't call the lower end cards RTX if they can't deliver ray tracing... and they clearly won't. Driver improvements/optimisation aren't going to see that much of a leap in performance, given how barely capable the 2080Ti is. I'm even concerned the 2070 is going to struggle massively.

This could get really ugly for Nvidia in the months ahead. I think the only saving grace will be raw performance outside of ray tracing... if the 2080 and 2080Ti deliver on that front, it will atleast see decent sales to 1440p/4K gamers who want to push their hardware to the max and don't care about RTX. It will make Nvidia look like idiots for pushing this whole RTX thing in the first place though... given no one who can afford the GPU to run it has a low enough resolution monitor to do so... and those that do can't afford the GPU that can run it!! It's going to be one big farce!
Posted on Reply
#8
coonbro
maybe a 2080 RTX at 1500$ and then a 2080 GTX at 750$ [with 20$ mail in rebate]
Posted on Reply
#9
atomicus
phanbueyhonestly if they could release a 2080ti without ray tracing for $300-$400 less i would be all over that.
You do realise it's all built on the same chip right? They won't do that. It's not like disabling RTX would be an option either... in fact that would only add to the manufacturing cost.
Posted on Reply
#10
Captain_Tom
Why does this article act like this is new info? Everyone knows the 2060 is very unlikely to have RT cores - Nvidia wants its sheep to pay $600 cards from now on.

Furthermore the x60 series is now basically made for laptops first, and the new RT cores are HORRIBLY inefficient. It makes substantially more sense to create a Turing die that lacks RT and Tensor cores so they can make it a tiny <200mm^2 die that only uses 50-100w.
Posted on Reply
#11
phanbuey
atomicusYou do realise it's all built on the same chip right? They won't do that. It's not like disabling RTX would be an option either... in fact that would only add to the manufacturing cost.
of course, im referring to the lack or RTX as a segregator... as in... the inclusion of RTX is not worth the premium on the new cards at the moment.
Posted on Reply
#12
Ed_1
atomicusThey certainly can't call the lower end cards RTX if they can't deliver ray tracing... and they clearly won't. Driver improvements/optimisation aren't going to see that much of a leap in performance, given how barely capable the 2080Ti is. I'm even concerned the 2070 is going to struggle massively.

This could get really ugly for Nvidia in the months ahead. I think the only saving grace will be raw performance outside of ray tracing... if the 2080 and 2080Ti deliver on that front, it will atleast see decent sales to 1440p/4K gamers who want to push their hardware to the max and don't care about RTX. It will make Nvidia look like idiots for pushing this whole RTX thing in the first place though... given no one who can afford the GPU to run it has a low enough resolution monitor to do so... and those that do can't afford the GPU that can run it!! It's going to be one big farce!
There is a bigger issue here IMO, all these more advanced features need Dx12.
Well, many Dx12 games just don't run smoothly so first, they (game Dev's) need them to run well before adding more features IMO (this is both Nvidia and AMD are getting affected).

On the performance of RT core, we have to remember they are running alongside the normal Cuda cores so until the RT start bottlenecking there shouldn't be too much of an issue as long as there options available in the game for RT quality.
Posted on Reply
#13
metalfiber
Everybody's up in arms over raytracing. Remember when games went from pixels to polygons? Then polygons got shading and fps dropped so bad it was unplayable with shading on. Same thing happening with raytracing.
Wait a couple of years and you'll say to yourselves, How did we ever play games without raytracing?
Posted on Reply
#14
Solidstate89
atomicusThey certainly can't call the lower end cards RTX if they can't deliver ray tracing... and they clearly won't. Driver improvements/optimisation aren't going to see that much of a leap in performance, given how barely capable the 2080Ti is. I'm even concerned the 2070 is going to struggle massively.
I actually think that's the reason they created the RTX branding. The 2080Ti, 2080 and 2070 get the RTX branding and all the cards below it get the GTX branding.
Posted on Reply
#15
DeOdView
Hmmm... so the 2060 and below should prefixed as GTX right?

P.S.
On second thought, It's NV, so who know?! :)
Posted on Reply
#16
eidairaman1
The Exiled Airman
This is a washed up series.
Posted on Reply
#17
atomicus
phanbueyof course, im referring to the lack or RTX as a segregator... as in... the inclusion of RTX is not worth the premium on the new cards at the moment.
Well no, it's absolutely NOT worth the premium now... there's nothing to utilise it! And that's not going to change anytime soon. Furthermore, if you can only exploit RTX at 1080p, it makes it even less appealing for the vast majority of gamers who are at 1440p/4K. RTX gaming won't be available to them, so the cards had better deliver in raw performance terms or Nvidia are in serious trouble here.
metalfiberEverybody's up in arms over raytracing. Remember when games went from pixels to polygons? Then polygons got shading and fps dropped so bad it was unplayable with shading on. Same thing happening with raytracing.
Wait a couple of years and you'll say to yourselves, How did we ever play games without raytracing?
I don't disagree ray tracing is the future, but it is a long way off... and it doesn't change the fact that buying a ray tracing card today is largely a waste of money if that's the primary reason you're buying it... or at best just a terrible value proposition. But this is very much a chicken and egg scenario. You can't have ray traced games without ray traced cards, but people have to buy the ray traced cards first and then the games will (hopefully) follow. There needs to be tremendous confidence among the consumer base for this to succeed, and that was the entire thrust of Jensen's presentation... it's just a shame the pricing has offended so many. If they'd come in at Pascal prices everyone would be cheering right now. This is what results from lack of competition though. AMD need to sort their act out.
Posted on Reply
#18
ppn
They intentionally ruin the 2080 raytrace performance until Pascal inventory is depleted, and today the variety of available 1080s is halved compared to month ago. They need to get rid of them. You seriously think that 10 gigarays is not enough for 1080p.

In case 2060 only adds 20% more cores, and no raytracing and tensor cores the die size swould olny be 10% bigger than 1060. around 250 sq.mm. ans if they shrink it to 7nm - 120 sq.mm.
Posted on Reply
#19
phanbuey
ppnThey intentionally ruin the 2080 raytrace performance until Pascal inventory is depleted, and today the variety of available 1080s is halved compared to month ago. They need to get rid of them. You seriously think that 10 gigarays is not enough for 1080p.

In case 2060 only adds 20% more cores, and no raytracing and tensor cores the die size swould olny be 10% bigger than 1060. around 250 sq.mm. ans if they shrink it to 7nm - 120 sq.mm.
I don't know about that, i think their stock price would tank if they did that.

They can just cram the chips down the throats of their board partners anyways since there is no competition from AMD.
Posted on Reply
#20
neatfeatguy
IF the RT thing for Nvidia takes off, give things a couple more generations before it becomes a much more useful aspect to their cards. Right now I see it as just some kind of selling gimmick. Sure, the new cards may be overall faster than the previous generation, but the whole ray tracing aspect of it really hinders how useful the cards can actually be from what we've been shown.

If things pan out, I could see the mid ranged cards in the next gen (most likely 2 gens from now) having the ability to provide some kind of decent RT ability. To expect the mid ranged cards to do it now, just not going to happen.
Posted on Reply
#21
Dammeron
Back in the day we already had vertex and pixel shaders, which were replaced by unified ones, cause nV and ATi understood that diversification was not a way to go. I think it will be the same with ray-tracing - in few years we're gonna get a universal unit.
Posted on Reply
#22
zelnep
RayTracing will be fantastic, but for future. imagine if a 1250$ gpu's cant push RTX for 60fps at full HD - this means RTX is like 2 or more generations away. but even if you think 60-ish fps on fullHD is worth and acceptable - gues what - there will be like 0,5% (of total 100% PC gaming) or less people that actually will have a 1250$ gpu, that means - no developer will do anything (not even simple optimizing or hottfixes) for such a small market (well they will say othervise, when Jensen will invite them to promo-events and pay they to tell you so). so RTX for 20xx is a stillborn! wake me up when 30xx will hit the market
Posted on Reply
#23
diatribe
phanbueyhonestly if they could release a 2080ti without ray tracing for $300-$400 less i would be all over that.
The 2080ti shouldn't cost anymore than it's predecessors, either $699 or $749.
Posted on Reply
#24
Fleurious
diatribeThe 2080ti shouldn't cost anymore than it's predecessors, either $699 or $749.
That’s my take on these cards, regardless of how much better they may or may not perform.
Posted on Reply
#25
Vayra86
Glad to see you figured it out too @Raevenlord

I knew this when I saw Jensen shout TEN GIGA RAYS like a fool. He tried to pull a Steve Jobs on people. Everything was amazing, fantastic, never seen before, first time you could get proper use out of this new tech... it was like watching an Apple keynote.

Except with Nvidia, what they showed were jerky tech demoes at 30 FPS and a huge blurry mess of a dude dancing at the end. Oh yeah, it had reflections, too.

Its hilarious to see people on TPU echoing that RTRT is the next best thing. As if Nvidia re-invented the wheel. I guess this generation separates the fools from the realists.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts