Monday, August 20th 2018

NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics

NVIDIA at its Gamescom-imbued presentation finally took the lid off its long-awaited refresh to its GeForce lineup - and there's more than a thousand increase (and a consonant change) to it. At the Palladium venue in Koln, Germany (which was choke-full with press and NVIDIA-invited attendees), Jensen Huang went on stage to present a video on the advancements of graphics simulation that brought about animations such as Tron, the first Star Wars, the original Tomb Raider, Multi-Texturing on RIVA TNT, and passed through special effects in Hollywood... Every incarnation of the pixels and triangles we've been accustomed to.

We already know the juicy tidbits - the three models being released, when, and their pricing (with a hike to boot on the 2070 graphics card, which sees its price increased by $100 compared to last gen's 1070). We know the cooling solution official NVIDIA cards will sport, and how the company will be pairing efforts with game developers to ensure the extra hardware they've invested time, money, and a name change into will bear fruits. But what's behind this change? What brought us to this point in time? What powered the company's impressive Sol Demo?
It's been a long road for NVIDIA ever since its contributor Turner Whitted worked on Multi-bounce Recursive Ray-tracing started way back in 1978. Jensen Huang says that GPU development and improvement has been moving at ten times what was being demanded by Moore's Law to CPUs - 1000 times every ten years. But ray-tracing is - or was - expected to require Petaflops of computing power. Yet another step that would take some 10 years to achieve.
NVIDIA, naturally, didn't want any of that. According to Jensen Huang, that meant the company had to achieve an improvement equivalent to 1000 more performance - ten years earlier. The answer to that performance conundrum is RTX - a simultaneous hardware, software, SDK and library push, united in a single platform. RTX hybrid rendering unifies rasterization and ray tracing, with a first rasterization pass (highly parallel) and a second ray tracing pass that only acts upon the rendered pixels, but allows for materialization of effects and reflections and light sources that would be outside of the scene - and thus, virtually inexistent with pre-ray-tracing rendering techniques. Now, RT cores can work in tandem with rasterization compute solutions to achieve reasonable rendering times for ray-traced scenes that would, according to Jensen Huang, take ten times more to render in Pascal-based hardware.
(NVIDIA CEO Jensen Huang quipped that for gamers to be able to achieve ray-tracing before the RT cores were added in the silicon and architecture design mix, they'd have to pay $68,000 dollars for the DGX with four Tesla V100 graphics cards. He even offered to do so in 3,000 facile $19.95 payments.)
Turing has been ten years in the making, and Jensen Huang says this architecture and its RT Cores are the greatest jump in graphics computing for the company - and he likely meant the industry as well - since CUDA. The pairing of the three new or revised processing engines inside each Turing piece of silicon brings about this jump. The Turing SM, which allows for 14 TFLOPS and 14 TIPS (Integer Operations) of concurrent FP and INT Execution; the Tensor cores with their 110 TFLOPs of FP16, 220 TFLOPS if INT8, and a doubling again at 440 TFLOPS of INT4 performance; and the RT Core, with its 10 Giga Rays/sec (which Jensen Huang loves saying). For comparison, the 1080 Ti would be able to achieve, in peak conditions, 1.21 Giga Rays per second - almost 10 times lower performance.
And the overall effect on performance is nothing short of breathtaking, at least in the terms put out by Jensen Huang: a single Turing chip replaces the 4 V100 GPUs found within the DGX - and with lowered render times of just 45 ms against the V100's 55 ms for rendering a ray-traced scene. Pascal, on the other hand, would take 308 ms to render the same scene - in its 1080 Ti rendition no less.
A New Standard of Performance
Ray Tracing is being done all the time within 1 Turing Frame; this happens at the same time as part of the FP32 shading process - without RT cores, the green Ray tracing bar would be ten times larger. Now, it can be done completely within FP32 shading, followed by INT shading. And there are resources enough to add in some DNN (Deep Neural Network) processing to boot - NVIDIA is looking to generate Artificially-designed pixels with its DNN processing - essentially, the 110 TFLOPS powered by Tensor Cores, which in Turing render some 10x 1080 Ti equivalent performance, will be used to fill in some pixels - true to life - as if they had been actually rendered. Perhaps some Super Resolution applications will be found - this might well be a way of increasing pixel density by filling in additional pixels to an image.
Perhaps one of the least "sexy" tidbits out of NVIDIA's new generation launch is one of the most telling. The change from GTX to RTX speaks to years of history being paid respects to, but left behind, unapollogeticaly, for a full push towards ray-tracing. It speaks of leaving behind years upon years of pixel rasterization improvement in search of that which was only theoretically possible not that long ago - real-time ray-tracing of lighting across multiple, physically-based bodies.

The move from GTX to RTX means NVIDIA is putting its full weight behind the importance of its RTX platform for product iterations and the future of graphics computing. It manifests in a re-imagined pipeline for graphics production, where costly, intricate, but ultimately faked solutions gave way to steady improvements to graphics quality. And it speaks of a dream where AIs can write software themselves (and maybe themselves), and the perfect, Ground Truth Image is generated via DLSS in deep-learning powered networks away from your local computing power, sent your way, and we see true cloud-assisted rendering - of sorts. It's bold, and it's been emblazoned on NVIDIA's vision, professional and gamer alike. We'll be here to see where it leads - with actual ray-traced graphics, of course.
Sources: Ray Tracing and Global Illumination, NVIDIA Blogs, Image Inpainting
Add your own comment

65 Comments on NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics

#1
Basard
I suppose a lot of people missed the live stream.... Luckily I took a vacation day from work. It was actually pretty informative and well presented.
I haven't seen it posted anywhere yet in TPU's news, so here it is: www.twitch.tv/videos/299680425##
And, for the love of god, just skip to 1:52:00 and click the "X" after Huang gets off the stage.
Posted on Reply
#2
EntropyZ
I think in gaming it won't be anything impressive. Nvidia has no business fighting their own previous generation which is still competitive. Maybe those people that want to up their resolutions or run 4K comfortably on demanding games would upgrade. Or someone who has been sitting on Maxwell.

Also I didn't notice a big difference when RTX is turned on. I don't think the performance drop will be worth the minuscule visual quality improvement (unless the game devs crack their knuckles and actually do something nice). Currently ambient occlusion does enough already but pushing gaming graphics to new heights doesn't hurt. I can wait for all the benchmarks with RTX on/off.

GeForce is still a GPGPU. So I don't expect any big gaming performance gains either way.

I wonder if there will ever be a "pure gaming" card and dedicate all of the silicon real estate to things that do contribute to overall better frametimes. We had cards built specifically for mining after all. Maybe when they run out of tricks, they can experiment.

Anyway, time to give up your wallet to the leather jacket man, he has won. Resistance is futile.
Posted on Reply
#3
lexluthermiester
Jay did a summary which explained a few things. Worth a look;
Posted on Reply
#4
oxidized
EntropyZI think in gaming it won't be anything impressive. Nvidia has no business fighting their own previous generation which is still competitive. Maybe those people that want to up their resolutions or run 4K comfortably on demanding games would upgrade. Or someone who has been sitting on Maxwell.

Also I didn't notice a big difference when RTX is turned on. I don't think the performance drop will be worth the minuscule visual quality improvement. Currently ambient occlusion does enough already but pushing gaming graphics to new heights doesn't hurt. I can wait for all the benchmarks with RTX on/off.

GeForce is still a GPGPU. So I don't expect any big gaming performance gains either way.

I wonder if there will ever be a "pure gaming" card and dedicate all of the silicon real estate to things that do contribute to overall better frametimes. We had cards built specifically for mining after all. Maybe when they run out of tricks, they can experiment.

Anyway, time to give up your wallet to the leather jacket man, he has won. Resistance is futile.
Well let's say there's not a single game that fully embraces what Ray tracing is, and of course there's not even a single card powerful enough to handle such game, not even their Quadro 8000 possibly, even because 2080Ti has the same amount of "Gigarays"

Sounds to me like this is very big pascal chips + these new techs and additions, computational power increase in "normal" games is probably only possible due to the increase of the die size, and consequentially of number of transistors.
Posted on Reply
#5
ghazi
oxidizedWell let's say there's not a single game that fully embraces what Ray tracing is, and of course there's not even a single card powerful enough to handle such game, not even their Quadro 8000 possibly, even because 2080Ti has the same amount of "Gigarays"

Sounds to me like this is very big pascal chips + these new techs and additions, computational power increase in "normal" games is probably only possible due to the increase of the die size, and consequentially of number of transistors.
Yeah, and with most of that additional die area going to the new technologies and not to shaders, rasterizers, or anything of the sort, it seems pretty underwhelming for the price. The fact that it's up for preorder for a whole month with no actual performance numbers (only in software utilizing the RT and Tensor cores that the old cards lack, which for a gaming card is NOT a realistic usage scenario) makes me fear for the worst here.
Posted on Reply
#6
Steevo
250W power consumption for roughly 20%-40% more actual game performance, means limited overclocking, high temps, and the addition of all the crap others have already done, stuffed into 5 games in 2 years from now where it will be supported only on newer cards and then only half is actually rendered the other half is precooked.


I wonder what happens if we look back to a competitive brand 250W card... and how it was embraced by the community for $1200.....
Posted on Reply
#7
Manu_PT
Time to keep my GTX1060 i5 8400 rig for e-sports titles, indies, retro gaming and RTS.

AAA from now on will be on a console, PS4 Pro or Xbox One X (or maybe PS5 or Xbox 2). PC high-end market going Apple style. No thanks, I´m not willing to pay this type of money for that.

Congratulations Nvidia for making Pc high-end market a luxury and niche thing.
Posted on Reply
#8
lexluthermiester
Manu_PTTime to keep my GTX1060 i5 8400 rig for e-sports titles, indies, retro gaming and RTS.

AAA from now on will be on a console, PS4 Pro or Xbox One X (or maybe PS5 or Xbox 2). PC high-end market going Apple style. No thanks, I´m not willing to pay this type of money for that.

Congratulations Nvidia for making Pc high-end market a luxury and niche thing.
What are you talking about? The prices stated are almost the same as current 1080ti, 1080 and 1070 prices. And those model prices will be dropping soon. NVidia is not going "Apple". Stop creating drama where there is none.
Posted on Reply
#9
ghazi
lexluthermiesterWhat are you talking about? The prices stated are almost the same as current 1080ti, 1080 and 1070 prices. And those model price will be dropping soon. NVidia is not going "Apple". Stop creating drama where there is none.
$1200 is almost the same as $700? $800 is almost the same as $500? Maybe you can find one or two board partner cards out od dozens for something like $50 less but NVIDIA's "MSRP" does not really exist.
Posted on Reply
#11
ViperXTR
This like the dawn of GeForce 3 programmable shaders or Nvidia's gamble into cinematic computing liek GeForce FX, id wait for the 2nd or 3rd gen iteration of it before it can really take off
Posted on Reply
#12
Raendor
lexluthermiesterWhat are you talking about? The prices stated are almost the same as current 1080ti, 1080 and 1070 prices. And those model price will be dropping soon. NVidia is not going "Apple". Stop creating drama where there is none.
You drunk? 1080ti was released for $649 and $699 aib and founders respectively. 1080 was $549 after that. This time it’s $999/$1200 and $699/799 ffs!
Posted on Reply
#13
Flanker
If game developers start to do their rendering with relying on these Ray tracing hardware, things can get pretty shitty for AMD wouldn't it? Unless they already have something similar coming out.
Posted on Reply
#14
RejZoR
FlankerIf game developers start to do their rendering with relying on these Ray tracing hardware, things can get pretty shitty for AMD wouldn't it? Unless they already have something similar coming out.
Would be hilarious and ironic if relentlessly bashed GCN turns out to be excellent at raytracing compute XD
Posted on Reply
#15
Prima.Vera
I guess will see in 2-3 months what AMD has to give to the table, but knowing how they'd compete in the last years, they most likely have something that's on the same performance levels as the 1080Ti...
Posted on Reply
#16
ZoneDymo
Prima.VeraI guess will see in 2-3 months what AMD has to give to the table, but knowing how they'd compete in the last years, they most likely have something that's on the same performance levels as the 1080Ti...
Sigh... the only problem AMD had was availability tbh, Vega 56 was and is a fantastic card and you know it.
Posted on Reply
#17
lexluthermiester
ghazi$1200 is almost the same as $700? $800 is almost the same as $500? Maybe you can find one or two board partner cards out od dozens for something like $50 less but NVIDIA's "MSRP" does not really exist.
Not sure where you getting your info, but $999, $699 and $499 came straight from NVidia and are similar, if not identical, pricing for current models. Check your facts.
RaendorYou drunk? 1080ti was released for $649 and $699 aib and founders respectively. 1080 was $549 after that. This time it’s $999/$1200 and $699/799 ffs!
Nope, quite sober and know how to read..
Posted on Reply
#18
Nkd
lexluthermiesterNot sure where you getting your info, but $999, $699 and $499 came straight from NVidia and are similar, if not identical, pricing for current models. Check your facts.


Nope, quite sober and know how to read..
fanboy talk here. Where have you been? You forgot the founder edition tax? Where the fake msrp happens after like 6 motnhs lol. Stop talking like a fanboy. Pleas find me a card thats 999 right now. or 699. Stop drinking nvidia coolaid.
Posted on Reply
#19
lexluthermiester
Nkdfanboy talk here. Where have you been? You forgot the founder edition tax? Where the fake msrp happens after like 6 motnhs lol. Stop talking like a fanboy. Pleas find me a card thats 999 right now. or 699. Stop drinking nvidia coolaid.
The Titan's are well over $1000, the 1080ti is in the $700 to $800 range, the 1080 is $550 to $700 and the 1070ti is below $500. Fanboy what now? Go do some homework.
Posted on Reply
#20
Nkd
lexluthermiesterThe Titan's are well over $1000, the 1080ti is in the $700 to $800 range, the 1080 is $550 to $700 and the 1070ti is below $600. Fanboy what now? Go do some homework.
I am not sure what you are talking about. You did not answer my question. Keep drinking coolaid lol!
Posted on Reply
#21
lexluthermiester
NkdI am not sure what you are talking about. You did not answer my question. Keep drinking coolaid lol!
Because it was a worthless, pointless question. Who's the fanboy?
Posted on Reply
#22
hat
Enthusiast
If you have a problem with the price, then vote with your wallet. It's the most powerful tool you have as a consumer. I too think $1200 is a ridiculous price for a graphics card, even Titan. That's why I don't have one. Hell, I paid an insane price for two 1070s (around $900) but that was justifiable because mining which fortunately has worked out in my favor. Normally I'd never buy two cards, only one, but I still wouldn't pay $450 for a gaming card. I'd still have my 660 ti while eyeballing a 1060 or even a 1050ti.
Posted on Reply
#24
Xaled
Why call something that would give X% amount of improvement for X% amount of money 'revolutionary'? not to mention that software that supports this technology would become much more expensive because nvidia would charge them too...
so in the end we would be getting X% amount of improvement for 2-3X% cost.
Posted on Reply
#25
ShurikN
Xzibit
Steve hit every nail on the head. Especially from 13:00 onward.
"For now Ray-Tracing looks to be more hype than anything, and nVidia are using this to hide the fact they don't actually have anything new to sell"
Posted on Reply
Add your own comment
Oct 19th, 2024 13:07 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts