Monday, August 20th 2018

NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics

NVIDIA at its Gamescom-imbued presentation finally took the lid off its long-awaited refresh to its GeForce lineup - and there's more than a thousand increase (and a consonant change) to it. At the Palladium venue in Koln, Germany (which was choke-full with press and NVIDIA-invited attendees), Jensen Huang went on stage to present a video on the advancements of graphics simulation that brought about animations such as Tron, the first Star Wars, the original Tomb Raider, Multi-Texturing on RIVA TNT, and passed through special effects in Hollywood... Every incarnation of the pixels and triangles we've been accustomed to.

We already know the juicy tidbits - the three models being released, when, and their pricing (with a hike to boot on the 2070 graphics card, which sees its price increased by $100 compared to last gen's 1070). We know the cooling solution official NVIDIA cards will sport, and how the company will be pairing efforts with game developers to ensure the extra hardware they've invested time, money, and a name change into will bear fruits. But what's behind this change? What brought us to this point in time? What powered the company's impressive Sol Demo?
It's been a long road for NVIDIA ever since its contributor Turner Whitted worked on Multi-bounce Recursive Ray-tracing started way back in 1978. Jensen Huang says that GPU development and improvement has been moving at ten times what was being demanded by Moore's Law to CPUs - 1000 times every ten years. But ray-tracing is - or was - expected to require Petaflops of computing power. Yet another step that would take some 10 years to achieve.
NVIDIA, naturally, didn't want any of that. According to Jensen Huang, that meant the company had to achieve an improvement equivalent to 1000 more performance - ten years earlier. The answer to that performance conundrum is RTX - a simultaneous hardware, software, SDK and library push, united in a single platform. RTX hybrid rendering unifies rasterization and ray tracing, with a first rasterization pass (highly parallel) and a second ray tracing pass that only acts upon the rendered pixels, but allows for materialization of effects and reflections and light sources that would be outside of the scene - and thus, virtually inexistent with pre-ray-tracing rendering techniques. Now, RT cores can work in tandem with rasterization compute solutions to achieve reasonable rendering times for ray-traced scenes that would, according to Jensen Huang, take ten times more to render in Pascal-based hardware.
(NVIDIA CEO Jensen Huang quipped that for gamers to be able to achieve ray-tracing before the RT cores were added in the silicon and architecture design mix, they'd have to pay $68,000 dollars for the DGX with four Tesla V100 graphics cards. He even offered to do so in 3,000 facile $19.95 payments.)
Turing has been ten years in the making, and Jensen Huang says this architecture and its RT Cores are the greatest jump in graphics computing for the company - and he likely meant the industry as well - since CUDA. The pairing of the three new or revised processing engines inside each Turing piece of silicon brings about this jump. The Turing SM, which allows for 14 TFLOPS and 14 TIPS (Integer Operations) of concurrent FP and INT Execution; the Tensor cores with their 110 TFLOPs of FP16, 220 TFLOPS if INT8, and a doubling again at 440 TFLOPS of INT4 performance; and the RT Core, with its 10 Giga Rays/sec (which Jensen Huang loves saying). For comparison, the 1080 Ti would be able to achieve, in peak conditions, 1.21 Giga Rays per second - almost 10 times lower performance.
And the overall effect on performance is nothing short of breathtaking, at least in the terms put out by Jensen Huang: a single Turing chip replaces the 4 V100 GPUs found within the DGX - and with lowered render times of just 45 ms against the V100's 55 ms for rendering a ray-traced scene. Pascal, on the other hand, would take 308 ms to render the same scene - in its 1080 Ti rendition no less.
A New Standard of Performance
Ray Tracing is being done all the time within 1 Turing Frame; this happens at the same time as part of the FP32 shading process - without RT cores, the green Ray tracing bar would be ten times larger. Now, it can be done completely within FP32 shading, followed by INT shading. And there are resources enough to add in some DNN (Deep Neural Network) processing to boot - NVIDIA is looking to generate Artificially-designed pixels with its DNN processing - essentially, the 110 TFLOPS powered by Tensor Cores, which in Turing render some 10x 1080 Ti equivalent performance, will be used to fill in some pixels - true to life - as if they had been actually rendered. Perhaps some Super Resolution applications will be found - this might well be a way of increasing pixel density by filling in additional pixels to an image.
Perhaps one of the least "sexy" tidbits out of NVIDIA's new generation launch is one of the most telling. The change from GTX to RTX speaks to years of history being paid respects to, but left behind, unapollogeticaly, for a full push towards ray-tracing. It speaks of leaving behind years upon years of pixel rasterization improvement in search of that which was only theoretically possible not that long ago - real-time ray-tracing of lighting across multiple, physically-based bodies.

The move from GTX to RTX means NVIDIA is putting its full weight behind the importance of its RTX platform for product iterations and the future of graphics computing. It manifests in a re-imagined pipeline for graphics production, where costly, intricate, but ultimately faked solutions gave way to steady improvements to graphics quality. And it speaks of a dream where AIs can write software themselves (and maybe themselves), and the perfect, Ground Truth Image is generated via DLSS in deep-learning powered networks away from your local computing power, sent your way, and we see true cloud-assisted rendering - of sorts. It's bold, and it's been emblazoned on NVIDIA's vision, professional and gamer alike. We'll be here to see where it leads - with actual ray-traced graphics, of course.
Sources: Ray Tracing and Global Illumination, NVIDIA Blogs, Image Inpainting
Add your own comment

65 Comments on NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics

#26
Fabio
ZoneDymoSigh... the only problem AMD had was availability tbh, Vega 56 was and is a fantastic card and you know it.
yes, at a 349 price..
Posted on Reply
#27
John Naylor
EntropyZAlso I didn't notice a big difference when RTX is turned on.
But isn't that kinda like looking at Motioin Blur comparisons between 60 hz and 144 hz on a 60 hz monitor ? Until the engine is designed for RT, you wont be able to see it.
EntropyZAlso I didn't notice a big difference when RTX is turned on. I don't think the performance drop will be worth the minuscule visual quality improvement (unless the game devs crack their knuckles and actually do something nice). Currently ambient occlusion does enough already but pushing gaming graphics to new heights doesn't hurt. I can wait for all the benchmarks with RTX on/off.
But isn't that kinda like looking at Motioin Blur comparisons between 60 hz and 144 hz on a 60 hz monitor ? Until the engine is designed for RT, you wont be able to see it.
lexluthermiesterJay did a summary which explained a few things. Worth a look;
I was tempted to look but after watching him drill the circuit traces on a motherboard to mount a CPU cooler, as well as his water cooling videos, I just can't take anything he does seriously. 15 seconds in I have flashbacks to that drill scene. :0

Actually, unless a youtube video is associated with a print web review site, I just don't watch. HardwareCanucks' case reviews is one i do look at. Looking forward to Wiz's return from Brazil.
Posted on Reply
#28
BluesFanUK
PS5/XB2 is threatening to eclipse PC gaming. They will never hit PC level graphical power but the difference is reducing to the naked eye with each generation.
Posted on Reply
#29
John Naylor
ghazi$1200 is almost the same as $700? $800 is almost the same as $500? Maybe you can find one or two board partner cards out od dozens for something like $50 less but NVIDIA's "MSRP" does not really exist.
You accounting for inflation and the 25% US tariffs in there ? If ya do, the 1080 ius actually significantly cheaper than 10xx series and the 1070 is $2 more.
Prima.VeraI guess will see in 2-3 months what AMD has to give to the table, but knowing how they'd compete in the last years, they most likely have something that's on the same performance levels as the 1080Ti...
Accounting for the huge difference in Overclock abilyty, AMD hasn't been competitive at the top end in years ..

With 7xx series, nVidia crushed AMDs marketing campaign for the 290x by taking dusting off the 1080 Ti design they had sitting on the shelf ... and as it turned out, when both overclocked, the 780 was faster than the 290x. Then the 970 came out and nVidia took the top 3 tiers with nVida selling 2+ times more than all AMDs 2xx and 3xx series combined. There was an illusionary battle for supremacy at the 1060 vs 480 level, but when OC'ing was figured in, the weak OC ability of AMD cards left it 10% behind. The 1060 took over as the most popiular card in steam's hardware survey, a position previously held by the 970. Here I think nVidia will be trying to establish a new tier ... something that can actually drive a 4k monitor with motion blur reduction . Im hoping so cause if they push to establish dominance down to the 1050 level that pushes AMD to almost the point of irrelevancy.

I don't see nVidia doing that as it could lead to ant-trust or other regulatory concerns ... It also doesn't seem like a good idea for AMD. The idea that AMD has something up their sleeve to challenge the Ti is hard to swallow, they haven't competed at the top end since the 7xxx series, and that's 6+ years ago. I think AMDs best move is to start at the bottom up... head off any challenge posed by the 2050 and, instead of putting a card out to challenge the 2070, take that card and price between the 2060 and 2070 and make it a competitive choice over the 2060. in other words what nVidia did with the 970 .... at a relatively small price increase over the 960 and AMDs offering, it killed. And that's where the volume sales are.
Posted on Reply
#31
efikkan
BluesFanUKPS5/XB2 is threatening to eclipse PC gaming. They will never hit PC level graphical power but the difference is reducing to the naked eye with each generation.
No, the consoles are more and more targeting the casual gamers, and the hardware is lagging further and further behind the PCs, so gaming enthusiasts are slowly leaving the platform. Nintendo have dropped all the way down to a tablet, and in a few years there will be TVs with better hardware built in. Back in the days consoles were actually competing with PC gaming.
Posted on Reply
#32
lexluthermiester
BluesFanUKPS5/XB2 is threatening to eclipse PC gaming. They will never hit PC level graphical power but the difference is reducing to the naked eye with each generation.
Your comments will fall on deaf ears here. We don't care about consoles. Until consoles have the versatility and configurability of a PC, they will never have what it takes to compete.
Posted on Reply
#33
Vayra86
BluesFanUKPS5/XB2 is threatening to eclipse PC gaming. They will never hit PC level graphical power but the difference is reducing to the naked eye with each generation.
Good point. There is no ray tracing in console tech as of now, another confirmation its not going to take off. All AMD needs to do is call out RTX for the cheap scam it really is, and profit.
BluesFanUKPS5/XB2 is threatening to eclipse PC gaming. They will never hit PC level graphical power but the difference is reducing to the naked eye with each generation.
Good point. There is no ray tracing in console tech as of now, another confirmation its not going to take off. All AMD needs to do is call out RTX for the cheap scam it really is, and profit.
Posted on Reply
#34
lexluthermiester
Vayra86Good point. There is no ray tracing in console tech as of now, another confirmation its not going to take off. All AMD needs to do is call out RTX for the cheap scam it really is, and profit.
It's not a cheap scam. It's step in the right direction, bringing hollywood quality visuals to gaming, something that needed to happen years ago.
Posted on Reply
#35
Vayra86
lexluthermiesterIt's not a cheap scam. It's step in the right direction, bringing hollywood quality visuals to gaming, something that needed to happen years ago.
Sure thing. Good luck buying into it then, paying Pascal prices for Pascal performance once more. This wont fly, mark my words. Save this post and get back to it next year; or just look at the number of Physx games on GPU today; or the number of Hairworks games, or Turf Effects... or...
Posted on Reply
#36
lexluthermiester
Vayra86Sure thing. Good luck buying into it then, paying Pascal prices for Pascal performance once more. This wont fly, mark my words. Save this post and get back to it next year; or just look at the number of Physx games on GPU today; or the number of Hairworks games, or Turf Effects... or...
There is a difference between Physx and ray-tracing. Physx was a platform for doing physics modeling in a game, something that can be done in software at similar performance hits. Ray-tracing is a graphics rendering technique that is well known and well used, but needs dedicated hardware to run more efficiently. NVidia has just given us all that hardware. It was only a matter of time until it was brought into the gaming sector. Smart devs are going to use it. Foolish and moronic devs will ignore it and adopt the attitude of ignorance.
Posted on Reply
#37
Vayra86
lexluthermiesterThere is a difference between Physx and ray-tracing. Physx was a platform for doing physics modeling in a game, something that can be done in software at similar performance hits. Ray-tracing is a graphics rendering technique that is well known and well used, but needs dedicated hardware to run more efficiently. NVidia has just given us all that hardware. It was only a matter of time until it was brought into the gaming sector. Smart devs are going to use it. Foolish and moronic devs will ignore it and adopt the attitude of ignorance.
Wow, you should join Jensen on stage shouting TEN GIGA RAYS PER SECOND PEOPLE!

You mistake theory for practice. The current iteration isnt practical, and economically it wont work.

Do you seriously believe Nvidia didnt drop a huge bag of money at each of those devs? No sane mind optimizes a game for the top 5% and considers that good business. You can look at Crytek to see how those developers fare...
Posted on Reply
#38
lexluthermiester
Vayra86Wow, you should join Jensen on stage shouting TEN GIGA RAYS PER SECOND PEOPLE!

You mistake theory for practice. The current iteration isnt practical, and economically it wont work.

Do you seriously believe Nvidia didnt drop a huge bag of money at each of those devs? No sane mind optimizes a game for the top 5% and considers that good business. You can look at Crytek to see how those developers fare...
Do understand just how useful ray-tracing is? When every movie studio on the planet uses it for their CGI, it's big and has been for decades. And now's it's come to the consumer. If you fail to understand what ray-tracing has done for the world and it's potential for gaming, that is only your failure.
Posted on Reply
#39
Steevo
lexluthermiesterDo understand just how useful ray-tracing is? When every movie studio on the planet uses it for their CGI, it's big and has been for decades. And now's it's come to the consumer. If you fail to understand what ray-tracing has done for the world and it's potential for gaming, that is only your failure.
Its useful, sure, but not in the closed wall garden of GameWorks. Its useful when it will be implemented, which based on game consoles running AMD hardware with DX12/Vulcan technology for the next 3 years it will only see the light of day on a few select titles on PC and then half of it will be CPU based.

Nvidia have a habit of making something that is only fully featured on games that suck ass. Arkham anyone? PhysX en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

Lets claim its someting great when we actually have a good set of games using it and everyone can use it. Till then a few thousand people with hardware capable of something not implemented and that will likely demand more performance than the card is capable of producing in its first generation is worthless.
Posted on Reply
#40
Recus
ShurikNSteve hit every nail on the head. Especially from 13:00 onward.
"For now Ray-Tracing looks to be more hype than anything, and nVidia are using this to hide the fact they don't actually have anything new to sell"
lol almost every new GPU generation introducing new technologies. RTX is not like Unified Shader Architecture so it won't took over rasterization overnight. Saying that RTX 2000 won't have any improvment in performance is silly.

Dumb people see less Cuda cores/TFLOPS = it must be slower. 780 Ti (2880c/5.1 tflops) vs 980 (2048c/4.6 tflops) who is faster?
Vayra86Wow, you should join Jensen on stage shouting TEN GIGA RAYS PER SECOND PEOPLE!

You mistake theory for practice. The current iteration isnt practical, and economically it wont work.

Do you seriously believe Nvidia didnt drop a huge bag of money at each of those devs? No sane mind optimizes a game for the top 5% and considers that good business. You can look at Crytek to see how those developers fare...
Wolfenstein 2 Beta patch: Async Compute brings 5% extra performance for RX Vega 64
AMD finds 5% more performance with async
Async Compute Only Boosted HITMAN’s Performance By 5-10% on AMD cards
Posted on Reply
#41
efikkan
Vayra86There is no ray tracing in console tech as of now, another confirmation its not going to take off. All AMD needs to do is call out RTX for the cheap scam it really is, and profit.
Raytracing is not a scam. It will take a while before it's very useful, but all new technology has to be introduced some way, and this is only the beginning of the next revolution in graphics. Give it a few iterations, and you'll see it's useful.

Even with the limited usefulness of raytracing in the GeForce 20xx series, the cards will still be the most powerful GPUs we've seen by far. It's astounding how all the "wise guys" on Youtube manages to know the performance of this new generation, failing to realize the obvious that the SMs are completely redesigned. What we normally refer to as "cores" in GPUs are not cores at all, it's FPUs (usually FPU/ALU combos). Turing uses a different structure where FPUs and ALUs are separate clusters, allowing much higher throughput. We'll have to wait and see how much of a difference this makes for gaming. I honestly don't know if we will see a large difference across the board, or more of a boost in certain types of games. But I do know that Nvidia wouldn't switch to this unless it was significantly better.
Posted on Reply
#42
lexluthermiester
SteevoLets claim its someting great when we actually have a good set of games using it and everyone can use it.
How about every major hollywood production that use CGI? Welcome to the real world. The future is now.
Posted on Reply
#43
Xzibit
ViperXTRwww.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Videos/Nvidia-RTX-2080-Ti-Performance-in-Shadow-of-the-Tomb-Raider-1263244/

looks like 2080ti is running tomb raider and drops to 30+ fps on heavy scenarios, still an early version of the game
Its also at 1080p. Game releases next month its on its final stages. Sep, 14
PCGamesHardware.DeWe tried Shadow of the Tomb Raider with Raytracing on a Geforce RTX 2080 Ti and show in the video the pictures per second at Full HD and probably in the highest detail level.
$1200 card with RTX at 1080p 30-60fps
Posted on Reply
#44
Raunhofer
People here calling ray-tracing a scam.
After commenting they go back to watch the new Netflix movie filled with CGI they don't even know is CGI... because of ray-tracing. Ignorance never goes away, does it?

Comparing ray-tracing to hairworks is the new VR is 3D-TV.
Posted on Reply
#45
ShurikN
RaunhoferAfter commenting they go back to watch the new Netflix movie filled with CGI they don't even know is CGI... because of ray-tracing. Ignorance never goes away, does it?
That movie was made with probably half a million dollars in ray-tracing capable hardware. Not a single RTX 2080. Not in real time.
No one here is going to make CGI movies with a gaming card.
At this point in time ray-tracing is a gimmick. In two years it will be the best thing ever, now it's worthless for gaming.

This entire series was made as a stopgap until 7nm matures. Hype up people with RT, get them to buy Turing saying it's the best thing since sliced bread and cocaine, and then in mid 2019 come out with a new card, with everything improved, including ray-tracing capabilities. Get money from sheep twice in less than one year. Cupertino style.
This might be the one of the most pointless release in recent history along with Kaby Lake, hand full of Bulldozer iterations and 2000 series Radeon.
Posted on Reply
#46
efikkan
Raytracing is no more a "gimmick" than DirectX 12 was a "gimmick". We're still waiting for the good games to use it natively, yet many jumped on the hype train and bought cards that were supposed to be more "future proof". Most of those buyers have already or are about to replace their hardware anyway…
Posted on Reply
#47
Steevo
RaunhoferPeople here calling ray-tracing a scam.
After commenting they go back to watch the new Netflix movie filled with CGI they don't even know is CGI... because of ray-tracing. Ignorance never goes away, does it?

Comparing ray-tracing to hairworks is the new VR is 3D-TV.
www.cnet.com/news/vr-is-not-dying-insists-manufacturer-of-vr-headset/


store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

www.roadtovr.com/valve-monthly-active-vr-users-on-steam-are-up-160-year-over-year/


.07% winning .. sure looks funny if it looks like that
Posted on Reply
#48
Xzibit
efikkanRaytracing is no more a "gimmick" than DirectX 12 was a "gimmick". We're still waiting for the good games to use it natively, yet many jumped on the hype train and bought cards that were supposed to be more "future proof". Most of those buyers have already or are about to replace their hardware anyway…
Shadows of the Tomb Raider confirmed RTX support would be a patch not native upon release. EA wont commit to saying anything about performance metrics.

Game needs DX12 support, On top of that needs DXR support. Then it has to be supported by Nvidia GameWorks RT SDK.

How many games include DX12 support? How many games run better on DX12 over DX11 on Nvidia hardware? How many games include GameWorks without Nvidia backing?
Posted on Reply
#49
Vayra86
lexluthermiesterDo understand just how useful ray-tracing is? When every movie studio on the planet uses it for their CGI, it's big and has been for decades. And now's it's come to the consumer. If you fail to understand what ray-tracing has done for the world and it's potential for gaming, that is only your failure.
Read again. Never said it wasnt useful, im saying this implementation is a failure
Posted on Reply
#50
Jism
Geezus, what where they thinking at the marketing department... Lets raise pricing like Apple does, and charge a broad 1100$ for the premium model.

The lack of AMD competing in the high end is what creates this madness. AMD needs to get their stuff together, that 7nm vega aint going to cut it.
Posted on Reply
Add your own comment
Mar 12th, 2025 14:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts