Saturday, September 23rd 2023

NVIDIA Foresees Greater Reliance on AI Rendering Techniques in Games

Digital Foundry organized an AI Visuals Roundtable earlier this week, with Alex Battaglia (Video Producer) acting as host—the main topic being a recently visually upgraded version of Cyberpunk 2077. Guests included: Bryan Catanzaro (NVIDIA's Vice President Applied Deep Learning Research), Jakub Knapik (VP Art and Global Art Director at CD Projekt RED), Jacob Freeman (GeForce Marketing Evangelist) and Pedro Valadas (PCMR Founder). Team Green, naturally, advocates its newly released DLSS 3.5 and Ray Reconstruction technologies—brute force rendering techniques are considered passé, and no longer an ideal solution for upcoming GPU generations. Valadas was curious enough to ask whether NVIDIA had any plans to target reasonable performance levels—minus DLSS—at native resolutions.

Catanzaro focused on the benefits introduced in Update 2.0 and Phantom Liberty: "I think that DLSS 3.5 actually makes Cyberpunk 2077 even more beautiful than native rendering...that's my belief. The reason being that the AI is able to make smarter decisions about how to render a scene...and I think that's going to continue to develop." He half jokingly states that rasterization is a "bag of fakeness." A combination of DLSS and path tracing is preferred over old hat methods—thus attaining the most realistic visual results. He summarizes with his own slogan: "Native rendering is fake frames." Catanzaro predicts that real-time graphics industries will become increasingly reliant on AI-processed image reconstruction and rendering technologies in the future. He hopes that AAA game environments will become cheaper to make in due time—with a shift to neural rendering. DLSS 10 could interface with game engine technology at a higher level, with a possible outcome being the creation of more "immersive and beautiful" experiences.
Digital Foundry's video description states: "Many thanks to all participants in this roundtable chat: Bryan Catanzaro, Vice President Applied Deep Learning Research at Nvidia, Jakub Knapik VP Art and Global Art Director at CD Projekt RED, GeForce evangelist Jacob Freeman and Pedro Valadas of the PCMR sub-Reddit."


Cyberpunk 2077 2.0 is here—the first showcase for DLSS 3.5 ray reconstruction, integrated into a full DLSS package including super resolution and frame generation, all combining to produce a state-of-the-art visual experience. In this roundtable discussion, we discuss how ray reconstruction works, how it was developed and its applications outside of path-traced games. We also talk about the evolution of the original DLSS, the success of DLSS in the modding community and the future of machine learning in PC graphics.
Sources: Eurogamer, Tom's Hardware
Add your own comment

47 Comments on NVIDIA Foresees Greater Reliance on AI Rendering Techniques in Games

#26
Vya Domus
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about.
I remember that too but back then it was worth it, games really did look better year after year. Now they don't, you have to squint your eyes to able to tell that a reflection is more accurate using X technology instead of Y but the impact to performance is catastrophic nonetheless.

Ray tracing was bad enough performance wise but you could argue it does make a big difference to visuals. But now Nvidia is pushing for full path tracing, you get a quarter of the performance down to literal slideshow levels of framerates for what exactly ? Even the most hardcore fanboy would have trouble to tell the difference if the images aren't labeled. Are we supposed to just pretend it's amazing so we pay Nvidia 2000$ to make the games playable with a million levels of upscaling/interpolation/denoising/etc ?
Posted on Reply
#27
gffermari
Vya DomusAre we supposed to just pretend it's amazing ...
It is amazing when the correct RT effects are implemented. Most console titles that come with RT, support just low res RT shadows or reflections. Not even both.
Also most people don't even understand what RT does. It won't improve the models, it won't improve the design, it won't improve the texture quality. The main thing it does is the light simulation which is critical in AAA titles.
Posted on Reply
#28
Sabotaged_Enigma
Frame gen is just guessing frames between original ones but those generated frames are still just "guessed"... it's not the long way to go... it just makes a good-looking fps number. Say in FPS games, AI of course wouldn't know my next move, although it has the chance to guess it right.
I don't think games should rely on AI techs. Otherwise let's just use AI to render the whole game, needless to try so hard to make models and optimise visuals, why develop games in the first place?
Software is important, but it doesn't mean it's all.
Posted on Reply
#29
Vayra86
ImoutoWho's the genius writing their slogans and catchphrases? If rasterization is fakeness then DLSS is fake fakeness since most of the work on the render pipeline is still pure raster.

And people gulp this crap with a smile on their faces.

RT is incredibly taxing. Rendering a game with full RT at 1440p@60FPS so it can be upscalled to 4K is beyond anything a graphics card can pull in the next two decades. The missing performance is raster, as simple as that.
But DLSS so much betuuhhhrrrerrr

Its like missionary work, and it'll likely end the same way: a steady growth of atheism as people get their reality checks later down the line while a subgroup descends into extremism. I remember a few years ago everyone was going to be all over RT. Today you're happy if you can discover whether a god ray is rendered in raster or not. Its completely irrelevant, simply because the tax is too high - in every way: hardware cost, performance cost, game support cost...

Fools & Money, undeniable, and proven, with every marketing blurb with hardware solutions looking for new problems to create. Because we all know: there isn't a problem. Starfield shouldn't require FSR. Cyberpunk shouldn't require DLSS and neither games should be artifacting their way to decent FPS. We had games running fine for many years without all of this shit.

I haven't seen a single game actually be a better game because of these technologies, and no, being verge of playable when you use it and unplayable without isn't a pro argument to use them...
Cyberpunk for all its bells and whistles is now trying to relaunch itself with a 2.0 version that is in fact hoping to aspire to the expectations of what 1.0 should've been. But at least you have a dozen ways to run it!
ZoneDymoand in the end I want to state again, I want to be in a time period where all this nonsense is just behind us, no more DLSS/FSR/XESS and just one hardware agnostic solution that just works for all so we can once again properly focus on hardware.
And RT at levels that we dont even need a denoiser anymore.
Its just unprecedented this focus on software that determine how a game looks per hardware vendor and I cant wait for it to be behind us.
So much this. I can't begin to describe the sadness I feel for all those fools pixel peeping their FSR/DLSS/Xess comparisons and discussing what's what. Seriously, get a fucking life and realize how knee deep you're in the corporate cesspool, in fully apathy of a marketing machine used to sell inferior games and inferior GPUs and call them done. Just demand all vendors to stop this bullshit and get with the program, yes, we know, you can do this trick, now unify it, deploy it and move on.

This interview underlines it nicely.
We already know DLSS doesn't need AI, interpolation is as old as 1999. Stop bullshitting
Dr. DroIt's just TPU and its traditionally AMD-biased patronage; and when AMD doesn't support something, doesn't do it correctly, or fumbled something, this is the bog standard reaction of simply ignoring/downplaying/hostilizing it. Personally, it's a big reason why I stopped caring for AMD, their own diehards would rather stick to them through thick and thin instead of demanding that they do something about it.

Much as we love this forum, we're a very small segment... if you went by TPU's numbers, Radeon would have like a 70% market share. Don't let it get to you, mate.
This whole affair isn't and was never about AMD and what it does or doesn't do, that's just the easy cop out so we don't have to face reality and can keep blinders on about what's happening in GPU land.
I hate FSR in similar measure, except AMD isn't using it to sell. They just don't sell ;) :roll:

Also, AMD biased... you couldn't be more wrong. TPU is however a community where people like to think more and click less on baity videos and nonsense. Or at least, there is a substantial group in it. Less marketing influenced? Definitely. AMD biased? That's just your own bias talking and if anything should poise you to reflect a little. The vast majority here has been running Nvidia for decades and that includes me. People here generally just want whatever is best at the time, its why they came here, influenced by some of the highest quality reviewing on the web ;)
Posted on Reply
#30
PapaTaipei
Ray tracing and dls alongside other useless features were developed in the first place to force people to require more powerful GPUs in order to sell more, that is, because rasterization has become so easy for GPUs that they absolutely needed a NEW REASON to market and sell GPUs. RT is not eco friendly, will spawn a new generation of devs for which optimisation is something they don't know.
Posted on Reply
#31
gffermari
PapaTaipeiRay tracing and dls alongside other useless features were developed in the first place to force people to require more powerful GPUs in order to sell more, that is, because rasterization has become so easy for GPUs that they absolutely needed a NEW REASON to market and sell GPUs. RT is not eco friendly, will spawn a new generation of devs for which optimisation is something they don't know.
RT is not a new reason. It's a necessity because we can't path trace the games.
Path tracing is the only way to simulate the light correctly but it will take decades to run it natively.
Also, there is another aspect that hasn't been touched (or shot dead by nVidia, so they make money out of it later in the future).
Physics, destruction and materials behaviour.
Posted on Reply
#33
mb194dc
QUANTUMPHYSICSDLSS and RTX are AI features NO ONE ASKED FOR.

Nvidia pushed them forward and made them benchmarks.

Benchmarks that they themselves sit atop of and control.
Agree, DLSS (and FSR) looks shite in a lot of circumstances, especially fast movement. I'll use my ultra expensive GPU for native res, thanks.

RTX = slightly better reflections for a 50%~ performance hit and you need better cpu for it.

AI, no idea what LLMs do for games. Probably just dropping it as a buzz word in a desperate attempt for more stock price pumping.

I was thinking earlier how Crysis with DX10 really was a revolution in visuals that made the performance hit seem worthwhile.

That isn't true at all of Cyberpunk with RT and DLSS at all.

We're very much in the tech snake oil era these days imo.
Posted on Reply
#34
Imouto
People tend to forget that if you want a middle ground between raster and RT/PT you have PBR. RT/PT for games is absolute bullshit that's only being pushed because it brings down graphics cards to their knees.

The natural evolution was PBR but Nvidia is pushing RT/PT so hard it won't stand a chance.
Posted on Reply
#35
dyonoctis
ImoutoPeople tend to forget that if you want a middle ground between raster and RT/PT you have PBR. RT/PT for games is absolute bullshit that's only being pushed because it brings down graphics cards to their knees.

The natural evolution was PBR but Nvidia is pushing RT/PT so hard it won't stand a chance.
PBR is used in every modern game...including those making use of RT/PT. It's not an opposite technology, it's an integral part of modern 3D rendering, most offline 3D renderer uses PBR materials, ray tracing, path tracing and PBR can be complementary, and it's already used in rasterization. The Witcher 3 makes extensive use of PBR, and it's a raster/RT game.
Minecraft with RTX PBR Texturing Guide | GeForce News | NVIDIA
Understanding Physically Based Rendering in Arnold | AREA by Autodesk
Physically-Based Shading at Disney
SIGGRAPH 2010 Course: Physically-Based Shading Models in Film and Game Production (renderwonk.com)
Posted on Reply
#36
GreiverBlade
ZoneDymoand in the end I want to state again, I want to be in a time period where all this nonsense is just behind us, no more DLSS/FSR/XESS and just one hardware agnostic solution that just works for all so we can once again properly focus on hardware.
You do realise that FSR and, iirc, XeSS are hardware agnostic, only DLSS is hardcore proprietary.

(And FSR, albeit needing some refinement, works perfectly fine, imho)
Posted on Reply
#37
evernessince
Vya DomusAre we supposed to just pretend it's amazing so we pay Nvidia 2000$ to make the games playable with a million levels of upscaling/interpolation/denoising/etc ?
Well said.

You've made me realize I completely forgot the price argument as well. Nvidia jacked up pricing on the 2000 series, arguing that was the cost of ray tracing, and future generations have only become more expensive.

So really we are spending more than ever on graphics cards for a feature of questionable subjective benefit as of right now, a huge performance hit, and the addition of graphics artifacts and latency.

And now Nvidia is likely to increase prices next gen as well because according to Jensen, Moore's law is dead for everyone but the well off.
Posted on Reply
#38
tancabean
ImoutoPeople tend to forget that if you want a middle ground between raster and RT/PT you have PBR. RT/PT for games is absolute bullshit that's only being pushed because it brings down graphics cards to their knees.

The natural evolution was PBR but Nvidia is pushing RT/PT so hard it won't stand a chance.
PBR is a shading technique. It has nothing to do with RT vs rasterization which are visibility algorithms. They both use PBR.
Posted on Reply
#39
cmguigamf
GreiverBladeYou do realise that FSR and, iirc, XeSS are hardware agnostic, only DLSS is hardcore proprietary.

(And FSR, albeit needing some refinement, works perfectly fine, imho)
XeSS is only hardware agnostic to a point. Unless you're using it on Arc, it upsamples things differently than on AMD or NV cards. And as rumors have it, AMD might change FSR to work better with their RT acceleration/cores, so even there things look grim.
Posted on Reply
#40
Vya Domus
cmguigamfAnd as rumors have it, AMD might change FSR to work better with their RT acceleration/cores, so even there things look grim.
Not going to happen, not on current hardware anyway. AMD does not have the same kind of dedicated accelerators for ML, they're using instructions which use the already existing ALUs/FPUs in the CU, they're mostly there just for better hardware utilization, they're not dedicated units. FSR3 frame interpolation is an asynchronous compute task that apparently comes with a 2-3ms cost, there's real no way to squeeze an ML task in that async pipeline without serious performance issues.
Posted on Reply
#41
stimpy88
99% of things called A.I. or "uses A.I. technology" are complete BS. Just marketing dribble over the latest buzzword.

I would say this about A.I. denoising too. It's just switching algorithms between a few different noise filters depending on contrast, angle etc.
Posted on Reply
#42
Assimilator
Vya DomusAre we supposed to just pretend it's amazing so we pay Nvidia 2000$ to make the games playable with a million levels of upscaling/interpolation/denoising/etc ?
Ah yes, the same strawman that is trotted out in every thread involving NVIDIA. "NVIDIA overcharges for GPUs therefore LITERALLY EVERYTHING they do is bad/evil/WAAAAAAAAAAA I'M A CHILD WITH THE INABILITY TO THINK FOR MYSELF".

No you muppets, it's absolutely possible to be against NVIDIA's pricing while being a massive proponent of the technological progress in GPU rendering they've brought to the table. Before NVIDIA made real-time ray-tracing a thing, there was zero innovation in the GPU space, it was just "make next generation render more rasterised frames durrrrrrrrr". RTRT is the first step along the road to computer-generated graphics matching what our eyes can see, without all of the tricks and fakery and ultimately bullshit that half a century of rasterisation has taught game engineers and graphics artists to bake into their work so that the entire visual facade doesn't come crashing down.

Then there's the fact that all of the rasterisation fakery has allowed us to create a pretty good simulation of what the human eye sees, that we've been trained to accept, whereas ray-tracing is starting on floor zero and is naturally not going to be perfect. Yet time and time again we hear the same refrain, "ray-tracing looks worse so we should stop wasting time on it WAAAAAAA". THAT'S NOT HOW TECHNOLOGY WORKS.

Frame interpolation is part and parcel of the long road to replacing rasterisation with RT. It's not perfect, and NVIDIA knows that full well, but it's a necessary step until graphics hardware becomes powerful enough to ray- or path-trace in real-time to no longer require it. When that happens it will be discarded by the wayside.
Vayra86We already know DLSS doesn't need AI
No we don't. We know that standard interpolation doesn't need machine learning, but we literally don't know how DLSS works. It may be nothing more than interpolation with marketing, it might be a whole lot more, only NVIDIA knows how it's implemented. So stop repeating this BS that you've made up to justify your arguments.
Vayra86Also, AMD biased... you couldn't be more wrong.
:laugh:

TPU has ALWAYS had a massive AMD bias, given that the site was basically created around ATITool. That utility is the reason I, and I'll bet many of the other old-timers, joined these forums.
Posted on Reply
#43
Vya Domus
AssimilatorNo you muppets
Sorry dude the only muppets here are the people such as yourself that refuse to admit Nvidia is marketing their products as solutions to problems that they continuously create.

You cannot be against their absurd prices while simultaneously worshiping the strategies that allowed them to increase those prices in the first place, it's military grade cognitive dissonance and ignorance.
Posted on Reply
#44
Assimilator
Vya DomusSorry dude the only muppets here are the people such as yourself that refuse to admit Nvidia is marketing their products as solutions to problems that they continuously create.

You cannot be against their absurd prices while simultaneously worshiping the strategies that allowed them to increase those prices in the first place, it's military grade cognitive dissonance and ignorance.
The only cognitive dissonance is from those such as yourself, who are so deluded that they're willing to claim that attempting to advance that state of the art via real-time ray-tracing is a problem created by NVIDIA to sell more product.
Posted on Reply
#45
TheoneandonlyMrK
AssimilatorThe only cognitive dissonance is from those such as yourself, who are so deluded that they're willing to claim that attempting to advance that state of the art via real-time ray-tracing is a problem created by NVIDIA to sell more product.
So should we be buying 5/700 watt GPUs to play Cod in five years.


Should we all Have to pay 2/3K to game at 1080p.


The futures bright.

But it's melting icebergs for god ray's, how quaint the future is, possibly short though.
Posted on Reply
#46
Vayra86
AssimilatorThe only cognitive dissonance is from those such as yourself, who are so deluded that they're willing to claim that attempting to advance that state of the art via real-time ray-tracing is a problem created by NVIDIA to sell more product.
Ehhh.... Nvidia didn't do this then? Maybe you oughta look back at some presentations of Siggraph et al

Do you need a recap of their efforts throughout the industry since pre-Turing? I'm not sure what you're smoking here honestly. The cognitive dissonance is with you, unable to accept a highly inconvenient truth: corporate is always chasing money, it doesn't have to amount to anything. Many technologies and innovations just simply die off. Less respectfully you could say lots of companies with money are just throwing shit at the wall continuously, paid by us and not by them through tax.

This is why I go back to what matters every time: have games become better since Turing? Did RT enable new worlds, new immersive qualities yet? I dare say it hasn't. So why would we want this? Its a major cost increase. The cost/benefit analysis is not advantageous to any of us gamers. So I think we have every reason to say no?

And again, its not about Nvidia because its Nvidia. Its about their approach, their timing, their strategy, and everything else surrounding their Geforce GPUs since Turing. It's been a complete and utter shitshow; and on the other side of the fence, we have an AMD that's copying that shitshow over because they haven't got any good ideas for GPU left, apart from the hardware-related finetuning they do through chiplets. I don't applaud either company for this, tbh.
TheoneandonlyMrKSo should we be buying 5/700 watt GPUs to play Cod in five years.


Should we all Have to pay 2/3K to game at 1080p.


The futures bright.

But it's melting icebergs for god ray's, how quaint the future is, possibly short though.
Exactly. One way or another, the reality check is pending. I'm not diving into this hole only to figure out what I already know: this is a bubble waiting to burst.
Posted on Reply
#47
HOkay
Vayra86This is why I go back to what matters every time: have games become better since Turing? Did RT enable new worlds, new immersive qualities yet? I dare say it hasn't. So why would we want this? Its a major cost increase. The cost/benefit analysis is not advantageous to any of us gamers.
Agreed. You can use existing lighting techniques to make a game look essentially the same as the current level of ray tracing that GPUs can handle. I've not yet played anything where its been anywhere near worth the performance hit imho. I thought Portal RTX might finally be something to wow me, but it didn't. It's not bad, it's clearly miles better lighting than the original, but it's not better than a modern game with attention paid to the lighting, again just imho.
Posted on Reply
Add your own comment
Mar 11th, 2025 01:42 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts