Saturday, September 23rd 2023

NVIDIA Foresees Greater Reliance on AI Rendering Techniques in Games

Digital Foundry organized an AI Visuals Roundtable earlier this week, with Alex Battaglia (Video Producer) acting as host—the main topic being a recently visually upgraded version of Cyberpunk 2077. Guests included: Bryan Catanzaro (NVIDIA's Vice President Applied Deep Learning Research), Jakub Knapik (VP Art and Global Art Director at CD Projekt RED), Jacob Freeman (GeForce Marketing Evangelist) and Pedro Valadas (PCMR Founder). Team Green, naturally, advocates its newly released DLSS 3.5 and Ray Reconstruction technologies—brute force rendering techniques are considered passé, and no longer an ideal solution for upcoming GPU generations. Valadas was curious enough to ask whether NVIDIA had any plans to target reasonable performance levels—minus DLSS—at native resolutions.

Catanzaro focused on the benefits introduced in Update 2.0 and Phantom Liberty: "I think that DLSS 3.5 actually makes Cyberpunk 2077 even more beautiful than native rendering...that's my belief. The reason being that the AI is able to make smarter decisions about how to render a scene...and I think that's going to continue to develop." He half jokingly states that rasterization is a "bag of fakeness." A combination of DLSS and path tracing is preferred over old hat methods—thus attaining the most realistic visual results. He summarizes with his own slogan: "Native rendering is fake frames." Catanzaro predicts that real-time graphics industries will become increasingly reliant on AI-processed image reconstruction and rendering technologies in the future. He hopes that AAA game environments will become cheaper to make in due time—with a shift to neural rendering. DLSS 10 could interface with game engine technology at a higher level, with a possible outcome being the creation of more "immersive and beautiful" experiences.
Digital Foundry's video description states: "Many thanks to all participants in this roundtable chat: Bryan Catanzaro, Vice President Applied Deep Learning Research at Nvidia, Jakub Knapik VP Art and Global Art Director at CD Projekt RED, GeForce evangelist Jacob Freeman and Pedro Valadas of the PCMR sub-Reddit."


Cyberpunk 2077 2.0 is here—the first showcase for DLSS 3.5 ray reconstruction, integrated into a full DLSS package including super resolution and frame generation, all combining to produce a state-of-the-art visual experience. In this roundtable discussion, we discuss how ray reconstruction works, how it was developed and its applications outside of path-traced games. We also talk about the evolution of the original DLSS, the success of DLSS in the modding community and the future of machine learning in PC graphics.
Sources: Eurogamer, Tom's Hardware
Add your own comment

47 Comments on NVIDIA Foresees Greater Reliance on AI Rendering Techniques in Games

#1
QUANTUMPHYSICS
DLSS and RTX are AI features NO ONE ASKED FOR.

Nvidia pushed them forward and made them benchmarks.

Benchmarks that they themselves sit atop of and control.
Posted on Reply
#2
TheoneandonlyMrK
Wow, EVERYONE got the press notes eh.

Talk about pushing a certain POV(Huang's)

Personally I think Gamers nexus showed it's real worth.

And as for AI based rendering, eww the smears or effects it adds or deleted isn't for me.
Posted on Reply
#3
evernessince
Nvidia integrating it's tech deeper into game engines...

Surely nothing bad could come of that /s
Posted on Reply
#4
cmguigamf
Weird from DF to have basically only nVidia's people on this "roundtable", which sounds more like free marketing for Huang than an actual place for discussion. Should have included all players (AMD and Intel), otherwise there's no point imo.
Posted on Reply
#5
dj-electric
Wanna hear a secret? it has to stay between us.

(AMD and Intel do too, its why they invest so much in AI hardware and tools)
Posted on Reply
#6
aciDev
While I also believe that there is a need for a paradigm shift in 3D rendering and that neural networks are most likely the direction to go, without the 'bag of fakeness,' the current AI wouldn't render anything at all.
Posted on Reply
#7
Unregistered
Basically they can't or more accurately don't want to make faster GPUs, so they push fake stuff.
Why isn't DLSS used to improve upon native for example? Or why a 1800€+ can't do RT? Stuff like that is more suitable for mid to low end GPUs.
cmguigamfWeird from DF to have basically only nVidia's people on this "roundtable", which sounds more like free marketing for Huang than an actual place for discussion. Should have included all players (AMD and Intel), otherwise there's no point imo.
They are basically part of the marketing division of nVidia.
Posted on Edit | Reply
#8
Aretak
cmguigamfWeird from DF to have basically only nVidia's people on this "roundtable", which sounds more like free marketing for Huang than an actual place for discussion. Should have included all players (AMD and Intel), otherwise there's no point imo.
Digital Foundry is pretty much an arm of Nvidia's marketing department. They're constantly shilling Nvidia's new products and working closely with them on videos, whilst AMD and Intel are lucky to get a mention every few months (in a positive context anyway). I still remember Alex having a little meltdown on Twitter when Hardware Unboxed pointed out some flaws with frame generation at launch, and saying they shouldn't be focusing on the negatives because it might damage people's perception of the technology. I'm not accusing them of faking their numbers or anything, but the topics they choose to cover (or not cover), their increased focus on subjective commentary over hard data and oddly close relationship with Nvidia makes them one of the last places I go for information. I miss the days when they just let the benchmarks do the talking.
Posted on Reply
#9
Imouto
Who's the genius writing their slogans and catchphrases? If rasterization is fakeness then DLSS is fake fakeness since most of the work on the render pipeline is still pure raster.

And people gulp this crap with a smile on their faces.

RT is incredibly taxing. Rendering a game with full RT at 1440p@60FPS so it can be upscalled to 4K is beyond anything a graphics card can pull in the next two decades. The missing performance is raster, as simple as that.
Posted on Reply
#10
gffermari
It’s either we wait 20+ years for a gpu to be able to render a simple Ray traced reflection at native resolution or we accept different techs to simulate the expected result. nVidia representative is right. You can’t just use brute force to make something better. You have to be smart.

For the same reason we don’t complain about cube maps or screen space reflections and their ridiculous bugs and limitations, we shouldn’t complain for any tech that provides better image.

And no, there’s no tech that delivers 100% with no bugs or glitches.
Posted on Reply
#11
QuietBob
"trust in the man in the leather jacket... and verily, thou shalt be rewarded"

thus sayeth Jacob Freeman, GeForce Evangelist
:roll:
Posted on Reply
#12
TheoneandonlyMrK
dj-electricWanna hear a secret? it has to stay between us.

(AMD and Intel do too, its why they invest so much in AI hardware and tools)
There's a difference between generative and general AI and AI based rendering fudges.

Yes, fudge's.

Because that's what they are, path tracing requires fudges to make it viable, but those fudges don't equal better than native IMHO , different yes, closer to real yes.
But streaky effects, effects going missing and chevron patches turning up now and again, and that's If you bought a 4090 lower end cards , this is a joke on.
And Nvidia move the goal posts past your hardware quicker than the console cycle.

Most sites are towing the line and sticking to Nvidia's notes, GN are not and give a fair, not hyped assessment, and test fairer, though not like they were told to.
They also show, in motion why it's a pass from me, and that's on Nvidia's hardware Showcase sponsor title.

Posted on Reply
#13
Assimilator
I remember when new technology to improve visual fidelity was all that PC gamers cared about. Now it's just people who hate NVIDIA, bitching about the fact that NVIDIA's approaches to increasing visual fidelity aren't themselves perfect, because of course rasterisation has ALWAYS been perfect /s. Do you people even listen to yourselves? Do you think before you type the next NON-NATIVE IS EVIL REEEE post? Or do you just care about circlejerking with the others who think the same as you do?

No wonder PC gaming is dying.
TheoneandonlyMrKGN are not and give a fair, not hyped assessment, and test fairer, though not like they were told to
A fair assessment that concludes ray reconstruction is a good feature... and considering this is the first release, it will only get better.
Posted on Reply
#14
dragontamer5788
Company making AI accelerators tries to sell more AI chips.

News at 11.
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about.
That still exists. But NVidia's job here is to sell AI because that's NVidia's key advantage over AMD. If something is "simply" a graphics effect, AMD offers similar, or possibly even greater performance (thanks to Infinity Cache). NVidia can't win on that front.
Posted on Reply
#15
HisDivineOrder
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about. Now it's just people who hate NVIDIA, bitching about the fact that NVIDIA's approaches to increasing visual fidelity aren't themselves perfect, because of course rasterisation has ALWAYS been perfect /s. Do you people even listen to yourselves? Do you think before you type the next NON-NATIVE IS EVIL REEEE post? Or do you just care about circlejerking with the others who think the same as you do?

No wonder PC gaming is dying.


A fair assessment that concludes ray reconstruction is a good feature... and considering this is the first release, it will only get better.
If PC gaming is dying, it's Nvidia (and to a lesser extent AMD) the one killing it with prices squeezing out anyone trying to game at the low end.
Posted on Reply
#16
TheoneandonlyMrK
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about. Now it's just people who hate NVIDIA, bitching about the fact that NVIDIA's approaches to increasing visual fidelity aren't themselves perfect, because of course rasterisation has ALWAYS been perfect /s. Do you people even listen to yourselves? Do you think before you type the next NON-NATIVE IS EVIL REEEE post? Or do you just care about circlejerking with the others who think the same as you do?

No wonder PC gaming is dying.


A fair assessment that concludes ray reconstruction is a good feature... and considering this is the first release, it will only get better.
I never said they thought it bad, but they're realistic, and show, as I said, side by side comparison, they're conclusion was fair not hyped, like many others.
Posted on Reply
#17
theouto
I greatly disliked how they had one of the founders of the pcmr, asking questions that we would probably ask, and then they twist his words to make him and the questions he asked sound stupid.

I generally liked the video, I like Alex's coverages of topics, but many segments in this video, it really just felt like an hour long nvidia ad, and how they are descending upon us mere mortals, granting us technologies of the future, shit man, they even had an evangelist, a fucking evangelist, why are they here? What do they bring to the discussion besides brainwashing?
Posted on Reply
#18
chrcoluk
theoutoI greatly disliked how they had one of the founders of the pcmr, asking questions that we would probably ask, and then they twist his words to make him and the questions he asked sound stupid.

I generally liked the video, I like Alex's coverages of topics, but many segments in this video, it really just felt like an hour long nvidia ad, and how they are descending upon us mere mortals, granting us technologies of the future, shit man, they even had an evangelist, a fucking evangelist, why are they here? What do they bring to the discussion besides brainwashing?
Yeah he asks if DLSS is been used as a replacement for general optimisation, and the answer was well starfield disproves that right away (ignoring FSR is same thing), and that its allowing us to do RT, RT that, but ignoring he was likely referring to games that dont look great, they dont look fancy but play like complete **** without DLSS, even with no RT. He then has to arkwardly laugh knowing they just sidetracking his point.

They even started saying normal frames are fake lol.
Posted on Reply
#19
ZoneDymo
cmguigamfWeird from DF to have basically only nVidia's people on this "roundtable", which sounds more like free marketing for Huang than an actual place for discussion. Should have included all players (AMD and Intel), otherwise there's no point imo.
That is kinda the problem I have with DF, they want to be too buddy buddy so they can get the information just like game reviewers like IGN, they dont want to be harsh or critical as that will lock doors towards exclusive content...but that means they dont operate for the consumer.

DF tends to be pretty level headed but they have done Nvidia sponsored stuff in the past and they all run Nvidia stuff on their main rigs which is just a bit....yeah unbalanced.

And the extremely lame softball questions that Alex asked here are just cringeworthy imo.

and in the end I want to state again, I want to be in a time period where all this nonsense is just behind us, no more DLSS/FSR/XESS and just one hardware agnostic solution that just works for all so we can once again properly focus on hardware.
And RT at levels that we dont even need a denoiser anymore.
Its just unprecedented this focus on software that determine how a game looks per hardware vendor and I cant wait for it to be behind us.
Posted on Reply
#20
evernessince
AretakDigital Foundry is pretty much an arm of Nvidia's marketing department. They're constantly shilling Nvidia's new products and working closely with them on videos, whilst AMD and Intel are lucky to get a mention every few months (in a positive context anyway). I still remember Alex having a little meltdown on Twitter when Hardware Unboxed pointed out some flaws with frame generation at launch, and saying they shouldn't be focusing on the negatives because it might damage people's perception of the technology. I'm not accusing them of faking their numbers or anything, but the topics they choose to cover (or not cover), their increased focus on subjective commentary over hard data and oddly close relationship with Nvidia makes them one of the last places I go for information. I miss the days when they just let the benchmarks do the talking.
Well they are sponsored by Nvidia. Even if they were to adhere to best benchmarking practices (which they really don't) to try to remain as impartial as possible, it's impossible to shake the potential bias a financial partnership with a company you are supposed to be objectively reviewing injects.
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about. Now it's just people who hate NVIDIA, bitching about the fact that NVIDIA's approaches to increasing visual fidelity aren't themselves perfect, because of course rasterisation has ALWAYS been perfect /s. Do you people even listen to yourselves? Do you think before you type the next NON-NATIVE IS EVIL REEEE post? Or do you just care about circlejerking with the others who think the same as you do?

No wonder PC gaming is dying.
Visual fidelity is a big part of the problem. The actual visual benefit for what we are currently able to do with RT is subjective for a lot of people, many disagree that it's even a visual benefit at all and prefer native. The CP2077 path tracing for example just looks way too uncanny valley for me, I prefer native. For a feature that provides an absolutely massive performance hit, the bar is very high for what people expect visually in order to justify it.

On top of that, each AI technology has drawbacks which often include reductions in visual quality in certain areas. In the past technologies like FXAA have been called out for a reduction in image quality despite being performant so it stands to reason that something like RT + AI would be called out for it's performance and visual quality negatives just to enable Ray Tracing who's benefit at this moment is questionable for many individuals.

Ray tracing will certainly get better and is the path forward for games that want a hyper-realistic style but let's not pretend that as of right now it's just only positives.
Posted on Reply
#21
R-T-B
Xex360Why isn't DLSS used to improve upon native for example?
That's been done. It's just called DLAA.
Posted on Reply
#22
DemonicRyzen666
gffermariIt’s either we wait 20+ years for a gpu to be able to render a simple Ray traced reflection at native resolution or we accept different techs to simulate the expected result. nVidia representative is right. You can’t just use brute force to make something better. You have to be smart.

For the same reason we don’t complain about cube maps or screen space reflections and their ridiculous bugs and limitations, we shouldn’t complain for any tech that provides better image.

And no, there’s no tech that delivers 100% with no bugs or glitches.
Not true. The cards we have now with rasterization along with raytracing features are middle ground cards & will be for while like 5 more years tops.
If they drop all the shader units & anything that's built for rasterization. Then just add all the raytracing parts alone by selfs (tensors cores, RT cores,opitical flow accelators, BHV trassvesal parts. The only thing you'll likely need more of is Render out puts (R.O.Ps) to do full raytracing without rasterization.
Posted on Reply
#23
evernessince
DemonicRyzen666Not true. The cards we have now with rasterization along with raytracing features are middle ground cards & will be for while like 5 more years tops.
If they drop all the shader units & anything that's built for rasterization. Then just add all the raytracing parts alone by selfs (tensors cores, RT cores,opitical flow accelators, BHV trassvesal parts. The only thing you'll likely need more of is Render out puts (R.O.Ps) to do full raytracing without rasterization.
Ray tracing only simulates light propagation, it doesn't handle any of the other aspects that a GPU does. GPUs are still going to need all their existing hardware to handle drawing, coloring, transforming, ect. The 3D scene itself still needs to be rasterized into a 2D image regardless of the lighting technology used as well. In 5 years we won't even fully replace raster based lighting, that's closer to 10-20 years away as the current cost and propagation of RT capable hardware makes it impossible for devs to consider a purely ray traced lighting pipeline.
Posted on Reply
#24
DemonicRyzen666
evernessinceRay tracing only simulates light propagation, it doesn't handle any of the other aspects that a GPU does. GPUs are still going to need all their existing hardware to handle drawing, coloring, transforming, ect. The 3D scene itself still needs to be rasterized into a 2D image regardless of the lighting technology used as well. In 5 years we won't even fully replace raster based lighting, that's closer to 10-20 years away as the current cost and propagation of RT capable hardware makes it impossible for devs to consider a purely ray traced lighting pipeline.
you're talking about the game engine, not the gpu. All the gpu does it run code & execute & return in workloads from the code.
Posted on Reply
#25
Dr. Dro
It's hilarious to read this PR speak because Cyberpunk doesn't even come with DLSS 3.5, it's DLSS 3.1 with the Ray Reconstruction component.
AssimilatorI remember when new technology to improve visual fidelity was all that PC gamers cared about. Now it's just people who hate NVIDIA, bitching about the fact that NVIDIA's approaches to increasing visual fidelity aren't themselves perfect, because of course rasterisation has ALWAYS been perfect /s. Do you people even listen to yourselves? Do you think before you type the next NON-NATIVE IS EVIL REEEE post? Or do you just care about circlejerking with the others who think the same as you do?
It's just TPU and its traditionally AMD-biased patronage; and when AMD doesn't support something, doesn't do it correctly, or fumbled something, this is the bog standard reaction of simply ignoring/downplaying/hostilizing it. Personally, it's a big reason why I stopped caring for AMD, their own diehards would rather stick to them through thick and thin instead of demanding that they do something about it.

Much as we love this forum, we're a very small segment... if you went by TPU's numbers, Radeon would have like a 70% market share. Don't let it get to you, mate.
Posted on Reply
Add your own comment
Dec 18th, 2024 23:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts