Tuesday, July 4th 2023
Intel Developing Efficient Solution for Path Tracing on Integrated GPUs
Intel's software engineers are working on path-traced light simulation and conducting neural graphics research, as documented in a recent company news article, with an ambition to create a more efficient solution for integrated graphics cards. The company's Graphics Research Organization is set to present their path-traced optimizations at SIGGRAPH 2023. Their papers have been showcased at recent EGSR and HPG events. The team is aiming to get iGPUs running path-tracing in real time, by reducing the number of calculations required to simulate light bounces.
The article covers three different techniques, all designed to improve GPU performance: "Across the process of path tracing, the research presented in these papers demonstrates improvements in efficiency in path tracing's main building blocks, namely ray tracing, shading, and sampling. These are important components to make photorealistic rendering with path tracing available on more affordable GPUs, such as Intel Arc GPUs, and a step toward real-time performance on integrated GPUs." Although there is an emphasis on in-house products in the article, Intel's "open source-first mindset" hints that their R&D could be shared with others—NVIDIA and AMD are likely still struggling to make ray tracing practical on their modest graphics card models.The articles concludes: "We're excited to keep pushing these efforts for more efficiency in a talk in Advances in Real-time Rendering, SIGGRAPH's most attended course. During this talk, titled Path Tracing a Trillion Triangles, we demonstrate that with efficient algorithms, real-time path tracing requires a much less powerful GPU, and can be practical even on mid-range and integrated GPUs in the future. In the spirit of Intel's open ecosystem software mindset, we will make this cross-vendor framework open source as a sample and a test-bed for developers and practitioners."
Sources:
Tom's Hardware, Intel Dev Blog
The article covers three different techniques, all designed to improve GPU performance: "Across the process of path tracing, the research presented in these papers demonstrates improvements in efficiency in path tracing's main building blocks, namely ray tracing, shading, and sampling. These are important components to make photorealistic rendering with path tracing available on more affordable GPUs, such as Intel Arc GPUs, and a step toward real-time performance on integrated GPUs." Although there is an emphasis on in-house products in the article, Intel's "open source-first mindset" hints that their R&D could be shared with others—NVIDIA and AMD are likely still struggling to make ray tracing practical on their modest graphics card models.The articles concludes: "We're excited to keep pushing these efforts for more efficiency in a talk in Advances in Real-time Rendering, SIGGRAPH's most attended course. During this talk, titled Path Tracing a Trillion Triangles, we demonstrate that with efficient algorithms, real-time path tracing requires a much less powerful GPU, and can be practical even on mid-range and integrated GPUs in the future. In the spirit of Intel's open ecosystem software mindset, we will make this cross-vendor framework open source as a sample and a test-bed for developers and practitioners."
25 Comments on Intel Developing Efficient Solution for Path Tracing on Integrated GPUs
I liked the part where it says it's Open Source, I hope it doesn't have a huge hidden "but"
I'm pretty unimpressed :D
May it scale well... They are going to feel the squeeze of the market then...
What they want to sell is not what the majority buys, and if RT wants to survive/become a real thing, they need mass adoption.
But, don't discount improvements. Often, tiny improvements are indicative of 'mastering' a new technology, and can be instrumental in future 'leaps'.
IMO, we need a game w/ very 'simple' (read: low-cost) visuals and fantastic gameplay, that centers around Ray-Tracing. Even better if it was DXR, but vendor-specific implementations were also supported.
-a 'killer app' to really drive attention and development while demonstrating exactly why and how RT is so 'neat'.
Even the 4090 needs to do tricks like upscaling and fake frames
Nvidia clearly has only been gaining +6% increase in the efficiency is their entire RT features each generation, when compared to any card with same ROP's, TMU & RT cores, Tensor cores, shaders/compute units, for the last generation.
That's without using D.L.S.S Or Frame Generation.
without increasing rasterization, that is the total increase in Nvidia's RT efficiency overall.
AMD and Intel going after Nvidia markting stunts just because it sells, its a trap.
on the other hand, DLSS and alikes, should bet more R&D on that also, because its expands the GPU lifecycle.
A proper game studio should be capable of making great looking games without use of RT... Gran Turismo for example.
its like playing eSports with RT ... its dumb.
The FPS sacrifice just for a more realistic light reflection that doesnt gets noticed only if you look for it.
Personally, I don't care about ray tracing in games. Maybe because I enjoyed games in times when the whole scene had less polygons than one character's buttocks nowadays - that's actually true, Lady Dmitrescu's butt has more polygons than the whole scene in Resident Evil 1 had - and I'd rather play a good game than a good looking game, maybe because I'm not willing to sacrifice performance for a marginal improvement in how well defined the puddles look.