• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Developing Efficient Solution for Path Tracing on Integrated GPUs

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,086 (3.90/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
Intel's software engineers are working on path-traced light simulation and conducting neural graphics research, as documented in a recent company news article, with an ambition to create a more efficient solution for integrated graphics cards. The company's Graphics Research Organization is set to present their path-traced optimizations at SIGGRAPH 2023. Their papers have been showcased at recent EGSR and HPG events. The team is aiming to get iGPUs running path-tracing in real time, by reducing the number of calculations required to simulate light bounces.

The article covers three different techniques, all designed to improve GPU performance: "Across the process of path tracing, the research presented in these papers demonstrates improvements in efficiency in path tracing's main building blocks, namely ray tracing, shading, and sampling. These are important components to make photorealistic rendering with path tracing available on more affordable GPUs, such as Intel Arc GPUs, and a step toward real-time performance on integrated GPUs." Although there is an emphasis on in-house products in the article, Intel's "open source-first mindset" hints that their R&D could be shared with others—NVIDIA and AMD are likely still struggling to make ray tracing practical on their modest graphics card models.



The articles concludes: "We're excited to keep pushing these efforts for more efficiency in a talk in Advances in Real-time Rendering, SIGGRAPH's most attended course. During this talk, titled Path Tracing a Trillion Triangles, we demonstrate that with efficient algorithms, real-time path tracing requires a much less powerful GPU, and can be practical even on mid-range and integrated GPUs in the future. In the spirit of Intel's open ecosystem software mindset, we will make this cross-vendor framework open source as a sample and a test-bed for developers and practitioners."

View at TechPowerUp Main Site | Source
 
I didn't recognise formula on the image. My science is very bad. Eyes too.
 
If I understand correctly, it's only 3-7% improvement? To make (limited)RT possible on iGPUs the improvement would have to be in the order of 300-700% lol

I liked the part where it says it's Open Source, I hope it doesn't have a huge hidden "but"
 
If I understand correctly, it's only 3-7% improvement? To make (limited)RT possible on iGPUs the improvement would have to be in the order of 300-700% lol
True, but the optimisation may very well be applicable to more powerful GPUs, and those might be able to better scale the effect.
 
I am convinced Nvidia does not want great ray tracing on lower end products. They prefer people wanting their high end products because they are increasingly moving away from the mass produced lower cost products. I wouldn't expect them to be happy about anything that makes their gimmick easier on lower end products because that would devalue what they want you paying $900+ for.
 
If I understand correctly, it's only 3-7% improvement? To make (limited)RT possible on iGPUs the improvement would have to be in the order of 300-700% lol

I liked the part where it says it's Open Source, I hope it doesn't have a huge hidden "but"
Well... 'but' its a single digit percent effort.

I'm pretty unimpressed :D
May it scale well...

I am convinced Nvidia does not want great ray tracing on lower end products. They prefer people wanting their high end products because they are increasingly moving away from the mass produced lower cost products. I wouldn't expect them to be happy about anything that makes their gimmick easier on lower end products because that would devalue what they want you paying $900+ for.
They are going to feel the squeeze of the market then...

What they want to sell is not what the majority buys, and if RT wants to survive/become a real thing, they need mass adoption.
 
Man, things are rough when 7% improvement is some kind of breakthrough.
 
Man, things are rough when 7% improvement is some kind of breakthrough.

Yeah, it's a lil overblown (it's Marketing/PR *shrug*).
But, don't discount improvements. Often, tiny improvements are indicative of 'mastering' a new technology, and can be instrumental in future 'leaps'.

IMO, we need a game w/ very 'simple' (read: low-cost) visuals and fantastic gameplay, that centers around Ray-Tracing. Even better if it was DXR, but vendor-specific implementations were also supported.
-a 'killer app' to really drive attention and development while demonstrating exactly why and how RT is so 'neat'.
 
Good news, Intel taking this next leap in rendering technology seriously.
 
I am convinced Nvidia does not want great ray tracing on lower end products. They prefer people wanting their high end products because they are increasingly moving away from the mass produced lower cost products. I wouldn't expect them to be happy about anything that makes their gimmick easier on lower end products because that would devalue what they want you paying $900+ for.

Luckily the AMD RX 7600 is a ray tracing monster, and being AMD is comes with more than 8GB of... oh wait.
 
I am convinced Nvidia does not want great ray tracing on lower end products. They prefer people wanting their high end products because they are increasingly moving away from the mass produced lower cost products. I wouldn't expect them to be happy about anything that makes their gimmick easier on lower end products because that would devalue what they want you paying $900+ for.
Even if the low end starts becoming better trough tweaks the higher end will benefit from that two.
Even the 4090 needs to do tricks like upscaling and fake frames
 
True, but the optimisation may very well be applicable to more powerful GPUs, and those might be able to better scale the effect.
AMD patent just surfaced that might indicate RDNA4 is getting dedicated hardware specifically for path/ray tracing acceleration. RDNA3/2 do not have hardware acceleration for doing the BVH and intersections etc.
 
AMD patent just surfaced that might indicate RDNA4 is getting dedicated hardware specifically for path/ray tracing acceleration. RDNA3/2 do not have hardware acceleration for doing the BVH and intersections etc.
[joke]Here, I was hoping we'd get to see a dedicated RT accelerator card; ala Ageia PhysX[/joke]
 
Yeah, it's a lil overblown (it's Marketing/PR *shrug*).
But, don't discount improvements. Often, tiny improvements are indicative of 'mastering' a new technology, and can be instrumental in future 'leaps'.

IMO, we need a game w/ very 'simple' (read: low-cost) visuals and fantastic gameplay, that centers around Ray-Tracing. Even better if it was DXR, but vendor-specific implementations were also supported.
-a 'killer app' to really drive attention and development while demonstrating exactly why and how RT is so 'neat'.
So we wait for Nintendo to use it for something is what you're saying.
 
Man, things are rough when 7% improvement is some kind of breakthrough.
How so?
Nvidia clearly has only been gaining +6% increase in the efficiency is their entire RT features each generation, when compared to any card with same ROP's, TMU & RT cores, Tensor cores, shaders/compute units, for the last generation.
That's without using D.L.S.S Or Frame Generation.

without increasing rasterization, that is the total increase in Nvidia's RT efficiency overall.
 
Not holding my breath waiting for some massive breakthrough, but I do wish them luck in what they're doing. RT is neat and impressive in its own rights, but in most cases the relatively minor upgrade to visuals just isn't worth absolutely annihilating your framerates.
 
Of course but wouldn't it be nice if we had 8GB of something like Crystal Well on a GPU? They already got 1GB+ cache on CPU, getting big amounts of cache on GPU is probably up next even if it's only for enterprise or HPC clients!
 
Not holding my breath waiting for some massive breakthrough, but I do wish them luck in what they're doing. RT is neat and impressive in its own rights, but in most cases the relatively minor upgrade to visuals just isn't worth absolutely annihilating your framerates.
I always considered RT to be a niche thing, not all games need it, and those that have very poor graphics to begin with like CP2077 or Control with their PS1 geometry, in order for RT to make sense games should be photorealistic hence perfect textures and geometry.
 
Good news, Intel taking this next leap in rendering technology seriously.

Me: putting on my 7% seriously cool glasses.
the simpsons GIF
 
RT is for some games, and mostly a markting stunt to sell GPU's, ( thanks Nvidia but no thanks). Intel should focus more R&D on optimizing rastering/performance/watt.
AMD and Intel going after Nvidia markting stunts just because it sells, its a trap.
on the other hand, DLSS and alikes, should bet more R&D on that also, because its expands the GPU lifecycle.
A proper game studio should be capable of making great looking games without use of RT... Gran Turismo for example.
its like playing eSports with RT ... its dumb.
The FPS sacrifice just for a more realistic light reflection that doesnt gets noticed only if you look for it.
 
To the naysayers: 7% seems like not so much, but that's because we're used to companies shouting about massive improvements with an asterisk of them being only applicable to very niche, borderline situations. Actual progress is made in small steps.
Personally, I don't care about ray tracing in games. Maybe because I enjoyed games in times when the whole scene had less polygons than one character's buttocks nowadays - that's actually true, Lady Dmitrescu's butt has more polygons than the whole scene in Resident Evil 1 had - and I'd rather play a good game than a good looking game, maybe because I'm not willing to sacrifice performance for a marginal improvement in how well defined the puddles look.
 
To the naysayers: 7% seems like not so much, but that's because we're used to companies shouting about massive improvements with an asterisk of them being only applicable to very niche, borderline situations. Actual progress is made in small steps.
Personally, I don't care about ray tracing in games. Maybe because I enjoyed games in times when the whole scene had less polygons than one character's buttocks nowadays - that's actually true, Lady Dmitrescu's butt has more polygons than the whole scene in Resident Evil 1 had - and I'd rather play a good game than a good looking game, maybe because I'm not willing to sacrifice performance for a marginal improvement in how well defined the puddles look.
How is related ray tracing and number of polygons in the your try to looking on situation?
 
Back
Top