Thursday, September 1st 2022

Intel Details its Ray Tracing Architecture, Posts RT Performance Numbers

Intel on Thursday posted an article that dives deep into the ray tracing architecture of its Arc "Alchemist" GPUs, which are particularly relevant with performance-segment parts such as the Arc A770, which competes with the NVIDIA GeForce RTX 3060. In the article, Intel posted ray tracing performance numbers that put it at-par with, or faster than the RTX 3060, which which it has traditional raster performance parity. In theory, this would make Intel's ray tracing tech superior to that of AMD RDNA2, because while the AMD chips have raster performance parity, their ray tracing performance do not tend to be at par with NVIDIA parts at a price-segment level.

The Arc "Alchemist" GPUs meet the DirectX 12 Ultimate feature-set, and its ray tracing engine supports DXR 1.0, DXR 1.1, and Vulkan RT APIs. The Xe Core is the indivisible subunit of the GPU, and packs its main number-crunching machinery. Each Xe Core features a Thread Sorting Unit (TSU), and a Ray Tracing Unit (RTU). The TSU is responsible for scheduling work among the Xe Core and RTU, and is the core of Intel's "secret sauce." Each RTU has two ray traversal pipelines (fixed function hardware tasked with calculating ray intersections with intersections/BVH. The RTU can calculate 12 box intersections per cycle, 1 triangle intersection per cycle, and features a dedicated cache for BVH data.
The TSU, as we said, is the secret sauce of Intel's ray tracing performance. It's key to achieving what Intel calls "Asynchronous Ray Tracing." The TSU organizes ray tracing instructions and data such that rays with similar hit shaders are optimally allocated unified shader resources of the Xe cores, for the best possible allocation of hardware resources. The slide above details the ray tracing pipeline, where the TSU is shown playing a big role in optimizing things for the hit-shader execution stage.
Intel posted performance numbers for the Arc A770 at 1080p, compared with the RTX 3060 at the same resolution, across a selection of 17 games. These include Ghostwire Tokyo, which was earlier found to be extremely sub-optimal on the "Alchemist" architecture, but has since been optimized for in the latest beta drivers. The A770 trades blows with the RTX 3060, even if there are a few cases where the NVIDIA chip is slightly ahead. This is certainly a better showing when compared to a Radeon RX 6650 XT pitted against the RTX 3060 in ray tracing.
The A770 isn't meant for 1440p + Ray Tracing (nor is the RTX 3060), but performance enhancements like the XeSS and DLSS make both possible. While Intel didn't compare the A770+XeSS to RTX 3060+DLSS at 1440p, it posted a slide about how XeSS makes gaming with ray tracing more than playable at 1440p, across both its "balanced" and "performance" presets.

Below is the video presentation from Intel:
Add your own comment

30 Comments on Intel Details its Ray Tracing Architecture, Posts RT Performance Numbers

#1
GerKNG
i'm excited to see what intel has to offer in 1-2 years.
that looks promising for their first GPUs.
Posted on Reply
#2
Crackong
GerKNGi'm excited to see what intel has to offer in 1-2 years.
that looks promising for their first GPUs.
People believed this in year 2020~2021.
But now it is second-half year 2022 and the cards are still existed in the form of vapourized thin air.
Posted on Reply
#3
Upgrayedd
CrackongPeople believed this in year 2020~2021.
But now it is second-half year 2022 and the cards are still existed in the form of vapourized thin air.
I think a lot of things were pushed back from the last two years..
Posted on Reply
#4
ModEl4
Interesting, Intel claims A770 is on average around +14% faster at 1080p when you enable raytracing in this 17 titles testbed vs RTX 3060.
Even if A770 is 20% faster than A750 (it won't) that means A750 in worst case is -5% vs RTX 3060 when Raytracing enabled or just the same if A770/A750 have 14% performance difference.
In synthetic test Intel claims the difference is a lot bigger and as shader complexity goes up so is the difference.

Posted on Reply
#5
AusWolf
GerKNGi'm excited to see what intel has to offer in 1-2 years.
that looks promising for their first GPUs.
Maybe the A770 will come out by that time?
Posted on Reply
#6
v12dock
Block Caption of Rainey Street
Okay, Intel this must really be a joke at this point... In other news I personally created a brand new GPU that is 1.10x better than RTX at RT performance. I will make some pretty looking graphs to prove it... My only issue is some driver bugs which may prevent the card some from ever releasing.
Posted on Reply
#7
ModEl4
For those that wondering what SPP is in Intel 3DMark charts, it's samples per pixel, as you go up in sample count, image quality increases and grain goes down but it's more demanding.
Regarding theoretical throughput which means absolutely nothing (TSU and dedicated BVH cache is the secret sauce), A770 has the same rate per clock as RX6650XT per clock regarding ray-tri peak and 3X/clock regarding ray-box peak. (RDNA2 is 1 ray-triangle intersection per clock per CU and 4 ray-box intersections per clock per CU)
Posted on Reply
#8
ixi
I'm shocked, just shocked. Look at those PR tries, they are trying that is nice. Intel wanted to release gpu first in 2019? We are in 2022. and they just released a380 with limitation to China. Ok, now you can pre-order or even buy at newegg asrock model, which is nice... 480/720p gaming power house.

We ain't far from 2023. Amd and nvidia soon will reveal newer gpus. Intel needs to do PR to current generation while they can... intel call this a win, win scenario, probably. While it is fail in real scenario...
Posted on Reply
#9
AusWolf
ixiI'm shocked, just shocked. Look at those PR tries, they are trying that is nice. Intel wanted to release gpu first in 2019? We are in 2022. and they just released a380 with limitation to China. Ok, now you can pre-order or even buy at newegg asrock model, which is nice... 480/720p gaming power house.

We ain't far from 2023. Amd and nvidia soon will reveal newer gpus. Intel needs to do PR to current generation while they can... intel call this a win, win scenario, probably. While it is fail in real scenario...
I don't think the cards will be objectively bad. It's just that Intel will have to release them as soon as possible, or offer them at a seriously reduced price (compared to what they've originally planned) and market them as 1080p cards instead of 1440p that they could have done 1-2 years ago.
Posted on Reply
#10
Unregistered
Same time last year and one month before launch, this would've made sense.
#11
gffermari
What’s the point of being faster than a card that practically cannot run games with RT on?

Intel gpus would have a value one two years ago where you couldn’t find a gpu.
Now the second hand market is full of powerhouses.
Posted on Reply
#12
watzupken
gffermariWhat’s the point of being faster than a card that practically cannot run games with RT on?

Intel gpus would have a value one two years ago where you couldn’t find a gpu.
Now the second hand market is full of powerhouses.
I agree. Given Intel's flagship dGPU was meant to compete with the likes of the RTX 3070/ 3070Ti or RX 6800/ 6700XT, cards in the price range are not really RT worth, even though they can run games with RT enabled. The performance hit is too much to be worth the eye candy. In the case of RTX 3070, its saved by DLSS. Intel may have XESS to counter DLSS, but I think there is only 1 or very few games that supports it now, given that we are still talking about a phantom card/ vapor ware. People would be more convinced if there is an actual product to test, rather than some made up marketing slides. I mean seriously, there is nothing to see here.
AusWolfI don't think the cards will be objectively bad. It's just that Intel will have to release them as soon as possible, or offer them at a seriously reduced price (compared to what they've originally planned) and market them as 1080p cards instead of 1440p that they could have done 1-2 years ago.
Given the timeline now, I don't think it will make a difference whether Intel release it now or later. As it stands, even Nvidia and AMD are experiencing declining sale for their GPU. Intel may have some decent hardware there (and personally, I feel they did well for their first foray into enthusiast dGPU), but they delayed the product too much and missed the opportunity. While they are still talking (no actual products) about their RT hardware advantage here, whatever advantage they mentioned may be gone in another few months when facing off Ada Lovelace and RDNA3.
Posted on Reply
#13
john_
I hope AMD dropping to 3rd place in RT will be humiliating enough, to convince them to get better in RT performance and not treat it as just another secondary feature to support. It's becoming important.
ixiAmd and nvidia soon will reveal newer gpus.
None of those 2 will go under $500 in 2022 and let's wait and see if their newest lines will even have a model under $300 in 2023.
Posted on Reply
#14
watzupken
john_I hope AMD dropping to 3rd place in RT will be humiliating enough, to convince them to get better in RT performance and not treat it as just another secondary feature to support. It's becoming important.
I am sure AMD will not leave RT in its current state with the introduction of RDNA3 GPUs. Having said that, I do question if RT is an important feature though. Visually, it can look impressive, but the question is whether most people can tell the difference without a point of comparison? For example, if you were first given the game Control/ Cyberpunk to play, I am sure you won't be able to tell whether RT is on or off without reviewers putting side by side comparison. Likewise, Metro Exodus EE shows good amount of RT implemented, but without a point of comparison, I think most gamers will not be able to tell the difference while still enjoying the game and its graphics. Unless of course you are trained to look for such details, and not really playing the game.
Posted on Reply
#15
john_
watzupkenI am sure AMD will not leave RT in its current state with the introduction of RDNA3 GPUs. Having said that, I do question if RT is an important feature though. Visually, it can look impressive, but the question is whether most people can tell the difference without a point of comparison? For example, if you were first given the game Control/ Cyberpunk to play, I am sure you won't be able to tell whether RT is on or off without reviewers putting side by side comparison. Likewise, Metro Exodus EE shows good amount of RT implemented, but without a point of comparison, I think most gamers will not be able to tell the difference while still enjoying the game and its graphics. Unless of course you are trained to look for such details, and not really playing the game.
Little changes that might not be easily seen, are in the end what make the difference going from one quality level to the next higher one. They add up to give a better result on screen. And ray tracing having to do with lighting, meaning affecting the whole screen, probably does make a difference. I guess, people who are artists and their eyes and brains are trained to spot wrongs in lighting and colors, will probably see those differences easier than most of us.
Posted on Reply
#16
AusWolf
watzupkenI am sure AMD will not leave RT in its current state with the introduction of RDNA3 GPUs. Having said that, I do question if RT is an important feature though. Visually, it can look impressive, but the question is whether most people can tell the difference without a point of comparison? For example, if you were first given the game Control/ Cyberpunk to play, I am sure you won't be able to tell whether RT is on or off without reviewers putting side by side comparison. Likewise, Metro Exodus EE shows good amount of RT implemented, but without a point of comparison, I think most gamers will not be able to tell the difference while still enjoying the game and its graphics. Unless of course you are trained to look for such details, and not really playing the game.
john_Little changes that might not be easily seen, are in the end what make the difference going from one quality level to the next higher one. They add up to give a better result on screen. And ray tracing having to do with lighting, meaning affecting the whole screen, probably does make a difference. I guess, people who are artists and their eyes and brains are trained to spot wrongs in lighting and colors, will probably see those differences easier than most of us.
In my opinion, a little ray tracing added to a rasterized scene makes little difference in overall picture quality. We can only talk about a meaningful improvement when we have enough graphical power to ray trace the whole scene. It won't happen with Arc Alchemist, and I doubt RDNA3 or Ada will be capable of it, either. Maybe in 3-5 generations' time, perhaps.
Posted on Reply
#17
gffermari
Ray tracing(and DLSS/FSR) is not important. It’s the most important thing in gpus today.

When you have just ray traced shadows or reflections, you still see baked or incorrect lighting. The GI is the most important thing in graphics since GeForce 3 Ti200s pixel shading.

AMD has to bring a Ryzen-like gpu to have a success. FSR 2 matched the DLSS by 99%. I’m ok with that.
But the RT performance is unacceptable.
Posted on Reply
#18
TheoneandonlyMrK
GerKNGi'm excited to see what intel has to offer in 1-2 years.
that looks promising for their first GPUs.
That'll be the A770 at this trend setting release pace. :D :)
Posted on Reply
#19
john_
gffermariRay tracing(and DLSS/FSR) is not important. It’s the most important thing in gpus today.

When you have just ray traced shadows or reflections, you still see baked or incorrect lighting. The GI is the most important thing in graphics since GeForce 3 Ti200s pixel shading.

AMD has to bring a Ryzen-like gpu to have a success. FSR 2 matched the DLSS by 99%. I’m ok with that.
But the RT performance is unacceptable.
GeForce 3 Ti200. My most expensive GPU purchase ever. I overpayed, but didn't knew it back then.

RT is the future and it is super important. We can probably compare with 16bit vs 32bit color battle 20 years ago. I bet with the 15'' CRT monitors we had back then, there where people who couldn't notice serious graphics differences in the games we where playing back then, not enough to justify the whole fuss about 32bit color especially considering the performance loss.
16-bit vs 32-bit Performance - ATI Radeon 32MB SDR

16-bit vs 32-bit Performance - 3dfx Voodoo4 4500AGP


The company that had the worst performance in 32bit color, didn't survived.
Posted on Reply
#20
ZoneDymo
watzupkenI am sure AMD will not leave RT in its current state with the introduction of RDNA3 GPUs. Having said that, I do question if RT is an important feature though. Visually, it can look impressive, but the question is whether most people can tell the difference without a point of comparison? For example, if you were first given the game Control/ Cyberpunk to play, I am sure you won't be able to tell whether RT is on or off without reviewers putting side by side comparison. Likewise, Metro Exodus EE shows good amount of RT implemented, but without a point of comparison, I think most gamers will not be able to tell the difference while still enjoying the game and its graphics. Unless of course you are trained to look for such details, and not really playing the game.
difficult statements really, we are used to how games look.
heck imo games have not progressed much at all visually for the last 5? years.

Every brand new game I have seen looks extremely underwhelming visually if not downright old, Dying Light 2, the new Saints Row....
But we are all taken aback when something shows up that does move the needle, Far Cry, Doom 3, FEAR, Crysis just to name a few.

RT is most def a feature you see and notice, Screen Space reflections for example are glaring in their shortcomings, you thought lack of AA is distracting with the jittery lines? try half the image changing drastically just by looking up or down.... and yet we see it a lot.
A friend streamed I think it was "Medium" for me and that had cubemap reflections, which are still there when looking up and down so if that was just the standard then indeed RT there would be less impressive (probably I dont really know the shortcomings of cubemap reflections).

But also light bouncing lighting up dark areas more realistically, its definitely something rather new in games that you do notice, lighting up a red wall and seeing that red be reflected on other objects, you cant really miss that.
In the last of us ps5 remake they fake this effect but only in some areas, if it could/would be faked always and everywhere then im sure RT again would not be that impressive but there is a reason they (or anybody for that matter) dont do that.

you really dont need side by side comparisons to notice RT, though there are things that can fake it quite well.
And if you really dont notice, well that is fine as well, dont need to play on the highest settings then, turn that stuff down and enjoy high fps.....unless you dont notice that either of course.
Posted on Reply
#21
nguyen
AusWolfIn my opinion, a little ray tracing added to a rasterized scene makes little difference in overall picture quality. We can only talk about a meaningful improvement when we have enough graphical power to ray trace the whole scene. It won't happen with Arc Alchemist, and I doubt RDNA3 or Ada will be capable of it, either. Maybe in 3-5 generations' time, perhaps.
Yeah let forgo 2 decades of improvements in rasterization, sounds like a good idea.
Current hybrid RT solution actually offer the best of both rasterization and RT, unless you like to play simple looking path-traced games like Quake 2 RTX or Minecraft RTX
Posted on Reply
#22
AnotherReader
nguyenYeah let forgo 2 decades of improvements in rasterization, sounds like a good idea.
Current hybrid RT solution actually offer the best of both rasterization and RT, unless you like to play simple looking path-traced games like Quake 2 RTX or Minecraft RTX
Let's not forget that the current RT solution, i.e. few rays per pixel plus denoising and upscaling, is almost as much of a hack as rasterization.
Posted on Reply
#23
ModEl4
.
john_GeForce 3 Ti200. My most expensive GPU purchase ever. I overpayed, but didn't knew it back then.

RT is the future and it is super important. We can probably compare with 16bit vs 32bit color battle 20 years ago. I bet with the 15'' CRT monitors we had back then, there where people who couldn't notice serious graphics differences in the games we where playing back then, not enough to justify the whole fuss about 32bit color especially considering the performance loss.
16-bit vs 32-bit Performance - ATI Radeon 32MB SDR

16-bit vs 32-bit Performance - 3dfx Voodoo4 4500AGP


The company that had the worst performance in 32bit color, didn't survived.
32bit (8bit per RGBA channel) vs 16bit was extremely noticable even from PS1 1994 era (24bit RGB) examples imo.

And if my memory serves me right Voodoo 4 32bit color performance probably on the bottom of the barrel of reasons regarding why 3DFX sold most of it's assets to Nvidia in late 2000. They didn't delivered (sadly) in many fronts and was behind in the implementation of new features in relation with Nvidia (but they had better picture quality in some of the same features implemented)
Posted on Reply
#24
nguyen
AnotherReaderLet's not forget that the current RT solution, i.e. few rays per pixel plus denoising and upscaling, is almost as much of a hack as rasterization.
Everything is a hack LOL, they are just codes made to simulate real life, one is just closer to real life than the other...
Posted on Reply
#25
AnotherReader
nguyenEverything is a hack LOL, they are just codes made to simulate real life, one is just closer to real life than the other...
That's true. I just wanted to point out that RT is not a panacea, but a complement to rasterization.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts