Friday, August 26th 2022

Intel Posts XeSS Technology Deep-Dive Video

Intel Graphics today posted a technological deep-dive video presentation into how XeSS (Xe Super Sampling), the company's rival to NVIDIA DLSS and AMD FSR, works. XeSS is a gaming performance enhancement technology where your game is rendered by the GPU at a lower resolution than what your display is capable of; while a high-quality upscaling algorithm scales it up to your native resolution while minimizing quality losses associated with classical upscaling methods.

The video details mostly what we gathered from our older articles on how XeSS works. A game's raster and lighting is rendered at a lower-resolution, frame-data along with motion vectors are fed to the XeSS upscaling algorithm, and is then passed on to the renderer's post-processing and the native-resolution HUD is applied. The XeSS upscaler takes not just motion vector and the all important frame inputs, but also temporal data from processed (upscaled) frames, so a pre-trained AI could better reconstruct details.
What's new in today's presentation is that a set of performance numbers obtained on the flagship Arc A770 desktop graphics card were posted, showing how XeSS impacts performance across a set of games that include Ghostwire Tokyo, Hitman 3, Arcadegeddon, SoTR, Diofield Chronicle, Super People, Redout 2, and Chivalry 2. They also walked us through the various quality presets of XeSS that include Ultra Quality, Quality, Balanced, and Performance.
Also revealed is that Intel Graphics is working with UL Benchmarks to add a new XeSS Feature Test to 3DMark, much like the DLSS benchmark of the test. Much like AMD FSR, Intel claims the XeSS is easy to implement on a variety of game engines, and while it benefits tremendously from the XMX matrix-math accelerators of Xe-HPG GPUs, it has a DP4a motion-vector fallback that lets its support not just older Intel architectures (such as Xe-LP), but also rival GPU brands.
Intel once again detailed the broad list of games that will support XeSS, and the big AAA name here is the upcoming "Call of Duty: Modern Warfare II," which will get XeSS support at launch (October 28).

The video can be watched here:
Source: HotHardware
Add your own comment

32 Comments on Intel Posts XeSS Technology Deep-Dive Video

#26
ModEl4
AssimilatorHow do you analyse claimed performance numbers?
It's too much work and difficult to correlate the results from a first look that I took.
If you check 3 months back i said based on Intel's already delay, what Intel's strategy must be regarding launch date/optimization etc.
It was end of September.
Also i said that the actual clocks of the OC models (like a Asrock Phantom Gaming variant) shouldn't exceed 2.5GHz (without extra OC applied)
Regarding performance i said that these OC variants if accompanied with 18Gbps should allow Intel to reach 95% of RTX 3060Ti and if the silicon isn't so stressed out (Let's say 2.4GHz) and with 16Gbps -5% from that.
And i gave a ±5% margin error since we didn't have any benchmark and drivers was constantly improving.
So best case from 100% RTX 3060Ti performance if we are talking about 2.5GHz with 18Gbps memory with good driver improvements to worst case 86% RTX 3060Ti performance for 2.4GHz with 16Gbps memory with small driver improvements.
Regarding A380 i also said 3 months ago that it will be (at 1080p in this case) -2% from GTX 1650 with a ±5% margin error again, based on 2250MHz actual clocks and 16Gbps memory that was then the rumors.
A380 prediction got close enough so i expect A770 to be close to my original assumptions also depending clocks/memory and how much improved drivers will be in one month from now.
Posted on Reply
#27
Tom Yum
ModEl4What is interesting is that no publication or leaker tried to analyze the claimed A770 performance numbers (as far as I'm aware of).

----------------------------------------------------------

Intel Arc A380 GPU tested in 50 games, “compatibility better than expected for a newcomer”
Source : PC Games Hardware

Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Posted on Reply
#28
80251
Tom YumArgh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
But with iGPU's the emphasis isn't on gaming performance or cryptomining performance it's on video streaming perf. and the windows desktop.
Posted on Reply
#29
ModEl4
Tom YumArgh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel isn't new to graphics, if by graphics you mean IGPs that their focus isn't gaming at all, are extremely slower vs competitors (AMD) IGPs solutions (which in turn is a lot slower than desktop graphics), essentially they are so slow that it doesn't make sense from developers to optimize much since in many games they can't even maintain 30fps at 720p using lowest setting with no anisotropic etc.
Since we are talking about optimization from developers, we are talking about games that are concurrent or after the release of the CPU/IGP solution, not games 3-5 years earlier from CPU launch...
On the other side for Intel also it didn't make sense based on performance/target market to invest in s/w engineers to support developers or to fund optimizations marketing deals.
Of course Intel underdelivered even taking account the above since driver quality level/functionality wasn't were it should have been at all.
Now they are trying (i guess by necessity if you think were things are going regarding CPU/GPU interoperability) to compete in game orientated graphics solutions also, so they know what they have to do, I'm willing to cut them some slack you don't, nothing wrong about that...
Posted on Reply
#30
watzupken
Tom YumArgh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel is not new to GPUs for sure, but that experience does not translate to dGPUs as you can clearly tell. Even Intel's CEO acknowledged that it was a mistake for them to try and tap on their existing iGPU drivers. In my opinion, Intel only got serious in the dGPU space when they first introduced the Xe iGPU, which was meant to provide a glimpse of their dGPU future. Prior to the introduction of Xe, Intel's UHD graphics are mostly not game worthy or unstable in games. I've experienced games crashing midway (and not uncommon), and the other common problem is that the game will not even start once it detects Intel's iGPU. So there is little reason back then for Intel to optimize the drivers when the game will not start, or run like a slide show.
Posted on Reply
#31
Jimmy_
the only question - When?
Posted on Reply
#32
Unregistered
ZoneDymoXeSS is free for all to use (though the hardware for them to make efficient use of it might take some time to catch up) and if it is superior to FSR1/2 then well there is your place for it.
I don't think it will be the case, FSR will catch up to DLSS, XeSS will just die a painful death, needing specialised hardware is it's downfall.
Intel could've added it to their iGPU instead.
Add your own comment
Dec 18th, 2024 11:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts