• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Posts XeSS Technology Deep-Dive Video

How could you forget multisample 2X/4X diagonal gamma with such a short and easy name to remember!!?
I think thats MDXAMQXA, that isnt listed there, ya. :kookoo::shadedshu:
 
How do you analyse claimed performance numbers?
It's too much work and difficult to correlate the results from a first look that I took.
If you check 3 months back i said based on Intel's already delay, what Intel's strategy must be regarding launch date/optimization etc.
It was end of September.
Also i said that the actual clocks of the OC models (like a Asrock Phantom Gaming variant) shouldn't exceed 2.5GHz (without extra OC applied)
Regarding performance i said that these OC variants if accompanied with 18Gbps should allow Intel to reach 95% of RTX 3060Ti and if the silicon isn't so stressed out (Let's say 2.4GHz) and with 16Gbps -5% from that.
And i gave a ±5% margin error since we didn't have any benchmark and drivers was constantly improving.
So best case from 100% RTX 3060Ti performance if we are talking about 2.5GHz with 18Gbps memory with good driver improvements to worst case 86% RTX 3060Ti performance for 2.4GHz with 16Gbps memory with small driver improvements.
Regarding A380 i also said 3 months ago that it will be (at 1080p in this case) -2% from GTX 1650 with a ±5% margin error again, based on 2250MHz actual clocks and 16Gbps memory that was then the rumors.
A380 prediction got close enough so i expect A770 to be close to my original assumptions also depending clocks/memory and how much improved drivers will be in one month from now.
 
What is interesting is that no publication or leaker tried to analyze the claimed A770 performance numbers (as far as I'm aware of).

----------------------------------------------------------

Intel Arc A380 GPU tested in 50 games, “compatibility better than expected for a newcomer”
Source : PC Games Hardware

Intel-Arc-Alchemist-Driver-Game-Compatibility-August-2022-pcgh.png
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
 
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.

But with iGPU's the emphasis isn't on gaming performance or cryptomining performance it's on video streaming perf. and the windows desktop.
 
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel isn't new to graphics, if by graphics you mean IGPs that their focus isn't gaming at all, are extremely slower vs competitors (AMD) IGPs solutions (which in turn is a lot slower than desktop graphics), essentially they are so slow that it doesn't make sense from developers to optimize much since in many games they can't even maintain 30fps at 720p using lowest setting with no anisotropic etc.
Since we are talking about optimization from developers, we are talking about games that are concurrent or after the release of the CPU/IGP solution, not games 3-5 years earlier from CPU launch...
On the other side for Intel also it didn't make sense based on performance/target market to invest in s/w engineers to support developers or to fund optimizations marketing deals.
Of course Intel underdelivered even taking account the above since driver quality level/functionality wasn't were it should have been at all.
Now they are trying (i guess by necessity if you think were things are going regarding CPU/GPU interoperability) to compete in game orientated graphics solutions also, so they know what they have to do, I'm willing to cut them some slack you don't, nothing wrong about that...
 
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel is not new to GPUs for sure, but that experience does not translate to dGPUs as you can clearly tell. Even Intel's CEO acknowledged that it was a mistake for them to try and tap on their existing iGPU drivers. In my opinion, Intel only got serious in the dGPU space when they first introduced the Xe iGPU, which was meant to provide a glimpse of their dGPU future. Prior to the introduction of Xe, Intel's UHD graphics are mostly not game worthy or unstable in games. I've experienced games crashing midway (and not uncommon), and the other common problem is that the game will not even start once it detects Intel's iGPU. So there is little reason back then for Intel to optimize the drivers when the game will not start, or run like a slide show.
 
the only question - When?
 
XeSS is free for all to use (though the hardware for them to make efficient use of it might take some time to catch up) and if it is superior to FSR1/2 then well there is your place for it.
I don't think it will be the case, FSR will catch up to DLSS, XeSS will just die a painful death, needing specialised hardware is it's downfall.
Intel could've added it to their iGPU instead.
 
Back
Top