• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Posts XeSS Technology Deep-Dive Video

Joined
Jul 16, 2014
Messages
8,216 (2.16/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
How could you forget multisample 2X/4X diagonal gamma with such a short and easy name to remember!!?
I think thats MDXAMQXA, that isnt listed there, ya. :kookoo::shadedshu:
 
Joined
Oct 27, 2020
Messages
797 (0.53/day)
How do you analyse claimed performance numbers?
It's too much work and difficult to correlate the results from a first look that I took.
If you check 3 months back i said based on Intel's already delay, what Intel's strategy must be regarding launch date/optimization etc.
It was end of September.
Also i said that the actual clocks of the OC models (like a Asrock Phantom Gaming variant) shouldn't exceed 2.5GHz (without extra OC applied)
Regarding performance i said that these OC variants if accompanied with 18Gbps should allow Intel to reach 95% of RTX 3060Ti and if the silicon isn't so stressed out (Let's say 2.4GHz) and with 16Gbps -5% from that.
And i gave a ±5% margin error since we didn't have any benchmark and drivers was constantly improving.
So best case from 100% RTX 3060Ti performance if we are talking about 2.5GHz with 18Gbps memory with good driver improvements to worst case 86% RTX 3060Ti performance for 2.4GHz with 16Gbps memory with small driver improvements.
Regarding A380 i also said 3 months ago that it will be (at 1080p in this case) -2% from GTX 1650 with a ±5% margin error again, based on 2250MHz actual clocks and 16Gbps memory that was then the rumors.
A380 prediction got close enough so i expect A770 to be close to my original assumptions also depending clocks/memory and how much improved drivers will be in one month from now.
 
Joined
Apr 29, 2020
Messages
140 (0.08/day)
What is interesting is that no publication or leaker tried to analyze the claimed A770 performance numbers (as far as I'm aware of).

----------------------------------------------------------

Intel Arc A380 GPU tested in 50 games, “compatibility better than expected for a newcomer”
Source : PC Games Hardware

Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
 
Joined
Dec 12, 2020
Messages
1,755 (1.20/day)
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.

But with iGPU's the emphasis isn't on gaming performance or cryptomining performance it's on video streaming perf. and the windows desktop.
 
Joined
Oct 27, 2020
Messages
797 (0.53/day)
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel isn't new to graphics, if by graphics you mean IGPs that their focus isn't gaming at all, are extremely slower vs competitors (AMD) IGPs solutions (which in turn is a lot slower than desktop graphics), essentially they are so slow that it doesn't make sense from developers to optimize much since in many games they can't even maintain 30fps at 720p using lowest setting with no anisotropic etc.
Since we are talking about optimization from developers, we are talking about games that are concurrent or after the release of the CPU/IGP solution, not games 3-5 years earlier from CPU launch...
On the other side for Intel also it didn't make sense based on performance/target market to invest in s/w engineers to support developers or to fund optimizations marketing deals.
Of course Intel underdelivered even taking account the above since driver quality level/functionality wasn't were it should have been at all.
Now they are trying (i guess by necessity if you think were things are going regarding CPU/GPU interoperability) to compete in game orientated graphics solutions also, so they know what they have to do, I'm willing to cut them some slack you don't, nothing wrong about that...
 
Joined
Mar 28, 2020
Messages
1,759 (1.02/day)
Argh, so over elements of the gaming press running the whole 'Intel is a newcomer to graphics, let's give them a pass on how broken the product is' line.

Intel absolutely is not a newcomer to graphics, they have been the largest GPU manufacturer since at least the late 00's. Yes, that is in integrated graphics, but integrated graphics need drivers too, and they have introduced new architectures in iGPUs as well, they should know how to develop and release functional graphics drivers for new architectures on release.

From an architectural level, there is little difference between between an iGPU and dGPU, one is just a larger version of the other. It isn't like Intel is having issues with their dGPU memory controllers which is one of the only new elements in a dGPU. The issues they are having are not a symptom of trying something new, just bad mismanagement and incompetence which is a common theme for Intel outside their core x86 market, especially given their massive engineering and financial resources.
Intel is not new to GPUs for sure, but that experience does not translate to dGPUs as you can clearly tell. Even Intel's CEO acknowledged that it was a mistake for them to try and tap on their existing iGPU drivers. In my opinion, Intel only got serious in the dGPU space when they first introduced the Xe iGPU, which was meant to provide a glimpse of their dGPU future. Prior to the introduction of Xe, Intel's UHD graphics are mostly not game worthy or unstable in games. I've experienced games crashing midway (and not uncommon), and the other common problem is that the game will not even start once it detects Intel's iGPU. So there is little reason back then for Intel to optimize the drivers when the game will not start, or run like a slide show.
 
Joined
Aug 3, 2022
Messages
133 (0.15/day)
Processor i7-7700k @5ghz
Motherboard Asus strix Z270-F
Cooling EK AIO 240mm
Memory Hyper-X ( 16 GB - XMP )
Video Card(s) RTX 2080 super OC
Storage 512GB - WD(Nvme) + 1TB WD SDD
Display(s) Acer Nitro 165Hz OC
Case Deepcool Mesh 55
Audio Device(s) Razer Karken X
Power Supply Asus TUF gaming 650W brozen
Mouse Razer Mamba Wireless & Glorious Model D Wireless
Keyboard Cooler Master K70
Software Win 10
the only question - When?
 
D

Deleted member 185088

Guest
XeSS is free for all to use (though the hardware for them to make efficient use of it might take some time to catch up) and if it is superior to FSR1/2 then well there is your place for it.
I don't think it will be the case, FSR will catch up to DLSS, XeSS will just die a painful death, needing specialised hardware is it's downfall.
Intel could've added it to their iGPU instead.
 
Top