Saturday, October 31st 2020

Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

Intel today launched its Iris Xe MAX discrete graphics processor for thin-and-light notebooks powered by 11th Gen Core "Tiger Lake" processors. Dell, Acer, and ASUS are launch partners, debuting the chip on their Inspiron 15 7000, Swift 3x, and VivoBook TP470, respectively. The Iris Xe MAX is based on the Xe LP graphics architecture, targeted at compact scale implementations of the Xe SIMD for mainstream consumer graphics. Its most interesting feature is Intel DeepLink, and a powerful media acceleration engine that includes hardware encode acceleration for popular video formats, including HEVC, which should make the Iris Xe MAX a formidable video content production solution on the move.

The Iris Xe MAX is a fully discrete GPU built on Intel's 10 nm SuperFin silicon fabrication process. It features an LPDDR4X dedicated memory interface with 4 GB of memory at 68 GB/s of bandwidth, and uses PCI-Express 4.0 x4 to talk to the processor, but those are just the physical layers. On top of these are what Intel calls Deep Link, an all encompassing hardware abstraction layer that not only enables explicit multi-GPU with the Xe LP iGPU of "Tiger Lake" processors, but also certain implicit multi-GPU functions such as fine-grained division of labor between the dGPU and iGPU to ensure that the right kind of workload is split between the two. Intel referred to this as GameDev Boost, and we detailed it in an older article.
Deep Link goes beyond the 3D graphics rendering domain, and also provides augmentation of the Xe Media Multi-Format Encoders of the iGPU and dGPU to linearly scale video encoding performance. Intel claims that a Xe iGPU+dGPU combine offers more than double the encoding performance of NVENC on a GeForce RTX 2080 graphics card. All this is possible because a common software framework ties together the media encoding capabilities of the "Tiger Lake" CPU and Iris Xe MAX GPU that ensures the solution is more than the sum of its parts. Intel refers to this as Hyper Encode.
Deep Link also scales up AI deep-learning performance between "Tiger Lake" processors and the Xe MAX dGPU. This is because the chip has a DLBoost DP4a accelerator. As of today, Intel has onboarded major brands in the media encoding software ecosystem to support Deep Link—Hand Brake, OBS, XSplit, Topaz Gigapixel AI, Huya, Joyy, etc., and is working with Blender, Cyberlink, Fluendo, and Magix for full support in the coming months.
Under the hood, the Iris Xe MAX, as we mentioned earlier, is built on the 10 nm SuperFin process. This is a brand new piece of silicon, and not a "Tiger Lake" with its CPU component disabled, as its specs might otherwise suggest. It features 96 Xe execution units (EUs), translating to 768 programmable shaders. It also has 96 TMUs and 24 ROPs. It features an LPDDR4X memory interface, which 68 GB/s of memory bandwidth. The GPU is clocked at 1.65 GHz. It talks to "Tiger Lake" processors over a common PCI-Express 4.0 x4 bus. Notebooks with Iris Xe MAX have their iGPUs and dGPUs enabled to leverage Deep Link.
Media and AI only paint half the picture, the other being gaming. Intel is taking a swing at the 1080p mainstream gaming segment with the Iris Xe MAX offering over 30 FPS (playable) in AAA games at 1080p. It trades blows with notebooks that use the NVIDIA GeForce MX450 discrete GPU. We reckon that most e-sports titles should be playable at over 45 FPS at 1080p. Over the coming months, one should expect Intel and its ISVs to invest more in Game Boost, which should increase performance further. The Xe LP architecture features DirectX 12 support, including Variable Rate Shading (tier-1).
But what about other mobile platforms, and desktop, you ask? The Iris Xe MAX is debuting exclusively with thin-and-light notebooks based on 11th Gen Core "Tiger Lake" processors, but Intel has plans to develop desktop add-in cards with Iris Xe MAX GPUs sometime in the first half of 2021. We predict that if priced right, this card could sell in droves to the creator community, who could leverage the card's media encoding and AI DNN acceleration capabilities. It should also appeal to the HEDT and mission-critical workstation crowds that require discrete graphics, as they minimize their software sources.
Update Nov 1st: Intel clarified that the desktop Iris Xe MAX add-in card will be sold exclusively to OEMs for pre-builts.

The complete press-deck follows.
Add your own comment

74 Comments on Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

#26
crazyeyesreaper
Not a Moderator
What i find funny and most dont pick up on is the fact, that unlike gaming laptops when on battery power where the GPU gets castrated and performs worse than an IGP (ive literally tested it and bitched about it in reviews) the fact is if you game on the go ie battery power, if Intel included these GPUs in power efficient laptops and allow it to stretch its legs its honestly not a bad showing.
Posted on Reply
#27
X71200
If I'm rocking any form of laptop, I'm probably putting it on somewhere I can plug to the wall. I don't get the gaming on the go without charger, who even does that? You're ultimately emptying the battery for stupid games and if you need to do something actual with the laptop, such as checking bus or plane times, good luck when the laptop is dead. Sure, you have your phone but it'll be a lot more annoying. If you're working with limited battery / laptop, you normally want to keep the charge as much as you can so you can squeeze out enough to finish your hotel day or whatever. If you have nothing better to do than opening the laptop to play games in public, then I wouldn't know what to say to that. More battery is great, on the other hand, especially if you go around doing things such as diagnosing cars with your laptop, that can take time and you can't give a chance for the laptop to die when you're checking for errors in the ECU.
Posted on Reply
#28
crazyeyesreaper
Not a Moderator
X71200If I'm rocking any form of laptop, I'm probably putting it on somewhere I can plug to the wall. I don't get the gaming on the go without charger, who even does that? You're ultimately emptying the battery for stupid games and if you need to do something actual with the laptop, such as checking bus or plane times, good luck when the laptop is dead. Sure, you have your phone but it'll be a lot more annoying. If you're working with limited battery / laptop, you normally want to keep the charge as much as you can so you can squeeze out enough to finish your hotel day or whatever. If you have nothing better to do than opening the laptop to play games in public, then I wouldn't know what to say to that. More battery is great, on the other hand, especially if you go around doing things such as diagnosing cars with your laptop, that can take time and you can't give a chance for the laptop to die when you're checking for errors in the ECU.
Whats the point of allowing a laptop CPU suck down 100-watts of power but the GPU is castrated and limited to just 30-watts? Games dont max out the CPU nor does most workloads, Yet on battery I can hammer the CPU all day but the GPU will never actually be used to an meaningful extent. say the CPU is 100-watts under turbo 60-watts typical GPU is TDP capped at 80-watts but on battery is knocked down to 30-watts. In todays world where many tasks are GPU accelerated the fact is a dedicated GPU performs worse than an IGP in those situations.

Whats the point of 94 whr battery if your not gonna use it?

Fact is a mobile CPU + decent IGP under load with a good battery can last 3-4 hrs playing games when properly configured. I can get nearly 60 FPS out of a 25-watt Ryzen 4800U in GTA V at 720p yet in a similar test with say a 1660 Ti you get about 15-20. That said maybe the use case doesn't apply to you. But it does apply to me and at this point with each new generation. Battery performance gets worse and worse.

So if the Intel GPU is competitive and low power enough its possible to get a good on the go gaming experience on battery power while still getting 4-5 hrs of use when paired with a larger battery.

So it really comes down to how much power this thing will use if its below 30-watts it avoids the throttling issue thats a win for me. hopefully future RDNA based IGPs from AMD are coming soon.
Posted on Reply
#29
Chrispy_
What a pointless announcement.

It's for thin and lights, so the single most important metric is performance/Watt.
I skimmed all those slidees, and didn't see a single mention.

Lets, just for one second, assume it has competitive performance/Watt - and I very much doubt that, because if it did Intel would be shouting how great their efficiency was from the rooftops - how much does the damn thing cost?
Posted on Reply
#30
silentbogo
InVasManiThere needs to be a more viable alternative to CUDA that isn't a proprietary API that actually gains traction
There's already OpenCL and Vulkan Compute. The issue is not "proprietary vs openource", but rather fundamental differences in GPU architectures. You cannot make a platform-independent API that can outperform platform-optimized API. There's always some kind of tradeoff.
Posted on Reply
#31
mak1skav
At least they can "storm" the Steam charts once again for the most used graphics cards.
Posted on Reply
#32
X71200
crazyeyesreaperWhats the point of allowing a laptop CPU suck down 100-watts of power but the GPU is castrated and limited to just 30-watts? Games dont max out the CPU nor does most workloads, Yet on battery I can hammer the CPU all day but the GPU will never actually be used to an meaningful extent. say the CPU is 100-watts under turbo 60-watts typical GPU is TDP capped at 80-watts but on battery is knocked down to 30-watts. In todays world where many tasks are GPU accelerated the fact is a dedicated GPU performs worse than an IGP in those situations.

Whats the point of 94 whr battery if your not gonna use it?

Fact is a mobile CPU + decent IGP under load with a good battery can last 3-4 hrs playing games when properly configured. I can get nearly 60 FPS out of a 25-watt Ryzen 4800U in GTA V at 720p yet in a similar test with say a 1660 Ti you get about 15-20. That said maybe the use case doesn't apply to you. But it does apply to me and at this point with each new generation. Battery performance gets worse and worse.

So if the Intel GPU is competitive and low power enough its possible to get a good on the go gaming experience on battery power while still getting 4-5 hrs of use when paired with a larger battery.

So it really comes down to how much power this thing will use if its below 30-watts it avoids the throttling issue thats a win for me. hopefully future RDNA based IGPs from AMD are coming soon.
This seems to be too general of a post. You're not talking about what laptop of what KIND of laptop is even the topic, and I already mentioned that most people are going to put the laptop to the plug. CPU power does matter, you don't anywhere have the power of a desktop CPU in most laptops.

Fact is you're coming up with a configuration that I don't really care about, and it should tell you something about the end user. I'm not going to run some easy to run game, GTA V, on battery at 720p.

I do use the 94Whr battery actually, depends on what kind of laptop it is. As long as the machine is not overly heavy, it will come in handy. Though it's more of a thing with gaming and workstation laptops that have 240W chargers.

Furthermore, if you look at the machines in the slides, you will see convertible laptops and a Swift. Those are not gaming laptops and when you want to game with them, you probably will want to try to get the performance out properly by plugging it to the wall... or you'll not only get worse performance, but also run out of battery because your thin convertible doesn't come with a 94Whr battery.
Posted on Reply
#33
Caring1
This press release reads like an infomercial on TV.
I was waiting for the bit at the end that said, But wait, there's more.
We'll even throw in six FREE steak knives. :laugh:
Posted on Reply
#34
Vya Domus
Practically DOA, what I find particularly baffling is the fact that this uses LPDDR4X. If you go as far as to have a dedicated die with DRAM why not use modern memory optimized for GPUs ? MX350 is practically the closest thing Nvidia can do to a integrated chip in terms of cost and performance and this barely outperforms it? Damn.

I can't believe I am saying this but Intel, of all people, can't seem to comprehend that they need mindshare. You simply can't enter this space with lowest imaginable tier of performance, this will seem like a joke to people and by the time they scrap something together that's higher performance it may as well be over for them.
Posted on Reply
#35
Frick
Fishfaced Nincompoop
TheLostSwedeIt's a start I guess, but MX350 beating performance is hardly something to brag about.
To be impressive, this would have to beat the GTX 1650 in mobile.
Can't find any mention of which codecs are supported for encoding, beyond H.265.
This is very dependent on price and TDP. MX350 laptops are >€700. Bring the performance to a lower price and I'm in.
Posted on Reply
#36
IceShroom
Hopefully Intel fixed their horriable frametime problem for their iGPU/dGPU.
Posted on Reply
#37
kardeon
"Storms" Are you kidding ?
Posted on Reply
#38
Sybaris_Caesar
This begs a question. Where's leaks for AMD's big navi laptop GPUs? Since AMD's touting to be more efficient than Nvidia this generation shouldn't AMD's laptop GPU division get a boost as well. They only went upto RX 5600M last year if I rember correctly, which was a little slower than RTX 2060 Max-Q, the slower version of RTX 2060 Mobile. AMD did announce RX 5700M but I think they quietly pulled the plug since no laptop was available with it.
Posted on Reply
#39
Flanker
At least this has gone further than Larrabee right? right?
Caring1This press release reads like an infomercial on TV.
I was waiting for the bit at the end that said, But wait, there's more.
We'll even throw in six FREE steak knives. :laugh:
If you call in the next 30 minutes, you will get another GPU absolutely free!!!
Posted on Reply
#40
mkppo
TechLurkerThe only novel thing here is their ability to have an iGPU and dGPU link and work together, which isn't something AMD is really doing currently. Although I recall AMD did have a more rudimentary version of it with Crossfire, and was revisiting the idea more for future MCM GPUs and heterogenous computing via Infinity Architecture in general.

Still, it is a small edge (feature-wise) Intel has for now, and if they're able to go further with a SAM-like equivalent, they could potentially squeeze out a bit more performance that way. That said, it'll be a while until they can sufficiently catch up in actual performance, unless AMD or NVIDIA trips up hard.
The article is incorrect. Anandtech has a detailed writeup and basically there's no multi-gpu support at all for games, nor any plan to in the future. Only certain productivity workloads can utilize them together.

It's basically the same speed as the Tiger Lake iGPU, but gives added flexibility and speed for those content creation workloads.

"Storms into the market" LMAO
Posted on Reply
#43
Rares
Intel is great in "slides", but slow in motion... Sad times for Intel...
Posted on Reply
#44
yotano211
KhonjelThis begs a question. Where's leaks for AMD's big navi laptop GPUs? Since AMD's touting to be more efficient than Nvidia this generation shouldn't AMD's laptop GPU division get a boost as well. They only went upto RX 5600M last year if I rember correctly, which was a little slower than RTX 2060 Max-Q, the slower version of RTX 2060 Mobile. AMD did announce RX 5700M but I think they quietly pulled the plug since no laptop was available with it.
Dude chill, big navi is not even out yet. Laptops versions usually come out much later.
Posted on Reply
#45
AsRock
TPU addict
XL-R8RI arrived to say the same; "storming" is hardly a word to use when beating a lowly MX350.
If anyone is storming it's AMD,
Posted on Reply
#46
AusWolf
"We reckon that most e-sports titles should be playable at over 45 FPS at 1080p."

And I reckon that players of such titles won't be satisfied with that performance. I really don't think 96 EUs at 1.65 GHz will be enough to "storm into" 1080p gaming, especially if the chips are only sold to laptop manufacturers and OEMs. Don't get me wrong, I'd be happy if AMD and nvidia had some competition in the gaming GPU market - but I can't see it happening just yet.
Posted on Reply
#47
medi01
"Storms into... 1080p market", savage.

Bad news for NV MX series though.
Posted on Reply
#48
Tsukiyomi91
To a certain someone who said that "1080p is so yesterday", then why the majority of users are on that resolution till this day?
Posted on Reply
#49
MrMilli
dyonoctisThat's interesting, but I wonder how long it will take to become mainstream. AMD ROCm was also supposed to make radeon run CUDA...since 2016 but there's still no news of amd support in mainstream CUDA apps
That was never the point of ROCm. It facilitates CUDA code translation.

I think that most are too focused on the gaming aspects of this chip. While yes, you can game on this and it's gaming performance is comparable to a MX350/GTX1050, that's not the main selling point of DG1. You don't even buy a laptop with a MX350 for gaming, do you? You mainly want it to improve content creation. Case in point, the three laptop it launches in. Not meant for gaming whatsoever.
The combined power of the Iris Xe and Iris Xe Max is nothing to be scoffed at considering the power enveloppe. FP16 => 8TFLOPS. That's GTX 1650m level.
I don't know if anybody here uses their computer for work but Gigapixel AI acceleration, video encoding, ... these things really matter for content creators.



I this multi-stream video encoding test, it beat a RTX2080 & i9-10980HK combination. Can you imagine that a puny lightweight Acer Swift 3 laptop beating a >€2000 heavy gaming laptop in any task? I would say mission accomplished.

We're going have to wait another year before Intel launches their actual gaming product, DG2.
Posted on Reply
#50
_Flare
Storm with a small group of very expensive 10nm Unicorn-Soldiers :toast:
Posted on Reply
Add your own comment
Nov 21st, 2024 17:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts