Saturday, October 31st 2020

Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

Intel today launched its Iris Xe MAX discrete graphics processor for thin-and-light notebooks powered by 11th Gen Core "Tiger Lake" processors. Dell, Acer, and ASUS are launch partners, debuting the chip on their Inspiron 15 7000, Swift 3x, and VivoBook TP470, respectively. The Iris Xe MAX is based on the Xe LP graphics architecture, targeted at compact scale implementations of the Xe SIMD for mainstream consumer graphics. Its most interesting feature is Intel DeepLink, and a powerful media acceleration engine that includes hardware encode acceleration for popular video formats, including HEVC, which should make the Iris Xe MAX a formidable video content production solution on the move.

The Iris Xe MAX is a fully discrete GPU built on Intel's 10 nm SuperFin silicon fabrication process. It features an LPDDR4X dedicated memory interface with 4 GB of memory at 68 GB/s of bandwidth, and uses PCI-Express 4.0 x4 to talk to the processor, but those are just the physical layers. On top of these are what Intel calls Deep Link, an all encompassing hardware abstraction layer that not only enables explicit multi-GPU with the Xe LP iGPU of "Tiger Lake" processors, but also certain implicit multi-GPU functions such as fine-grained division of labor between the dGPU and iGPU to ensure that the right kind of workload is split between the two. Intel referred to this as GameDev Boost, and we detailed it in an older article.
Deep Link goes beyond the 3D graphics rendering domain, and also provides augmentation of the Xe Media Multi-Format Encoders of the iGPU and dGPU to linearly scale video encoding performance. Intel claims that a Xe iGPU+dGPU combine offers more than double the encoding performance of NVENC on a GeForce RTX 2080 graphics card. All this is possible because a common software framework ties together the media encoding capabilities of the "Tiger Lake" CPU and Iris Xe MAX GPU that ensures the solution is more than the sum of its parts. Intel refers to this as Hyper Encode.
Deep Link also scales up AI deep-learning performance between "Tiger Lake" processors and the Xe MAX dGPU. This is because the chip has a DLBoost DP4a accelerator. As of today, Intel has onboarded major brands in the media encoding software ecosystem to support Deep Link—Hand Brake, OBS, XSplit, Topaz Gigapixel AI, Huya, Joyy, etc., and is working with Blender, Cyberlink, Fluendo, and Magix for full support in the coming months.
Under the hood, the Iris Xe MAX, as we mentioned earlier, is built on the 10 nm SuperFin process. This is a brand new piece of silicon, and not a "Tiger Lake" with its CPU component disabled, as its specs might otherwise suggest. It features 96 Xe execution units (EUs), translating to 768 programmable shaders. It also has 96 TMUs and 24 ROPs. It features an LPDDR4X memory interface, which 68 GB/s of memory bandwidth. The GPU is clocked at 1.65 GHz. It talks to "Tiger Lake" processors over a common PCI-Express 4.0 x4 bus. Notebooks with Iris Xe MAX have their iGPUs and dGPUs enabled to leverage Deep Link.
Media and AI only paint half the picture, the other being gaming. Intel is taking a swing at the 1080p mainstream gaming segment with the Iris Xe MAX offering over 30 FPS (playable) in AAA games at 1080p. It trades blows with notebooks that use the NVIDIA GeForce MX450 discrete GPU. We reckon that most e-sports titles should be playable at over 45 FPS at 1080p. Over the coming months, one should expect Intel and its ISVs to invest more in Game Boost, which should increase performance further. The Xe LP architecture features DirectX 12 support, including Variable Rate Shading (tier-1).
But what about other mobile platforms, and desktop, you ask? The Iris Xe MAX is debuting exclusively with thin-and-light notebooks based on 11th Gen Core "Tiger Lake" processors, but Intel has plans to develop desktop add-in cards with Iris Xe MAX GPUs sometime in the first half of 2021. We predict that if priced right, this card could sell in droves to the creator community, who could leverage the card's media encoding and AI DNN acceleration capabilities. It should also appeal to the HEDT and mission-critical workstation crowds that require discrete graphics, as they minimize their software sources.
Update Nov 1st: Intel clarified that the desktop Iris Xe MAX add-in card will be sold exclusively to OEMs for pre-builts.

The complete press-deck follows.
Add your own comment

74 Comments on Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

#51
fynxer
HAHA, dGPU with 30 FPS @1080p cannot be called gaming.

To be able to call it gaming it should be at least 60 FPS @1080p

30FPS @1080p is gaming on life support at best, you get a computer you can play older games

BUT what happens when that next gen games comes in a month and you get 20FPS in that game and it is unplayable, then you realized you wasted your money on this Intel gaming dGPU.
Posted on Reply
#52
RedelZaVedno
We'll need DDR5 for better iGPU performance. I believe APU with RDNA2 and Zen4 will be awesome budget 1080p gaming option IF DDR5 really launches with 6400MHz frequency (It'll probably be only 4800 at the beginning). I can see RDNA2+DDR5 iGPU reaching 1080p/60fps on high in most titles once SDRAM passes 6000 million transfers per second point.
Posted on Reply
#53
bonehead123
hahahahahahahahahahahahhahahahahahahahahahaha, SURELY THIS IS A HALLOWEEN JOKE, RIGHT ?

I'm laughin so hard, I need to part my hair just to take a dump...:roll:..:eek:..:D..:kookoo:..:clap:..:peace:
Posted on Reply
#54
Unregistered
MrMilliThat was never the point of ROCm. It facilitates CUDA code translation.

I think that most are too focused on the gaming aspects of this chip. While yes, you can game on this and it's gaming performance is comparable to a MX350/GTX1050, that's not the main selling point of DG1. You don't even buy a laptop with a MX350 for gaming, do you? You mainly want it to improve content creation. Case in point, the three laptop it launches in. Not meant for gaming whatsoever.
The combined power of the Iris Xe and Iris Xe Max is nothing to be scoffed at considering the power enveloppe. FP16 => 8TFLOPS. That's GTX 1650m level.
I don't know if anybody here uses their computer for work but Gigapixel AI acceleration, video encoding, ... these things really matter for content creators.



I this multi-stream video encoding test, it beat a RTX2080 & i9-10980HK combination. Can you imagine that a puny lightweight Acer Swift 3 laptop beating a >€2000 heavy gaming laptop in any task? I would say mission accomplished.

We're going have to wait another year before Intel launches their actual gaming product, DG2.
Don't even think anyone reads your post. they just laughing coz intels first gpu cant do 1080/60fps
#55
bug
I guess it's a good thing they stormed into gaming then. If they took things more slowly, they'd be crushing Adreno 400 now instead of MX450.
Posted on Reply
#56
dyonoctis
MrMilliThat was never the point of ROCm. It facilitates CUDA code translation.
My bad, but it's still a shame that it's only for HPC. As thing s stands now, it looks like you'll get 3 brands for 3 type of users : AMD for pure gamers, Nvidia for 3D and vfx, Intel for video editors.
Posted on Reply
#57
dicktracy
It's just a basic iGPU for small laptop. Calm down, AMD activists.
Posted on Reply
#58
TechLurker
mkppoThe article is incorrect. Anandtech has a detailed writeup and basically there's no multi-gpu support at all for games, nor any plan to in the future. Only certain productivity workloads can utilize them together.

It's basically the same speed as the Tiger Lake iGPU, but gives added flexibility and speed for those content creation workloads.

"Storms into the market" LMAO
Well that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.
Posted on Reply
#59
GreiverBlade
well ... reading that title ... i read "storm in a glass of water" ... (tempest in a teapot basically) oh well, good enough for a basic ultraportable dGP but nothing exceptional, encoding side it's not bad tho...
also, if it's their "Max" i wonder what is their "Min" if the "Max" only compete vs a MX450

classical Intel tho ... they need to revise their PR strategy :laugh: some slide give a very childish vibe ...
Posted on Reply
#60
AnarchoPrimitiv
TechLurkerWell that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.
AMD already did an iGPU+dGPU crossfire years ago in systems using their APUs and dGPUs. As far as MCMs go, AMD is already on that, they have a few recent patents for something called "GPU masking" where multiple GPUs are seen as a single, logical device by the API/OS. I'm guessing that this will be a main part of their upcoming MCM dGPUs to come...
Posted on Reply
#61
bug
GreiverBladewell ... reading that title ... i read "storm in a glass of water" ... (tempest in a teapot basically) oh well, good enough for a basic ultraportable dGP but nothing exceptional, encoding side it's not bad tho...
also, if it's their "Max" i wonder what is their "Min" if the "Max" only compete vs a MX450

classical Intel tho ... they need to revise their PR strategy :laugh: some slide give a very childish vibe ...
The thing is, a dGPU is about the worst thing for an ultraportable...
Posted on Reply
#62
GreiverBlade
bugThe thing is, a dGPU is about the worst thing for an ultraportable...
yeah i just realized that after xD, but as Intel designed it then it's good enough for the worst thing to be in an ultraportable...
[joke]typical intel[/joke] "we are the best at that because no one did better than us at doing the best for the worst part of something!" :laugh:
Posted on Reply
#63
bug
GreiverBladeyeah i just realized that after xD, but as Intel designed it then it's good enough for the worst thing to be in an ultraportable...
[joke]typical intel[/joke] "we are the best at that because no one did better than us at doing the best for the worst part of something!" :laugh:
I don't think that's what they meant to do. It's just what they ended up with isn't good for anything else.
Posted on Reply
#64
GreiverBlade
bugI don't think that's what they meant to do. It's just what they ended up with isn't good for anything else.
oh no, i am sure it's what they aimed to do ... their recent PR "we are king of yaddah yaddah real life perf yaddah yaddah gaming crown" are kinda backfiring thus they needed a new "we are the best for [insert new product]"

take it as a joke ;)
Posted on Reply
#65
ValenOne
GeForce MX450 is based on TU117 with 64-bit bus.
Posted on Reply
#66
bug
rvalenciaGeForce MX450 is based on TU117 with 64-bit bus.
I wouldn't know. I have never considered those for anything. For me, it's mid-range dGPU (or better), or just stick to the IGP. I have no use for parts so low-end, you need a different nomenclature to describe them.
Posted on Reply
#67
MrMilli
tiggerDon't even think anyone reads your post. they just laughing coz intels first gpu cant do 1080/60fps
Yeah, I didn't expect anything else.
Posted on Reply
#68
Minus Infinity
More liked Intel stormed into mediocrity. Yeah yeah it's a start but surely they should have set the bar higher than the crapulent MX350 given that will be replaced probably within 3-6 months and next gen will be vastly better.
Posted on Reply
#69
thesmokingman
TechLurkerWell that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.
What was old is new again? Lmao... Intel, smh.
Posted on Reply
#70
londiste
So, how much of this ridicule is due to what Intel has actually said and how much is from a clickbait headline?
Intel's slides are about mobile creation and integration with iGPU, gaming is mentioned less prominently.
Posted on Reply
#71
chstamos
fynxerHAHA, dGPU with 30 FPS @1080p cannot be called gaming.

To be able to call it gaming it should be at least 60 FPS @1080p

30FPS @1080p is gaming on life support at best, you get a computer you can play older games

BUT what happens when that next gen games comes in a month and you get 20FPS in that game and it is unplayable, then you realized you wasted your money on this Intel gaming dGPU.
I don't think it's even that. It's slower than a Geforce 1050, and 1050 cannot even run high settings on 1080p/30fps.

The negative reactions are probably because nvidia and AMD are neglecting the low-mid end and people (myself included) are anxious for a competitive low end card. Intel overpromised "enthusiast" level discrete cards so we were hoping no matter how much they underdelivered, we'd at least get a respectable 100-120 or 200 dollar pricerange GPU.

Maybe we will, next year. Doesn't seem promising so far, I have to say.

So far we've gotten a look at a crap discrete GPU back in February and an iterative igpu step with Xe branding.

I'm still hoping intel gets it right, eventually, because GPU prices are ridiculous right now.
Posted on Reply
#72
seth1911
There are some Firestrike Scores out, it scores between 1050 Mobile and MX 450.
Amd have nothing in this Class, only a RX 550 and this Crap is slower than an GT 1030:laugh:


Yeah sure AMD have the APU, but the Rzyen 3 3400G is still slower than a GT 1030 (GDDR5) in Games:slap:
fynxerHAHA, dGPU with 30 FPS @1080p cannot be called gaming.
Yeah Gaming is exclusive for the Great 1000$ Card buyers, i play atm on a massive OC GT 710 GDDR5 my Games like WoW and FFXIV runs very well with it.
Posted on Reply
#73
Mouth of Sauron
I have very mixed feelings about this.

1) This essentially looks a lot like GPU-coprocessor (thanks to Anandtech for a metaphor), which is good, of course. There is a question of cost and software which uses it (which I do expect to grow).
2) One thing I'm passionately hating Intel for is their active taking part in HSA downfall, and *now* they are implementing it themselves. Hypocrisy of the highest possible level (oh, and people on Anandtech missed it completely). And their solution is proprietary, as I understand. Disgusting.

On the other hand, I highly doubt that AMD and NVIDIA will have troubles making competitive products (on die or separate), and open standard would be nice (one already exists, *coughs*).
Posted on Reply
#74
TechLurker
AnarchoPrimitivAMD already did an iGPU+dGPU crossfire years ago in systems using their APUs and dGPUs. As far as MCMs go, AMD is already on that, they have a few recent patents for something called "GPU masking" where multiple GPUs are seen as a single, logical device by the API/OS. I'm guessing that this will be a main part of their upcoming MCM dGPUs to come...
Between that and AMD's 4/8-way specialized communication over Infinity Architecture that lets various Instinct Accelerators talk directly amongst themselves as well as with EPYC, It does feel like AMD could potentially nail MCM GPUs and some sort of heterogenous fusion of CPU+GPU appearing as a singular, giant APU within the next year or two (probably along with a homage to AMD Fusion).

I'm familiar with the asymmetric Crossfire capability; but it never quite lived up to its potential at the time. I was thinking Intel had beaten AMD to a modern implementation of the same concept with this announcement, but it turns out that was just misleading and not fully usable that way; rather, it's only usable in content creation rather than all the time.
Posted on Reply
Add your own comment
Mar 30th, 2025 11:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts