Thursday, April 16th 2020

Intel Gen12 Xe iGPU Could Match AMD's Vega-based iGPUs

Intel's first integrated graphics solution based on its ambitious new Xe graphics architecture, could match AMD's "Vega" architecture based iGPU solutions, such as the one found in its latest Ryzen 4000 series "Renoir" iGPUs, according to leaked 3DMark FireStrike numbers put out by @_rogame. Benchmark results of a prototype laptop based on Intel's "Tiger Lake-U" processor surfaced on the 3DMark database. This processor embeds Intel's Gen12 Xe iGPU solution, which is purported to offer significant performance gains over current Gen11 and Gen9.5 based iGPUs.

The prototype 2-core/4-thread "Tiger Lake-U" processor with Gen12 graphics yields a 3DMark FireStrike score of 2,196 points, with a graphics score of 2,467, and 6,488 points physics score. These scores are comparable to 8 CU Radeon Vega iGPU solutions. "Renoir" tops out at 8 CUs, but shores up performance to the 11 CU "Picasso" levels by other means. Besides tapping into the 7 nm process to increase engine clocks, improve the boosting algorithm, and modernizing the display- and multimedia engines; AMD's iGPU is largely based on the same 3-year old "Vega" architecture. Intel Gen12 Xe makes its debut with the "Tiger Lake" microarchitecture slated for 2021.
Source: _rogame (Twitter)
Add your own comment

45 Comments on Intel Gen12 Xe iGPU Could Match AMD's Vega-based iGPUs

#1
qcmadness
We need actual products rather than paper launch.
Posted on Reply
#2
Melvis
Cool they "might" be able to match an arch thats 3 yrs old now....and with Navi just around the corner this is pretty much DOA.
Posted on Reply
#3
voltage
I cannot believe I have to wait another year... I have been waiting for this for nearly 5 years. hmmm
Posted on Reply
#4
Cheeseball
Not a Potato
That score is around 8% to 10% faster than the i5-1035G4 (48 EUs). So if we scale up to a possible 96 EUs, it can probably match a GTX 1050 or 1050 Ti. That's not bad if it would be used as an iGPU.
MelvisCool they "might" be able to match an arch thats 3 yrs old now....and with Navi just around the corner this is pretty much DOA.
Technically Intel did match it with the i7-1065G7 (64 EUs) as it was very competitive in GPU performance with the older Vega 8 (non-Renoir).
Posted on Reply
#5
R0H1T
MelvisCool they "might" be able to match an arch thats 3 yrs old now....and with Navi just around the corner this is pretty much DOA.
Yeah by the time this is available in any meaningful quantities AMD will likely have RDNA2(?) based APU ready for launch, at this point AMD is their own competition!
Posted on Reply
#6
NC37
Given that AMD has simply rehashed the same design since APUs began, it makes sense that Intel can catch them. The core problem is the Shared VRAM. AMD solves for it in consoles but does nothing for PC market. All Intel needs to do is solve for that and they can beat any APU. Which they have shown a willingness to do in their IRIS platform.
Posted on Reply
#7
Vya Domus
NC37All Intel needs to do is solve for that and they can beat any APU. Which they have shown a willingness to do in their IRIS platform.
There is nothing to "beat", you simply need faster memory and at this point this means DDR5. AMD had the closest thing to a solution with the HBC thing but that never made it's way to APUs.
voltageI cannot believe I have to wait another year... I have been waiting for this for nearly 5 years. hmmm
Why would even wait so long for this ?
Posted on Reply
#8
SL2
qcmadnessWe need actual products rather than paper launch.
Which paper launch are you talking about?
TPUThe prototype..
Posted on Reply
#9
IceShroom
CheeseballTechnically Intel did match it with the i7-1065G7 (64 EUs) as it was very competitive in GPU performance with the older Vega 8 (non-Renoir).
Technically match the performence, but with worse frametime. And from my Haswell iGPU experience, Intel will not support the gpu more than 2-3 years.
Posted on Reply
#10
Valantar
MelvisCool they "might" be able to match an arch thats 3 yrs old now....and with Navi just around the corner this is pretty much DOA.
Pretty sure they mean to say they can match Renoir performance. Then again, once next gen APUs arrive (likely 3-ish months after these) with RDNA (2 would be nice, but likely 1) these will once again be significantly behind. Especially if AMD gets an LPDDR5 controller going by then.
Posted on Reply
#11
john_
Competition in the iGPU market. This can only end up good for the casual consumer who doesn't want to pay extra for a low-mid range GPU.
Posted on Reply
#12
medi01
Vega in 4000 series is a nice piece, comfortably beats MX250/330 (the latter is a rebrand)
Posted on Reply
#13
yeeeeman
qcmadnessWe need actual products rather than paper launch.
This is a leak, not a paper launch.
Posted on Reply
#14
holyprof
Matching AMD iGPU in synthetic benchmark is one thing, matching AMD (or NVidia) game software optimizations ... is another.
Only thing Intel can do is bribe game developers (like NV has done before - TWIMTBP) to "optimize" their games to run faster on Intel GPU arch (or artifically slow down on non-Intel GPU). They have done that in the past with CPU so why not.

Fanboy stuff aside, more players means more options so I'm happy. This gen I bought an AMD CPU + NVidia GPU, maybe next will be Intel (probably my next laptop), so I'm cheering for them - I rarely game on my laptop, but a nice iGPU is always welcome for some casual gaming outside my home.
Posted on Reply
#15
Valantar
holyprofMatching AMD iGPU in synthetic benchmark is one thing, matching AMD (or NVidia) game software optimizations ... is another.
Only thing Intel can do is bribe game developers (like NV has done before - TWIMTBP) to "optimize" their games to run faster on Intel GPU arch (or artifically slow down on non-Intel GPU). They have done that in the past with CPU so why not.

Fanboy stuff aside, more players means more options so I'm happy. This gen I bought an AMD CPU + NVidia GPU, maybe next will be Intel (probably my next laptop), so I'm cheering for them - I rarely game on my laptop, but a nice iGPU is always welcome for some casual gaming outside my home.
I have to ask (not critical, just curious): why will your next laptop probably be Intel? They way things are looking now they have some serious catching up to do for AMD to not have an advantage in every aspect.
Posted on Reply
#16
holyprof
ValantarI have to ask (not critical, just curious): why will your next laptop probably be Intel? They way things are looking now they have some serious catching up to do for AMD to not have an advantage in every aspect.
I Agree with you.
I'm not saying it will be. After seeing the new AMD APUs, and having tesed the Ryzen 2200G vs i5-4690K iGPU myself, I decided my next laptop would be AMD based. Now I just saw that I don't need to narrow down my choice because Intel might be back in the iGPU game.
Posted on Reply
#17
Darmok N Jalad
I don’t think AMD’s current IGP design is really their best effort. With 4000, they were able to drop the number of CUs and ramp up clocks and still supposedly get similar/better performance. As others said, they are dealing with memory bandwidth issues, and I think once that gets better, they can put more effort into iGPU performance. Until then, there’s only so much they can do.
Posted on Reply
#18
zlobby
Yeah, and we COULD be out there, playing in the parks and sitting in a fancy restaurant but we are not. So much for theb 'could' part.

As someone above mentioned, AMD are not even focusing on the iGPU uarch in mobile Ryzen 4000 series. They just significantly bumped the clocks, and together with the new high-speed DDR4 support in Ryzen 4000, these little chips will still pack a punch.

Also, last time I checked intel's GPU drivers were even worse than AMD's. This can change any time now but I wouldn't bet too much on it.

My bet is mobile Ryzen 4000 will hold up against anything intel has to release this year, or even the next. If the friggin virus don't screw fabs' latest 5nm process I expect intel to lose even the next year due to the process inferiority itself.

Heck, even 2700U could decode and output 4K@60fps 10-bit color in hardware. What was the cheapest intel iGPU that can do this?
Posted on Reply
#19
Valantar
Darmok N JaladI don’t think AMD’s current IGP design is really their best effort. With 4000, they were able to drop the number of CUs and ramp up clocks and still supposedly get similar/better performance. As others said, they are dealing with memory bandwidth issues, and I think once that gets better, they can put more effort into iGPU performance. Until then, there’s only so much they can do.
A significant portion of their current-gen performance increase is due to moving to LPDDR4X-4266, which nearly doubles bandwidth from most previous-gen implementations (DDR4-2400 has been the most common). The clock speed bump and being supported by a faster CPU with better IPC and efficiency also obviously helps, as they can allocate more power to the iGPU while still delivering more CPU performance than last time around. DDR4-3200 laptops will be better than previous-gen ones, but not by much - from Anandtech's G14 review the 4900HS beats previous gen 8CU desktop APUs, but not 11CU ones. Of course these have more power headroom, but memory should be the same or slightly slower for the desktop chips.

The implementation of Vega is likely their best effort within the constraints of a mass-market mobile chip at the time of its design- one where die area needs to stay reasonable (without sacrificing cores or maximum CPU performance, which would necessitate an expensive second H-series die) and development schedules for GPU architectures don't align well with APU development schedules (RDNA likely wasn't ready when Renoir was designed). Still, there are obvious roads towards better iGPU performance now and in the future, mainly in implementing RDNA with its near 50% perf/CU increase (when comparing the 5700 XT vs. Radeon Vii which are on the same node) - which was obviously ready when design work on next-gen APUs started even if it wasn't for the 4000-series. If they get LPDDR5 into the next gen chips (which is unlikely given that it just arrived in mobile, LPDDR4X just hit PCs, and LPDDR4X hit mobile several years ago) that's another 25+% bandwidth increase.
Posted on Reply
#20
R0H1T
ValantarA significant portion of their current-gen performance increase is due to moving to LPDDR4X-4266, which nearly doubles bandwidth from most previous-gen implementations (DDR4-2400 has been the most common). The clock speed bump and being supported by a faster CPU with better IPC and efficiency also obviously helps, as they can allocate more power to the iGPU while still delivering more CPU performance than last time around. DDR4-3200 laptops will be better than previous-gen ones, but not by much - from Anandtech's G14 review the 4900HS beats previous gen 8CU desktop APUs, but not 11CU ones. Of course these have more power headroom, but memory should be the same or slightly slower for the desktop chips.

The implementation of Vega is likely their best effort within the constraints of a mass-market mobile chip at the time of its design- one where die area needs to stay reasonable (without sacrificing cores or maximum CPU performance, which would necessitate an expensive second H-series die) and development schedules for GPU architectures don't align well with APU development schedules (RDNA likely wasn't ready when Renoir was designed). Still, there are obvious roads towards better iGPU performance now and in the future, mainly in implementing RDNA with its near 50% perf/CU increase (when comparing the 5700 XT vs. Radeon Vii which are on the same node) - which was obviously ready when design work on next-gen APUs started even if it wasn't for the 4000-series. If they get LPDDR5 into the next gen chips (which is unlikely given that it just arrived in mobile, LPDDR4X just hit PCs, and LPDDR4X hit mobile several years ago) that's another 25+% bandwidth increase.
Yeah I don't remember seeing any major reviews publish their numbers with LPDDR4x @4266 MHz, the vast majority of reviews you are seeing have 2666 or 3200 MHz regular DDR4 & yet they smash every other Intel IGP out there & nearly match or beat the MX250 ~ in that sense there's still plenty of performance gains to be had. Remember at CES we didn't have final retailer versions of laptops neither drivers fine tuned to make the IGP shine, I'd say (IGP) Vega is still king of the hill for about a year or so!
Posted on Reply
#21
Cheeseball
Not a Potato
zlobbyHeck, even 2700U could decode and output 4K@60fps 10-bit color in hardware. What was the cheapest intel iGPU that can do this?
Since the Intel HD Graphics 530 in the first Skylake CPUs (2015).
Posted on Reply
#22
zlobby
CheeseballSince the Intel HD Graphics 530 in the first Skylake CPUs (2015).
Yeah, no. HEVC on HD 530 (all 500 actually) is limited to Main Profile 10, i.e. 8-bit per sample or 10-bit with 4:2:0 chroma sampling. No 4:4:4 at 4K@60fps.

My entire post was about mobile processors. IDK how you got the idea of 35W "mobile" CPU, although back then this was pretty low number.
ValantarA significant portion of their current-gen performance increase is due to moving to LPDDR4X-4266, which nearly doubles bandwidth from most previous-gen implementations (DDR4-2400 has been the most common). The clock speed bump and being supported by a faster CPU with better IPC and efficiency also obviously helps, as they can allocate more power to the iGPU while still delivering more CPU performance than last time around. DDR4-3200 laptops will be better than previous-gen ones, but not by much - from Anandtech's G14 review the 4900HS beats previous gen 8CU desktop APUs, but not 11CU ones. Of course these have more power headroom, but memory should be the same or slightly slower for the desktop chips.

The implementation of Vega is likely their best effort within the constraints of a mass-market mobile chip at the time of its design- one where die area needs to stay reasonable (without sacrificing cores or maximum CPU performance, which would necessitate an expensive second H-series die) and development schedules for GPU architectures don't align well with APU development schedules (RDNA likely wasn't ready when Renoir was designed). Still, there are obvious roads towards better iGPU performance now and in the future, mainly in implementing RDNA with its near 50% perf/CU increase (when comparing the 5700 XT vs. Radeon Vii which are on the same node) - which was obviously ready when design work on next-gen APUs started even if it wasn't for the 4000-series. If they get LPDDR5 into the next gen chips (which is unlikely given that it just arrived in mobile, LPDDR4X just hit PCs, and LPDDR4X hit mobile several years ago) that's another 25+% bandwidth increase.
We really don't know if RDNA was made with mobile in mind, or at least I don't. Porting RDNA to mobile may not bring much benefits if it was not tailored for mobile platforms in a first place.
Posted on Reply
#23
Melvis
ValantarPretty sure they mean to say they can match Renoir performance. Then again, once next gen APUs arrive (likely 3-ish months after these) with RDNA (2 would be nice, but likely 1) these will once again be significantly behind. Especially if AMD gets an LPDDR5 controller going by then.
Thats exactly what I meant :laugh: since RDNA isnt 3yrs old yet.

Yep thats right once RDNA is implemented into these APU's there iGPU is once again way behind.
Posted on Reply
#24
Vayra86
MelvisCool they "might" be able to match an arch thats 3 yrs old now....and with Navi just around the corner this is pretty much DOA.
Raja at his finest, eh. XE so far is nothing more than a marketing slide and a weak dev kit. These IGPs are just another iteration of what they have always had. Xe is branding. But branding offers zero FPS.

Meanwhile the momentum of Xe discrete seems to have died down right around the time Navi released and actually brought a perf/watt jump. I think Intel is slowly but surely seeing this is yet another area where they lack the node and are just behind the curve... even the curve of the GPU underdog, go figure.
Posted on Reply
#25
Valantar
zlobbyWe really don't know if RDNA was made with mobile in mind, or at least I don't. Porting RDNA to mobile may not bring much benefits if it was not tailored for mobile platforms in a first place.
Wait, what? A GPU architecture is a GPU architecture, and AMD builds all of theirs to be modular and scaleable (which, frankly, all GPU architectures are to some extent due to the parallel nature of the workload). The only criterion for it being "built for mobile" or not is efficiency, where RDNA clobbers GCN - a 40 CU 5700 XT at ~220W matches or beats a 60CU Radeon VII at ~275W on the same node after all, and that's the least efficient implementation of RDNA. AMD has specifically said that RDNA is their GPU architecture (singular) for the coming decade, so GCN is going the way of the dodo in all markets - it's just that iGPUs generally lag a bit architecturally (due to having to combine multiple architectures it's more of a challenge to have everything line up properly, leading to delays). Of course they also have CDNA for compute accelerators, but those aren't technically GPUs. Of course RDNA dGPUs all have GDDR6, which is a significant advantage compared to any laptop or desktop platform, but the advantage isn't any bigger than in the DDR3/GDDR5 era - and arguably current/upcoming LPDDR4X designs are much more usable than anything previous. I would be shocked if next-gen APUs didn't use some version of RDNA, as there is absolutely no reason for them not to implement it at this point.
R0H1TYeah I don't remember seeing any major reviews publish their numbers with LPDDR4x @4266 MHz, the vast majority of reviews you are seeing have 2666 or 3200 MHz regular DDR4 & yet they smash every other Intel IGP out there & nearly match or beat the MX250 ~ in that sense there's still plenty of performance gains to be had. Remember at CES we didn't have final retailer versions of laptops neither drivers fine tuned to make the IGP shine, I'd say (IGP) Vega is still king of the hill for about a year or so!
Have there been any proper reviews of the U-series at all? I've only seen leaked ones (that I trust to a reasonable extent, particularly that Notebookcheck Lenovo leak, though we don't know all the details of the configurations for those laptops), and otherwise there are H-series reviews with DDR4-3200 like the Asus G14. And as AnandTech has shown, that implementation roughly sits between the desktop 3200G and 3400G with DDR4-2933 (slightly closer to the 3400G on average), and soundly beats the 3500U (15W Picasso Vega 8) with DDR4-2400. Of course this is a 35W chip in a chassis with significant cooling capacity, so 15W versions are might perform worse, but might make up for that or even beat it if they have LPDDR4X - all iGPUs are starved for bandwidth, after all. Also, at least according to that Notebookcheck leak, the (possibly 25W-configured) 4800U with LPDDR4X consistently beats the MX 250 and 330, while lagging about 5% behind the MX 350.

But as this news post says, it's possible that Tiger Lake Xe iGPUs can match it - though frankly I doubt that given Intel's driver track record. They have often managed to get close to AMD iGPUs with Iris Plus SKUs in synthetics like 3DMark, yet have consistently lagged far, far behind in real-world gaming. I expect a push for better and more frequently updated drivers with Xe, but it'll take time to get them out of the gutter. And by then, RDNA APUs will be here.
Posted on Reply
Add your own comment
Jun 28th, 2024 06:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts