• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's First Discrete Graphics Solution, Iris Xe MAX, Debuts in Acer's Swift 3X Featuring Intel 11th Gen Tiger Lake

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Acer today announced the Swift 3X, a new laptop which will give consumers the first taste of Intel's discrete graphics solution powered by Xe. Remember that Intel's Xe is Intel's first discrete-class graphics architecture, whose development was helmed by former AMD graphics head Raja Koduri after Intel hired him just a week after he tendered his resignation with AMD. This is the first materialization of an Intel-developed, discrete graphics product for the consumer market, and thus should blow the lid on Intel's Xe performance. Whether or not the blue giant cements itself as a third player in the discrete graphics accelerator space - at first try - depends on the performance of this architecture.

The Swift 3X features the new Intel Iris Xe MAX discrete graphics solution paired with 11th Gen Intel Core processors "in order to offer creative professionals such as photographers and YouTubers unique capabilities and powerful on-the-go performance for work and gaming." The Swift 3X comes in at 1.37 kg (3.02 lbs), and Acer quotes up to 17.5 hours of up time in a single charge; if necessary, the Swift 3X can also be fast-charged to provide four hours of use in just 30 minutes.





The Swift 3X features a 14-inch FHD IPS screen that covers 72% of the NTSC color gamut and offers an 84% screen-to-body ratio; offers Intel Wi-Fi 6 (Gig+); and features USB-C, Thunderbolt 4 and USB 3.2 Gen 2 ports for expanded connectivity.

Jerry Kao said:
The Swift series has always been about pushing the envelope, trying to fit as much power into as portable a package as possible. The new Swift 3X continues that mindset, with discrete graphics in a sleek chassis for those who need style and performance on the go."



Chris Walker said:
It is exciting to see the new Aspire, Spin and Swift series of laptops take advantage of the real-world performance and platform integration delivered in new 11th Gen Intel Core processors. With the Swift 3X, we've partnered closely with Acer to unlock new capabilities for creators on thin-and-light laptops with the unmatched performance of 11th Gen plus the all-new Intel Iris Xe MAX discrete graphics."

Price and Availability
Acer Swift 3X (SF314-510G) will be available in North America in December starting at USD 899.99; in EMEA in November starting at 849 EUR; and in China in October, starting at RMB 4,999.

View at TechPowerUp Main Site
 
It would be hilarious to see this paired with AMD CPU.
 
Such a GPU, aimed at Youtubers......
 
im looking forward to see how the discrete gpu performs, im hopeful honist.
 
and in China in October,
Well, I'm expecting some reviews coming from China, then. It's unlikely that whatever Intel launches during 2020-2021 could push me into buying an Intel dGPU (I'm almost set to acquiring a Nvidia GPU, mostly for FAH), but still, I'm interested.
im looking forward to see how the discrete gpu performs, im hopeful honist.
More competition is good.
 
wouldn't hold my breath for anything more than "productivity-level" performance.
As far as current leaks, it's the same 96EUs, but 64-bit memory interface running 4GB of LPDDR4X, and with much tighter thermal/power envelope.
E.g. it's meant to replace the likes of MX200/300 in thin and light segment, since MX450 hasn't really materialized yet.
 
and thus should blow the lid on Intel's Xe performance. Whether or not the blue giant cements itself as a third player in the discrete graphics accelerator space - at first try - depends on the performance of this architecture.

Not really. IMO that will be decided if they make discrete pcie x16 cards.

This appears to be a separate low power mobile chip. I dont expect anything from this at all. Even if it can technically be called "Discrete"
 
Not really. IMO that will be decided if they make discrete pcie x16 cards.

This appears to be a separate low power mobile chip. I dont expect anything from this at all. Even if it can technically be called "Discrete"
True, they're teaming up with Acer I would be surprised if I liked it in use is all I'll say.
 
True, they're teaming up with Acer I would be surprised if I liked it in use is all I'll say.

Shit being acer id be surprised if I liked using it regardless of what was in it.
 
The more I hear about Xe dGPU and Tiger Lake Xe IGP, the less excited I am.

Xe's IGP was touted as having 40% better performance than Renoir, that turned out to actually be 15% better than Renoir but only in cherry-picked synthetics and it's actually worse than Vega7 (not Vega8) in real-world gaming tests.

Given that Xe as a dGPU was never promising that much in the first place, if it misses the mark as much as the laptops have then we've got a damp squib on our hands in Q1 2021.

Competition is good, but it has to actually compete to be useful to us consumers.
 
Not really hopeful about the performance at this price, but Intel has to start somewhere. Kudos to Acer for experimenting.
 
WOW! All this time and this is All Intel came up with so far?
 
wouldn't hold my breath for anything more than "productivity-level" performance.
As far as current leaks, it's the same 96EUs, but 64-bit memory interface running 4GB of LPDDR4X, and with much tighter thermal/power envelope.
E.g. it's meant to replace the likes of MX200/300 in thin and light segment, since MX450 hasn't really materialized yet.

IMO Nvidia might want to clean up some GP107 stock before actually deploying MX450, in Hong Kong there are at least 3 new laptop with 1050 variant last couple week.
 
Not really. IMO that will be decided if they make discrete pcie x16 cards.

This appears to be a separate low power mobile chip. I dont expect anything from this at all. Even if it can technically be called "Discrete"

Exactly.

This is just a rebrand of Intel's IGP, trying to look fancy running Windows desktop. At least they managed to cram another X in the name, can't go wrong.

Laptops are the worst possible way to show off your GPU performance. Especially at this form factor of thin and light.

I've still not seen a sliver of what would make Intel's Xe even remotely compete with the current midrange.
 
IMO Nvidia might want to clean up some GP107 stock before actually deploying MX450, in Hong Kong there are at least 3 new laptop with 1050 variant last couple week.
Or the yield on Turing is too good, that there aren't that many chopped-up TU117s left for ultrabooks.
Either way, this snowballed into things like Xe graphics on Acer, and Lenovo Thinkpad E-series switching to things like integrated Vega on AMD and discrete RX640 on Intel variants.
 
The more I hear about Xe dGPU and Tiger Lake Xe IGP, the less excited I am.

Xe's IGP was touted as having 40% better performance than Renoir, that turned out to actually be 15% better than Renoir but only in cherry-picked synthetics and it's actually worse than Vega7 (not Vega8) in real-world gaming tests.

Given that Xe as a dGPU was never promising that much in the first place, if it misses the mark as much as the laptops have then we've got a damp squib on our hands in Q1 2021.

Competition is good, but it has to actually compete to be useful to us consumers.

I have little confidence about Intel's Tiger Lake XE graphics so I am not surprised by the results. But objectively, I would say Intel did pretty well for their first try at graphics. The poor results from Tiger Lake could be attributed to (1) poor driver support in games, and (2) lack of power/ cooling.
 
me thinks 10nm but thats a guess. i wouldnt be so qwick to knock it intel is quite good at pulling rabbits outa hats :) . the thing thats making interested with this lappy is the battery life 17.5 hours, at the mo when im out in the field astro imaging im having lug around 2 x 12v power tanks i could probley lose one that i use to top me lappy up.
 
Last edited:
Considering that 5000 yuan is a significant lower sum than the 900 Dollars or 850 Euro price there must be an extra shitty version for the Chinese market, or the rest of the world is getting shafted with those prices.
 
I have little confidence about Intel's Tiger Lake XE graphics so I am not surprised by the results. But objectively, I would say Intel did pretty well for their first try at graphics. The poor results from Tiger Lake could be attributed to (1) poor driver support in games, and (2) lack of power/ cooling.
Not really their first try at graphics. They've been making (bad) graphics solutions for 22 years and for the last three years have had AMD's previous lead graphics architect working on Xe. Raja Kodori knows how to make a GPU, so if Xe sucks it's because Intel aren't currently very good at fabricating silicon or writing drivers.

Raja didn't lead AMD through its best products of the last two decades but he certainly moved things on successfully and managed to get architectural improvements out of 28nm over and over again when all the fabs screwed up at once and we were stuck on 28nm for 3-4 generations.
 
Not really their first try at graphics. They've been making (bad) graphics solutions for 22 years and for the last three years have had AMD's previous lead graphics architect working on Xe. Raja Kodori knows how to make a GPU, so if Xe sucks it's because Intel aren't currently very good at fabricating silicon or writing drivers.

Raja didn't lead AMD through its best products of the last two decades but he certainly moved things on successfully and managed to get architectural improvements out of 28nm over and over again when all the fabs screwed up at once and we were stuck on 28nm for 3-4 generations.

Exactly. I know a lot of us don't consider integrated graphics as being part of the GPU game but it is. Obviously not on the same level as AMD or Invidia and I'm not naïve enough to imply that scaling up an IGP into a discrete solution is a feasible or a competitive solution but they are not completely new to the graphics arena. True, they are still pretty green (in the inexperienced use of the word) when it comes to discrete graphics so all in all.... I guess I'm not saying much.
 
Back
Top