Thursday, January 9th 2020

Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space

At a media event on Wednesday, Intel invited us to check out their first working modern discrete graphics card, the Xe DG1 Software Development Vehicle (developer-edition). Leading the event was our host Ari Rauch, Intel Vice President and General Manager for Graphics Technology Engineering and dGPU Business. Much like gruff developer-editions of game consoles released to developers several quarters ahead of market launch, the DG1-SDV allows software developers to discover and learn the Xe graphics architecture, and develop optimization processes for their current and future software within their organizations. We walked into the event expecting to see a big ugly PCB with a bare fan-heatsink and a contraption that sort-of looks like a graphics card; but were pleasantly surprised with what we saw: a rather professional product design.

What we didn't get at the event, through, was a juicy technical breakdown of the Xe graphics architecture, and its various components that add up to the GPU. We still left pleasantly surprised for what we were shown: it works! The DG1-SDV is able to play games at 1080p, even if they are technically lightweight titles like "Warframe," and aren't maxing out settings. The SDV is a 15.2 cm-long graphics card that relies on the PCI-Express slot for power entirely (and hence pulling less than 75 W).
We already know from Intel's Xe slides since 2019 that the Xe architecture is designed to be extremely scalable, with a single ISA scaling all the way from iGPUs to tiny discrete GPUs like the DG1-SDV, and scaling all the way up to double-digit TFLOP-scale compute processors for the HPCs. Along the way, though, Intel intends to compete in the gaming and client-graphics space by developing products at just the right scale of the Xe architecture, with just the right performance/Watt to compete with specific products from the NVIDIA-AMD duopoly. Forget the high-end for a moment. If Intel is able to match even the GTX 1650 and RX 5500 (or their future $150 successors) in performance and power, they end up tapping into a double-digit percentage of the client-graphics TAM, and that spells trouble for Santa Clara and Markham.
The Xe DG1 is backed by Intel's robust software stack, which has seen breakneck development and feature-additions in recent times, such as a modern Control Center app, support for modern technologies such as variable-rate shading, integer scaling, etc. Intel has, for over a decade, established a foothold in the client media-acceleration space with its Quick Sync video encoders, and Xe only dials that a notch. The DG1 features Intel's entire media-acceleration and display-controller feature-set. Intel is also designing Xe to be extremely configurable by OEMs, to make the GPU finely-match their product's thermal and power targets. Responding to a specific question by us, Intel didn't rule out the possibility of discrete Xe graphics cards working in tandem with Intel iGPUs (possibly starting with "Tiger Lake").
Xe DG1-SDV in action (video)
Here are a couple of brief videos we took of the DG1-SDV alive and kicking.

Here's the card itself:

And here's the money-shot of Intel's presentation, a "Warframe" gaming session. There were no performance numbers put out, but the game is being rendered at 1080p, and appears playable.

Xe DG1-SDV Physical Design
The Xe DG1-SDV (software development vehicle) is a contraption that's more evolved than "working prototype," and stops short of being a production product. It is designed to to be stable and durable enough for its target audience: ISVs, individual software developers, and systems engineers evaluating the thing for major hardware OEMs. The card has the exact same physical dimensions as the Radeon R9 Nano, and fits into any machine that has two full-height expansion slots and a PCI-Express x16 interface, no additional power cables needed. A single fan cools an aluminium fin-stack heatsink underneath. Throughout the demo, the cooler was more than audible and in need of acoustic optimization. The cooler shroud and back-plate bear a futuristic silvery design, there's also a row of LEDs near the I/O shield that put out light into the grooves of the shroud.
"Tiger Lake" gate-crashes the party
Intel is still trying to break its habit from being a CPU maker foremost. The "Tiger Lake" CPU microarchitecture is a significant inflection point of Intel's next-gen "Willow Cove" CPU core, and the first implementation of Xe as an iGPU solution. Given the volumes of CPUs with iGPUs Intel pushes, Xe is expected to hit critical mass in the client-segment with "Tiger Lake" proving the launchpad. There are some interesting tidbits in the "Tiger Lake" slide:
  • There's a "massive leap" in graphics performance, compared to current Gen11 architecture, thanks to Xe
  • Unless we're badly mistaken, Intel just put out a CPU IPC guidance for "Willow Cove" as being "double digit" (we assume in comparison to current Ice Lake / Sunny Cove)
  • A "massive" AI performance improvement from DLBoost and support for more AVX-512 instructions
Complete Intel Slide-Deck
Offical Intel Board Shots
Add your own comment

44 Comments on Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space

#1
MDWiley
Looks like the product line has potential. It’ll be interesting to see what they come up as time goes on since they plan to compete with Nvidia and Radeon.
Posted on Reply
#2
notb
Ultimately this will not be about discrete desktop GPUs anyway.
If these are similarly efficient to Nvidia's cheap mobile Pascal/Turing (1050, 1650) and are well integrated with Intel's mobile platforms, they'll just conquer most of laptop market on day 1.

I believe splitting tasks between IGP and dGPU has already been confirmed (from Tiger Lake onward). AMD has it as well.
Posted on Reply
#3
dicktracy
Since AMD also decided to join Nvidia in overcharging their low and midrange GPUs, this looks like a good time to break up this duopoly. AMD fans sure misses Raja Koduri’s breath taking prices.
Posted on Reply
#4
Darmok N Jalad
This seems like a pretty good showing. 1080p gaming with a single fan card with no power connector is a pretty good accomplishment, as it suggests the architecture has decent efficiency and has room to scale into more powerful applications. It would seem like they might be able to reach mid-range performance, and maybe even challenge AMD for custom design wins. Only problem is, next gen console hardware is already set, and who knows how long before another upgrade will be needed. I think we are reaching diminishing returns unless RT gains traction.
Posted on Reply
#5
notb
Darmok N JaladOnly problem is, next gen console hardware is already set, and who knows how long before another upgrade will be needed. I think we are reaching diminishing returns unless RT gains traction.
Unless it turns out we open the huuuuge next Xbox and it's just a custom small PC with a replacable MXM GPU (which for me would be a disaster and a blow to console robustness...).

Anyway, as mentioned earlier, mobile GPUs seem like the obvious aim. Seriously, why would any laptop maker go with a 1650 or similar Radeon, when they have a solution from the party that provides the CPU? Especially when it would work with the CPU...
Posted on Reply
#6
mak1skav
Hoping for the best but waiting for the worst. I hope Intel will be more serious in their GPU attempt this time and they will invest heavily in the drivers department too. I guess their ultimate target will be to win a big chunk of AI market so I am not sure how serious they will be about the gaming abilities of their GPUs. We need another competitor in the GPU market but Intel pricing history isn't very promising lol.
Posted on Reply
#7
notb
dicktracySince AMD also decided to join Nvidia in overcharging their low and midrange GPUs, this looks like a good time to break up this duopoly. AMD fans sure misses Raja Koduri’s breath taking prices.
The fact that you'd like GPUs to be cheaper doesn't make them overpriced. That's just a price. On a free market. GPU makers can set it however they deem fit.

For a long time we had higher Nvidia's prices and lower AMD's. Nvidia was making money, AMD wasn't.
Now AMD has raised prices to Nvidia's level, which is just a confirmation of who was right all along.

Intel isn't exactly known for selling at break-even. Don't expect them to be the cheapest option. :)

Seriously, PC gaming is a cheap hobby anyway. Don't be such a miser...
Posted on Reply
#8
danbert2000
I suppose it's smart to start small, but I wonder if Intel really does have the ability to scale this up to 5700 XT/2070 level performance. I'm guessing that a release like this shows that Intel is close to putting out a real product, but at the same time it's not going to be until the second half of the year if they're still trying to get developers on board with the architecture. It's probably the case that Intel wants some early prebuilt design wins in the lower end of the gaming market in order to get a foothold, but if the card is just coming out now, products with it will take a while. And we have no idea if this is just a small taste of Xe, or if there's a midrange card already in testing. Perhaps this year all we see is something like a 5500 XT or 1660. That would still be faster than anything Intel has ever made by about 3x.
Posted on Reply
#9
notb
danbert2000I suppose it's smart to start small, but I wonder if Intel really does have the ability to scale this up to 5700 XT/2070 level performance.
But do they really need to go for the high-end?

Do we know how GPU sales profile looks by segment?
Guessing by Steam survey, cards up to $300 provide vast majority of sales. Not sure about profits.

This isn't and will never be a major product line for Intel. I doubt they're willing to spend on developing niche high-end cards.
Even AMD, making roughly half of their revenue on GPUs, hasn't always been keen to fight with Nvidia's best.
Posted on Reply
#10
T4C Fantasy
CPU & GPU DB Maintainer
in their "ask you anything" someone should ask why do they hide transistor count when nobody else does... it may not be important, but its interesting to know density and all of that.
Posted on Reply
#11
Dave65
I bash on Intel pretty good, but this is good for everyone competition wise to keep pricing at decent levels.
Posted on Reply
#12
Anymal
Much needed correction:
We walked into the event expecting to see a big ugly PCB with a bare fan-heatsink and a contraption that sort-of looks like a graphics card; but were PEASANTLY surprised with what we saw: a rather professional product design.
Posted on Reply
#13
sepheronx
Good! More competition the better.
Posted on Reply
#15
silentbogo
Looks a bit underwhelming... Warframe @1080p Low and no FPS counter? Even the current-gen HD630 can do a fairly playable 45-50FPS at these settings.
This discrete card doesn't look any more powerful than its upcoming mobile counterpart (e.g. Gen12, or Xe-LP or whatever).
Fine for a dev kit, but I don't think it's anything even remotely near a consumer-grade discrete graphics product.
Posted on Reply
#16
ZoneDymo
notbThe fact that you'd like GPUs to be cheaper doesn't make them overpriced. That's just a price. On a free market. GPU makers can set it however they deem fit.

For a long time we had higher Nvidia's prices and lower AMD's. Nvidia was making money, AMD wasn't.
Now AMD has raised prices to Nvidia's level, which is just a confirmation of who was right all along.

Intel isn't exactly known for selling at break-even. Don't expect them to be the cheapest option. :)

Seriously, PC gaming is a cheap hobby anyway. Don't be such a miser...
The fact that you think "overpriced" can be anything but an opinion, and trying to fight that opinion as if its a fact is just...silly to say the least.
Posted on Reply
#17
Steevo
So, noisy, 1080 Warhammer performance is above 30FPS, so somewhere around a 1060/580 if it was Ultra settings.

Reviews will be needed before I buy into this.
Posted on Reply
#18
iO
Only 8 PCIe lanes on the PCB and the complete lack of any details still isn't promising but at least they show actual hardware now instead of just renders.
Posted on Reply
#19
Darmok N Jalad
SteevoSo, noisy, 1080 Warhammer performance is above 30FPS, so somewhere around a 1060/580 if it was Ultra settings.

Reviews will be needed before I buy into this.
I’m looking more at the power budget factor. The cards you mention need a supplemental power connection, as does even the 5500XT. The question becomes, can this scale beyond that into a very powerful GPU when given much more than 75W to work with? We’ve not really seen Intel push their GPU architecture into high-power territory. Their consumer line of GPUs has always been a small part of the power budget.
Posted on Reply
#20
londiste
This is a development vehicle and definitely not a product or a final form on what Xe will be.
They are showing precisely one thing - it exists/works.

Wasn't DG1 rumored/revealed to be a 96EU GPU and <25W TDP?
96EU should be 768 shaders so that would make it more of a iGPU size than usual dGPU as we know it.

Might be preparation for Xe reveal in Tiger Lake that should be the same size?
Posted on Reply
#21
Cheeseball
Not a Potato
londisteThis is a development vehicle and definitely not a product or a final form on what Xe will be.
They are showing precisely one thing - it exists/works.

Wasn't DG1 rumored/revealed to be a 96EU GPU and <25W TDP?
96EU should be 768 shaders so that would make it more of a iGPU size than usual dGPU as we know it.

Might be preparation for Xe reveal in Tiger Lake that should be the same size?
Yup, 96 EU sounds right. Core i7-1065G7 is 64 EU which is 512 cores. So this does seem better than Iris Pro Graphics 580.

Does this have double-precision FP64? Or would that be limited to the HP and HPC variants?
Posted on Reply
#22
silentbogo
londisteWasn't DG1 rumored/revealed to be a 96EU GPU and <25W TDP?
96EU should be 768 shaders so that would make it more of a iGPU size than usual dGPU as we know it.

Might be preparation for Xe reveal in Tiger Lake that should be the same size?
Since Tiger Lake is an MCM, I suspect it's just Gen12 iGPU slapped on a board with some vRAM. Performance-wise it kinda lines up with "up to 100% uplift over Gen11" teased by Intel.
Though, compared with older HD620/630 I don't see how can it be twice as fast as IceLake graphics, if it's not even close to "+100%" over 9th gen.
CheeseballYup, 96 EU sounds right.
There were also rumors that the first "desktop" DG1 is a 128EU, but so far no confirmation on this.
Posted on Reply
#23
Vya Domus
Even for an early prototype it's just too underwhelming, it's also obvious this is till no where near a real product that will be out there in the hands of people anytime soon.

It seems like Intel has developed this annoying strategy of releasing or announcing stuff that's not up to snuff just to remind people that they "are doing something" just like that crappy 10nm dual core CPU they released 2 years ago and the "glued" 56 core Xeons. Just doing something isn't good enough.

To all the "bring competition to AMD/Nvidia" prophets, buckle up for some serious disappointment. I have a feeling the purpose of this demo is to sort of like bring down the expectations a bit, there have been some wild hopes and dreams put in Intel's discreet GPUs.
Posted on Reply
#25
Jism
Its a compute card, being setup as a general purpose graphics card.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts