• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Xe DG1 SDV PCB Pictured, Looks Desolate

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,878 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Here are some of the first pictures of the Intel Xe DG1 SDV, taken apart to reveal its rather desolate PCB. The Xe DG1 SDV isn't commercially available, but rather distributed by Intel to ISVs, so they can begin optimizing or developing for the Gen12 Xe graphics architecture. The board features a GPU ASIC that's nearly identical to the Iris Xe MAX mobile discrete GPUs, and four LPDDR4 memory chips making up 8 GB of video memory.

The Xe DG1 GPU is based on the Xe LP graphics architecture, and the silicon is built on the 10 nm SuperFin silicon fabrication node. The chip features 96 execution units (768 unified shaders); and apparently makes do with the 75 W power supplied by the PCI-Express slot. A frugal 2-phase VRM powers the GPU. The GPU uses conventional 4-pin PWM to control the fan, which ventilates a simple aluminium mono-block heatsink. Three DisplayPorts and one HDMI 2.1 make up the output configuration. While you won't be able to buy a Xe DG1 SDV in the market (unless an ISV decides to break their NDA and put one up on eBay), Intel has allowed a small number of board partners to develop custom-design cards. ASUS is ready with one. Igor's Lab has more pictures, a list of benchmark fails, and other interesting commentary in the source link below.



View at TechPowerUp Main Site
 
You guys do realize that @T4C Fantasy has had board pictures, and much clearer ones at that, of the DG1 on your very own GPU Database for months... Right?


1612166770843.png1612166799185.png1612166826000.png


The DG1 SDV is over a year old now, too. Not sure why we're reliving the same news cycle a full year later...
 
Well, if you don't have to pump 500 watts into the GPU, there's not much to put on the board...
 
Some interesting tidbits for those interested:

  • The lack of compatibility outside of 9th, 10th and 11th-gen Intel Core systems isn't an artificial limitation, but is in fact due to the GPU lacking its own EEPROM firmware chip, requiring the BIOS of the motherboard to contain the necessary data to make it work. Reportedly the GPU package/die itself lacks the necessary SPI connections for this, though the server/compute card with four of these does seem to have some hacked-on EEPROM solution.
  • The display outputs are essentially non-functional; Intel recommends using the motherboard's display outputs.
  • Power consumption is reportedly in the 20-27W range, and performance is supposedly in the GT1030 range, though there are no benchmarks to back this up.
 
Are those salvaged from defect package? Stripped down from faulty mobile package, give separate board and sale as "dedicated" ?
 
You guys do realize that @T4C Fantasy has had board pictures, and much clearer ones at that, of the DG1 on your very own GPU Database for months... Right?


View attachment 186459View attachment 186460View attachment 186461


The DG1 SDV is over a year old now, too. Not sure why we're reliving the same news cycle a full year later...

Come on, Intel was just running out of product and dev announcements so they asked TPU to rehash last years'

Give them a break. I mean they need to have that company name plastered on a front page at least four times a week. The power of repetition!
 
Why would such a punny GPU carry 8 GB of RAM? I really don't see it...
 
I would really love to see how Xe 12th gen stretch its legs given higher compute count and GDDR6.
That 512EU variant is what im really waiting to hear about
 
Some interesting tidbits for those interested:
  • The lack of compatibility outside of 9th, 10th and 11th-gen Intel Core systems isn't an artificial limitation, but is in fact due to the GPU lacking its own EEPROM firmware chip, requiring the BIOS of the motherboard to contain the necessary data to make it work. Reportedly the GPU package/die itself lacks the necessary SPI connections for this, though the server/compute card with four of these does seem to have some hacked-on EEPROM solution.
  • The display outputs are essentially non-functional; Intel recommends using the motherboard's display outputs.
  • Power consumption is reportedly in the 20-27W range, and performance is supposedly in the GT1030 range, though there are no benchmarks to back this up.
This is the SDV card which is about a year old. I think it was in the same twitter thread that was talking about lack of firmware that mentioned Igor's drivers were from a year ago.
Iris Xe has 4GB VRAM and 80EU instead of 8GB and 96EU on SDV.

I wonder if this story represents the current state of Iris Xe at all.
 
This is the SDV card which is about a year old. I think it was in the same twitter thread that was talking about lack of firmware that mentioned Igor's drivers were from a year ago.
Iris Xe has 4GB VRAM and 80EU instead of 8GB and 96EU on SDV.

I wonder if this story represents the current state of Iris Xe at all.
It's the same silicon though, so the lack of SPI lanes will still prevent them from connecting an EEPROM (in a conventional manner at least). They might have hacked one on, but if that's the case it should work as a standard PCIe device on any platform.
 
I dig the regular Fan 4-Pin.

Makes it easier to switch to Noctua Fans.

If only all GPUs had a regular 4-Pin.
 
Why would such a punny GPU carry 8 GB of RAM? I really don't see it...
When others do it, it's called future proofing ;)

This doesn't look desolate at all. It's not a high end card. In fact, I'm really not sure why it needs to take up two PCIe slots.
 
When others do it, it's called future proofing ;)

This doesn't look desolate at all. It's not a high end card. In fact, I'm really not sure why it needs to take up two PCIe slots.
It really doesn't, this could keep itself nice and cool with a single-slot passive heatsink even, given some case airflow.
 
  • Like
Reactions: bug
Apart from not being news... this is an IGP turned into a GPU. Low power, low performance... no need for massive overly-populated PCBs. Did anyone expect anything else?
 
Apart from not being news... this is an IGP turned into a GPU. Low power, low performance... no need for massive overly-populated PCBs. Did anyone expect anything else?
Well, 3rd party designs tend to go overboard even for mid or low end cards. On the other hand, this isn't a 3rd party design.
 
OK. So what's the point of this useless card compared to an integrated solution??
 
OK. So what's the point of this useless card compared to an integrated solution??
Many low end cards aren't faster then a good IGP. At the very least, they come with dedicated memory that doesn't eat into your system RAM as is significantly faster.
This one is probably a stepping stone towards more capable cards down the road. That's probably why it wasn't released to retail.
 
Intel marketed the laptop versions of these as an accelerator card for compute, video editing and other similar workloads, so I assume this is targeting the same thing. They're meant to work on non-graphics tasks in conjunction with the iGPU - combining multiple GPUs for compute is far easier than doing the same for graphics, after all.
 
Sooo, basically it can't run anything because support is mad buggy? Nice.
 
I don't see why hatin on Intel. I really hope they release something competitive in near future as we can only benefit from one more player in the discrete GPU space.
 
Well, 3rd party designs tend to go overboard even for mid or low end cards. On the other hand, this isn't a 3rd party design.
Good point. Are we expecting 3rd-party designs? What is Intel's AIB strategy? I'd expect the entire thing to be a flash in a pan and to work miserably as a companion card for Gen 9, 10 and 11 CPUs, to boost iGPU performance. I can't see a single scenario where this would justify taking it to 3rd parties or, in fact, buying a retail version of this Intel card at all...

What is Intel offering here for casual? A sub-par Fortnite/Rocket League/LOL experience? Or are there other gains to be had from the use of this add-in card?
 
I know intel gets the heat and generally it has been earned but I do like the restrained and simplistic design.
Also, adding another option in the future for diy'ers and builders can only improve matters regarding GPUs.
 
I can't see a single scenario where this would justify taking it to 3rd parties or, in fact, buying a retail version of this Intel card at all...
Justify for whom? Intel and 3rd parties benefit from this from several angles. As a consumer - that is not marketed or even meant for consumer at all.
If they plan to release a real retail GPU, they need a test run. Get the cooperation right, timings, problem areas etc.
 
  • Like
Reactions: bug
Back
Top