Monday, October 28th 2019
Intel Powers-on the First Xe Graphics Card with Dev Kits Supposedly Shipping
Intel is working hard to bring its first discrete GPU lineup triumphantly, after spending years with past efforts to launch the new lineup resulting in a failure. During its Q3 earnings call, some exciting news was presented, with Intel's CEO Bob Swan announcing that "This quarter we've achieved power-on exit for our first discrete GPU DG1, an important milestone." By power on exit, Mr. Swan refers to post-silicon debug techniques that involve putting a prototype chip on a custom PCB for testing and seeing if it works/boots. With a successful test, Intel now has a working product capable of running real-world workloads and software, that is almost ready for sale.
Additionally, the developer kit for the "DG1" graphics card is supposedly being sent to various developers over the world, according to European Economy Commission listings. Called the "Discrete Graphics DG1 External FRD1 Accessory Kit (Alpha) Developer Kit" this bundle is marked as a prototype in the alpha stage, meaning that the launch of discrete Xe GPUs is only a few months away. This confirming previous rumor that Xe GPUs will launch in 2020 sometime mid-year, possibly in July/August time frame.
Source:
PCGamesN
Additionally, the developer kit for the "DG1" graphics card is supposedly being sent to various developers over the world, according to European Economy Commission listings. Called the "Discrete Graphics DG1 External FRD1 Accessory Kit (Alpha) Developer Kit" this bundle is marked as a prototype in the alpha stage, meaning that the launch of discrete Xe GPUs is only a few months away. This confirming previous rumor that Xe GPUs will launch in 2020 sometime mid-year, possibly in July/August time frame.
30 Comments on Intel Powers-on the First Xe Graphics Card with Dev Kits Supposedly Shipping
:rockout:RIGHT SIDE UP! :roll:
Would be a killer move,amd should have done it long time ago.
www.notebookcheck.net/New-evidence-of-Intel-s-multi-GPU-support-for-upcoming-Xe-discrete-cards-uncovered-in-Linux-drivers.440252.0.html
Are these consumer workstation gpu? or something else?
As the article mentions, this has been claimed before, heck multiple iterations of the idea of multiple different gpu's working together, but it has never actually come to anything good/usable.
Next to that...AMD should have done this a long time ago..with the APU's? why?
The more powerful cpu's from AMD dont have a build in gpu and those that do are semi low end, meant for the office work or laptops etc, there it make sense.
To pair that with a dedicated gpu is a bit silly really.
On the article itself, more non-news from Intel it seems:
Intel Marketing: "We won't let you forget us"
Rumors are pretty much always there as there is constant research going on around multi-GPU solutions. Crossfire and SLI were/are the most efficient methods for now. Again, this is for gaming and other real-time rendering applications. GPGPU or similar applications (non-real-time rendering), especially data centers and distributed GPGPU with lower latency/time dependency are much easier and have been largely figured out for a long while (research into improving these has obviously not stopped and is ongoing and this are better in that area as well, the pain points and bottlenecks are simply different).
We knew it was coming, we knew the expected release date and...nothing has changed except for that we know its coming a bit more certainly now? yay.
Delays would be news but reinforcing what we already know really isnt, unless there were previous rumors that the originally expected date was not going to be met.
What Vulkan and D3D12 can do is schedule work based on warp load. A Xe card, for example, could get 25 frames scheduled while the Iris Pro gets 5 frames scheduled per second. I'm not entirely sure how that works in regards to VRAM but all of the GPUs work on a need-to-know basis rather than cloning like SLI/Crossfire does. It's much more efficient from a performance an power consumption perspective.
I have not looked very closely into what exactly the couple available explicit mGPU games do in terms of memory usage (and I do not have Crossfire or SLI right now) but VRAM is not the biggest concern with it right now.
VRAM is high bandwidth and high latency. In GPU design, VRAM performance is the number one factor they have to engineer around. Every clock that the VRAM is able to respond, they have to have all of their requests queued while still (hopefully) executing on previous data. Anything that hinders VRAM performance can severely reduce work throughput.
I think this is getting off the main topic, sorry.
Post processing effects do make some sense.
Currently there are games that scale perfectly but there is an increasing number of games where it does not scale or scales very poorly. That takes away a lot of the value.
As for release date, this is first silicon, and may require few revisions before go to shops. Each revision take some 5 months (3m for manufacturing, 2m for analyzing and preparing next revision)
Whole dual-card idea sounds good, but realization depends on devs, and rarely they've invested time in making dual-card games with significant gain. Most of them actually had no gain at all. "The rest" were AAA titles, and probably sponsored by one of manufacturer to include dual-card optimization.
I also don't expect much of first cards, yeah there is Raja-Raja-Raja all over, but if we compare this with AMD... Supposedly, Raja had *some* influence in optimizing old architecture, and his real contributing is showing from now on... Time he spent in Intel is much less, and there are those rumours about "dual-Iris" or "double Iris" that inspire no confidence at all, for me at least.
It may be different for some future generation, but this one - I really, really doubt. Rajas' expertise is nice motivation, but one man doesn't make top products for a long time. Besides, NVIDIA/AMD have, like, 10 zillions of various GPU patents for decades, it's really narrow maneuvering space for Intel. Again, not forever, but 1st generation - definitely.