Wednesday, November 13th 2019

7nm Intel Xe GPUs Codenamed "Ponte Vecchio"

Intel's first Xe GPU built on the company's 7 nm silicon fabrication process will be codenamed "Ponte Vecchio," according to a VideoCardz report. These are not gaming GPUs, but rather compute accelerators designed for exascale computing, which leverage the company's CXL (Compute Express Link) interconnect that has bandwidth comparable to PCIe gen 4.0, but with scalability features slated to come out with future generations of PCIe. Intel is preparing its first enterprise compute platform featuring these accelerators codenamed "Project Aurora," in which the company will exert end-to-end control over not just the hardware stack, but also the software.

"Project Aurora" combines up to six "Ponte Vecchio" Xe accelerators with up to two Xeon multi-core processors based on the 7 nm "Sapphire Rapids" microarchitecture, and OneAPI, a unifying API that lets a single kind of machine code address both the CPU and GPU. With Intel owning the x86 machine architecture, it's likely that Xe GPUs will feature, among other things, the ability to process x86 instructions. The API will be able to push scalar workloads to the CPU, and and the GPU's scalar units, and vector workloads to the GPU's vector-optimized SIMD units. Intel's main pitch to the compute market could be significantly lowered software costs from API and machine-code unification between the CPU and GPU.
Image Courtesy: Jan Drewes
Source: VideoCardz
Add your own comment

50 Comments on 7nm Intel Xe GPUs Codenamed "Ponte Vecchio"

#26
Mirkoskji
Self moderation applied Based on Ahhzz intervention.
Posted on Reply
#27
Ahhzz
I'm not sure why so many of you insist on visiting the etymology of a name, or economics of Italy in 1990, but you are straying far, far off-topic. Return to the topic at hand, and stay there, or lose posting privileges in this thread. No further warnings posted, and no further responses to off-topic posts accepted.
Posted on Reply
#28
eidairaman1
The Exiled Airman
I knew Intel's focus was AI, not gamers.
Posted on Reply
#29
TheGuruStud
eidairaman1I knew Intel's focus was AI, not gamers.
We know they can't be competitive with large dies, so this only leaves one option...Intel glue.
Clearly, they went balls deep on it, so this tells you how confident they are about 7nm, too.

AMD should start taunting them and make fun of them for gluing everything.
Posted on Reply
#31
R-T-B
ratirtDon't get me wrong here but it does look bad despite the war you've mentioned.
This story?

How?
Posted on Reply
#33
R-T-B
KhonjelWell excuse us epyc jokesters for not being knowledgeable on the matter.

So tell us o wise one, what will fINalLy uniFYInG cPu and gPU gonna accomplish? Let's hear your epyc tale of cpu and gpu doing a fusion dance to create an epu, epyc processing unit.
Ideally, it would be a godsend to programmers as it would reduce coding complexity by magnitudes.
Posted on Reply
#34
R-T-B
lynx29I may me wrong, but didn't you spend a lot of time "hacking" Intel cpu to get rid of the management software? Maybe I am reading it wrong. I mean I don't have to mess with Ryzen at all.
Yeah, I did. Never found any bios on the cpu silicon itself... not even sure that would be practical. They are pretty open about what part of your bios image stores it. It basically has it's own bios partition.
KhonjelNo serious reply pls. Apparently our humble moderator thinks that post is a troll post.
Unsure what you mean by this. I was being serious. I have no idea if Intel will acomplish that but that is the idea anyways.
Posted on Reply
#35
Space Lynx
Astronaut
R-T-BIdeally, it would be a godsend to programmers as it would reduce coding complexity by magnitudes.
I'll wait for benchmarks. ^^
Posted on Reply
#36
R-T-B
lynx29I'll wait for benchmarks. ^^
I have my doubts there'll be anything meaningful to see. You'd need to completely rewrite existimg software if it even works.

Yeah, I'm kinda skeptical, too.
Posted on Reply
#37
Xuper
This doesn't matter , Can they provide good Driver for gaming ?
Posted on Reply
#38
Space Lynx
Astronaut
R-T-BI have my doubts there'll be anything meaningful to see. You'd need to completely rewrite existimg software if it even works.

Yeah, I'm kinda skeptical, too.
Well it's useless for me then, cause my backlog is massive, not really in the market for new games.
Posted on Reply
#39
R-T-B
KhonjelBtw what does cpu and gpu fusion mean for gaming? Will SLI be viable then? Will GPUs be finally free from the clutches of drivers like CPUs?
That may be a side affect. I could picture a world where a firmware update is akin to updating gpu drivers, only done far less often. That would be of course, if they get it right. Looking at the bugginess of common gpu drivers, I remain skeptical.
Posted on Reply
#40
CrAsHnBuRnXp
lynx29Enjoy your 70 security fixes that will occur bi-monthly.

Intel has lost my trust personally. I have never seen so many security vulnerabilities in my life.

Another new one, Xombieland V2 was announced yesterday, and Intel announced 70 security patch fixes incoming. lol how many times this year now? wow... Ryzen... none needed... since launch lol
You have to realize too that Intel has been using basically the same architecture for around 10 years. Intel needs a brand new series from the ground up to fix the issue.
Posted on Reply
#41
Vayra86
The EggSo when they’re trying to sell companies Xe to go with their Xeons, I’m sure that won’t be confusing for anyone.
"Do you stutter?" "No, this is my new rig"
notbSeriously... a news piece about a system that may finally unify CPU and GPU interfaces, but all it gets from TPU community are 2 posts joking about a name. That's just epyc.
I think that is because all we've got so far is 'will be' 'are going to'... where r u substance
Posted on Reply
#42
Bones
From the OP:
("Intel is preparing its first enterprise compute platform featuring these accelerators codenamed "Project Aurora," in which the company will exert end-to-end control over not just the hardware stack, but also the software.")

This to me means Intel is up to it's usual dirty tricks to leverage things in their favor.
While it may not mean such directly it could be used to block others (AMD?) from being able to "Do" certain things AND to make developers dance to their tune, including the gaming industry - At the very least have the ability to manipulate things in their favor.

One possible example of such abuse/dirty tricks:
You don't favor Intel products by slowing/crippling gaming performance if the software sees an AMD CPU in use you get restricted/cutoff from being able to use the software for your gaming platform development or for it to work with titles already released.

And don't try to convince me it's not possible for them (Or anyone else for that matter TBF) to try it.
Intel has been caught red handed in many schemes over time as fact, not fiction and certainly isn't above doing it again as a company, esp one that's now becoming desperate and you know what desperate folks will do as a rule......
Posted on Reply
#43
Vayra86
BonesFrom the OP:
("Intel is preparing its first enterprise compute platform featuring these accelerators codenamed "Project Aurora," in which the company will exert end-to-end control over not just the hardware stack, but also the software.")

This to me means Intel is up to it's usual dirty tricks to leverage things in their favor.
While it may not mean such directly it could be used to block others (AMD?) from being able to "Do" certain things AND to make developers dance to their tune, including the gaming industry - At the very least have the ability to manipulate things in their favor.

One possible example of such abuse/dirty tricks:
You don't favor Intel products by slowing/crippling gaming performance if the software sees an AMD CPU in use you get restricted/cutoff from being able to use the software for your gaming platform development or for it to work with titles already released.

And don't try to convince me it's not possible for them (Or anyone else for that matter TBF) to try it.
Intel has been caught red handed in many schemes over time as fact, not fiction and certainly isn't above doing it again as a company, esp one that's now becoming desperate and you know what desperate folks will do as a rule......
Its all up to AMD to catch them doing it, or sue. I'm sure they have their eyes open...
Posted on Reply
#44
Bones
The software companies could speak up too if they discover it happening.
In either instance it is up to them to speak up and do something about it should it occur.
Posted on Reply
#45
ratirt
R-T-BThis story?

How?
I mean, I understand why people bash on the new stuff Intel released. Considering Phi it is an another attempt to create something and the history doesn't back up the Xe GPU Intel released. This new XE is Phi in a different, maybe more advanced form but I don't think it is a great piece of technology. Not saying it will not be if Intel pursue this idea but the chances are slip to none. That's just how I see it.
The other thing is, since Intel is not trying to compete with NV and AMD in desktop with their GPU, gives an impression that Intel feels weak in that department and cant compete with the two companies. So this Xe is kind of an attempt to exist and deliver some sort of GPU since it's been told Intel is working on one. More like to be true to its word.
Posted on Reply
#46
londiste
Vayra86Its all up to AMD to catch them doing it, or sue. I'm sure they have their eyes open...
Doing what exactly, tighter CPU and GPU cooperation and coupling? AMD has been working on HSA for years.
ratirtConsidering Phi it is an another attempt to create something and the history doesn't back up the Xe GPU Intel released. This new XE is Phi in a different, maybe more advanced form but I don't think it is a great piece of technology. Not saying it will not be if Intel pursue this idea but the chances are slip to none. That's just how I see it.
The other thing is, since Intel is not trying to compete with NV and AMD in desktop with their GPU, gives an impression that Intel feels weak in that department and cant compete with the two companies. So this Xe is kind of an attempt to exist and deliver some sort of GPU since it's been told Intel is working on one. More like to be true to its word.
Where did that Phi comparison come from? There has been no talk about mass-x86-cores type of approach and everything we know points to the contrary - a fairly standard GPU architecture as we know these. Xe is not exactly brand new either, Intel has Gen9.5 and Gen11 that show the evolution towards Xe.

Intel is trying to make money. Margins in compute, especially AI, are far higher than desktop. If they lock down an architecture that works there, it is easier to bring that down to desktop instead of starting here. Even with Intel having been working on their iGPU for years, remember that even here the common assumption is that Intel cannot create a discrete GPU at all.

Edit:
Oh, right. That x86 part is in the original post/news blurb. I call bullshit.
Posted on Reply
#47
ratirt
londisteWhere did that Phi comparison come from? There has been no talk about mass-x86-cores type of approach and everything we know points to the contrary - a fairly standard GPU architecture as we know these. Xe is not exactly brand new either, Intel has Gen9.5 and Gen11 that show the evolution towards Xe.

Intel is trying to make money. Margins in compute, especially AI, are far higher than desktop. If they lock down an architecture that works there, it is easier to bring that down to desktop instead of starting here. Even with Intel having been working on their iGPU for years, remember that even here the common assumption is that Intel cannot create a discrete GPU at all.
The Phi, even if it contained the x86 cores, was considered a GPU just like the Xe is considered a GPU. The Xe is derived from Xeon and it is stated that the Xe might be capable of x86 instructions . We don't know much about it because it's not out but it basically will do same thing as the Phi was meant to do. That is why I've pointed out the similarities between the two and that's why the Xe is so peculiarly similar to the Phi GPU. Xe as mentioned, in the OP's article, is supposedly be an exascale compute just like the PHI GPU was.
Making money is obvious thing. Not sure why you bring that one up.
Posted on Reply
#48
londiste
ratirtThe Phi, even if it contained the x86 cores, was considered a GPU just like the Xe is considered a GPU.
Actually, Phi was not considered a GPU. It was always a compute accelerator. GPU side of that died with Larrabee.
ratirtThe Xe is derived from Xeon and it is stated that the Xe might be capable of x86 instructions . We don't know much about it because it's not out but it basically will do same thing as the Phi was meant to do. That is why I've pointed out the similarities between the two and that's why the Xe is so peculiarly similar to the Phi GPU.
Where did you get that Xe is derived from Xeon?
We know enough about Xe to say it is not doing x86.
ratirtMaking money is obvious thing. Not sure why you bring that one up.
I brought up money because there is a lot of that in compute/AI market. And you said Xe not coming to desktop at first is a sign Intel feels weak. When you are designing a new GPU, that is the obvious market to go for. Desktop GPUs are dirt cheap and require a lot of twitchy games-specific software work.
Posted on Reply
#49
ratirt
londisteActually, Phi was not considered a GPU. It was always a compute accelerator. GPU side of that died with Larrabee.
Phi is considered a GPU and it was competing against NV's Tesla.
londisteWhere did you get that Xe is derived from Xeon?
We know enough about Xe to say it is not doing x86.
Well that Xe derivative from Xeon is my guess not something legit. If it turns out that the Xe is capable of x86. Im not saying is is capable now but it might be capable. Still it is similar to Phi with the concept in what this Xe GPU will do and what NV and AMD counterparts will go against.
londisteI brought up money because there is a lot of that in compute/AI market. And you said Xe not coming to desktop at first is a sign Intel feels weak. When you are designing a new GPU, that is the obvious market to go for. Desktop GPUs are dirt cheap and require a lot of twitchy games-specific software work.
And money is obvious but feeling weak in one area doesn't mean you can't get money in the other and be good at it. Desktop market (dirt as you said) is still a market and exists and Intel is not competing with NV and AMD here so there must be a reason for that. How I see it and shared it is, Intel can't make a decent GPU for gaming now. It can be because of soft support and that is why Xe is not gaming GPU to get more tracktion etc. Anyway Intel was going to release a desktop GPU (as far as I remember) but it didn't happen.
Posted on Reply
Add your own comment
Dec 18th, 2024 03:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts