• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

7nm Intel Xe GPUs Codenamed "Ponte Vecchio"

It's a "let's keep the project up as much time as possible while the budget increases tenfolds or more and have problems all the way trough, so we get new funding" approach.
one of the latest problem was that metal junctions of the dams are affected by corrosion due to salty water (who could've imagined that eh...). If i remember well they cheaped out on junctions and soldering materials. Also they discovered that while the dams are active, debris fill the space where they should fold while unactive (sand and sea, who could have imagined again). It is basically a legalized scam, funded with public money with the excuse of job creation and industry boosting.
Come to think of it when you say. It has been done on purpose. I don't believe there are people that dumb.

if they release it as accelerator for like 3D or photoshop or creative stuff, it would be interesting
It has been said for server market so why not 3d photoshop editing as well.
 
Self moderation applied Based on Ahhzz intervention.
 
Last edited:
I'm not sure why so many of you insist on visiting the etymology of a name, or economics of Italy in 1990, but you are straying far, far off-topic. Return to the topic at hand, and stay there, or lose posting privileges in this thread. No further warnings posted, and no further responses to off-topic posts accepted.
 
I knew Intel's focus was AI, not gamers.
 
I knew Intel's focus was AI, not gamers.

We know they can't be competitive with large dies, so this only leaves one option...Intel glue.
Clearly, they went balls deep on it, so this tells you how confident they are about 7nm, too.

AMD should start taunting them and make fun of them for gluing everything.
 
There's a lot of drama on Videocardz about this.


The lineup will be for business users. The gaming lineup will come in 2020.
 
There's a lot of drama on Videocardz about this.


The lineup will be for business users. The gaming lineup will come in 2020.

I don't think entry level cards are for gaming. But whatever floats their sinking boat.
 
Well excuse us epyc jokesters for not being knowledgeable on the matter.

So tell us o wise one, what will fINalLy uniFYInG cPu and gPU gonna accomplish? Let's hear your epyc tale of cpu and gpu doing a fusion dance to create an epu, epyc processing unit.

Ideally, it would be a godsend to programmers as it would reduce coding complexity by magnitudes.
 
I may me wrong, but didn't you spend a lot of time "hacking" Intel cpu to get rid of the management software? Maybe I am reading it wrong. I mean I don't have to mess with Ryzen at all.

Yeah, I did. Never found any bios on the cpu silicon itself... not even sure that would be practical. They are pretty open about what part of your bios image stores it. It basically has it's own bios partition.

No serious reply pls. Apparently our humble moderator thinks that post is a troll post.

Unsure what you mean by this. I was being serious. I have no idea if Intel will acomplish that but that is the idea anyways.
 
Ideally, it would be a godsend to programmers as it would reduce coding complexity by magnitudes.

I'll wait for benchmarks. ^^
 
I'll wait for benchmarks. ^^

I have my doubts there'll be anything meaningful to see. You'd need to completely rewrite existimg software if it even works.

Yeah, I'm kinda skeptical, too.
 
This doesn't matter , Can they provide good Driver for gaming ?
 
I have my doubts there'll be anything meaningful to see. You'd need to completely rewrite existimg software if it even works.

Yeah, I'm kinda skeptical, too.

Well it's useless for me then, cause my backlog is massive, not really in the market for new games.
 
Btw what does cpu and gpu fusion mean for gaming? Will SLI be viable then? Will GPUs be finally free from the clutches of drivers like CPUs?

That may be a side affect. I could picture a world where a firmware update is akin to updating gpu drivers, only done far less often. That would be of course, if they get it right. Looking at the bugginess of common gpu drivers, I remain skeptical.
 
Low quality post by Dave65
Did he say Pancho Villa?
:laugh::laugh::laugh::laugh::laugh::lovetpu:
 
Enjoy your 70 security fixes that will occur bi-monthly.

Intel has lost my trust personally. I have never seen so many security vulnerabilities in my life.

Another new one, Xombieland V2 was announced yesterday, and Intel announced 70 security patch fixes incoming. lol how many times this year now? wow... Ryzen... none needed... since launch lol
You have to realize too that Intel has been using basically the same architecture for around 10 years. Intel needs a brand new series from the ground up to fix the issue.
 
So when they’re trying to sell companies Xe to go with their Xeons, I’m sure that won’t be confusing for anyone.

"Do you stutter?" "No, this is my new rig"

Seriously... a news piece about a system that may finally unify CPU and GPU interfaces, but all it gets from TPU community are 2 posts joking about a name. That's just epyc.

I think that is because all we've got so far is 'will be' 'are going to'... where r u substance
 
From the OP:
("Intel is preparing its first enterprise compute platform featuring these accelerators codenamed "Project Aurora," in which the company will exert end-to-end control over not just the hardware stack, but also the software.")

This to me means Intel is up to it's usual dirty tricks to leverage things in their favor.
While it may not mean such directly it could be used to block others (AMD?) from being able to "Do" certain things AND to make developers dance to their tune, including the gaming industry - At the very least have the ability to manipulate things in their favor.

One possible example of such abuse/dirty tricks:
You don't favor Intel products by slowing/crippling gaming performance if the software sees an AMD CPU in use you get restricted/cutoff from being able to use the software for your gaming platform development or for it to work with titles already released.

And don't try to convince me it's not possible for them (Or anyone else for that matter TBF) to try it.
Intel has been caught red handed in many schemes over time as fact, not fiction and certainly isn't above doing it again as a company, esp one that's now becoming desperate and you know what desperate folks will do as a rule......
 
From the OP:
("Intel is preparing its first enterprise compute platform featuring these accelerators codenamed "Project Aurora," in which the company will exert end-to-end control over not just the hardware stack, but also the software.")

This to me means Intel is up to it's usual dirty tricks to leverage things in their favor.
While it may not mean such directly it could be used to block others (AMD?) from being able to "Do" certain things AND to make developers dance to their tune, including the gaming industry - At the very least have the ability to manipulate things in their favor.

One possible example of such abuse/dirty tricks:
You don't favor Intel products by slowing/crippling gaming performance if the software sees an AMD CPU in use you get restricted/cutoff from being able to use the software for your gaming platform development or for it to work with titles already released.

And don't try to convince me it's not possible for them (Or anyone else for that matter TBF) to try it.
Intel has been caught red handed in many schemes over time as fact, not fiction and certainly isn't above doing it again as a company, esp one that's now becoming desperate and you know what desperate folks will do as a rule......

Its all up to AMD to catch them doing it, or sue. I'm sure they have their eyes open...
 
The software companies could speak up too if they discover it happening.
In either instance it is up to them to speak up and do something about it should it occur.
 
This story?

How?
I mean, I understand why people bash on the new stuff Intel released. Considering Phi it is an another attempt to create something and the history doesn't back up the Xe GPU Intel released. This new XE is Phi in a different, maybe more advanced form but I don't think it is a great piece of technology. Not saying it will not be if Intel pursue this idea but the chances are slip to none. That's just how I see it.
The other thing is, since Intel is not trying to compete with NV and AMD in desktop with their GPU, gives an impression that Intel feels weak in that department and cant compete with the two companies. So this Xe is kind of an attempt to exist and deliver some sort of GPU since it's been told Intel is working on one. More like to be true to its word.
 
Its all up to AMD to catch them doing it, or sue. I'm sure they have their eyes open...
Doing what exactly, tighter CPU and GPU cooperation and coupling? AMD has been working on HSA for years.

Considering Phi it is an another attempt to create something and the history doesn't back up the Xe GPU Intel released. This new XE is Phi in a different, maybe more advanced form but I don't think it is a great piece of technology. Not saying it will not be if Intel pursue this idea but the chances are slip to none. That's just how I see it.
The other thing is, since Intel is not trying to compete with NV and AMD in desktop with their GPU, gives an impression that Intel feels weak in that department and cant compete with the two companies. So this Xe is kind of an attempt to exist and deliver some sort of GPU since it's been told Intel is working on one. More like to be true to its word.
Where did that Phi comparison come from? There has been no talk about mass-x86-cores type of approach and everything we know points to the contrary - a fairly standard GPU architecture as we know these. Xe is not exactly brand new either, Intel has Gen9.5 and Gen11 that show the evolution towards Xe.

Intel is trying to make money. Margins in compute, especially AI, are far higher than desktop. If they lock down an architecture that works there, it is easier to bring that down to desktop instead of starting here. Even with Intel having been working on their iGPU for years, remember that even here the common assumption is that Intel cannot create a discrete GPU at all.

Edit:
Oh, right. That x86 part is in the original post/news blurb. I call bullshit.
 
Last edited:
Where did that Phi comparison come from? There has been no talk about mass-x86-cores type of approach and everything we know points to the contrary - a fairly standard GPU architecture as we know these. Xe is not exactly brand new either, Intel has Gen9.5 and Gen11 that show the evolution towards Xe.

Intel is trying to make money. Margins in compute, especially AI, are far higher than desktop. If they lock down an architecture that works there, it is easier to bring that down to desktop instead of starting here. Even with Intel having been working on their iGPU for years, remember that even here the common assumption is that Intel cannot create a discrete GPU at all.
The Phi, even if it contained the x86 cores, was considered a GPU just like the Xe is considered a GPU. The Xe is derived from Xeon and it is stated that the Xe might be capable of x86 instructions . We don't know much about it because it's not out but it basically will do same thing as the Phi was meant to do. That is why I've pointed out the similarities between the two and that's why the Xe is so peculiarly similar to the Phi GPU. Xe as mentioned, in the OP's article, is supposedly be an exascale compute just like the PHI GPU was.
Making money is obvious thing. Not sure why you bring that one up.
 
The Phi, even if it contained the x86 cores, was considered a GPU just like the Xe is considered a GPU.
Actually, Phi was not considered a GPU. It was always a compute accelerator. GPU side of that died with Larrabee.
The Xe is derived from Xeon and it is stated that the Xe might be capable of x86 instructions . We don't know much about it because it's not out but it basically will do same thing as the Phi was meant to do. That is why I've pointed out the similarities between the two and that's why the Xe is so peculiarly similar to the Phi GPU.
Where did you get that Xe is derived from Xeon?
We know enough about Xe to say it is not doing x86.
Making money is obvious thing. Not sure why you bring that one up.
I brought up money because there is a lot of that in compute/AI market. And you said Xe not coming to desktop at first is a sign Intel feels weak. When you are designing a new GPU, that is the obvious market to go for. Desktop GPUs are dirt cheap and require a lot of twitchy games-specific software work.
 
Back
Top