Sunday, May 3rd 2020
Intel Teases "Big Daddy" Xe-HP GPU
The Intel Graphics Twitter account was on fire today, because they posted an update on the development of the Xe graphics processor, mentioning that samples are ready and packed up in quite an interesting package. The processor in question was discovered to be a Xe-HP GPU variant with an estimated die size of 3700 mm², which means we sure are talking about a multi-chip package here. How we concluded that it is the Xe-HP GPU, is by words of Raja Koduri, senior vice president, chief architect, general manager for Architecture, Graphics, and Software at Intel. He made a tweet, which was later deleted, that says this processor is a "baap of all", meaning "big daddy of them all" when translated from Hindi.
Mr. Koduri previously tweeted a photo of the Intel Graphics team at India, which has been working on the same "baap of all" GPU, which suggests this is a Xe-HP chip. It seems that this is not the version of the GPU made for HPC workloads (this is reserved for the Xe-HPC GPU),this model could be a direct competitor to offers like NVIDIA Quadro or AMD Radeon Pro. We can't wait to learn more about Intel's Xe GPUs, so stay tuned. Mr. Koduri has confirmed that this GPU will be used only for Data Centric applications as it is needed to "keep up with the data we are generating". He has also added that the focus for gaming GPUs is to start off with better integrated GPUs and low power chips above that, that could reach millions of users. That will be a good beginning as that will enable software preparation for possible high-performance GPUs in future.
Update May 2: changed "father" to "big daddy", as that's the better translation for "baap".
Update 2, May 3rd: The GPU is confirmed to be a Data Center component.
Sources:
VideoCardz (original), VideoCardz (update)
Mr. Koduri previously tweeted a photo of the Intel Graphics team at India, which has been working on the same "baap of all" GPU, which suggests this is a Xe-HP chip. It seems that this is not the version of the GPU made for HPC workloads (this is reserved for the Xe-HPC GPU),
Update May 2: changed "father" to "big daddy", as that's the better translation for "baap".
Update 2, May 3rd: The GPU is confirmed to be a Data Center component.
68 Comments on Intel Teases "Big Daddy" Xe-HP GPU
So until then, long live the king.
This is a GPU. Totally different beast. Turing is still better than rdna on an older node. Rdna 2 will close the gap, but too late cause ampere will widen it again. TBH, AMD is pretty lame with gpus You do see it here so it does exist. This is a workstation gpu, not a consumer gaming gpu. If one of the chiplets has 512eus, than one chiplet is close to rtx2080ti (10+ tflops)
videocardz.com/newz/raja-koduri-intel-confirms-xe-hp-gpu-is-data-center-targeted
Why is it that people expect AMD with a far, far, far smaller budget to compete with Nvidia? Not only does Nvidia have a bigger budget, but we know it's also pulling stuff like the GeForce partner program and other unhanded forms of "marketing" (just like Intel who was convicted of Bribing OEMs to such an extant that even if AMD were to give them free chips, they still claimed theyd lose money". I'm not saying that definitely something nefarious is going on, but it'd be pretty naive to think it's impossible, just based on Nvidia's past precedent of doing this exact stuff.
It's just like the double standard with pricing.... Nvidia gouged with its RTX line, but as soon as Navi was priced similarly due to similar performance, everyone cried murder. Somehow people think telheyre entitled to cheap as dirt GPUs from AMD, but Nvidia can price however they want
And then of course, I can't stand hearing the people wishing that AMD was more competitive and blaming AMD for not being more competitive. Well, the undeniable fact is that in the past, like in the late 2000s, even when AMD had a far better GPU at a better price, you computer enthusiasts STILL bought more Nvidia, and that has basically happened every time AMD has a better product and if you think that's crazy, just look at Ryzen 4000 for mobile, clearly a better product, but will it outsell the competing Intel chip (even if it wins an equal amount of OEM designs) ? Most likely not. Basically what I'm saying is, no one really has a right to blame AMD, because even when they gave you exactly what you claimed you wanted... You STILL didn't buy it. As long as there are consumers out there that DON'T take the literal 5 minutes to Google some reviews and just trust what idiots on the internet say or the uninformed Best Buy salesmen, Nvidia will always outsell AMD, and Intel too for that matter. it's really frustrating to me because I won't even buy a $40 piece of electronics without doing as much research as possible to be absolutely sure I'm making the best possible purchase, so I don't understand how any consumer can buy computer hardware that costs multiple hundreds of dollars, without doing the same... And yet the vast majority do. As long as Nvidia and Intel can depend on the willful ignorance of consumers, it doesn't matter out much AMD's products outperform them... They just never win
*I'm not an AMD fan, my motivation for these statements is a desire to ensure that people know the true history behind this subject, because I'm sure most of them think that Nvidia has the market share because they always had better products, which is not true at all. Marketing is why they have the most market share, that's the only way to make sense out of the fact that even when AMD had better products for less, people still didn't buy them.
I'll leave the rest of this curious rant alone. :)
AMD hasn't the pr or marketing or money to compete with Nvidia successfully, but that's going to change too.
They're all going to have to draw deep from their innovation stores to get and remain competitive.
Good times.
Hindi slangs are very context-based. The one Koduri was referring to is "x is the biggest of them all," or "the baap of all silicon." Therefore he meant "big daddy." (an extreme value).
I was only thinking the other day that it’s been a while since we had some fresh BS from Intels GFX department.