• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Teases "Big Daddy" Xe-HP GPU

That's probably a nice huge GPU for compute stuff, but I just can't help it.

I see the size of that thing and I think something like this



Size of GPU needed for an average performance of 60fps in top ten games of 2020 (or something)
View attachment 153530
Except you have it wrong Nvidia's in the middle.

And intel still isn't actually on the chart.
 
I believe you've gotten your Nvidia and AMD sizes switched around. RX 5700 XT is 251mm2, RTX 2060 Super is 445mm2. RX 5700 XT is faster overall (though it does also have RT cores).
No I haven't. Just for the record. Look, first of all I am an AMD fan. Secondly, I was just making fan with the Intel chip, wasn't meant to compare AMD and Nvidia. But let's do that comparison anyway.

First you should compare with 2070, not 2060 Super. They use the same chip. Pricing is not a factor here.
Secondly, let's simplify everything and say that the Nvidia chip at 7nm will be (445X7)/12=260mm2. It will be probably smaller if we check the numbers I found here. The Nvidia chip then it will be (445X33.8)/66.7=226mm2.
If I am not doing some stupid maths in my mind, and if we also consider that, as you also say, we also have Tensor and RT cores in that chip, then Nvidia's architecture seems to be (much) more efficient. And let's not forget also that it is old.

Except you have it wrong Nvidia's in the middle.

And intel still isn't actually on the chart.
The same explanation as for the other post.
 
No I haven't. Just for the record. Look, first of all I am an AMD fan. Secondly, I was just making fan with the Intel chip, wasn't meant to compare AMD and Nvidia. But let's do that comparison anyway.

First you should compare with 2070, not 2060 Super. They use the same chip. Pricing is not a factor here.
Secondly, let's simplify everything and say that the Nvidia chip at 7nm will be (445X7)/12=260mm2. It will be probably smaller if we check the numbers I found here. The Nvidia chip then it will be (445X33.8)/66.7=226mm2.
If I am not doing some stupid maths in my mind, and if we also consider that, as you also say, we also have Tensor and RT cores in that chip, then Nvidia's architecture seems to be (much) more efficient. And let's not forget also that it is old.


The same explanation as for the other post.
Will be , I'm not on about fantasy unreleased chip's but what's out now and known about, try again Nvidia have no 7nm parts we Know about and they're going to be massive I assure you.

AMD fan, you implied Intel will be as shit as AMD and Nvidia are best , comedy rebuttal.
 
welcome to kilo-watt gpu erra…. :roll:
 
welcome to kilo-watt gpu erra…. :roll:
With an Intel CPU and xe GPU you might well need a kilowatt PSU, make's your joke less funny more true good times:shadedshu::D
 
RTX 3080 and NAVI21 should be very similar in size and performance, so what remains to be seen can intel match that.
 
RTX 3080 and NAVI21 should be very similar in size and performance, so what remains to be seen can intel match that.
Compared to what Intel GPU? This one? Nope. A single 500mm² one? sure.
 
i cant wait to see how these will perform! we really need more competition in the gpu market.
 
Knowing Intel and GPU's....it aint gonna be the dog's bollocks that's for sure....
 
Remember that 1500W chiller? Yeah, you'll need that, again.
 
No I haven't. Just for the record. Look, first of all I am an AMD fan. Secondly, I was just making fan with the Intel chip, wasn't meant to compare AMD and Nvidia. But let's do that comparison anyway.

First you should compare with 2070, not 2060 Super. They use the same chip. Pricing is not a factor here.
Secondly, let's simplify everything and say that the Nvidia chip at 7nm will be (445X7)/12=260mm2. It will be probably smaller if we check the numbers I found here. The Nvidia chip then it will be (445X33.8)/66.7=226mm2.
If I am not doing some stupid maths in my mind, and if we also consider that, as you also say, we also have Tensor and RT cores in that chip, then Nvidia's architecture seems to be (much) more efficient. And let's not forget also that it is old.
...so you're not comparing actual products on actual production nodes, but comparing theoretical products based on oversimplified calculations? Cool. I would argue a far better approach is basing what you say on actual real-world products. No, the Intel GPUs don't exist yet, but the other two do, and we know nothing concrete about how next-gen versions of these will look, so for your point (which isn't bad in and of itself) to come across in the best way possible the logical comparison point is with currently available products. You can of course update it when RDNA 2 and Ampere hit retail should you want to. And yes, 2070 and 2060S is the same die - and also the same performance, so the point is moot, no? 5700 XT beats both.

I think most next gen GPU will target 8K as it seems HDMI 2.1 is now implemented.
Given that 4k60 at Ultra settings is still a struggle for most GPUs, 8k is a pipe dream (and nothing more than a PR bullet point). 4x the resolution for near zero perceptible quality increase, with performance tanking completely ... 1440p will stay the sweet spot for quality and performance in games for years to come, though the next generation will likely be when 4k becomes somewhat accessible.
 
Indeed this brings up the question : just how much power would something like this draw ? I mean I seriously can't see how these things wouldn't be several hundreds watts parts, how do you even cool that ? It gets to a point where even extreme custom solutions become impractical.
 
Indeed this brings up the question : just how much power would something like this draw ? I mean I seriously can't see how these things wouldn't be several hundreds watts parts, how do you even cool that ? It gets to a point where even extreme custom solutions become impractical.
Might get one for winter if they fold well it gets cold as shit here.

Well except this isn't consumer, if Intel were to release this 4/8 tile monstrosity imagine the price, ain't no effing way this aligns to what they said they were , a mid teir GPU this would out BOM an effing Titan V and Even to consumer's would go beyond the 2080ti's price.


And again the pin out, pciex plus phy for memory's which incidentally look likely to not be needed since HBM, don't think so personally.

Then again they needed 49 extra pins for power on their new cpu:shadedshu::p:laugh::laugh::laugh::rockout::banghead::confused::eek::kookoo: for a couple more core's.

Still no, this is enterprise tackle.
 
Last edited:
Indeed this brings up the question : just how much power would something like this draw ? I mean I seriously can't see how these things wouldn't be several hundreds watts parts, how do you even cool that ? It gets to a point where even extreme custom solutions become impractical.

To get any perf at all, it would be 1000, I assume. But the fabric on this scale would eat most of the power...so 2000W low clocks, 3000 turbo? Lol

Intel is DESPERATE for anything beefy.
 
...so you're not comparing actual products on actual production nodes, but comparing theoretical products based on oversimplified calculations? Cool. I would argue a far better approach is basing what you say on actual real-world products. No, the Intel GPUs don't exist yet, but the other two do, and we know nothing concrete about how next-gen versions of these will look, so for your point (which isn't bad in and of itself) to come across in the best way possible the logical comparison point is with currently available products. You can of course update it when RDNA 2 and Ampere hit retail should you want to. And yes, 2070 and 2060S is the same die - and also the same performance, so the point is moot, no? 5700 XT beats both.


Given that 4k60 at Ultra settings is still a struggle for most GPUs, 8k is a pipe dream (and nothing more than a PR bullet point). 4x the resolution for near zero perceptible quality increase, with performance tanking completely ... 1440p will stay the sweet spot for quality and performance in games for years to come, though the next generation will likely be when 4k becomes somewhat accessible.

8K/60 is possible on 2080 Ti SLI using DLSS on max and every setting at medium or lower... Just because it's rendering at below 1440p doesn't make it less valid. :laugh:
 
8K/60 is possible on 2080 Ti SLI using DLSS on max and every setting at medium or lower... Just because it's rendering at below 1440p doesn't make it less valid. :laugh:
True :laugh:
But sure, if we go about it that way any DP 1.4 GPU can game at 8k60 if you set the render resolution to whatever the GPU can handle and just set the GPU driver to upscale it. Now we can all play our (NES) games at 8k on our RX 540s and GTX 1030s! 8k* gaming to the people!
 
Indeed this brings up the question : just how much power would something like this draw ? I mean I seriously can't see how these things wouldn't be several hundreds watts parts, how do you even cool that ? It gets to a point where even extreme custom solutions become impractical.

One thing in its favour....Plenty of cooling surface area....
 
8K and 16K are coming.

is a new standard from the Video Electronics Standards Association that allows USB 4 to offer all the bells and whistles of the DisplayPort 2.0 standard as well as transmitting USB data. That means support for 8K displays at 60Hz with HDR, 4K displays at 144Hz with HDR, or even 16K (15360x8460) displays at 60Hz with compression. It’s a big step towards USB Type-C becoming a true jack-of-all trades connector.

The USB 4 spec can already transmit DisplayPort data, but AnandTech reports that the new standard remaps USB-C’s high speed data pins to unlock more bandwidth for video. USB 4 is bidirectional, meaning it can carry up to 40Gbps of data in either direction. However, video doesn’t need to go both ways — you only really need data to pass from your laptop to your monitor (for example). This alt mode means that all that bandwidth can be used to just send video one way, meaning you get a maximum raw bandwidth of up to 80Gbps.

 
One thing in its favour....Plenty of cooling surface area....

Yes and no. It is massive so sure, lots or surface area for heat transfer. But most GPUs are bare die to heatsink.
I wonder why they choose for an IHS, ease of installation? Apparently those special socketed Nvidia GPUs are a PITA to install.
Anyway that IHS will not improve temperatures to keep that GPU cool.
 
...so you're not comparing actual products on actual production nodes, but comparing theoretical products based on oversimplified calculations? Cool. I would argue a far better approach is basing what you say on actual real-world products. No, the Intel GPUs don't exist yet, but the other two do, and we know nothing concrete about how next-gen versions of these will look, so for your point (which isn't bad in and of itself) to come across in the best way possible the logical comparison point is with currently available products. You can of course update it when RDNA 2 and Ampere hit retail should you want to. And yes, 2070 and 2060S is the same die - and also the same performance, so the point is moot, no? 5700 XT beats both.

Well,... I wasn't expecting people to take that post - with the picture - so seriously and not as a fun post.
As for comparison of architectures, in efficiency at least, Nvidia is still (far) ahead of AMD. In my opinion, it's a fact. We probably disagree here, but anyway.
 
Well,... I wasn't expecting people to take that post - with the picture - so seriously and not as a fun post.
As for comparison of architectures, in efficiency at least, Nvidia is still (far) ahead of AMD. In my opinion, it's a fact. We probably disagree here, but anyway.


This is not apples-to-apples comparison between architectures because Nvidia uses compressions everywhere and optimises only for performance, not for correct rendering.

-colour compression;
-full texture compression;
-DLSS;
 
Compared to what Intel GPU? This one? Nope. A single 500mm² one? sure.


Compared to what Intel GPU?

Where can we buy one, where are the benchmarks, the multiple games tested from third party reviewers?

There is no comparison as Intel still doesn't have a GPU beyond the little ones in their CPUs, and a few crappy add in cards from way back. But they have a lot of fanbois, smoke and mirrors, tweets, and thoughts about how great they are.....
 
Compared to what Intel GPU?

Where can we buy one, where are the benchmarks, the multiple games tested from third party reviewers?

There is no comparison as Intel still doesn't have a GPU beyond the little ones in their CPUs, and a few crappy add in cards from way back. But they have a lot of fanbois, smoke and mirrors, tweets, and thoughts about how great they are.....

"Vaporware is a real product!" - Intel 2017-forseeable future.
 
Back
Top