Wrong, the only reason Intel wanted to get into dGPUs is because their MBAs wanted to tap the lucrative margins enabled by the crypto bubble. Except that bubble burst before Intel's dGPUs launched, so they were left with a product lacking a market, so they quickly pivoted it to the consumer space instead of throwing it away. The AI hype bubble is irrelevant to Intel's dGPUs because (a) they didn't predict it and thus didn't design their GPUs for it (b) like AMD, they lack the software ecosystem that would allow them to compete with NVIDIA.
Except there is no opportunity to compete with NVIDIA, because no other company has invested as much in building a rich and compelling ecosystem around GPU compute, and no other company will be able to build a competing ecosystem before the AI bubble bursts. So once again Intel will find itself with a product that really doesn't justify the amount of resources they've ploughed into it, a product that's almost certainly never going to provide a positive return on investment (never mind the kind of ROI that their MBAs originally hoped for) - and Pat Gelsinger is very much a positive ROI kinda guy.
He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.
Except in one sentence you decide that history doesn't matter, and that it isn't a predictor of the future. I do wish I could live in your interpretation of reality where both the past and the future don't matter.
Let me bottom line this, as you seem to want to take the narrowest of views that excludes both past and future. I'm an Intel investor. Theoretically I came late to the crypto bubble...but if you look at my scheduler and performance in crypto mining I was also released as a pretty trash option for crypto even if it wasn't a bubble. I maybe take this to heart as an experiment...but if that's the case I don't plow huge amounts of resources into developing a card that started from the word go with heavy disadvantages. I also fry the CEO and board for plowing money and resources into Battlemage...because it's throwing more money at a problem. I'm now in Intel...I see my investors are angry...I've cut the smaller investments which were easy to plow into experimental losses....but I need a future.
What do I see?
dGPUs can suck down enormous amounts of power
PCI-e is plenty of bandwidth to saturate secondary processors
AI is basically LLMs...so the more accelerators you pack into an area the better it performs its dog-and-pony show
Nvidia is making bank...and they are officially no longer a silicon company....they call themselves software...and demonstrate it by releasing worse specified product more expensive and still make bank
Nvidia can also be trusted to show the way forward to investors...because blue monkey copies green monkey....separate cards with novel accelerators baked in make bank
You want to pretend that they were too late to the dGPU gouge fest...and I agree. You want to pretend that AI will collapse as a bubble before Intel gets their foot in the door...because. Not because of reasons, you expect that not knowing how long it'll take Intel it is still guaranteed to take too long. Funny...I believe that's a pretty stupid crystal ball when Intel has plenty of stuff out there already. Gaudi 3 being "50% faster than H100" is a questionable claim...but if true slapping that sucker on an add-in card is license to print money just like Nvidia.
So...I have to ignore the past, pretend the future is bleak, ignore the present where Intel is lining up AI successes that look like a win if they can slap them on any board within a few months (
Gaudi 3, April 2024), and all so that I can say that Intel is going to axe its GPU division. Kinda seems like you have a hate boner for their dGPUs that will only be sated by their destruction. For my part...I just think it's Intel failing upward somehow....and they seem to have done so because of a conflagration of failures instead of despite it.