• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ARC "driver issues" turning out to be actually hardware deficiencies - Battlemage reveal

Really keen to see what Battlemage can BRR up to, it's shaping up to be a more exciting gen than RDNA4 at this point, in some ways Intel are showing more interest and promise than AMD are.
 
SSDs -> Soldigm
NUC -> Asus
Optane -> Binned
1st party Intel Boards - Binned
Gelsinger is known for cutting cruft out of companies to make them leaner and meaner. He has done it with VMWare and was very successful. The categories above aren't known for high margins and he is trained under Andy Grove, one of the co-founders of Intel and also famous for transitioning Intel from a memory company to a CPU company. He has said in the past "I never want to be in memory, which automatically cuts out SSDs and Optane".

Now under heavy financial pressure the GPU division could be cut, but they won't be able to cut the iGPU division anyway.
 
Gelsinger is known for cutting cruft out of companies to make them leaner and meaner. He has done it with VMWare and was very successful. The categories above aren't known for high margins and he is trained under Andy Grove, one of the co-founders of Intel and also famous for transitioning Intel from a memory company to a CPU company. He has said in the past "I never want to be in memory, which automatically cuts out SSDs and Optane".

Now under heavy financial pressure the GPU division could be cut, but they won't be able to cut the iGPU division anyway.
dGPU was always an extremely strange segment for Intel to try to play in, given the incumbents and the abject failures that are the company's previous attempts in this space. I also feel like they've intentionally setup their dGPUs to fail, given that they always seem to launch them at the same time as their competitors launch theirs. Unless BM is something really special I foresee its launch playing out exactly as Alchemist's: strong initial market penetration that drops off to negligible within a year. If that happens Gelsinger will without a doubt kill the dGPU business, regardless of how much capital has already been plowed into it.

This is especially true because when Arc was conceptualised the expectation was that dGPU and iGPU would be able to share hardware and drivers and thus save money. That quickly proved impossible and I doubt it ever will be, because the iGPU team consists of people who have a completely different focus to dGPU. The former team exists to provide the most minimal hardware that can do basic desktop tasks in the smallest physical, power and thermal envelope; the latter to do... basically the opposite. And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.
 
You're all very unrealistically positive. Someone has to say something negative to balance it out, I accept the burden:
Intel is just losing money on the GPU business, billions on top of billions, if this generation doesn't work out it will be the last to come in the form of dGPUs for desktops. bye bye.
 
Hardware has been bottlenecking ARC
NOT
Driver has been bottlenecking ARC
How about both? Clearly, ARC numbers got a whole lot better with successive driver updates, as they fixed many broken games. This does NOT mean there is no hardware bottleneck, just that it's not an either/or.
 
dGPU was always an extremely strange segment for Intel to try to play in, given the incumbents and the abject failures that are the company's previous attempts in this space. I also feel like they've intentionally setup their dGPUs to fail, given that they always seem to launch them at the same time as their competitors launch theirs. Unless BM is something really special I foresee its launch playing out exactly as Alchemist's: strong initial market penetration that drops off to negligible within a year. If that happens Gelsinger will without a doubt kill the dGPU business, regardless of how much capital has already been plowed into it.

This is especially true because when Arc was conceptualised the expectation was that dGPU and iGPU would be able to share hardware and drivers and thus save money. That quickly proved impossible and I doubt it ever will be, because the iGPU team consists of people who have a completely different focus to dGPU. The former team exists to provide the most minimal hardware that can do basic desktop tasks in the smallest physical, power and thermal envelope; the latter to do... basically the opposite. And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.

So I read this...and I look at this differently.

Yes, a lot of the "low impact high return" moonshot items failed to materialize. Likewise, the dGPU division started out crippled in performance by some design choices. Despite this, Intel plowed the resources into the dGPU market to be a lower middle tier option...for many games. None of that inherently guarantees success, but it showed Intel was willing to keep this on life support despite knowing that it'd be a loss leader regarding costs and returns with a primarily middle market target.

Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.


Side note...300+ watts for a dGPU...whereas current processors that high require liquid cooling. I think they've side-stepped the need for power management with this too....so win-win-win in the old Intel playbook.
 
I swear I've known about this since the beggining, but I can't quite seem to remember where I learned it. Perhaps it was just speculation that turned out to be right. idk.

So I read this...and I look at this differently.

Yes, a lot of the "low impact high return" moonshot items failed to materialize. Likewise, the dGPU division started out crippled in performance by some design choices. Despite this, Intel plowed the resources into the dGPU market to be a lower middle tier option...for many games. None of that inherently guarantees success, but it showed Intel was willing to keep this on life support despite knowing that it'd be a loss leader regarding costs and returns with a primarily middle market target.

Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.


Side note...300+ watts for a dGPU...whereas current processors that high require liquid cooling. I think they've side-stepped the need for power management with this too....so win-win-win in the old Intel playbook.
Yeah I agree. Intel has realized the advancements in silicon technology have moved away from cpus and into gpus, they really can't afford to miss another boat like this ( as they did with smartphones). But they nearly almost missed this boat too, but they may just have caught it in time if they work their butts off. Sure, they are spending $500 to make a gpu that sells for $300 now, but in a few years, perhaps they'll be selling that GPU to a tech firm for LLM willing to pay thousands of dollars. I don't think intel is ducking out of this of this one. That would essentially be suicide.

And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.
Well isn't that exactly what they are doing with lunar lake? Having those Xe cores in there? Or perhaps thats just marketting speak for their existing igpu technology. Or perhaps you mean only on desktop chips? I'm not really sure but I thought arrow lake had arc technology in there too.

Besides, we don't yet know if the next gen of intel chips will have the same heat/power issues the last 5? or so gens have had LOL.

I just hope arrow lake wont take up too much of the die with ai cores that are useless to me. Could have used that space for more cache or cores, or anything other than ai would have been fine. Maybe they could put a little cigarette lighter in there. Now that would be way more interesting than ai cores. A little dangerous perhaps, but definitely eye catching.
 
Well isn't that exactly what they are doing with lunar lake? Having those Xe cores in there? Or perhaps thats just marketting speak for their existing igpu technology. Or perhaps you mean only on desktop chips? I'm not really sure but I thought arrow lake had arc technology in there too.
Not only LNL and ARL, it is already inside MTL. Without it there would be no MSI Claw.
A great upgrade from their earlier iGPU's, but has the same weaknesses as their desktop ARC, exacerbated by the fewer cores and lesser power budget.
 
Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.
Wrong, the only reason Intel wanted to get into dGPUs is because their MBAs wanted to tap the lucrative margins enabled by the crypto bubble. Except that bubble burst before Intel's dGPUs launched, so they were left with a product lacking a market, so they quickly pivoted it to the consumer space instead of throwing it away. The AI hype bubble is irrelevant to Intel's dGPUs because (a) they didn't predict it and thus didn't design their GPUs for it (b) like AMD, they lack the software ecosystem that would allow them to compete with NVIDIA.

Except there is no opportunity to compete with NVIDIA, because no other company has invested as much in building a rich and compelling ecosystem around GPU compute, and no other company will be able to build a competing ecosystem before the AI bubble bursts. So once again Intel will find itself with a product that really doesn't justify the amount of resources they've ploughed into it, a product that's almost certainly never going to provide a positive return on investment (never mind the kind of ROI that their MBAs originally hoped for) - and Pat Gelsinger is very much a positive ROI kinda guy.

He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.
 
Wrong, the only reason Intel wanted to get into dGPUs is because their MBAs wanted to tap the lucrative margins enabled by the crypto bubble. Except that bubble burst before Intel's dGPUs launched, so they were left with a product lacking a market, so they quickly pivoted it to the consumer space instead of throwing it away. The AI hype bubble is irrelevant to Intel's dGPUs because (a) they didn't predict it and thus didn't design their GPUs for it (b) like AMD, they lack the software ecosystem that would allow them to compete with NVIDIA.

Except there is no opportunity to compete with NVIDIA, because no other company has invested as much in building a rich and compelling ecosystem around GPU compute, and no other company will be able to build a competing ecosystem before the AI bubble bursts. So once again Intel will find itself with a product that really doesn't justify the amount of resources they've ploughed into it, a product that's almost certainly never going to provide a positive return on investment (never mind the kind of ROI that their MBAs originally hoped for) - and Pat Gelsinger is very much a positive ROI kinda guy.

He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.

Except in one sentence you decide that history doesn't matter, and that it isn't a predictor of the future. I do wish I could live in your interpretation of reality where both the past and the future don't matter.

Let me bottom line this, as you seem to want to take the narrowest of views that excludes both past and future. I'm an Intel investor. Theoretically I came late to the crypto bubble...but if you look at my scheduler and performance in crypto mining I was also released as a pretty trash option for crypto even if it wasn't a bubble. I maybe take this to heart as an experiment...but if that's the case I don't plow huge amounts of resources into developing a card that started from the word go with heavy disadvantages. I also fry the CEO and board for plowing money and resources into Battlemage...because it's throwing more money at a problem. I'm now in Intel...I see my investors are angry...I've cut the smaller investments which were easy to plow into experimental losses....but I need a future.

What do I see?
dGPUs can suck down enormous amounts of power
PCI-e is plenty of bandwidth to saturate secondary processors
AI is basically LLMs...so the more accelerators you pack into an area the better it performs its dog-and-pony show
Nvidia is making bank...and they are officially no longer a silicon company....they call themselves software...and demonstrate it by releasing worse specified product more expensive and still make bank
Nvidia can also be trusted to show the way forward to investors...because blue monkey copies green monkey....separate cards with novel accelerators baked in make bank


You want to pretend that they were too late to the dGPU gouge fest...and I agree. You want to pretend that AI will collapse as a bubble before Intel gets their foot in the door...because. Not because of reasons, you expect that not knowing how long it'll take Intel it is still guaranteed to take too long. Funny...I believe that's a pretty stupid crystal ball when Intel has plenty of stuff out there already. Gaudi 3 being "50% faster than H100" is a questionable claim...but if true slapping that sucker on an add-in card is license to print money just like Nvidia.



So...I have to ignore the past, pretend the future is bleak, ignore the present where Intel is lining up AI successes that look like a win if they can slap them on any board within a few months (Gaudi 3, April 2024), and all so that I can say that Intel is going to axe its GPU division. Kinda seems like you have a hate boner for their dGPUs that will only be sated by their destruction. For my part...I just think it's Intel failing upward somehow....and they seem to have done so because of a conflagration of failures instead of despite it.
 
He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.
That's not what he said. He said dGPUs are what allows making money on what significantly development costs and time is being spent on anyway - iGPUs. Right now it may not seem like it, but if they are able to get harmony between the two and have good products it'll essentially be that way.

Also he said he regrets Intel not being able to continue with Larrabbee and GPU development as a major goal. He's not just some accountant shmuck.
 
If Arc gets axed, then we're in trouble!
 
If Arc gets axed, then we're in trouble!

What gave you the idea that it would be? It's probably one of the most promising divisions within Intel right now. They have momentum and a golden chance - AMD's graphics division has been blundering non-stop for the entire generation and Nvidia's all but given up on the entry and midrange markets. You should expect Battlemage to be a very competitive product indeed.
 
You should expect Battlemage to be a very competitive product indeed.
It's going to be better than Alchemist's situation for sure.

But A770-level of performance die is going to be the first to be released in a few months. The one that'll perform at RTX 4070 Ti level is 1 year away - Summer of 2025.
 
But Intel did made bests motherboards long ago, right ?!
 
But Intel did made bests motherboards long ago, right ?!

Wouldn't call them the best - but they were reliable and generally, of high quality. None of the Intel boards I've owned sucked.
 
I'm Intel's biggest cheerleader in this space, we need as much competition as we can get...

But Intel did made bests motherboards long ago, right ?!
Wouldn't call them the best - but they were reliable and generally, of high quality. None of the Intel boards I've owned sucked.
System integrator boards, they were ok...

Yea nothing special apart from Skulltrails maybe, but those were not for most.

They also had some weird policies that affected CPU warranties while they were still a thing.
In that, any warranty claims submitted for CPUs needed to be validated on Intel branded boards.

Only reason I even found that out was due to a friend's bad luck.
He had a Core 2 Duo with some borked temp sensors, tested it in both his and my rig (different boards, same issue).
 
He had a Core 2 Duo with some borked temp sensors, tested it in both his and my rig (different boards, same issue).
Yeah, some 45 nm Core 2s, especially early 45 nm ones, are likely to have the minimum reported temp stuck at or around 45C. (especially C0!)

The E0s are more likely to actually be able to report lower core temps.
 
We can see from Lunarlake reviews that it's already doing a lot better relative to Time Spy results. This is due to the result of micro-architectural changes in Xe2 aka Battlemage.

Apparently Celestial is the much bigger change though, the one that'll dwarf the improvements in Battlemage. It is said this is what can get them competitive. Interestingly same sources are saying that they'll skip Xe3/Celestial, which is a puzzling decision. If anything Battlemage should be skipped in favor of Celestial if big change in Celestial is indeed true. Ideally they shouldn't skip anything, if for software stability.

2 year GPU development means 2024-2025 for BMG and 2026-2027 for Celestial. If there's no dGPU for Celestial, the successor to BMG is 2028-2029. I would classify that as a mini failure of strategy. Intel often makes these almost brain-dead decisions and wonder why they can't get into a new market.
 
I suspect there is issues with Celestial and Druid is progressing along nicely
Similar to how Intel abandoned 20a and focused on 18a when the timelines started bluring together
 
Similar to how Intel abandoned 20a and focused on 18a when the timelines started bluring together
And losing 10% performance.

I think they just rebranded 20A to 18A. 20A was supposed to be 15% over intel 3, and 18A, another 10% over 20A. Now they are saying 15% for 18A over Intel 3, which is just 20A claims. And just 30% density.

The real 18A seems to be 18A-P, a year away.
 
It's interesting how move to SIMD16 improving compatibility was the most noticeable part of Battlemage. It still underperforms in UE5.

Also, it still needs quite a bit of driver work:
-Driver overhead on 1080p resolutions and slower CPUs, arguably worse on B580 than on predecessor. They need to "fix" it before they get B770, or by B770, because it'll get worse. When Celestial comes out, it'll become even worse, because it'll be faster.
-Promised DX11 driver with performance optimizations not requiring whitelisting has not launched yet.
-When VR support? Probably de-prioritized. Promised feature too.

Still has ReBar performance/compatibility issues and still high idle power issues.

Changing from ARC Control to new software shows a bit of dysfunctionality within the group. Hopefully this is their low.

Battlemage overall is a good improvement. There seems to be overall less glitches and graphics errors than on Alchemist. They need to keep at it.
 
My 13th gen CPU has Xe2 integrated graphics and games galore work better than expected. Go figure,

I have 64GB RAM but my iGPU lacks ReBar so what else is new. Guess the integrated graphics has limited resources.
 
My 13th gen CPU has Xe2 integrated graphics and games galore work better than expected. Go figure,

I have 64GB RAM but my iGPU lacks ReBar so what else is new. Guess the integrated graphics has limited resources.
I am confused by this post. Raptor uses Iris Xe. ARC first showed up in Meteor Lake. Xe2 is Lunar Lake and Arrow Lake.
 
Back
Top