Thursday, November 28th 2024
Intel Continues to Develop GPUs Beyond Arc Battlemage
New rumors emerged via Notebookcheck that point to Intel's intention to continue developing discrete desktop GPUs after the imminent launch of its Arc Battlemage series. Leaker JayKihn confirmed in a comment on X that Intel is actively working on future GPU generations, namely the Arc Celestial and Arc Druid, unlike previous rumors that pointed to a possible cancellation if Battlemage underperforms. Sources say Intel is still committed to its GPU roadmap. With the Arc Battlemage series led by the B580 model, Intel is targeting the budget segment, going in direct competition with NVIDIA's RTX 4060 and RTX 4060 Ti. Price-wise, we can expect graphic cards with Intel Arc Battlemage to go around $250 (for the 12 GB model) and although the performance will not be groundbreaking, it can attract interest from buyers on a tight budget.
Since Intel has reportedly canceled all plans to launch discrete laptop Battlemage cards, Arc Druid and Arc Celestial could follow the same path. Although details regarding Arc Celestial and Arc Druid are scattered, confirmation of their development is good news for the PC graphics card market. What we do know now is that Arc Celestial models will arrive in 2025 or early 2026, which could coincide with the launch of Intel's Panther Lake CPUs. The GPUs are expected to take advantage of the new Xe3 graphics architecture (Arc B580 will feature Intel's Xe2-HPG architecture). However, given Intel's latest challenges, the ultimate success of these next-generation GPUs remains to be seen, while we still believe that the success of Arc Battlemage will be decisive for future Intel GPU development.
Sources:
Notebookcheck, @jaykihn0
Since Intel has reportedly canceled all plans to launch discrete laptop Battlemage cards, Arc Druid and Arc Celestial could follow the same path. Although details regarding Arc Celestial and Arc Druid are scattered, confirmation of their development is good news for the PC graphics card market. What we do know now is that Arc Celestial models will arrive in 2025 or early 2026, which could coincide with the launch of Intel's Panther Lake CPUs. The GPUs are expected to take advantage of the new Xe3 graphics architecture (Arc B580 will feature Intel's Xe2-HPG architecture). However, given Intel's latest challenges, the ultimate success of these next-generation GPUs remains to be seen, while we still believe that the success of Arc Battlemage will be decisive for future Intel GPU development.
26 Comments on Intel Continues to Develop GPUs Beyond Arc Battlemage
I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
This might turn out to be a great card for a legacy box then
And then people wonder why we have a monopoly in discrete GPUs and why people buy only Nvidia.
I got 2 A770s in 2023. The second one is a Sparkle Titan 16 GB.
Although nothing stops a manufacturer from doing an ARM CPU + AMD/NVIDIA GPU combo, which is already possible in ARM platforms with PCIe (either something like an Ampere workstation, servers, or hacking some SBC with a PCIe slot). However, your discrete GPU options are still the same as an equivalent x86 counterpart.
We would just have another player in the market, x86 would still be a thing, maybe with a lower (but still major) market share for the next decade. If we're talking about iGPUs, yeah, but it's not like we have many options for those anyway.
For desktop with your regular PCIe, I don't see how it'd be the case. 3 things:
- I already believe the desktop form factor is dying, and it currently has way too many drawbacks for performance and power efficiency. Gamers may hate it, but the likely future is to have mini-PCs with an SoC that has everything integrated, something like Strix Halo. Unified memory on a large memory bus. I don't think we will be seeing quad-channel support on consumer DIY platforms anytime soon, whereas we will be seeing Strix Halo with its 256-bit LPDDR5X bus next year. With this in mind, I can very well see Nvidia making such integrated product, which can be used in both laptops (goes in hand with those mediatek rumors), and also mini-PCs, meaning that the dGPU market would just shrink, no matter if x86 or ARM.
- Another thing is that dGPUs have no strict requirement to x86 or an "Intel platform". As I said, you can very well slap a Nvdia, AMD or even Intel dGPU into an ARM platform and use it. PCIe is a standard and works across many different ISAs, you can even run AMD and Intel dGPUs in a RISC-V CPU if you so desire (on linux, just to be clear).
- All your worries seem to be just relationed to games, since there's this whole Windows-x86 relationship in the dGPU market. There already are many products using Nvidia and AMD GPUs with ARM, and Nvidia already has CPUs, it's just that none of those are meant to run games.
Still, I don't see what would change. x86 would still be a thing. dGPUs might become more expensive and not so accessible, but will still be a thing, same goes for DIY desktop building.
On the Windows side I believe it's more of a chicken and egg problem. Once there's some traction (be it on a single SoC or DIY form factor), drivers for the ARM version should become available.
Me thinking positively, I can't see DIY literally being dead, in fact, there was backlash over Surface RT. Any such negativity, reminds me of the early-Surface era. But, we got very close to losing DIY already, before there was a lot of push back against Surface RT and Windows RT.
On top of that, we aren't being forced to get a laptop, despite Microsoft's advertisement showing off laptops on Windows 10 users' screens.
I still have high hopes for Arc! Especially if GeForces are going to be too expensive for what they should be. I rather have an Arc A770 than a GeForce RTX 4080, anyways! Based on how expensive they are!
I feel like Nvidia is pushing it, even with the price of 'RTX 4070. :(
The A770 is a failed upper mid range card being sold at a steep discount against a newer generation of 5/4nm cards. You pick up one of the $229 open boxes, sure, you could argue the price offsets its low performance, if you are willing to put up with intels weirdness. Most consumers dont want to take that risk, especially if $250 is a big deal for them, they want a card that will last. So far intel isnt inspiring much confidence.
OK..... I am talking about full attack in the consumer market, not a few boards for developers. I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs. I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Nvidia is known for creating restrictions, walled gardens, for it's customers. 15 years ago when it was trying to dictate PhysX and CUDA as features that will be available only to loyal customers, it had locked it's drivers in such a way that you couldn't use an AMD card as a primary card and an Nvidia as a secondary card. Doing so, putting an AMD card as primary, meant CUDA and PhysX not running at all on hardware, or running with badly unoptimised software code on one core of a multi core CPU, using ancient MMX instruction set. It was a joke. So, it wouldn't be strange for Nvidia to put restrictions on it's products that would make them refuse to work at full speed, or at all, or dissable some of the best of their features when used with platforms that are not made by Nvidia.
Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market. What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
Another thing is how a gaming setup has only became more and more expensive as time goes, making it harder for people to keep with the hobby.
Of course that's just my opinion and I can be totally wrong. If I were able to predict the future with certainty I'd have won the lottery already :p No, because I don't see how a 5090 would be different for ARM or x86. How would any of that be an "attack"? If anything, having more options in the market would be a good thing. That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles. Fair enough. Technically such pre-built system you mentioned wouldn't be much different than a mini-PC, but the somewhat increased size for some expansion (like Sata, extra M2, and 1 or 2 PCIe slots) could still make it way smaller than your regular DIY setup. Maybe more similar to those ITX ones (without any of the upgradeability, apart from some peripherals).
I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer. Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share. I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.