Thursday, November 28th 2024

Intel Continues to Develop GPUs Beyond Arc Battlemage

New rumors emerged via Notebookcheck that point to Intel's intention to continue developing discrete desktop GPUs after the imminent launch of its Arc Battlemage series. Leaker JayKihn confirmed in a comment on X that Intel is actively working on future GPU generations, namely the Arc Celestial and Arc Druid, unlike previous rumors that pointed to a possible cancellation if Battlemage underperforms. Sources say Intel is still committed to its GPU roadmap. With the Arc Battlemage series led by the B580 model, Intel is targeting the budget segment, going in direct competition with NVIDIA's RTX 4060 and RTX 4060 Ti. Price-wise, we can expect graphic cards with Intel Arc Battlemage to go around $250 (for the 12 GB model) and although the performance will not be groundbreaking, it can attract interest from buyers on a tight budget.

Since Intel has reportedly canceled all plans to launch discrete laptop Battlemage cards, Arc Druid and Arc Celestial could follow the same path. Although details regarding Arc Celestial and Arc Druid are scattered, confirmation of their development is good news for the PC graphics card market. What we do know now is that Arc Celestial models will arrive in 2025 or early 2026, which could coincide with the launch of Intel's Panther Lake CPUs. The GPUs are expected to take advantage of the new Xe3 graphics architecture (Arc B580 will feature Intel's Xe2-HPG architecture). However, given Intel's latest challenges, the ultimate success of these next-generation GPUs remains to be seen, while we still believe that the success of Arc Battlemage will be decisive for future Intel GPU development.
Sources: Notebookcheck, @jaykihn0
Add your own comment

28 Comments on Intel Continues to Develop GPUs Beyond Arc Battlemage

#1
Vayra86
So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
Posted on Reply
#2
Nomad76
News Editor
Vayra86So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
TBH with almost zero market share on dGPU and dropping out of the laptop sector my only guess is "not to be trashed by reviews and public feedback at least". This way Intel can hope that will get some OEM deals for corporate PCs to be fitted with next Celestial and Druid. (this is my personal view and it can be way off)
Posted on Reply
#3
Dristun
Vayra86So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
Considering they're currently hovering around zero, I'd say 3-4% would already be a massive success. General quality is alright and legacy gaming too (HW Unboxed tested 250 games, I played around 40 from 1996-2013 and software rendering had to come into play only in a few and all bar two, Legacy of Kain1 and Revenant, worked). Unless someone is trying to play old games at 360fps on a HRR display, but then I have to question them skimping so hard on a videocard, haha.
Posted on Reply
#4
Vayra86
DristunConsidering they're currently hovering around zero, I'd say 3-4% would already be a massive success. General quality is alright and legacy gaming too (HW Unboxed tested 250 games, I played around 40 from 1996-2013 and software rendering had to come into play only in a few and all bar two, Legacy of Kain1 and Revenant, worked). Unless someone is trying to play old games at 360fps on a HRR display, but then I have to question them skimping so hard on a videocard, haha.
Oh thats nice info indeed.

This might turn out to be a great card for a legacy box then
Posted on Reply
#5
Solaris17
Super Dainty Moderator
Nomad76Since Intel has reportedly canceled all plans to launch discrete laptop Battlemage cards
discrete shouldnt be confused with integrated, which it already does. For the readers anyway.
Posted on Reply
#6
john_
People who think that Intel has the choice to stop developing discrete GPUs can't really understand how vital GPUs are for the future of a platform. Imagine tomorrow ARM having better GPU options than the x86 platform.
going in direct competition with NVIDIA's RTX 4060 and RTX 4060 Ti
OK, the author at Notebookcheck ignores the existence of AMD and the article here copy and paste the same text.
And then people wonder why we have a monopoly in discrete GPUs and why people buy only Nvidia.
Posted on Reply
#7
Nomad76
News Editor
DristunConsidering they're currently hovering around zero, I'd say 3-4% would already be a massive success.
Maybe in 2-3 years, 3-4% only from/with Battlemage that's not going to happen
Posted on Reply
#8
john_
Vayra86So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
Not dropping behind AMD and Nvidia in the sub $300 market in performance, maybe even narrowing the gap, something that wouldn't be difficult if Nvidia and AMD decide that buyers of sub $300 graphics cards aren't paying enough, so they shouldn't get more performance, but just some more features(RTX 4060 vs RTX 3060 and RX 7600 vs RX 6650XT). Improving drivers, not having another Starfield, sponsoring some AAA games, offering Frame Generation with XeSS and not rely on AMD's FSR 3.1, because obviously fake resolutions and fake frames are mandatory features today, necessary to run all those great unoptimized new games that are released or coming.
Nomad76"not to be trashed by reviews and public feedback at least".
Why? It's fun when EVERYONE is doing it to AMD......the last 20 years.
DristunI'd say 3-4% would already be a massive success
They gone to 6% for a quarter, before dropping back to 0%.
Posted on Reply
#9
Vayra86
john_Not dropping behind AMD and Nvidia in the sub $300 market in performance, maybe even narrowing the gap, something that wouldn't be difficult if Nvidia and AMD decide that buyers of sub $300 graphics cards aren't paying enough, so they shouldn't get more performance, but just some more features(RTX 4060 vs RTX 3060 and RX 7600 vs RX 6650XT). Improving drivers, not having another Starfield, sponsoring some AAA games, offering Frame Generation with XeSS and not rely on AMD's FSR 3.1, because obviously fake resolutions and fake frames are mandatory features today, necessary to run all those great unoptimized new games that are released or coming.

Why? It's fun when EVERYONE is doing it to AMD......the last 20 years.

They gone to 6% for a quarter, before dropping back to 0%.
Sounds like a solid goal and that would also mean competitiveness in that segment, when I read 12GB at 250 bucks I was like OKAY. Let's go!
Posted on Reply
#10
Dristun
john_They gone to 6% for a quarter, before dropping back to 0%.
which is weird, if you'd ask me, A770 is a much better deal today at $240 than when I bought it for nearly $400 out of curiosity
Posted on Reply
#11
RJARRRPCGP
Will the 12 GB Arc Battlemage outperform Radeon RX 6750 XT?

I got 2 A770s in 2023. The second one is a Sparkle Titan 16 GB.
Posted on Reply
#12
igormp
john_ARM having better GPU options than the x86 platform.
How so? So far only available option for an ARM PC that's not a Mac is Qcom with their laptops, no desktop-like option available, so you only get integrated graphics.
Although nothing stops a manufacturer from doing an ARM CPU + AMD/NVIDIA GPU combo, which is already possible in ARM platforms with PCIe (either something like an Ampere workstation, servers, or hacking some SBC with a PCIe slot). However, your discrete GPU options are still the same as an equivalent x86 counterpart.
Posted on Reply
#13
john_
igormpHow so? So far only available option for an ARM PC that's not a Mac is Qcom with their laptops, no desktop-like option available, so you only get integrated graphics.
Although nothing stops a manufacturer from doing an ARM CPU + AMD/NVIDIA GPU combo, which is already possible in ARM platforms with PCIe (either something like an Ampere workstation, servers, or hacking some SBC with a PCIe slot). However, your discrete GPU options are still the same as an equivalent x86 counterpart.
Don't cut my phrase in half. It changes it's meaning.
john_Imagine tomorrow ARM having better GPU options than the x86 platform.
Anyway what I am posting (for years) is this question. What will happen to Intel's x86, if Nvidia tomorrow starts selling a full Nvidia ARM platform? Nvidia is rumored to introduce next year SOCs, or APUs, or CPUs, or whatever, based on ARM, that will be targeting the mini PC and laptop market and why not, even the desktop market. They do have the money and the brand recognition today to try to go against everyone, Intel, AMD, Qualcomm, Mediatek and even Apple. Their goal is to grab more than the GPU market and that's why they tried to buy ARM. About 20 years ago Jensen wanted an x86 license and Intel said no. Nvidia was also building chipsets, until Intel decided to close that market to others and only allow it's own chipsets to be paired with it's CPUs. So, going beyond GPUs in the retail market is probably an old wish for Jensen. Now, imagine Nvidia's ARM platform getting year after year more and more popular with OEMs and consumers. Nvidia could start giving advantages to it's own platform by releasing it's new GPU models first for the ARM platform and months latter for the x86 platform, or introduce a new interface for it's GPUs, making it's GPUs faster and cheaper on the ARM platform than on the x86 platform. In the end, if the ARM platform becomes the most popular one, Nvidia could decide to just stop producing discrete GPUs for the x86 platform, forcing everyone who wants the best graphics to go and buy it's own ARM based platform. That could be something that could happen in 10-15-20 years. If that happens and AMD and Intel are no way near Nvidia in GPU performance, the retail market and many professional systems that need GPU compute power, could go to ARM. AMD could also follow and leave the Intel platform without even mid range GPUs. That's why Intel needs to keep pushing in GPU developement. Because the GPU today is as importand and in some cases even more importand than the CPU, or at least the CPU architecture.
Posted on Reply
#14
windwhirl
Vayra86So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
Not sure, personally. I think, IMO, it'd be a success if Intel delivers a product that is competitive price/performance wise, doesn't have asterisks to it (like having to enable REBAR to actually take advantage of most of its potential), drivers are OK, and if it sells more units than the previous Arc boards (this obviously we won't know for sure, since this one data item is probably something Intel will not really tell anything about other than "GPUs are selling well"). No one is expecting them to suddenly deliver the Radeon/Geforce killer.
Posted on Reply
#15
zmeul
they need to go forward and become a thorn in their asses, nVidia is getting too cocky and customers will be the ones hurting
Posted on Reply
#16
igormp
john_Don't cut my phrase in half. It changes it's meaning.
Sorry, I don't see how it changes the meaning.
john_What will happen to Intel's x86, if Nvidia tomorrow starts selling a full Nvidia ARM platform?
I mean, they already do have such products, albeit not in the consumer space. Fun fact, you can even boot Windows ARM on a modern Jetson with UEFI without issues nor hacks (albeit without GPU acceleration since there are no public Nvidia drivers for windows on ARM, unlike Linux).
We would just have another player in the market, x86 would still be a thing, maybe with a lower (but still major) market share for the next decade.
john_Nvidia could start giving advantages to it's own platform by releasing it's new GPU models first for the ARM platform and months latter for the x86 platform
If we're talking about iGPUs, yeah, but it's not like we have many options for those anyway.
For desktop with your regular PCIe, I don't see how it'd be the case.
john_or introduce a new interface for it's GPUs, making it's GPUs faster and cheaper on the ARM platform than on the x86 platform. In the end, if the ARM platform becomes the most popular one, Nvidia could decide to just stop producing discrete GPUs for the x86 platform, forcing everyone who wants the best graphics to go and buy it's own ARM based platform. That could be something that could happen in 10-15-20 years. If that happens and AMD and Intel are no way near Nvidia in GPU performance, the retail market and many professional systems that need GPU compute power, could go to ARM. AMD could also follow and leave the Intel platform without even mid range GPUs. That's why Intel needs to keep pushing in GPU developement. Because the GPU today is as importand and in some cases even more importand than the CPU, or at least the CPU architecture.
3 things:
- I already believe the desktop form factor is dying, and it currently has way too many drawbacks for performance and power efficiency. Gamers may hate it, but the likely future is to have mini-PCs with an SoC that has everything integrated, something like Strix Halo. Unified memory on a large memory bus. I don't think we will be seeing quad-channel support on consumer DIY platforms anytime soon, whereas we will be seeing Strix Halo with its 256-bit LPDDR5X bus next year. With this in mind, I can very well see Nvidia making such integrated product, which can be used in both laptops (goes in hand with those mediatek rumors), and also mini-PCs, meaning that the dGPU market would just shrink, no matter if x86 or ARM.
- Another thing is that dGPUs have no strict requirement to x86 or an "Intel platform". As I said, you can very well slap a Nvdia, AMD or even Intel dGPU into an ARM platform and use it. PCIe is a standard and works across many different ISAs, you can even run AMD and Intel dGPUs in a RISC-V CPU if you so desire (on linux, just to be clear).
- All your worries seem to be just relationed to games, since there's this whole Windows-x86 relationship in the dGPU market. There already are many products using Nvidia and AMD GPUs with ARM, and Nvidia already has CPUs, it's just that none of those are meant to run games.

Still, I don't see what would change. x86 would still be a thing. dGPUs might become more expensive and not so accessible, but will still be a thing, same goes for DIY desktop building.
Posted on Reply
#17
Wirko
igormpAnother thing is that dGPUs have no strict requirement to x86 or an "Intel platform". As I said, you can very well slap a Nvdia, AMD or even Intel dGPU into an ARM platform and use it. PCIe is a standard and works across many different ISAs, you can even run AMD and Intel dGPUs in a RISC-V CPU if you so desire (on linux, just to be clear).
For the purpose of this discussion it also matters that PCIe works over a cable. External GPUs may become more common in the long term, and they may take the shape of a mini PC, meant to be stacked together with one.
Posted on Reply
#18
igormp
WirkoFor the purpose of this discussion it also matters that PCIe works over a cable. External GPUs may become more common in the long term, and they may take the shape of a mini PC, meant to be stacked together with one.
True, those dGPU thunderbolt boxes are already a thing, so they could get more traction if the mini-PC form factor gets more adoption. That's still pretty ISA agnostic, nonetheless.
Posted on Reply
#19
Wirko
igormpTrue, those dGPU thunderbolt boxes are already a thing, so they could get more traction if the mini-PC form factor gets more adoption. That's still pretty ISA agnostic, nonetheless.
Yes. Whoever brings this sort of GPU to the market will have to ensure both x86 and Arm compatibility and support because both architectures will be common. That part of future telling is easy; the hard part is predicting the fragmentation of the Arm world, and the effort needed to support green, orange, red and other colours of Arm processors.
Posted on Reply
#20
igormp
WirkoYes. Whoever brings this sort of GPU to the market will have to ensure both x86 and Arm compatibility and support because both architectures will be common. That part of future telling is easy; the hard part is predicting the fragmentation of the Arm world, and the effort needed to support green, orange, red and other colours of Arm processors.
I don't think it's that complicated. As I had already said, the support already exists on Linux, you can run any of the current dGPU offerings with many different CPUs.
On the Windows side I believe it's more of a chicken and egg problem. Once there's some traction (be it on a single SoC or DIY form factor), drivers for the ARM version should become available.
Posted on Reply
#21
RJARRRPCGP
Anyone predicting the end of DIY, sounds like what he said/she said, back in the Windows 8x era. :(

Me thinking positively, I can't see DIY literally being dead, in fact, there was backlash over Surface RT. Any such negativity, reminds me of the early-Surface era. But, we got very close to losing DIY already, before there was a lot of push back against Surface RT and Windows RT.

On top of that, we aren't being forced to get a laptop, despite Microsoft's advertisement showing off laptops on Windows 10 users' screens.

I still have high hopes for Arc! Especially if GeForces are going to be too expensive for what they should be. I rather have an Arc A770 than a GeForce RTX 4080, anyways! Based on how expensive they are!

I feel like Nvidia is pushing it, even with the price of 'RTX 4070. :(
Posted on Reply
#22
TheinsanegamerN
Dristunwhich is weird, if you'd ask me, A770 is a much better deal today at $240 than when I bought it for nearly $400 out of curiosity
It's not really that weird. The 7600 is 9% faster at the same price. 4060 and 7600xt are both around $300 ish and are 12 and 18% faster, respectively. Nvidia has far better support for RT and DLSS. All these options pull considerably less power then the A770, as in 80+ fewer watts. The 4060 is nearly half the power use. And of course, both AMD and nvidia offer far superior drivers to intel.

The A770 is a failed upper mid range card being sold at a steep discount against a newer generation of 5/4nm cards. You pick up one of the $229 open boxes, sure, you could argue the price offsets its low performance, if you are willing to put up with intels weirdness. Most consumers dont want to take that risk, especially if $250 is a big deal for them, they want a card that will last. So far intel isnt inspiring much confidence.

Posted on Reply
#23
john_
igormpSorry, I don't see how it changes the meaning.
Really? I am talking about a possible future, you translate it as if I am talking about what is happening today and you don't see the difference?
OK.....
igormpI mean, they already do have such products, albeit not in the consumer space. Fun fact, you can even boot Windows ARM on a modern Jetson with UEFI without issues nor hacks (albeit without GPU acceleration since there are no public Nvidia drivers for windows on ARM, unlike Linux).
We would just have another player in the market, x86 would still be a thing, maybe with a lower (but still major) market share for the next decade.
I am talking about full attack in the consumer market, not a few boards for developers.
igormpf we're talking about iGPUs, yeah, but it's not like we have many options for those anyway.
For desktop with your regular PCIe, I don't see how it'd be the case.
I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs.
igormp3 things:
- I already believe the desktop form factor is dying, and it currently has way too many drawbacks for performance and power efficiency. Gamers may hate it, but the likely future is to have mini-PCs with an SoC that has everything integrated, something like Strix Halo. Unified memory on a large memory bus. I don't think we will be seeing quad-channel support on consumer DIY platforms anytime soon, whereas we will be seeing Strix Halo with its 256-bit LPDDR5X bus next year. With this in mind, I can very well see Nvidia making such integrated product, which can be used in both laptops (goes in hand with those mediatek rumors), and also mini-PCs, meaning that the dGPU market would just shrink, no matter if x86 or ARM.
- Another thing is that dGPUs have no strict requirement to x86 or an "Intel platform". As I said, you can very well slap a Nvdia, AMD or even Intel dGPU into an ARM platform and use it. PCIe is a standard and works across many different ISAs, you can even run AMD and Intel dGPUs in a RISC-V CPU if you so desire (on linux, just to be clear).
- All your worries seem to be just relationed to games, since there's this whole Windows-x86 relationship in the dGPU market. There already are many products using Nvidia and AMD GPUs with ARM, and Nvidia already has CPUs, it's just that none of those are meant to run games.

Still, I don't see what would change. x86 would still be a thing. dGPUs might become more expensive and not so accessible, but will still be a thing, same goes for DIY desktop building.
I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Nvidia is known for creating restrictions, walled gardens, for it's customers. 15 years ago when it was trying to dictate PhysX and CUDA as features that will be available only to loyal customers, it had locked it's drivers in such a way that you couldn't use an AMD card as a primary card and an Nvidia as a secondary card. Doing so, putting an AMD card as primary, meant CUDA and PhysX not running at all on hardware, or running with badly unoptimised software code on one core of a multi core CPU, using ancient MMX instruction set. It was a joke. So, it wouldn't be strange for Nvidia to put restrictions on it's products that would make them refuse to work at full speed, or at all, or dissable some of the best of their features when used with platforms that are not made by Nvidia.
Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market. What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
Posted on Reply
#24
TheinsanegamerN
john_Really? I am talking about a possible future, you translate it as if I am talking about what is happening today and you don't see the difference?
OK.....

I am talking about full attack in the consumer market, not a few boards for developers.

I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs.
That would assume an ARM CPU from nvidia that pushes performance anywhere near what the x3d chips can do. From what we've seen so far......that is extremely unlikely, to put it mildly. Nvidia's last custom core attempt was a complete disaster and typical cortex designs are nowhere close to the performance needed for desktop applications.
john_I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Nvidia is known for creating restrictions, walled gardens, for it's customers. 15 years ago when it was trying to dictate PhysX and CUDA as features that will be available only to loyal customers, it had locked it's drivers in such a way that you couldn't use an AMD card as a primary card and an Nvidia as a secondary card. Doing so meant CUDA and PhysX not running at all on hardware, or running with badly unoptimised software code on one core of a multi core CPU, using ancient MMX instruction set. It was a joke. So, it wouldn't be strange for Nvidia to put restrictions on it's products that would make them refuse to work at full speed, or at all, or dissable some of the best of their features when used with platforms that are not made by Nvidia.
Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market. What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
Given that DIY PCs are a multi billion dollar industry I do not see nvidia controlling that market. Perhaps they will make in roads into the pre built sector, but I doubt it. People still like to upgrade their pre builts.
Posted on Reply
#25
igormp
RJARRRPCGPAnyone predicting the end of DIY, sounds like what he said/she said, back in the Windows 8x era. :(

Me thinking positively, I can't see DIY literally being dead, in fact, there was backlash over Surface RT. Any such negativity, reminds me of the early-Surface era. But, we got very close to losing DIY already, before there was a lot of push back against Surface RT and Windows RT.

On top of that, we aren't being forced to get a laptop, despite Microsoft's advertisement showing off laptops on Windows 10 users' screens.

I still have high hopes for Arc! Especially if GeForces are going to be too expensive for what they should be. I rather have an Arc A770 than a GeForce RTX 4080, anyways! Based on how expensive they are!

I feel like Nvidia is pushing it, even with the price of 'RTX 4070. :(
I don't think the DIY market will literally die (sales are still increasing even!), but rather shrink in the bigger picture and become more niche as time goes.

Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
Another thing is how a gaming setup has only became more and more expensive as time goes, making it harder for people to keep with the hobby.

Of course that's just my opinion and I can be totally wrong. If I were able to predict the future with certainty I'd have won the lottery already :p
john_Really? I am talking about a possible future, you translate it as if I am talking about what is happening today and you don't see the difference?
OK.....
No, because I don't see how a 5090 would be different for ARM or x86.
john_I am talking about full attack in the consumer market, not a few boards for developers.
How would any of that be an "attack"? If anything, having more options in the market would be a good thing.
john_I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs.
That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles.
john_I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Fair enough. Technically such pre-built system you mentioned wouldn't be much different than a mini-PC, but the somewhat increased size for some expansion (like Sata, extra M2, and 1 or 2 PCIe slots) could still make it way smaller than your regular DIY setup. Maybe more similar to those ITX ones (without any of the upgradeability, apart from some peripherals).

I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer.
john_Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market.
Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share.
john_What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.
Posted on Reply
Add your own comment
Nov 29th, 2024 00:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts