Sunday, December 29th 2024

Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

Intel could be working on a new Arc graphics card according to Quantum Bits quoted by VideoCardz. It's based on the Battlemage architecture and has 24 GB of memory, twice as much as current models. This new card seems to be oriented more towards professionals, not gamers. Intel's Battlemage lineup currently has the Arc B580 model with 12 GB GDDR6 memory and a 192-bit bus. There's also the upcoming B570 with 10 GB and a 160-bit bus. The new 24 GB model will use the same BGM-G21 GPU as the B580, while the increased VRAM version could use higher capacity memory modules or a dual-sided module setup. No further technical details are available at this moment.

Intel looks to be aiming this 24 GB version at professional tasks such as artificial intelligence jobs like Large Language Models (LLMs) and generative AI. The card would be useful in data centers, edge computing, schools, and research, and this makes sense for Intel as they don't have a high-memory GPU for professional productivity markets yet. The company wants to launch this Arc Battlemage with bigger memory in 2025, we guess it might be announced in late spring or ahead of next year's Computex if there's no rush. Intel in the meantime will keep making their current gaming cards too as the latest Arc series was very well received, a big win for Intel after all the struggles. This rumor hints that Intel's expanding its GPU plan rather than letting it fade away, that was a gray scenario before the launch of Battlemage. Now it seems they want to compete in the professional and AI acceleration markets as well.
In 2025, Intel plans to launch a larger memory version of the Battlemage series graphics cards, with the capacity increased to 24 GB.
In the future, the existing version will continue to serve consumer markets such as games, while the 24G larger video memory version will target the "productivity market."
The target users of the "productivity market" include data centers, edge computer rooms, education and scientific research, and individual developers.
— Quantum Bits
Sources: Quantum Bits, VideoCardz
Add your own comment

48 Comments on Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

#1
Solaris17
Super Dainty Moderator
Hm, for DC I would be surprised if it was "ARC". If it carries the ARC name it would be the second gen ARC Pro's. Though they would undoubtably use the BGM cores in FLEX and MAX DC parts eventually as they did Alchemist.
Posted on Reply
#2
evernessince
Has the potential to eat Nvidia's lunch in regard to professional and semi-professional workloads, particularly AI if priced right. VRAM is the primary bottleneck for AI right now on consumer cards.
Posted on Reply
#3
dj-electric
24GB is cool, a powerful GPU for a fair price that has "only" 16GB of VRAM would be cooler.
Posted on Reply
#4
docnorth
And they could ask a much higher price (like Nvidia and AMD too), for a 'professional' GPU.
Posted on Reply
#5
TPUnique
evernessinceHas the potential to eat Nvidia's lunch in regard to professional and semi-professional workloads, particularly AI if priced right. VRAM is the primary bottleneck for AI right now on consumer cards.
AMD's as well. It could be the budget card for LLM, assuming it's priced right.
Posted on Reply
#6
Macro Device
dj-electrica powerful GPU for a fair price that has "only" 16GB of VRAM would be cooler
It exists and it's called an aftermarket RX 6800 XT/6900 XT.

I'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
Posted on Reply
#7
TheinsanegamerN
Macro DeviceIt exists and it's called an aftermarket RX 6800 XT/6900 XT.
Believe it or not, there are problems with relying on second hand used hardware from unknown sellers with unknown lives. Shocking, I know, but many prefer to buy hardware that HASNT been driven to within an inch of its life.
Macro DeviceI'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
Or you could just learn to plug your power plugs in correctly so they dont melt.
Posted on Reply
#8
friocasa
Makes sense, easy to do, and they can increase the price to the point it becomes profitable, and therefore they would have an incentive to increase production

I hope they do it and also do their best to support AI applications
Posted on Reply
#9
Hardcore Games
I have seen gaming cards refitted with more VRAM which does help with some titles. I have used 16GB cards and monitoring VRAM usage is eye opening.

The ARC Xe graphics in my laptop just use the DDR4 I have stuffed into my machine. I would think the ARX Xe as a discrete GPU with 4GB or 8GB VRAM would have done much to get more performance,
Posted on Reply
#10
LabRat 891
Nomad76The new 24 GB model will use the same BGM-G21 GPU as the B580
I'm not sure the extra VRAM can be put to good use (for gamers).
Posted on Reply
#11
Timbaloo
I never got the "intel is dropping dGPU" rumors. Never made sense. Well, unless you're Lisa Su that is... :rolleyes: (scnr)
Posted on Reply
#12
Bruno_O
Hardcore GamesI have seen gaming cards refitted with more VRAM which does help with some titles. I have used 16GB cards and monitoring VRAM usage is eye opening.

The ARC Xe graphics in my laptop just use the DDR4 I have stuffed into my machine. I would think the ARX Xe as a discrete GPU with 4GB or 8GB VRAM would have done much to get more performance,
as a 20GB 7900XT owner I always laugh at claims that 8-12GB is "enough"

for years I've been seeing 16-18GB in use when playing at 4k native plus AA and RT
Posted on Reply
#13
Hecate91
TimbalooI never got the "intel is dropping dGPU" rumors. Never made sense. Well, unless you're Lisa Su that is... :rolleyes: (scnr)
Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.
TheinsanegamerNOr you could just learn to plug your power plugs in correctly so they dont melt
Or you could just not accept a flawed power connector, it got updated to be less crappy for a good reason. And funny enough so far Intel isn't using the new connector.
Posted on Reply
#14
Timbaloo
Hecate91Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.
Don't think of it as what you want. It's a company, they will do what makes sense from a company POV. Intel did not do Battlemage to give consumers cheap GPUs. They're doing it because mid/long-term they expect a good business in it. And that's why it does not make sense to drop the dGPU R&D, not because they want consumers to enjoy cheap HW.
Posted on Reply
#15
Hardcore Games
for the most part GPU prices were affected by bitcoin etc leaving gamers out in the cold
Posted on Reply
#16
evernessince
Hecate91Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.

Or you could just not accept a flawed power connector, it got updated to be less crappy for a good reason. And funny enough so far Intel isn't using the new connector.
I assume Intel will keep GPUs around because they are another AI product in their portfolio that could allow them to crack that market. Even if they don't earn gobs of money from the gaming side in the short term, the long term upsides are very appealing from a business perspective. They are probably the last thing you cut in order for the company to recover.

Intel cannot afford to miss another huge market like they did with mobile (tablets, phones, ect).
Posted on Reply
#17
eidairaman1
The Exiled Airman
Hardcore Gamesfor the most part GPU prices were affected by bitcoin etc leaving gamers out in the cold
Yup, manufacturers forgot who their primary audience was when idiots bought skids full of them.

And game devs got lazy, so most suffer consolitis.
Posted on Reply
#18
kondamin
More interested in the higher end battle mage cards, but if they manage to get good support going for cad/cam software that extra ram will be helpful for large projects
Posted on Reply
#19
Solaris17
Super Dainty Moderator
kondaminMore interested in the higher end battle mage cards, but if they manage to get good support going for cad/cam software that extra ram will be helpful for large projects
Man yes. I hope they release a B7xx series. As for the ecosystem I am hopeful it will catch up. Intel and AMD invested HEAVILY in opensource, these things trickle down, not to mention they are pretty proactive with the likes of Topaz and Adobe. They have been reaching out to help get support implimented.

I think with CAD specifically, well you know how it goes with industrial software. The one unfortunate aspect is by the time they come out with a version that supports it, well getting the companies that rely on it to switch might be a lot of trouble. Design firms maybe, but steel mills or industrial ops as a whole works at a snails pace regardless of OEM backing unfortunately.

The B series does run cool though! So I imagine they would make great (get me a picture on the screen) cards like the A310/80 for office.
Posted on Reply
#20
Visible Noise
I don’t see the point of 24GB on B5. B7 sure, not on B5 though.
Posted on Reply
#21
_roman_
Solaris17Intel and AMD invested HEAVILY in opensource
In regards of open source:

The firmware for my intel wifi chip, amd graphic card is closed source. Why else do I need this package "sys-kernel/linux-firmware-20241210-r1" ?
Just saying. No Firmware - no working network connection for my intel wlan. No firmware - just basic Vesa (VGA?) compatibility mode for my Radeon 7800XT.
same for the firmware for the processors / mainboards / bad compiler support

I can not set a fan curve in gnu gentoo linux for my AMD Radeon 7800XT graphic card. last time i tried in summer 2024. No visual tool from AMD - nothing.
Ryzen 5800X - sold / current 7600X have minimum gcc optimisation support. I would expect more for heavily open source support from amd. In my point of view - amd has a little bit of open source support
I want to see coreboot on amd mainboards

--

Back to topic: When the price is right and the software works some will buy those cards.
Posted on Reply
#22
Solaris17
Super Dainty Moderator
_roman_In regards of open source:

The firmware for my intel wifi chip, amd graphic card is closed source. Why else do I need this package "sys-kernel/linux-firmware-20241210-r1" ?
Just saying. No Firmware - no working network connection for my intel wlan. No firmware - just basic Vesa (VGA?) compatibility mode for my Radeon 7800XT.
same for the firmware for the processors / mainboards
I'm sorry to hear that, my AX210 is affected as well, but this thread is about ARC B series cards. The driver teams are not the same, and the contributions to GPUs is easily tracked in the kernel mailing list and sites like phoronix, as im sure you know. While I appreciate the extra detail in most things, lets try to keep it on topic.
Posted on Reply
#23
Hardcore Games
Because of predatory pricing I am using Integrated graphics and see what works
Posted on Reply
#24
wolf
Better Than Native
Bruno_Oas a 20GB 7900XT owner I always laugh at claims that 8-12GB is "enough"

for years I've been seeing 16-18GB in use when playing at 4k native plus AA and RT
Allocation is not utilisation.
Posted on Reply
#25
Jism
wolfAllocation is not utilisation.
Exactly.

Most of the stuff is just "Cached" whenever it needs it. It prevents from pulling from system DRAM which obviously is slower or might inflict a little performance penalty.

Other then that the whole VRAM debate is quite useless. 12GB is plenty for 1440P. Not even the latest Techpowerup test shows any GPU taxing it's full VRAM.

Not even i manage to get more then 8GB through Doom.
Posted on Reply
Add your own comment
Jan 6th, 2025 11:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts