Sunday, December 29th 2024

Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

Intel could be working on a new Arc graphics card according to Quantum Bits quoted by VideoCardz. It's based on the Battlemage architecture and has 24 GB of memory, twice as much as current models. This new card seems to be oriented more towards professionals, not gamers. Intel's Battlemage lineup currently has the Arc B580 model with 12 GB GDDR6 memory and a 192-bit bus. There's also the upcoming B570 with 10 GB and a 160-bit bus. The new 24 GB model will use the same BGM-G21 GPU as the B580, while the increased VRAM version could use higher capacity memory modules or a dual-sided module setup. No further technical details are available at this moment.

Intel looks to be aiming this 24 GB version at professional tasks such as artificial intelligence jobs like Large Language Models (LLMs) and generative AI. The card would be useful in data centers, edge computing, schools, and research, and this makes sense for Intel as they don't have a high-memory GPU for professional productivity markets yet. The company wants to launch this Arc Battlemage with bigger memory in 2025, we guess it might be announced in late spring or ahead of next year's Computex if there's no rush. Intel in the meantime will keep making their current gaming cards too as the latest Arc series was very well received, a big win for Intel after all the struggles. This rumor hints that Intel's expanding its GPU plan rather than letting it fade away, that was a gray scenario before the launch of Battlemage. Now it seems they want to compete in the professional and AI acceleration markets as well.
In 2025, Intel plans to launch a larger memory version of the Battlemage series graphics cards, with the capacity increased to 24 GB.
In the future, the existing version will continue to serve consumer markets such as games, while the 24G larger video memory version will target the "productivity market."
The target users of the "productivity market" include data centers, edge computer rooms, education and scientific research, and individual developers.
— Quantum Bits
Sources: Quantum Bits, VideoCardz
Add your own comment

48 Comments on Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

#26
igormp
Wow, that sounds sweet. If it comes out at a reasonable price, I might grab one for a second system meant just for inference.

On the other hand, this gives me some hope that, if they release a B700 model (likely with 16GB), then a pro model with 32gb could become a thing. If they're priced more reasonably than a 5090 (like in the ~$1k range) I might grab a couple to upgrade from my current setup.
Macro DeviceIt exists and it's called an aftermarket RX 6800 XT/6900 XT.

I'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
6000 series is pretty much shit for AI workloads given that it lacks the WMMA specs the 7000 series has.
Intel's software support is also way better than AMD's in that front, too bad they lack proper hardware to make use of this stack.
AMD is the other way around, great hardware with lacking software.
evernessinceI assume Intel will keep GPUs around because they are another AI product in their portfolio that could allow them to crack that market. Even if they don't earn gobs of money from the gaming side in the short term, the long term upsides are very appealing from a business perspective. They are probably the last thing you cut in order for the company to recover.

Intel cannot afford to miss another huge market like they did with mobile (tablets, phones, ect).
Afaik falcon shores is their upcoming platform for AI, which resembles a GPU, but doesn't have much to do with the current Arc lineup.
Their gaudi oferrings are pretty much meant solely for cheap inferencing, and their other GPUs offerings in that space haven't gotten any traction.
eidairaman1primary audience
The level of entitlement lol
Posted on Reply
#27
Macro Device
TheinsanegamerNBelieve it or not, there are problems with relying on second hand used hardware from unknown sellers with unknown lives. Shocking, I know, but many prefer to buy hardware that HASNT been driven to within an inch of its life.
Apart from straight up scammers there's very low risk of having a GPU live a very short life after buying it second hand. I only buy them used, only two of them had passed as of now (some chonker from 00s and R7 370). And I bought more than 50 thereof. Failure rate in brand news is even higher. With used, you at least know it starts.
Besides, RDNA2 is only 4 years old, this means it only was mining for a couple years, and miners nowadays are mostly good guys, they don't abuse their wares. Sure, you might get worse VRAM overclocking edge but this doesn't matter all that much. My 6700 XT is more than alive and still overclocks none worse than it did when I bought it (Feb '23). RDNA2 PCBs are overengineered. Cooling systems are also overengineered. These are built to last. You must be completely paranoid to avoid these.
TheinsanegamerNOr you could just learn to plug your power plugs in correctly so they dont melt.
I'm not talking connectors. I know, reading comprehension does rarely exist in people nowadays but I was talking pure PSU overload. And yes, 40 Xe2 cores at these speeds might get over 500 W power spikes. Not every PSU under the Sun can handle that. Also not that easy to cool it in the first place. Better safe than sorry.
Posted on Reply
#28
Marsil
This new card seems to be oriented more towards professionals, not gamers
There are professional gamers, you know?!
Posted on Reply
#29
evernessince
igormpWow, that sounds sweet. If it comes out at a reasonable price, I might grab one for a second system meant just for inference.

On the other hand, this gives me some hope that, if they release a B700 model (likely with 16GB), then a pro model with 32gb could become a thing. If they're priced more reasonably than a 5090 (like in the ~$1k range) I might grab a couple to upgrade from my current setup.

6000 series is pretty much shit for AI workloads given that it lacks the WMMA specs the 7000 series has.
Intel's software support is also way better than AMD's in that front, too bad they lack proper hardware to make use of this stack.
AMD is the other way around, great hardware with lacking software.

Afaik falcon shores is their upcoming platform for AI, which resembles a GPU, but doesn't have much to do with the current Arc lineup.
Their gaudi oferrings are pretty much meant solely for cheap inferencing, and their other GPUs offerings in that space haven't gotten any traction.


The level of entitlement lol
If they release a 32GB card at a decent price I'd pick that up immediately. Forget the 5090 that's likely to be super overpriced.
Posted on Reply
#30
Stephen.
Interesting rumor! If such a card will exist, it leads me to think that the current B580 is the full implementation of the BGM-G21 chip in all aspects except in the memory capacity.


This could answer the riddle of this information not provided by Intel, as indicated in the B580 review.
Posted on Reply
#31
JustBenching
wolfAllocation is not utilisation.
Yeap, I've seen 22gb on my 4090 (hogwarts legacy), I'm pretty confident it's not using that much vram.
Posted on Reply
#32
Bomby569
for the AI crowd on a budget :laugh:
Posted on Reply
#33
Vayra86
wolfAllocation is not utilisation.
Until it is - which is just a matter of time and games progressing.
JustBenchingYeap, I've seen 22gb on my 4090 (hogwarts legacy), I'm pretty confident it's not using that much vram.
While that might be true, there is a performance penalty for getting data elsewhere. A lot of that penalty you don't see, and some of it you do see, for example when games dynamically adjust resolution or quality. Whether you see it or not, its there though - and we consider that acceptable, to an extent. But the bottom line still is: allocating does not happen because the GPU is just as well off NOT allocating that memory. Because then why would it?
Posted on Reply
#35
inquisitor1
good news. light a fire under amd's asz to also wake up. poor nvidia has no company at the top. so lonely.

I want to see intel push hard and amd get spooked intel will pass them that they too wake up and bring the goods.

bring xx70/xx80 level with 14gb+ for gaming and video editing
Posted on Reply
#36
wolf
Better Than Native
Vayra86Until it is - which is just a matter of time and games progressing.
Technology marches forward and games get heavier, who'd have thought :rolleyes:
Posted on Reply
#37
Vayra86
wolfTechnology marches forward and games get heavier, who'd have thought :rolleyes:
The issue with that is that you don't really have the ability to predict when a game might actually utilize what you always thought was allocated. And as the bar keeps moving, you run into the practice many gamers eventually run into: the card's not sufficient anymore., or it drops the ball in very specific situations only. Except its a shame to find it being insufficient when there's lots of core power left, because frankly that's just a waste of a good GPU (one that would otherwise run things admirably at desired settings). Its much better running out of core before your VRAM, or just about in tandem.

Another issue with lack of VRAM in a GPU is that it becomes impossible to resell at a half decent price, further killing your ROI. But what stings most is that lacking VRAM effectively turns perfectly fine chips and boards into E-waste, something determined by the presence or absence of a single or couple of GB worth of chips.

So yes, tech marches forward, indeed, better to get products that can withstand a bit of that no? I know I wouldn't want to be double checking every game review if the VRAM usage didn't actually creep up to the dreaded 10GB all the time, just ONE generation later... ;)
Posted on Reply
#38
wolf
Better Than Native
Vayra86further killing your ROI
I mean, you're talking about a card that paid me back 2x it's purchase price in mined Eth, which I jumped on after purchasing it for gaming anyway.

It's almost 4.5 years old now, and still plays whatever I throw at it extremely well, it's very hard to me annoyed at the card for its VRAM.

From where I sit, I hear far more people who didn't buy one whigne about the 10gb than people who did, but hey maybe that's just my perspective. I wouldn't change a thing about what I bought then and how I've held out since.
Posted on Reply
#39
JustBenching
Vayra86The issue with that is that you don't really have the ability to predict when a game might actually utilize what you always thought was allocated. And as the bar keeps moving, you run into the practice many gamers eventually run into: the card's not sufficient anymore., or it drops the ball in very specific situations only. Except its a shame to find it being insufficient when there's lots of core power left, because frankly that's just a waste of a good GPU (one that would otherwise run things admirably at desired settings). Its much better running out of core before your VRAM, or just about in tandem.

Another issue with lack of VRAM in a GPU is that it becomes impossible to resell at a half decent price, further killing your ROI. But what stings most is that lacking VRAM effectively turns perfectly fine chips and boards into E-waste, something determined by the presence or absence of a single or couple of GB worth of chips.

So yes, tech marches forward, indeed, better to get products that can withstand a bit of that no? I know I wouldn't want to be double checking every game review if the VRAM usage didn't actually creep up to the dreaded 10GB all the time, just ONE generation later... ;)
I don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
Posted on Reply
#40
_roman_
Hardcore GamesBecause of predatory pricing I am using Integrated graphics and see what works
if it works.

I tried gaming with the ryzen 7600X graphics. I tried gaming with the MSI nvidia 960 4gB GTX (~45€ bought and sold on second hand market 2023)
That is no fun.

For basic and office tasks the ryzen 7600x graphics was working well 2023 for several months in windows 11 pro / gnu gentoo linux.

the radeon 6600 is reasonable priced in my point of view. Those intel cards may be an option for windows gamers who do not care about driver quality.
JustBenchingI don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
I doubt it works that way.

Please just test it yourself for several months with 4gb vram card, 8GB vram card and 16gb vram cards.
i did my share of processor, dram and graphic card testing in past few years.

8gb -> check whqd and control game
4gb -> check whqd and subnautica game -> textures are popping up after seconds with a nvidia 960 with 4gb vram in windows in whqd

I wanted to know which class of graphic card whqd needs for my games, what graphic card chip and how much vram. For testing I used used those giveaway epic games.
Posted on Reply
#41
Bruno_O
wolfAllocation is not utilisation.
4k ultra with AA and RT on

it's utilisation and not having enough memory kills your fps, no matter jow much nvidia fanboys with 10-12GB may disagree
Posted on Reply
#42
freeagent
You can have enough ram but not enough GPU, stranger things have happened :/
Bruno_Ono matter jow much
You spelled how wrong :rolleyes:
Posted on Reply
#43
wolf
Better Than Native
Bruno_O4k ultra with AA and RT on
Surprised you're doing that at all on a 7900XT honestly.
Bruno_Oit's utilisation and not having enough memory kills your fps, no matter jow much nvidia fanboys with 10-12GB may disagree
Having 20gb has skewed your perception. Having more VRAM of course is better, faboying has nothing to do with it and is a flawed argument and silly statement to throw out, and it doesn't change that memory allocation and utilisation are different things. I'm well aware of what running out looks and feels like, I can purposely over saturate the VRAM to test that.
Posted on Reply
#44
JustBenching
_roman_Please just test it yourself for several months with 4gb vram card, 8GB vram card and 16gb vram cards.
i did my share of processor, dram and graphic card testing in past few years.

8gb -> check whqd and control game
4gb -> check whqd and subnautica game -> textures are popping up after seconds with a nvidia 960 with 4gb vram in windows in whqd

I wanted to know which class of graphic card whqd needs for my games, what graphic card chip and how much vram. For testing I used used those giveaway epic games.
I did, I have 2 8gb cards, even in those heavy vram hot titles (hogwarts with RT for example) dropping the textures one notch works perfectly fine.
Bruno_O4k ultra with AA and RT on

it's utilisation and not having enough memory kills your fps, no matter jow much nvidia fanboys with 10-12GB may disagree
Cause we all know how great amd cards can handle RT due to the high amount of vram...
Posted on Reply
#45
Vayra86
JustBenchingI don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
Except that's not the issue I'm talking about. We all know you can tune settings. But if you keep cards for a longer time, you're already solid in that territory. I'm talking about year 4-6 in a GPU's life, when it has an easy 4 left, and can and will run games on the current API just fine, but VRAM prevents it from being a GPU that you can still play on.

For Ampere, this is already happening with all 8GB GPUs right now. They're either there already depending on what you play, or you'll find yourself tweaking far more than you'd like, OR you'll still find you can't eliminate stutter entirely.

There's no denying this, I've been experiencing this throughout using several GPUs in the past few decades, how they run out of juice, what you can do to prevent it, and what you can't do to eliminate the issue entirely. I've also seen what type of cards last FAR longer and return far greater value for money. They're never the midrangers with so-so VRAM. Those always go obsolete first, and when they do, their resale value is already in the shitter.

You mentioned Hogwarts. Now let's get real here. Say you bought an Ampere GPU. It was released some TWO years ago. Hogwarts is also about that age. That's not a great lifetime for a GPU to already be cutting back on settings, most notably texturing which is a low-GPU performance-hit to image quality. Its easy to have at maximum. All you really need is VRAM and it always helps the presentation quite a bit, which can't be said of various post effects. Its not the end of the world no, but can you really say that you've got a well balanced GPU at that point, if it can otherwise quite easily produce enough frames?

I believe we can and should expect more of products that keep rapidly increasing in price.
Posted on Reply
#46
JustBenching
Vayra86Except that's not the issue I'm talking about. We all know you can tune settings. But if you keep cards for a longer time, you're already solid in that territory. I'm talking about year 4-6 in a GPU's life, when it has an easy 4 left, and can and will run games on the current API just fine, but VRAM prevents it from being a GPU that you can still play on.

For Ampere, this is already happening with all 8GB GPUs right now. They're either there already depending on what you play, or you'll find yourself tweaking far more than you'd like, OR you'll still find you can't eliminate stutter entirely.

There's no denying this, I've been experiencing this throughout using several GPUs in the past few decades, how they run out of juice, what you can do to prevent it, and what you can't do to eliminate the issue entirely. I've also seen what type of cards last FAR longer and return far greater value for money. They're never the midrangers with so-so VRAM. Those always go obsolete first, and when they do, their resale value is already in the shitter.

You mentioned Hogwarts. Now let's get real here. Say you bought an Ampere GPU. It was released some TWO years ago. Hogwarts is also about that age. That's not a great lifetime for a GPU to already be cutting back on settings, most notably texturing which is a low-GPU performance-hit to image quality. Its easy to have at maximum. All you really need is VRAM and it always helps the presentation quite a bit, which can't be said of various post effects. Its not the end of the world no, but can you really say that you've got a well balanced GPU at that point, if it can otherwise quite easily produce enough frames?

I believe we can and should expect more of products that keep rapidly increasing in price.
Well again, the one and only vram culprit is textures (and RT to an extent but if you are talking about a 4-6 year timeline, you are not running RT regardless of VRAM).

Hogwarts is indeed 2 years old, but not many games actually use more VRAM than hogwarts maxed out even today. I literally took the or one of the worst vram hogs. Also, you have to take into account that the vram issue occurs when using maxed out RT on TOP of the ultra textures. And the context here is, sure, goddamn it ngreedia - had they used 12gb of vram on my 3060ti, I would be able to run hogwarts maxed out with RT, while now I have to lower texture quality to high to do that. GREAT, problem is, the alternative product (that has more vram) wouldn't let me run the game with RT at all (too freaking slow).

So surely, you can judge a product in a vacuum, but isn't it better to compare it to what else is available at that price point? That's where nvidia thrives, because sure a lot of their products lack this that or the other, but they are usually better overall than their competition. In which case, it doesn't matter what they lack if they still are the better choice, does it?

EG1. It's quite fascinating but eg, a measly 3060ti with it's 8gb of vram chocking it can run that game with better image quality than a 6800xt (unless you don't like RT I guess). Fascinating, no?
Posted on Reply
#47
Vayra86
JustBenchingWell again, the one and only vram culprit is textures (and RT to an extent but if you are talking about a 4-6 year timeline, you are not running RT regardless of VRAM).

Hogwarts is indeed 2 years old, but not many games actually use more VRAM than hogwarts maxed out even today. I literally took the or one of the worst vram hogs. Also, you have to take into account that the vram issue occurs when using maxed out RT on TOP of the ultra textures. And the context here is, sure, goddamn it ngreedia - had they used 12gb of vram on my 3060ti, I would be able to run hogwarts maxed out with RT, while now I have to lower texture quality to high to do that. GREAT, problem is, the alternative product (that has more vram) wouldn't let me run the game with RT at all (too freaking slow).

So surely, you can judge a product in a vacuum, but isn't it better to compare it to what else is available at that price point? That's where nvidia thrives, because sure a lot of their products lack this that or the other, but they are usually better overall than their competition. In which case, it doesn't matter what they lack if they still are the better choice, does it?

EG1. It's quite fascinating but eg, a measly 3060ti with it's 8gb of vram chocking it can run that game with better image quality than a 6800xt (unless you don't like RT I guess). Fascinating, no?
You're absolutely not wrong. I just expect more when spending this kind of money. That's really what it comes down to, what does the overall deal look like and how far do you want to go along with what a near monopolist is presenting you with. That doesn't mean I'm a big AMD fan either, by the way.
Posted on Reply
#48
JustBenching
Vayra86You're absolutely not wrong. I just expect more when spending this kind of money. That's really what it comes down to, what does the overall deal look like and how far do you want to go along with what a near monopolist is presenting you with. That doesn't mean I'm a big AMD fan either, by the way.
For sure, it's kinda silly not being able to run ultra textures on every game at any card that is 300 and above (regardless of whether ultra textures actually make a difference or not), but looking at the whole picture it's a small price to pay for what you are getting. Again, assuming you care about what you are getting (RT / DLSS etc.).

With that said I think the minimum for the new cards should be 12gb (5060, 9060XT - 300$ tier) but I don't think that's going to happen, lol.
Posted on Reply
Add your own comment
Jan 4th, 2025 11:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts