• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

Allocation is not utilisation.

Exactly.

Most of the stuff is just "Cached" whenever it needs it. It prevents from pulling from system DRAM which obviously is slower or might inflict a little performance penalty.

Other then that the whole VRAM debate is quite useless. 12GB is plenty for 1440P. Not even the latest Techpowerup test shows any GPU taxing it's full VRAM.

Not even i manage to get more then 8GB through Doom.
 
Wow, that sounds sweet. If it comes out at a reasonable price, I might grab one for a second system meant just for inference.

On the other hand, this gives me some hope that, if they release a B700 model (likely with 16GB), then a pro model with 32gb could become a thing. If they're priced more reasonably than a 5090 (like in the ~$1k range) I might grab a couple to upgrade from my current setup.
It exists and it's called an aftermarket RX 6800 XT/6900 XT.

I'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
6000 series is pretty much shit for AI workloads given that it lacks the WMMA specs the 7000 series has.
Intel's software support is also way better than AMD's in that front, too bad they lack proper hardware to make use of this stack.
AMD is the other way around, great hardware with lacking software.
I assume Intel will keep GPUs around because they are another AI product in their portfolio that could allow them to crack that market. Even if they don't earn gobs of money from the gaming side in the short term, the long term upsides are very appealing from a business perspective. They are probably the last thing you cut in order for the company to recover.

Intel cannot afford to miss another huge market like they did with mobile (tablets, phones, ect).
Afaik falcon shores is their upcoming platform for AI, which resembles a GPU, but doesn't have much to do with the current Arc lineup.
Their gaudi oferrings are pretty much meant solely for cheap inferencing, and their other GPUs offerings in that space haven't gotten any traction.

primary audience
The level of entitlement lol
 
Believe it or not, there are problems with relying on second hand used hardware from unknown sellers with unknown lives. Shocking, I know, but many prefer to buy hardware that HASNT been driven to within an inch of its life.
Apart from straight up scammers there's very low risk of having a GPU live a very short life after buying it second hand. I only buy them used, only two of them had passed as of now (some chonker from 00s and R7 370). And I bought more than 50 thereof. Failure rate in brand news is even higher. With used, you at least know it starts.
Besides, RDNA2 is only 4 years old, this means it only was mining for a couple years, and miners nowadays are mostly good guys, they don't abuse their wares. Sure, you might get worse VRAM overclocking edge but this doesn't matter all that much. My 6700 XT is more than alive and still overclocks none worse than it did when I bought it (Feb '23). RDNA2 PCBs are overengineered. Cooling systems are also overengineered. These are built to last. You must be completely paranoid to avoid these.
Or you could just learn to plug your power plugs in correctly so they dont melt.
I'm not talking connectors. I know, reading comprehension does rarely exist in people nowadays but I was talking pure PSU overload. And yes, 40 Xe2 cores at these speeds might get over 500 W power spikes. Not every PSU under the Sun can handle that. Also not that easy to cool it in the first place. Better safe than sorry.
 
Wow, that sounds sweet. If it comes out at a reasonable price, I might grab one for a second system meant just for inference.

On the other hand, this gives me some hope that, if they release a B700 model (likely with 16GB), then a pro model with 32gb could become a thing. If they're priced more reasonably than a 5090 (like in the ~$1k range) I might grab a couple to upgrade from my current setup.

6000 series is pretty much shit for AI workloads given that it lacks the WMMA specs the 7000 series has.
Intel's software support is also way better than AMD's in that front, too bad they lack proper hardware to make use of this stack.
AMD is the other way around, great hardware with lacking software.

Afaik falcon shores is their upcoming platform for AI, which resembles a GPU, but doesn't have much to do with the current Arc lineup.
Their gaudi oferrings are pretty much meant solely for cheap inferencing, and their other GPUs offerings in that space haven't gotten any traction.


The level of entitlement lol

If they release a 32GB card at a decent price I'd pick that up immediately. Forget the 5090 that's likely to be super overpriced.
 
Interesting rumor! If such a card will exist, it leads me to think that the current B580 is the full implementation of the BGM-G21 chip in all aspects except in the memory capacity.


This could answer the riddle of this information not provided by Intel, as indicated in the B580 review.
 
for the AI crowd on a budget :laugh:
 
Allocation is not utilisation.
Until it is - which is just a matter of time and games progressing.

Yeap, I've seen 22gb on my 4090 (hogwarts legacy), I'm pretty confident it's not using that much vram.
While that might be true, there is a performance penalty for getting data elsewhere. A lot of that penalty you don't see, and some of it you do see, for example when games dynamically adjust resolution or quality. Whether you see it or not, its there though - and we consider that acceptable, to an extent. But the bottom line still is: allocating does not happen because the GPU is just as well off NOT allocating that memory. Because then why would it?
 
Last edited:
good news. light a fire under amd's asz to also wake up. poor nvidia has no company at the top. so lonely.

I want to see intel push hard and amd get spooked intel will pass them that they too wake up and bring the goods.

bring xx70/xx80 level with 14gb+ for gaming and video editing
 
Until it is - which is just a matter of time and games progressing.
Technology marches forward and games get heavier, who'd have thought :rolleyes:
 
Technology marches forward and games get heavier, who'd have thought :rolleyes:
The issue with that is that you don't really have the ability to predict when a game might actually utilize what you always thought was allocated. And as the bar keeps moving, you run into the practice many gamers eventually run into: the card's not sufficient anymore., or it drops the ball in very specific situations only. Except its a shame to find it being insufficient when there's lots of core power left, because frankly that's just a waste of a good GPU (one that would otherwise run things admirably at desired settings). Its much better running out of core before your VRAM, or just about in tandem.

Another issue with lack of VRAM in a GPU is that it becomes impossible to resell at a half decent price, further killing your ROI. But what stings most is that lacking VRAM effectively turns perfectly fine chips and boards into E-waste, something determined by the presence or absence of a single or couple of GB worth of chips.

So yes, tech marches forward, indeed, better to get products that can withstand a bit of that no? I know I wouldn't want to be double checking every game review if the VRAM usage didn't actually creep up to the dreaded 10GB all the time, just ONE generation later... ;)
 
Last edited:
further killing your ROI
I mean, you're talking about a card that paid me back 2x it's purchase price in mined Eth, which I jumped on after purchasing it for gaming anyway.

It's almost 4.5 years old now, and still plays whatever I throw at it extremely well, it's very hard to me annoyed at the card for its VRAM.

From where I sit, I hear far more people who didn't buy one whigne about the 10gb than people who did, but hey maybe that's just my perspective. I wouldn't change a thing about what I bought then and how I've held out since.
 
The issue with that is that you don't really have the ability to predict when a game might actually utilize what you always thought was allocated. And as the bar keeps moving, you run into the practice many gamers eventually run into: the card's not sufficient anymore., or it drops the ball in very specific situations only. Except its a shame to find it being insufficient when there's lots of core power left, because frankly that's just a waste of a good GPU (one that would otherwise run things admirably at desired settings). Its much better running out of core before your VRAM, or just about in tandem.

Another issue with lack of VRAM in a GPU is that it becomes impossible to resell at a half decent price, further killing your ROI. But what stings most is that lacking VRAM effectively turns perfectly fine chips and boards into E-waste, something determined by the presence or absence of a single or couple of GB worth of chips.

So yes, tech marches forward, indeed, better to get products that can withstand a bit of that no? I know I wouldn't want to be double checking every game review if the VRAM usage didn't actually creep up to the dreaded 10GB all the time, just ONE generation later... ;)
I don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
 
Because of predatory pricing I am using Integrated graphics and see what works
if it works.

I tried gaming with the ryzen 7600X graphics. I tried gaming with the MSI nvidia 960 4gB GTX (~45€ bought and sold on second hand market 2023)
That is no fun.

For basic and office tasks the ryzen 7600x graphics was working well 2023 for several months in windows 11 pro / gnu gentoo linux.

the radeon 6600 is reasonable priced in my point of view. Those intel cards may be an option for windows gamers who do not care about driver quality.

I don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world

I doubt it works that way.

Please just test it yourself for several months with 4gb vram card, 8GB vram card and 16gb vram cards.
i did my share of processor, dram and graphic card testing in past few years.

8gb -> check whqd and control game
4gb -> check whqd and subnautica game -> textures are popping up after seconds with a nvidia 960 with 4gb vram in windows in whqd

I wanted to know which class of graphic card whqd needs for my games, what graphic card chip and how much vram. For testing I used used those giveaway epic games.
 
Last edited:
You can have enough ram but not enough GPU, stranger things have happened :/
no matter jow much
You spelled how wrong :rolleyes:
 
4k ultra with AA and RT on
Surprised you're doing that at all on a 7900XT honestly.
it's utilisation and not having enough memory kills your fps,
Having 20gb has skewed your perception. Having more VRAM of course is better,and it doesn't change that memory allocation and utilisation are different things. I'm well aware of what running out looks and feels like, I can purposely over saturate the VRAM to test that.
 
Last edited by a moderator:
Please just test it yourself for several months with 4gb vram card, 8GB vram card and 16gb vram cards.
i did my share of processor, dram and graphic card testing in past few years.

8gb -> check whqd and control game
4gb -> check whqd and subnautica game -> textures are popping up after seconds with a nvidia 960 with 4gb vram in windows in whqd

I wanted to know which class of graphic card whqd needs for my games, what graphic card chip and how much vram. For testing I used used those giveaway epic games.
I did, I have 2 8gb cards, even in those heavy vram hot titles (hogwarts with RT for example) dropping the textures one notch works perfectly fine.

4k ultra with AA and RT on

it's utilisation and not having enough memory kills your fps, no matter jow much nvidia fanboys with 10-12GB may disagree
Cause we all know how great amd cards can handle RT due to the high amount of vram...
 
I don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
Except that's not the issue I'm talking about. We all know you can tune settings. But if you keep cards for a longer time, you're already solid in that territory. I'm talking about year 4-6 in a GPU's life, when it has an easy 4 left, and can and will run games on the current API just fine, but VRAM prevents it from being a GPU that you can still play on.

For Ampere, this is already happening with all 8GB GPUs right now. They're either there already depending on what you play, or you'll find yourself tweaking far more than you'd like, OR you'll still find you can't eliminate stutter entirely.

There's no denying this, I've been experiencing this throughout using several GPUs in the past few decades, how they run out of juice, what you can do to prevent it, and what you can't do to eliminate the issue entirely. I've also seen what type of cards last FAR longer and return far greater value for money. They're never the midrangers with so-so VRAM. Those always go obsolete first, and when they do, their resale value is already in the shitter.

You mentioned Hogwarts. Now let's get real here. Say you bought an Ampere GPU. It was released some TWO years ago. Hogwarts is also about that age. That's not a great lifetime for a GPU to already be cutting back on settings, most notably texturing which is a low-GPU performance-hit to image quality. Its easy to have at maximum. All you really need is VRAM and it always helps the presentation quite a bit, which can't be said of various post effects. Its not the end of the world no, but can you really say that you've got a well balanced GPU at that point, if it can otherwise quite easily produce enough frames?

I believe we can and should expect more of products that keep rapidly increasing in price.
 
Last edited:
Except that's not the issue I'm talking about. We all know you can tune settings. But if you keep cards for a longer time, you're already solid in that territory. I'm talking about year 4-6 in a GPU's life, when it has an easy 4 left, and can and will run games on the current API just fine, but VRAM prevents it from being a GPU that you can still play on.

For Ampere, this is already happening with all 8GB GPUs right now. They're either there already depending on what you play, or you'll find yourself tweaking far more than you'd like, OR you'll still find you can't eliminate stutter entirely.

There's no denying this, I've been experiencing this throughout using several GPUs in the past few decades, how they run out of juice, what you can do to prevent it, and what you can't do to eliminate the issue entirely. I've also seen what type of cards last FAR longer and return far greater value for money. They're never the midrangers with so-so VRAM. Those always go obsolete first, and when they do, their resale value is already in the shitter.

You mentioned Hogwarts. Now let's get real here. Say you bought an Ampere GPU. It was released some TWO years ago. Hogwarts is also about that age. That's not a great lifetime for a GPU to already be cutting back on settings, most notably texturing which is a low-GPU performance-hit to image quality. Its easy to have at maximum. All you really need is VRAM and it always helps the presentation quite a bit, which can't be said of various post effects. Its not the end of the world no, but can you really say that you've got a well balanced GPU at that point, if it can otherwise quite easily produce enough frames?

I believe we can and should expect more of products that keep rapidly increasing in price.
Well again, the one and only vram culprit is textures (and RT to an extent but if you are talking about a 4-6 year timeline, you are not running RT regardless of VRAM).

Hogwarts is indeed 2 years old, but not many games actually use more VRAM than hogwarts maxed out even today. I literally took the or one of the worst vram hogs. Also, you have to take into account that the vram issue occurs when using maxed out RT on TOP of the ultra textures. And the context here is, sure, goddamn it ngreedia - had they used 12gb of vram on my 3060ti, I would be able to run hogwarts maxed out with RT, while now I have to lower texture quality to high to do that. GREAT, problem is, the alternative product (that has more vram) wouldn't let me run the game with RT at all (too freaking slow).

So surely, you can judge a product in a vacuum, but isn't it better to compare it to what else is available at that price point? That's where nvidia thrives, because sure a lot of their products lack this that or the other, but they are usually better overall than their competition. In which case, it doesn't matter what they lack if they still are the better choice, does it?

EG1. It's quite fascinating but eg, a measly 3060ti with it's 8gb of vram chocking it can run that game with better image quality than a 6800xt (unless you don't like RT I guess). Fascinating, no?
 
Well again, the one and only vram culprit is textures (and RT to an extent but if you are talking about a 4-6 year timeline, you are not running RT regardless of VRAM).

Hogwarts is indeed 2 years old, but not many games actually use more VRAM than hogwarts maxed out even today. I literally took the or one of the worst vram hogs. Also, you have to take into account that the vram issue occurs when using maxed out RT on TOP of the ultra textures. And the context here is, sure, goddamn it ngreedia - had they used 12gb of vram on my 3060ti, I would be able to run hogwarts maxed out with RT, while now I have to lower texture quality to high to do that. GREAT, problem is, the alternative product (that has more vram) wouldn't let me run the game with RT at all (too freaking slow).

So surely, you can judge a product in a vacuum, but isn't it better to compare it to what else is available at that price point? That's where nvidia thrives, because sure a lot of their products lack this that or the other, but they are usually better overall than their competition. In which case, it doesn't matter what they lack if they still are the better choice, does it?

EG1. It's quite fascinating but eg, a measly 3060ti with it's 8gb of vram chocking it can run that game with better image quality than a 6800xt (unless you don't like RT I guess). Fascinating, no?
You're absolutely not wrong. I just expect more when spending this kind of money. That's really what it comes down to, what does the overall deal look like and how far do you want to go along with what a near monopolist is presenting you with. That doesn't mean I'm a big AMD fan either, by the way.
 
You're absolutely not wrong. I just expect more when spending this kind of money. That's really what it comes down to, what does the overall deal look like and how far do you want to go along with what a near monopolist is presenting you with. That doesn't mean I'm a big AMD fan either, by the way.
For sure, it's kinda silly not being able to run ultra textures on every game at any card that is 300 and above (regardless of whether ultra textures actually make a difference or not), but looking at the whole picture it's a small price to pay for what you are getting. Again, assuming you care about what you are getting (RT / DLSS etc.).

With that said I think the minimum for the new cards should be 12gb (5060, 9060XT - 300$ tier) but I don't think that's going to happen, lol.
 
Back
Top