• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580 Battlemage Unboxing & Preview

At this point Intel and AMD are almost extinct as market share in discreet gpu, they are behind Nvidia in technology, power efficiency and performance.
This might be the last card they can play, lower price, if AMD even thinks that undercutting Nvidia with 30-50$ at same performance is enough, they are dead, they need to be more than 100$ cheaper considering FSR4 will be inferior, raytracing will probably be inferior and in content creation they don't exist because they don't optimize drivers for those much.
 
The GPU market is an upgrader's market. If Intel can't provide RTX 3080 level performance for less, it doesn't matter what they do. Used cards would be better.
 
270 mmq to go like 7600XT it can't be true they look like those Chinese things that you hear every now and then... RIP Intel
1733301774039.png
 
There was a live stream yesterday from PC World. They had Tom Pederson from Intel on. The AMD card he compared this to was the 7600. He spent at least 15% of his time focusing on VRAM and how the 12GB was a much better buffer than what was available on the 4060 or 7600. They even produced slides showing that the B580 was 23% faster than those 2 cards. The problem that Intel has is Xess might sound good but how many Games in your library that you play have Xess? Of all the Games I play only Redout2 supports XEss. While I don't see the 4060TI reducing in price I could see the 7600XT come down to this and the 7700XT drop into that price bracket.

The GPU market is an upgrader's market. If Intel can't provide RTX 3080 level performance for less, it doesn't matter what they do. Used cards would be better.
The 6700XT could become very popular. Or even used 6800XTs. The 8800XT is looking like a perfect upgrade for people rocking 6000 series GPUs so they should be on the market soon too.
 
Personally I look at Intel's cards as productivity cards. I mean for the money it's Da Vincy performance is insane with the AV1 support. They also do very well at CAD. Gaming is just a side kick.

Most people are raving about the gaming, but I watched a few reviews from a productivity stand point and these are hidden gems right there for video editors, drawing, Blender, etc.

Can't wait for them to launch and be available. Shame the Pro A40/50/60 never reached the shelves.
 
Last edited:
  • Like
Reactions: AcE
There was a live stream yesterday from PC World. They had Tom Pederson from Intel on. The AMD card he compared this to was the 7600. He spent at least 15% of his time focusing on VRAM and how the 12GB was a much better buffer than what was available on the 4060 or 7600. They even produced slides showing that the B580 was 23% faster than those 2 cards. The problem that Intel has is Xess might sound good but how many Games in your library that you play have Xess? Of all the Games I play only Redout2 supports XEss. While I don't see the 4060TI reducing in price I could see the 7600XT come down to this and the 7700XT drop into that price bracket.


The 6700XT could become very popular. Or even used 6800XTs. The 8800XT is looking like a perfect upgrade for people rocking 6000 series GPUs so they should be on the market soon too.
Or what about the next generation of AMD also offering a more affordable 12GB option? We're not that far from the next generation or the next generation pushing the 7600XT / 7700XT towards the B580 price point.
Don't get me wrong, I want another player in this market, but based on everything I've seen Intel seems content on getting 0 market share and having the crown for being marginally better value than AMD looking at MSRP. Great. Amazing.
At least their integrated GPU's will remain competitive against AMD I guess.
 
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch59.jpg
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch60.jpg
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch61.jpg

Now, is it 10% faster on average than RTX 4060 or by 23-25%? Intel and their stupid slides playing tricks on me again. Damn!

1733307543858.png

10% at the date of release, more performance unlocks as drivers mature? Other factors, such as conjunction of stars? Can't find slide with notes.

Also, they really compare 190W TDP card to RTX 4060 which is 115W TDP? If AMD did that, oh boy ...
B700x series is bound to consume 300+W.
 
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch59.jpg
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch60.jpg
https://www.techpowerup.com/review/intel-arc-b580-battlemage-unboxing-preview/images/arch61.jpg

Now, is it 10% faster on average than RTX 4060 or by 23-25%? Intel and their stupid slides playing tricks on me again. Damn!

View attachment 374396
10% at the date of release, more performance unlocks as drivers mature? Other factors, such as conjunction of stars? Can't find slide with notes.

Also, they really compare 190W TDP card to RTX 4060 which is 115W TDP? If AMD did that, oh boy ...
B700x series is bound to consume 300+W.
Nah 23-25% is in selected Raytracing games, including a bunch where then 4060 keels over and dies because it doesn't have enough VRAM.
Once you remove raytracing VRAM usage goes down and it puts up a better fight.
Also, it's not 25% better, it's 25% better in performance/$ at MSRP.

For convenience, see this.
 
@maxus24 When can we expect your frame generation review of the new XESS 2 technology frame gen feature? Looking forward to comparisons with FSR and DLSS frame gen
 
Nah 23-25% is in selected Raytracing games, including a bunch where then 4060 keels over and dies because it doesn't have enough VRAM.
Once you remove raytracing VRAM usage goes down and it puts up a better fight.
Also, it's not 25% better, it's 25% better in performance/$ at MSRP.
I'd also say that those numbers are probably with upscaling in use.
 
@W1zzard

Good preview

maybe can try test dxvk vs dx11 and vkd3d vs dx12, for see how much improve in dx or if vulkan still is a better option for intel gpus

:)
 
Last edited:
10% faster then 4060 at 1440p? interesting rez to show when 4060 is 1080p card not 1440. throw on top 4060 is already over a year old soon to be replaced.
Yeah, the 1440p resolution is beyond a 4060's abilities - because it's fallen off the memory bandwidth cliff by that point - so using it at 1440p isn't an apples-to-apples comparison of general compute/shader performance.

The 4060 sucks at 1440p, so saying that B580 is better than "it sucks" is disingenuous.
 
The issue I have with this card is the narrative. PC World just posted an Article with the headline "The GPU we have been asking for since the pandemic". It better live up to all the hype.
 
Hardware is OKish, however they seriously need to pick up the pace on the Software level. Their XeSS is still miles away compared to DLSS Quality, both on image and FPS.
Also, they should bring frame generation asap to their product, otherwise the RTX 4060 will still be the evident winner and buy.
Funny how you're one of the invidivuals that put the angry face on my post when you clearly have the same issues I have with Intel's shameful attempt at entering the gaming market. I was insanely optimistic about Intel's initial launch; however, I watched when they did and didn't do since then. This seems like more of the same. Now if Intel is serious and makes massive strides with their drivers, I would be willing to support them with product purchase. Until then, I have no hopes that Intel can pull this off as I watch the CPU division falling off. If I was a betting man which I'm not, I believe Intel may divert their resources from the GPU to the CPU sector.
 
what?. You think people are going to switch to INTEL from Nvidia, NOT amd because price? AMD has the “ stigma “. B a d d r I v e r s! But it is Not True!
Yet the statement “it is gonna happen”

Jeez, if only the cheaper card was purchased instead of blindly buying Nvidia, because of FOMO.
 
At some point, you have to let old technology go. Rebar support with no user intervention started with intel 10th gen & AMD Zen+. If you are willing to spend a little time, then the RebarUEFI project will add Rebar support all the way back to 4th gen Intel.


I've been pretty pleased with my a770. I always find it interesting that the folks that don't actually own one of these cards can make the most sweeping statements.


I replaced my 3060 with my a770, and saw a performance increase - the a770 is more of a 1440p card than a 1080 card.

Unlike you, I actually own 2 intel cards, my a770 is my daily driver & my a310 is in my AV1 encoding box.
I'm seriously happy for you and I actually mean it - no shade. Hardware Unboxed had a whole video listing all the issues in the games he owed when played on his Arc card. I'm not dropping that kind of money dealing with issue after issue. This video was uploaded by Tom in July of this year. If this launch is even 50% better than the last, I'll call it a success. If it's just as bad with games such as Fortnite, it's a problem and I can't get behind them.
 
I see we're already firing the furnaces for Another Major Disappointment.

Is the mindshare department right behind you? Are they whispering sweet nothing into your ears? It couldn't have anything to do with generation after generation of disappointment, would it? Maybe it's just that AMD has made it clear they dont care about budget users, so intel making advancements here is actually exciting, as opposed to another 6600xt 8G brebrand.

In intel has actually made some significant arch improvements, the B770 is gonna be actually interesting. 4070 for $400?
I agree, its all if they have made significant improvements in the new architecture it will be a very interesting proposition. I am a little excited to see some of these new mid tier GPU's in the wild. I still miss buying GTX 580's and HD 6970's for around $500 lol.

I don't really care about the ray tracing improvements mostly because at the areas these GPU's are at you are going to struggle to run it anyways unless they make some ridiculous improvements.
 
Ahh ok. So bout that 5090...
I can only speak for myself personally, coming from a gtx970 OCII (yeah, i know) but on that price-point, it looks super interesting for sure as I'm not into the latest and greatest. This will certainly be my replacement (the 580 if not the coming 7-series). Good one :)
 
I can only speak for myself personally, coming from a gtx970 OCII (yeah, i know) but on that price-point, it looks super interesting for sure as I'm not into the latest and greatest. This will certainly be my replacement (the 580 if not the coming 7-series). Good one :)

When you come from a Nvidia 970 you will be not happy with a intel, amd or nvidia gpu.

Those newer graphic cards need a mainboard with a software extension. That extension is called resizeable bar = RBAR. Some graphic card will not display a picture on certain older mainboards. I read that sometimes. That newer graphic card works flawless on other mainboards + processor + ram combination.

I wonder how many poeple will need a low class entry graphic card in the "performance range" of a nvidia 4060.

what?. You think people are going to switch to INTEL from Nvidia, NOT amd because price? AMD has the “ stigma “. B a d d r I v e r s! But it is Not True!

That is your opinion. I disagree.
When your classification is that a game will run with a amd graphic card - than it'S not true.

If your classification is like mine. Decent Gnu linux drivers. Decent Windows drivers. No awful high power consumption in idle because of high memory frequency in idle in windows and gnu gentoo linux for a msi radeon 6800 z trio. fan curves are not being applied in windows 11 pro. random driver crashes in windows 11 pro 23h2 and 24h2 with a powercolor radeon 7800XT hellhound.
long term bugs for many years. I expect as a consumer with several different amd graphic cards much higher driver quality for windows 11 pro and gnu gentoo linux. I expect that amd work on bugs for their graphic cards and their mainboards and their processors. Short summary of many known bugs which I experienced. In short: My opinion - amd has bad gpu drivers for windows 11 pro and nearly no existing gnu gentoo linux drivers. (i wrote it several times in the past - why and so on - about that graphic card topic)

It is very sad - i filled out several times that little bug symbol in the windows gpu driver software - to report back a driver bug - the bug still exists after more than 2 years. It's the fan curve - game profile bug. With everytime a new explanation + screenshot. I think I used that bug symbol for that particular bug at least more than 7 times already in past 3 years. You can check visually if a amd radeon 7800XT hellhoud graphic card fan is spinning or not after you made a game profile.
 
Last edited:
I replaced my 3060 with my a770, and saw a performance increase - the a770 is more of a 1440p card than a 1080 card.
It's objectively not, all the latest reviews here point to it that it even struggles with 1080p and is clearly slower than a 7600 XT, deals blows with a 6600 / 6600 XT. It's a inconsistent 1080p card, calling it 1440p card ... isn't right.
At this point Intel and AMD are almost extinct as market share in discreet gpu, they are behind Nvidia in technology, power efficiency and performance.
That happens when the only serious GPU builder is 1 company and the others are only in it "on the side", by mainly being CPU companies and GPUs, for both of them, only important in data center, and oh surprise AMD is very competitive with Nvidia in data center - and that's it. AMDs GPUs in data center are also huge and expensive, really not comparable to stuff like 7900 XTX which is so "let me save costs, let me save 5nm wafers". It's light and day difference and what happens when there is money to be made compared to not really so.
Or what about the next generation of AMD also offering a more affordable 12GB option? We're not that far from the next generation or the next generation pushing the 7600XT / 7700XT towards the B580 price point.
Next gen will surely be faster, if it has more vram has to be seen, but 8 GB vram is still sufficient for 1080p and even 1440p mostly. Heck the reviewer here mostly even says it's enough for 4K in his tests. Most games, not all of them, mind you. So 8 GB is still safe for 1080p at least.
sk Sapphire why they can't make a cooler that can keep their vRAM bellow 90ºC in a 22ºC ambient with over 2000 rpm. on their fans
Sounds more like a heatpad / TIM issue than a bad cooler. Sapphire is pretty solid in general.
I was insanely optimistic about Intel's initial launch; however, I watched when they did and didn't do since then. This seems like more of the same.
The whole Battlemage fiasco is 2 years to late, it should be competing with Ada / RDNA 3 instead now it will compete with the successors, just kinda bad tbh.
Hardware Unboxed had a whole video listing all the issues in the games he owed when played on his Arc card.
Yes, but to be honest, they released 50 drivers I think they did a lot of work, I will give them that. So probably now ARC is pretty usable, compared to back then when it clearly was a alpha/beta product and kinda unusable.
At 272 mm^2, this is an expensive GPU to fabricate. Intel's margin on this must be extremely low. For context, Navi 31's GCD is only 12% bigger. It is also 45% larger than the 4060 Ti's AD106. This doesn't bode well for larger Intel GPUs in this generation.
Yep, again, Intel can't get the performance with small GPUs, they have to at least build one at 250+mm² to hit performance of ones at AMD at only 200mm² in 6nm (!), Intel uses 5nm (!). Nvidia at 5nm is HALF the size at less than 150mm² (!). They are so far away from what NV/AMD can do, it's funny. It's at least 1-2 generations. Only their RT cores are somewhat impressive and i will admit that. XeSS ain't bad either. They did some things right.
 
At 272 mm^2, this is an expensive GPU to fabricate. Intel's margin on this must be extremely low. For context, Navi 31's GCD is only 12% bigger. It is also 45% larger than the 4060 Ti's AD106. This doesn't bode well for larger Intel GPUs in this generation.
Yeah, and it's on TSMC N5 which is the more expensive node that AMD use for higher-end GCDs only. The 7000-series MCDs and lower-end Navi 33 at the B580's$250 price point are on el-cheapo N6.

I don't think it's quite as bad as you make out though, the 7700XT is selling for $400 or so and that's a 25% larger die than B580 and also requires a VRM capable of delivering 25% more power. It only looks bad compared to Nvidia but they're using TSMC N4 to get their low power consumption and teeny-tiny die size with AD106.

On an unrelated note, does anyone know why Intel have singled out this one particular DP output?

1733352540861.png
 
Nvidia but they're using TSMC N4 to get their low power consumption and teeny-tiny die size with AD106.
Nvidia marketing, I'm not blaming you, 4N != N4. It's TSMC N5P with "nvidia optimisations" and a different name, just, those "optimisations" all companies that buy at TSMC will get them, it's NOTHING special. RTX 5000 on the other hand will indeed use N4P (the real thing).
On an unrelated note, does anyone know why Intel have singled out this one particular DP output?
It's the only port with DP 13.5 Gbit/s mode, the other ones are "just" UHBR 10.0.
 
Yeah, and it's on TSMC N5 which is the more expensive node that AMD use for higher-end GCDs only. The 7000-series MCDs and lower-end Navi 33 at the B580's$250 price point are on el-cheapo N6.

I don't think it's quite as bad as you make out though, the 7700XT is selling for $400 or so and that's a 25% larger die than B580 and also requires a VRM capable of delivering 25% more power. It only looks bad compared to Nvidia but they're using TSMC N4 to get their low power consumption and teeny-tiny die size with AD106.

On an unrelated note, does anyone know why Intel have singled out this one particular DP output?

View attachment 374474
The 7700 XT is using a smaller GCD and only 3 miniscule MCDs built on an older and cheaper process. It also has the luxury of using defective dies for the GCD whereas the B580 needs perfect dies. The MCD yields, using the latest available defect rate for N6, is about 98%. All in all, the 7700 XT will be nearly the same cost as the higher VRM cost should be offset by the lower chip costs. Nevertheless, as long as they make a profit, it's good enough. However, the poor PPA (Power, Performance, Area) does indicate that larger Battlemage GPUs won't be very competitive versus AMD, let alone Nvidia.

Yep, again, Intel can't get the performance with small GPUs, they have to at least build one at 250+mm² to hit performance of ones at AMD at only 200mm² in 6nm (!), Intel uses 5nm (!). Nvidia at 5nm is HALF the size at less than 150mm² (!). They are so far away from what NV/AMD can do, it's funny. It's at least 1-2 generations. Only their RT cores are somewhat impressive and i will admit that. XeSS ain't bad either. They did some things right.
Definitely, raytracing performance and XeSS are wins for Intel and AMD would do well to learn from that.
 
Back
Top