Monday, November 25th 2024

Intel Arc B580 GPU Leaked Ahead of Imminent Launch

Intel's upcoming Arc B580, part of the next-generation Battlemage GPU series, has been leaked, giving tech enthusiasts an early glimpse of what's to come. Amazon unintentionally revealed the ASRock Arc B580 Steel Legend graphics card, confirming key specifications and indicating that we are about to get the new discrete GPU contenders. The Arc B580 will feature Intel's Xe2-HPG architecture, with a factory overclocked speed of 2.8 GHz. The board is equipped with 12 GB of GDDR6 memory running at a clock speed of 19 Gbps. While the core count remains unconfirmed, the product listing highlights its support for 8K resolution (7680 x 4320), and dual 8-pin power connectors. Two variants are expected: the Steel Legend revealed in Amazon's listing, and the Challenger, which has yet to appear in detail.

The leak also aligns with Intel's upcoming plan for the Battlemage series, as references to the Arc B-Series have been uncovered in Intel's official documentation, including the "Intel oneAPI Math Kernel" and "Intel Platform Monitoring Technology" files. The Arc B580 will lead the charge, potentially alongside a slightly less powerful Arc B570 model. Targeting the mid-range market, Intel seems poised to offer a competitive alternative to rival GPUs. AMD will also stick to mid-range segment with RDNA 4, so the battle of "midranges" is the one to watch this upcoming season. With support pages for the B-Series already live and public leaks increasing, an official announcement appears imminent—possibly within days. We previously learned that December is the month of Battlemage launch to steal the CES's attention for January.
Sources: VideoCardz #1, VideoCardz #2
Add your own comment

20 Comments on Intel Arc B580 GPU Leaked Ahead of Imminent Launch

#1
john_
I am watching the latest video from - you all love him here - Moore's Law Is Dead saying that B580 is expected to be performing like an RX 7600 XT with 12GB of VRAM and I wonder if Intel can market such a card at a low price of less than $230-$250 and still expect to make some profit. It will be really difficult to sell at a higher price, even against current models.
Posted on Reply
#2
bonehead123
So basically a mid-range card with mid-range performance & specs....

And if they price it accordingly, it will probably fly off the shelves, if not, then it will be DOA in most respects :D
Posted on Reply
#3
kapone32
Wow that is kind of sad if the performance is in the range of the 7600XT, with 12GB of VRAM instead of 16. It would be interesting to see the difference in VRAM intensive Games.
Posted on Reply
#4
Darmok N Jalad
The toughest part is not knowing what driver support is going to look like. I know Intel has been pretty diligent on drivers so far, but it’s another generation and Intel has been cutting costs. You’re taking a bit of a chance in order to save some money. Plus there’s not much holding financially sound AMD and NVIDIA from just doing some price cuts if things start to get interesting.
Posted on Reply
#5
bug
So, 2.0GHz and 12GB of 19Gbps VRAM. That's all that leaked.
Posted on Reply
#6
Legacy-ZA
I was actually very impressed with their previous Arc offerings. especially how well it did with raytracing. I am very excited to see what Intel GPU's will be capable of in the future.
Posted on Reply
#7
lexluthermiester
kapone32Wow that is kind of sad if the performance is in the range of the 7600XT, with 12GB of VRAM instead of 16. It would be interesting to see the difference in VRAM intensive Games.
Not really. If the price is right, it's a good thing.
Legacy-ZAI was actually very impressed with their previous Arc offerings. especially how well it did with raytracing. I am very excited to see what Intel GPU's will be capable of in the future.
Exactly!
Posted on Reply
#8
Solaris17
Super Dainty Moderator
lexluthermiesterNot really. If the price is right, it's a good thing.
Agreed. I also think people around the net are missing that 5xx is the midrange lineup. These aren’t there high end models. A 5xx card getting 12gb of vram is good imo!
Posted on Reply
#9
lexluthermiester
Solaris17A 5xx card getting 12gb of vram is good imo!
There's that too. If the B750/B770 have 16GB/24GB, things are going to be interesting.
Posted on Reply
#10
Solaris17
Super Dainty Moderator
lexluthermiesterThere's that too. If the B750/B770 have 16GB/24GB, things are going to be interesting.
Especially if you consider it’s predecessor the A580 only got 8.
Posted on Reply
#11
bug
Legacy-ZAI was actually very impressed with their previous Arc offerings. especially how well it did with raytracing. I am very excited to see what Intel GPU's will be capable of in the future.
I thought it was a good first step as well. Too power hungry, maybe it could have used a bit better scaling so it didn't top out where it did. Also drivers could have been better at launch. But considering all the things that could go wrong, I thought it wasn't bad. And I said it back then: first gen is ok, second gen will probably be make or break.

I wasn't particularly impressed by RT though. It was an easy win because AMD tried to ignore RT and thus lagged (lags?) behind. But again, another area where Intel could have faltered, yet managed to do pretty well.
Posted on Reply
#12
Macro Device
12 GB VRAM isn't bad but it's kinda meaningless if we don't know the actual performance level. Both GTX Titan Maxwell and RTX 3080 Ti are 12 GB GPUs with a lightyear difference in actual gaming and productivity finesse.
If it beats 6700 XT then it deserves being closer to $300.
If it doesn't beat, nor does it lose to 6700 XT then the price tag should be as close to $200 as possible.
If it doesn't even match 6700 XT then why does it exist (don't forget it has TWO 8-pin power connectors!).
Posted on Reply
#13
claylomax
john_I am watching the latest video from - you all love him here - Moore's Law Is Dead saying that B580 is expected to be performing like an RX 7600 XT with 12GB of VRAM and I wonder if Intel can market such a card at a low price of less than $230-$250 and still expect to make some profit. It will be really difficult to sell at a higher price, even against current models.
Did you watch the video where he said that Battlemage had been cancelled?
Posted on Reply
#14
kapone32
Solaris17Agreed. I also think people around the net are missing that 5xx is the midrange lineup. These aren’t there high end models. A 5xx card getting 12gb of vram is good imo!
Well the 76000XT was as low as $269 Canadian for the As Rock variant and people still bashed it even though it came with a 16GB frame buffer. In the age of SAM and Rebar that can make a real difference vs a 6600XT. Thing is though that the GPU is not the fastest and as a result is not that much faster than a 6650XT in non VRAM intensive scenarios. I struggle to understand how in this narrative based world that the fastest period is the only solution. Value no longer holds any sway as the market is so even though a 5090 will cost more than most AMD based builds for 80-105% of that but most people will still only get more entrenched.

I remember posting when Intel announced they were investing in the GPU space. That the day would come when we would see Intel and AMD start offering OEM packages for deep discounts to HP,Lenovo and others to create their own eco systems and then Nvidia tried to buy ARM. Now they are making their own branded CPU. This better be received well by the narrative or we will all regret the loss of some aspects of DIY. You know companies like HP would love to take us back to the days of Compaq and it seems some of the manufacturers are keen tomake that happen as well. Kudos to As rock they are about the only GPU vendor that makes and stcoks just about every card from Nvidia,AMD and Intel.
bugI thought it was a good first step as well. Too power hungry, maybe it could have used a bit better scaling so it didn't top out where it did. Also drivers could have been better at launch. But considering all the things that could go wrong, I thought it wasn't bad. And I said it back then: first gen is ok, second gen will probably be make or break.

I wasn't particularly impressed by RT though. It was an easy win because AMD tried to ignore RT and thus lagged (lags?) behind. But again, another area where Intel could have faltered, yet managed to do pretty well.
RT is the best snake oil Nvidia ever sold the market.
Posted on Reply
#15
bug
claylomaxDid you watch the video where he said that Battlemage had been cancelled?
Must be the best cancelled card ever?
Posted on Reply
#16
Daven
Given that all Intel GPUs are low end, I’m not sure there is a large customer pool for middle low end. So why release the B580 model first?

Last time the A580 was released last, about a year after the A770.
Posted on Reply
#17
claylomax
bugMust be the best cancelled card ever?
I was being ...
forget it; it doesn't work online :laugh:
Posted on Reply
#18
Eternit
The main problem with this card is that it will be on the market at least a year too late. It will be on the market after the Christmas and just before the next gen from nVidia and AMD.
Posted on Reply
#19
Legacy-ZA
@W1zzard

Speaking of upcoming GPU's... in future reviews, will you be taking notes on which GPUs have fuses on the PCB? I have seen some horrible stuff lately, it would be great to have some extra surety, especially with that idiotic power connector causing issues here and there. A fuse might just save ones GPU and make it salvageable when it falls out of warranty.
Posted on Reply
#20
john_
claylomaxDid you watch the video where he said that Battlemage had been cancelled?
Is this a question out of curiosity or do you consider it an argument? Because it is not.
kapone32Wow that is kind of sad if the performance is in the range of the 7600XT, with 12GB of VRAM instead of 16. It would be interesting to see the difference in VRAM intensive Games.
It depends on pricing. If a 12GB model comes at an MSRP of less than $250, it will do good to us consumers. There are models in the market with 12GBs or more at low prices, like the A750/A770 and the 12GB RTX 3060, but with the exception of the A750 that had a $280 MSRP, the rest started at much higher MSRPs. We need to see that memory capacity at low MSRP. At least before 2040 because it is becoming ridiculous.
Darmok N JaladThe toughest part is not knowing what driver support is going to look like. I know Intel has been pretty diligent on drivers so far, but it’s another generation and Intel has been cutting costs. You’re taking a bit of a chance in order to save some money. Plus there’s not much holding financially sound AMD and NVIDIA from just doing some price cuts if things start to get interesting.
Drivers will lack compared to AMD and Nvidia, not because Intel is cutting costs, but because no one is developing games on ARC hardware. The same thing that AMD was facing 15 years ago, where all games where optimized for Intel CPUs and Nvidia GPUs at day one of their release and that was the main reason AMD was getting all that hatred for it's drivers.
As for pricing, for years I am saying that AMD can't price it's hardware much lower than Nvidia's. Because it doesn't have the wafers, the money, the architecture advantage, the brand name, the tech press support and the consumer support, to even start imagining of winning a price war against Nvidia. But Intel might have a better chance here, because no one takes them seriously in the GPU market. So they could start with aggressive pricing and hope that AMD will not respond with price drops, out of fear of dragging Nvidia to a price war that in the end only Nvidia can win.
kapone32RT is the best snake oil Nvidia ever sold the market.
DLSS is. The marketing campaign against FSR proves that. While Nvidia is using some games as technology demos for advertising RT performance, with those games implementing Path Tracing that demolishes everything less than an RTX 4080, the real huge online campaign the last 2 years is against FSR. Even if FSR 4.0 is better than any future DLSS(it wouldn't, but for the sake of the argument let's say it will) we will be reading for years that "DLSS is better than real life and FSR is cr@p".
DavenGiven that all Intel GPUs are low end, I’m not sure there is a large customer pool for middle low end. So why release the B580 model first?

Last time the A580 was released last, about a year after the A770.
Because Nvidia's and probably AMD's lower end models will come much latter. Nvidia will only release models of the RTX 5000 that will be priced probably north of $800 for the first few months, AMD might play ball between $350 and $600 in the beginning, so B580 will only have to go against 7600 and 4060 models. If it is better than those AMD and Nvidia models and priced at an attractive MSRP, it could offer Intel some possitive publicity, which will be a welcomed change considering all the negatives around Intel lately.
Posted on Reply
Add your own comment
Nov 25th, 2024 14:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts