Monday, November 25th 2024
Intel Arc B580 GPU Leaked Ahead of Imminent Launch
Intel's upcoming Arc B580, part of the next-generation Battlemage GPU series, has been leaked, giving tech enthusiasts an early glimpse of what's to come. Amazon unintentionally revealed the ASRock Arc B580 Steel Legend graphics card, confirming key specifications and indicating that we are about to get the new discrete GPU contenders. The Arc B580 will feature Intel's Xe2-HPG architecture, with a factory overclocked speed of 2.8 GHz. The board is equipped with 12 GB of GDDR6 memory running at a clock speed of 19 Gbps. While the core count remains unconfirmed, the product listing highlights its support for 8K resolution (7680 x 4320), and dual 8-pin power connectors. Two variants are expected: the Steel Legend revealed in Amazon's listing, and the Challenger, which has yet to appear in detail.
The leak also aligns with Intel's upcoming plan for the Battlemage series, as references to the Arc B-Series have been uncovered in Intel's official documentation, including the "Intel oneAPI Math Kernel" and "Intel Platform Monitoring Technology" files. The Arc B580 will lead the charge, potentially alongside a slightly less powerful Arc B570 model. Targeting the mid-range market, Intel seems poised to offer a competitive alternative to rival GPUs. AMD will also stick to mid-range segment with RDNA 4, so the battle of "midranges" is the one to watch this upcoming season. With support pages for the B-Series already live and public leaks increasing, an official announcement appears imminent—possibly within days. We previously learned that December is the month of Battlemage launch to steal the CES's attention for January.
Sources:
VideoCardz #1, VideoCardz #2
The leak also aligns with Intel's upcoming plan for the Battlemage series, as references to the Arc B-Series have been uncovered in Intel's official documentation, including the "Intel oneAPI Math Kernel" and "Intel Platform Monitoring Technology" files. The Arc B580 will lead the charge, potentially alongside a slightly less powerful Arc B570 model. Targeting the mid-range market, Intel seems poised to offer a competitive alternative to rival GPUs. AMD will also stick to mid-range segment with RDNA 4, so the battle of "midranges" is the one to watch this upcoming season. With support pages for the B-Series already live and public leaks increasing, an official announcement appears imminent—possibly within days. We previously learned that December is the month of Battlemage launch to steal the CES's attention for January.
46 Comments on Intel Arc B580 GPU Leaked Ahead of Imminent Launch
And if they price it accordingly, it will probably fly off the shelves, if not, then it will be DOA in most respects :D
I wasn't particularly impressed by RT though. It was an easy win because AMD tried to ignore RT and thus lagged (lags?) behind. But again, another area where Intel could have faltered, yet managed to do pretty well.
If it beats 6700 XT then it deserves being closer to $300.
If it doesn't beat, nor does it lose to 6700 XT then the price tag should be as close to $200 as possible.
If it doesn't even match 6700 XT then why does it exist (don't forget it has TWO 8-pin power connectors!).
I remember posting when Intel announced they were investing in the GPU space. That the day would come when we would see Intel and AMD start offering OEM packages for deep discounts to HP,Lenovo and others to create their own eco systems and then Nvidia tried to buy ARM. Now they are making their own branded CPU. This better be received well by the narrative or we will all regret the loss of some aspects of DIY. You know companies like HP would love to take us back to the days of Compaq and it seems some of the manufacturers are keen tomake that happen as well. Kudos to As rock they are about the only GPU vendor that makes and stcoks just about every card from Nvidia,AMD and Intel. RT is the best snake oil Nvidia ever sold the market.
Last time the A580 was released last, about a year after the A770.
forget it; it doesn't work online :laugh:
Speaking of upcoming GPU's... in future reviews, will you be taking notes on which GPUs have fuses on the PCB? I have seen some horrible stuff lately, it would be great to have some extra surety, especially with that idiotic power connector causing issues here and there. A fuse might just save ones GPU and make it salvageable when it falls out of warranty.
As for pricing, for years I am saying that AMD can't price it's hardware much lower than Nvidia's. Because it doesn't have the wafers, the money, the architecture advantage, the brand name, the tech press support and the consumer support, to even start imagining of winning a price war against Nvidia. But Intel might have a better chance here, because no one takes them seriously in the GPU market. So they could start with aggressive pricing and hope that AMD will not respond with price drops, out of fear of dragging Nvidia to a price war that in the end only Nvidia can win. DLSS is. The marketing campaign against FSR proves that. While Nvidia is using some games as technology demos for advertising RT performance, with those games implementing Path Tracing that demolishes everything less than an RTX 4080, the real huge online campaign the last 2 years is against FSR. Even if FSR 4.0 is better than any future DLSS(it wouldn't, but for the sake of the argument let's say it will) we will be reading for years that "DLSS is better than real life and FSR is cr@p". Because Nvidia's and probably AMD's lower end models will come much latter. Nvidia will only release models of the RTX 5000 that will be priced probably north of $800 for the first few months, AMD might play ball between $350 and $600 in the beginning, so B580 will only have to go against 7600 and 4060 models. If it is better than those AMD and Nvidia models and priced at an attractive MSRP, it could offer Intel some possitive publicity, which will be a welcomed change considering all the negatives around Intel lately.
Also, I suspect NVIDIAs overhead is quite a bit higher than AMD, because they make big dies and there’s no doubt a premium to maneuver your way into whatever fab space you need. It’s quite similar for Intel for CPUs. They would rather discontinue a product line than lower prices too much. This is why it’s so surprising how much money Intel is willing to lose on their dGPU aspirations. They have to be losing billions, and I don’t know that their ship will ever come in here. AMD can’t even chip away at NVIDIA, and they have way more experience and a long-standing presence in this space.