• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580 GPU Leaked Ahead of Imminent Launch

7600XT performance would aim it squarely at 1080p/2k gaming so 12GB will be fine.
I suppose you mean QHD since "2K" would make sense of marketing FullHD.
 
the failure of AMD and Intel to release properly before Xmas season repeatedly shows how disconnected those companies are from reality ...

sadly nobody to challenge them as NVIDIA already ascended to AI godhood and ceased to care about GPU lands
 
Did you watch the video where he said that Battlemage had been cancelled?
He said practically cancelled, and he is looking to be right.

Intel hasn't said anything publicly about Battlemage for months - this is why RedTechGaming is still running videos with Raj & showing roadmaps from Q3 2022.

BG31 (the b770 chip) apparently hasn't taped out yet, so we are looking late Q1 or early Q2 for the b770 - if it releases.

BG21 (the b580 chip) - low volume, so far only 3 AIBs (AsRock, Biostar, Gunnir) appear to be on board. 7600xt performance will need a price lower than $300USD to be viable in the market.

No word on a low end, unless the b580 is the low end card.

This is really disappointing - I have really been impressed with my a770 & my a310.
 
I dunno about that. NVIDIA is seen as the premium option. Their floor will never be that low, but rather they’ll make you pay for the cost of entry. When you’re in a position of dominance, you don’t even mess with low-margin segments until your position is threatened, and considering their profitability these days, I don’t think they feel much pressure to compete in the race to the bottom.

Also, I suspect NVIDIAs overhead is quite a bit higher than AMD, because they make big dies and there’s no doubt a premium to maneuver your way into whatever fab space you need.
You could be right. But AMD seems too scarred to flood the market with products, fearing that a response from Nvidia could left them with inventory that wouldn't be able to sell.

They would rather discontinue a product line than lower prices too much. This is why it’s so surprising how much money Intel is willing to lose on their dGPU aspirations. They have to be losing billions, and I don’t know that their ship will ever come in here. AMD can’t even chip away at NVIDIA, and they have way more experience and a long-standing presence in this space.
My theory, for years, is that without GPUs they will have a huge problem. Not immediately but surely in the future. Nvidia is rumored to start selling ARM SOCs for laptops and desktops next year. If they manage to get a good share of the market, thanks to their brand and GPU domination, they could slowly start abandoning Intel's and AMD's x86 platform for their own ARM platform. AMD could also start making GPUs that are performing much better with their own CPUs than CPUs from Intel. That could leave Intel with a platform that is terrible for gaming and applications that also use GPUs for compute tasks, only usable for basic office usage. I think they know that no matter how much money they will lose by trying to cover the distance with AMD and Nvidia in GPUs, they have to do it.
Now this could be a future that is 5-10-15-20 years away. But Intel is a company that would want to be still here after 30-50-100 years, so they have to look far ahead. If 20 years of upsence from the descrete GPU market makes them so desperate to have to lose billions every year to try to become competitive, imagine if they where deciding to abandon GPUs today and had to reconsider it after 10 years time. Nvidia and even AMD, or who knows, even Qualcomm and Apple would be so far ahead, that it will be pointless to even try. And that's why I never agree with those saying that Intel will abandon it's GPU business after ARC's failure. They just can't. It will be suicidal except if they manage to find huge money stream elsewhere. Well, buying Altera didn't worked for them, buying Mobileye didn't worked, Gaudi doesn't seems to work either, their x86 business is the only one that still works for them.
 
the failure of AMD and Intel to release properly before Xmas season repeatedly shows how disconnected those companies are from reality ...

sadly nobody to challenge them as NVIDIA already ascended to AI godhood and ceased to care about GPU lands

They probably realized there’s no need to release before the holidays. People would expect discounts, right off the bat. They probably do well enough discounting old models and clear out the old stock before pushing new cards to the market at whatever price they want.
 
I was actually very impressed with their previous Arc offerings. especially how well it did with raytracing. I am very excited to see what Intel GPU's will be capable of in the future.
I agree. The only downside is the driver which I feel was perhaps too rough on the edge at launch. The cards are very feature rich when compared to Nvidia and AMD.

The main problem with this card is that it will be on the market at least a year too late. It will be on the market after the Christmas and just before the next gen from nVidia and AMD.
This is the same problem as the Arc Alchemist. It’s a cycle late and will face very strong competition from next gen cards. Which is why the Arc Alchemist cards mostly competes on price even though it’s feature rich. The very rough driver in the early stages of the cycle did not help either.
 
GDDR6? These Intel GPUs should use GDDR7 and have a lot of cache memory to be able to compete with AMD and Nvidia GPUs.
 
GDDR6? These Intel GPUs should use GDDR7 and have a lot of cache memory to be able to compete with AMD and Nvidia GPUs.
Why? A GPU of this speed doesnt need GDDR7.

I agree. The only downside is the driver which I feel was perhaps too rough on the edge at launch. The cards are very feature rich when compared to Nvidia and AMD.


This is the same problem as the Arc Alchemist. It’s a cycle late and will face very strong competition from next gen cards. Which is why the Arc Alchemist cards mostly competes on price even though it’s feature rich. The very rough driver in the early stages of the cycle did not help either.
Driver issues is an understatement. They were straight up experimental. Fun to toy with, but man that launch was rough.

Long term support is also a major question. Their integrated GPUs get abandoned typically in 3-5 years. Will the desktop GPUs follow the same path? AMD usually hits 6 years and nvidia routinely gets a full decade.
 
I agree. The only downside is the driver which I feel was perhaps too rough on the edge at launch. The cards are very feature rich when compared to Nvidia and AMD.

To be fair, that was to be expected; I am very pleased to see how Intel continuously roll out driver updates, which actually fixes a lot of performance issues. This is something that AMD can take a hint with, AMD still doesn't roll out fixes fast enough for my taste, it's one of the reasons I moved to nVidia long ago.

Though Intel might be the "underdog" for now, they are the only company that generates excitement on my end when new models are announced, they just need to get faster.

When I think about nVidia, the first thought that comes to mind, "minimal performance gain, twice the price, you are going to being ripped off" When I think about AMD, "Why haven't you improved your features 2 generations onward, why don't you fix this driver issue, you expect me to pay what though you are not on top?"

Anyways, I am rambling, I guess I am just pleased there is a third player on the boards. Then again, these days, I get price fixing vibes all the more often so maybe the "competition" won't mean much in the end.
 
He said practically cancelled, and he is looking to be right.

Intel hasn't said anything publicly about Battlemage for months - this is why RedTechGaming is still running videos with Raj & showing roadmaps from Q3 2022.

BG31 (the b770 chip) apparently hasn't taped out yet, so we are looking late Q1 or early Q2 for the b770 - if it releases.

BG21 (the b580 chip) - low volume, so far only 3 AIBs (AsRock, Biostar, Gunnir) appear to be on board. 7600xt performance will need a price lower than $300USD to be viable in the market.

No word on a low end, unless the b580 is the low end card.

This is really disappointing - I have really been impressed with my a770 & my a310.
Yeah I saw the same video. Whenever I hear someone say "MILD said Arc was cancelled", it makes me question whether they actually watched the video in question. So far, on this topic at least, what he said seems to bear out unfortunately. Supposedly, Intel already axed the "enthusiast" BG10 die that had 56 Xe Cores.
 
the failure of AMD and Intel to release properly before Xmas season repeatedly shows how disconnected those companies are from reality ...

sadly nobody to challenge them as NVIDIA already ascended to AI godhood and ceased to care about GPU lands
I am sure this is because client computing market is a smaller makeup of their overall revenue now. Most profit come from B2B sales like data centers, corporate systems and integrators.
 
the failure of AMD and Intel to release properly before Xmas season repeatedly shows how disconnected those companies are from reality ...

sadly nobody to challenge them as NVIDIA already ascended to AI godhood and ceased to care about GPU lands

They keep pissing all over the gaming community. One day they will wake up and find everyone has moved on.

Lately being playing older games because newer games is poorly optimised and need a $1K GPU to run properly.
 
Yeah I saw the same video. Whenever I hear someone say "MILD said Arc was cancelled", it makes me question whether they actually watched the video in question. So far, on this topic at least, what he said seems to bear out unfortunately. Supposedly, Intel already axed the "enthusiast" BG10 die that had 56 Xe Cores.

They didn't watch it. - they are repeating what they read from someone else.

MLiD appears to be tied more into the manufacturing & retailers than marketing.

I am hoping we get a b770 next year - my a770 has been great for productivity, it's a solid 1440p gaming card and my a310 is chugging away converting Blu-Ray rips to AV1.
They keep pissing all over the gaming community. One day they will wake up and find everyone has moved on.
Not likely - gamers aren't the sharpest knives in the drawer. If they were - they would all be running AMD cards.

The only reason to run with an Nvidia card is if you need CUDA. Otherwise, they are outperformed by both AMD & Intel at their respective price points.
 
Not likely - gamers aren't the sharpest knives in the drawer. If they were - they would all be running AMD cards.
Just flippant fanboyism. Many people don't buy AMD cards because of their absurd power draw (although high end Nvidia cards have the same exact issue).
Nobody wants to spend $500 on a 300W 6900XT vs a 200W RTX 4070 if they have to pay for their own electricity unless they want a barely functioning space heater in the winter just to play video games. The same will go for 6 months from now when that perf target is hopefully drawing 150-200W.

Because people keep on "forgetting" these cards are years old at this point.

Unless AMD has some magical 4060Ti/4070 perf card at sub-$500 that has sub 225W power draw that I am just blanking on.
 
Just flippant fanboyism. Many people don't buy AMD cards because of their absurd power draw (although high end Nvidia cards have the same exact issue).
Nobody wants to spend $500 on a 300W 6900XT vs a 200W RTX 4070 if they have to pay for their own electricity unless they want a barely functioning space heater in the winter just to play video games. The same will go for 6 months from now when that perf target is hopefully drawing 150-200W.

Because people keep on "forgetting" these cards are years old at this point.

Unless AMD has some magical 4060Ti/4070 perf card at sub-$500 that has sub 225W power draw that I am just blanking on.
Speaking of fanboyism - comparing a card from 2020 (AMD) and a card from 2022 (Nvidia) - what would you call that?

The 6900xt & it's Nvidia equivalent (RTX 3080) have the same power draw.

A 7800xt outperforms a 4070 for less money. A theoretical 30 watt difference isn't enough to matter.
 
Aren't you going to play the monkey game?

Maybe and if they release a card that can run the game properly for ~350$ then sure why not. Then again it's just a matter of time maybe in 10 years a new 350$ card will surely be able to do just that.

Maybe I'll get into the habit of living 5 years in the past and get old tech at good prices and play games that has been released 3 - 5 years.

Atleast the games should have all their patches by then and optimised.
 
Maybe and if they release a card that can run the game properly for ~350$ then sure why not. Then again it's just a matter of time maybe in 10 years a new 350$ card will surely be able to do just that.

Maybe I'll get into the habit of living 5 years in the past and get old tech at good prices and play games that has been released 3 - 5 years.

Atleast the games should have all their patches by then and optimised.

It is very seldom I will play a game on release. To add on to what you said graphic card drivers are usually stable as well.
 
Given that all Intel GPUs are low end, I’m not sure there is a large customer pool for middle low end. So why release the B580 model first?

Last time the A580 was released last, about a year after the A770.
Because they are having problems with the largest G31 silicon that's why. They aren't delaying it by choice, it's because they need to.

I heard that Battlemage had 5 different iterations. There was a quad tile, 2560EU setup. There was also a 64 Xe core setup with 112MB cache, and now we have this one.
BG21 (the b580 chip) - low volume, so far only 3 AIBs (AsRock, Biostar, Gunnir) appear to be on board. 7600xt performance will need a price lower than $300USD to be viable in the market.
$229 to $259, depending on whether it's B570 or B580, and various other things such as Intel LE card.
 
Back
Top