Jan 22nd, 2025 12:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts

Thursday, August 4th 2022

Intel Arc Board Partners are Reportedly Stopping Production, Encountering Quality Issues

According to sources close to Igor Wallossek from Igor's lab, Intel's upcoming Arc Alchemist discrete graphics card lineup is in trouble. As the anonymous sources state, certain add-in board (AIB) partners are having difficulty adopting the third GPU manufacturer into their offerings. As we learn, AIBs are sitting on a pile of NVIDIA and AMD GPUs. This pile is decreasing in price daily and losing value, so it needs to be moved quickly. Secondly, Intel is reportedly suggesting AIBs ship cards to OEMs and system integrators to start the market spread of the new Arc dGPUs. This business model is inherently lower margin compared to selling GPUs directly to consumers.

Last but not least, it is reported that at least one major AIB is stopping the production of custom Arc GPUs due to quality concerns. What this means is yet to be uncovered, and we have to wait and see which AIB (or AIBs) is stepping out of the game. All of this suggests that the new GPU lineup is on the verge of extinction, even before it has launched. However, we are sure that the market will adapt and make a case for the third GPU maker. Of course, these predictions should be taken with a grain of salt, and we await more information to confirm those issues.
Source: Igor's Lab
Add your own comment

133 Comments on Intel Arc Board Partners are Reportedly Stopping Production, Encountering Quality Issues

#51
freeagent
Honestly, I was hoping Intel would come to the world of GPU's and help keep everyone on point.

They were probably bribed to not finish :D
Posted on Reply
#52
efikkan
This kind of claim of quality concern doesn't tell us much without knowing what kind of concern they are talking about. This could mean a whole range of problems ranging from a bad reference design/spec, die package quality to all kinds of problems in the architecture, if it's true at all.

One thing that's important to keep in mind; the lessons Intel have learned from running this generation of GPUs will probably not be implemented in their next generation, but the one following, as their next gen is probably in later design stages by now.
AssimilatorIntel wants AIBs to eat shit on profits by selling to OEMs and system integrators to gain marketshare, yet at the same time Intel is unwilling to subsidise the prices of their own fucking GPUs in order to build that same marketshare.
Assuming these claims are true at all, of course.

Anyways, I'm puzzled over this claim, as AIBs are usually not selling volumes to OEMs, so asking AIBs to sell to OEMs seems like a strange request, unless the AIBs have already made the cards and "can't" sell them. Nvidia and AMD usually sell reference designs to OEMs directly, as there is usually no reason for an AIB to get involved in those "bare bone" cards. And as we know, OEM GPUs are high volume and low margin products.
Posted on Reply
#53
OneMoar
There is Always Moar
Compared to Aftermarket OEM. is the bigger cash flow initial investment notwithstanding
the amount of people that buy dGPU's is incomprehensibly small compared to the volumes that dell and hp move
Posted on Reply
#54
Sybaris_Caesar
Every passing day and Moore's Law Is Dead's leak/report/hearsay seems more and more likely. First time I heard about it was in LTT's WAN Show. And those guys rarely if ever cover rumor mill.

Posted on Reply
#55
eidairaman1
The Exiled Airman
truehighroller1Ahhhh, recession across the globe is affecting the rich now. I didn't see every market possible crashing when they raised prices of food, who would have thunk it?

Oh, well I'm canceling everything possible not needed right now to save money so I can still eat good and raise my family. GOOD LUCK EVERYONE ELSE!
Yup I'm using my hard earned pay to maintain my vehicle.

Pc upgrades are dead last for me
MusselsYou're all weird, is what you are.

I wonder if this was an issue with the GPU's themselves, or a component? Look what happened to the 30 series at launch with the MOSFET issue
So does the toilet water rotate in the opposite direction? Do you drive on the wrong side of the road? ;)
freeagentHonestly, I was hoping Intel would come to the world of GPU's and help keep everyone on point.

They were probably bribed to not finish :D
I disagree, intel wouldn't have caused gpu prices to change as they would have marked theirs where both AMD and nvidia have theirs, due to the time they engineered this trash just to recover their losses.

This to me is turning into another i740, or larrabee-crappy drivers as proven by the intel igps, aka a never start/ Dead on Arrival
Posted on Reply
#56
Mussels
Freshwater Moderator
Intel want intel only builds from OEMs - no third party wifi, ethernet, graphics, etc.,

I'm guessing they don't have enough GPUs to do so with whatever fault has popped up, and want those cards meant for retail to fill in the gap?
Posted on Reply
#57
watzupken
Since nothing is mentioned about this supposed QC issues, it may be too early to conclude it’s a bad chip. It could be reference design that needs to improve, but hey, there were issues with Ampere release as well if you recall. Considering that Intel is inexperienced in designing dGPU, I don’t think they will deliver a flawless product. Objectively, ARC seems to be in a mess now, but seriously, nobody including Intel should expect flawless success in the first attempt. Deep pockets only mean more people working on the R&D and development, more money for cutting edge technology and marketing. But that does not mean they WILL get it right. Veterans like Nvidia and AMD have stumbled before and expected to stumble at some point.
I do hope Intel keep up the tempo because I feel they have crossed a significant milestone to deliver a working dGPU where I think performance may not be great, but still decent, letdown mostly due to driver stability which should not be impossible to fix.
Posted on Reply
#58
Quicks
Too many delays killed it. This had to be released when there was a shortage of GPU due to the cryptocurrency rush. Now is just too late unless they can price it right, but ROI might not allow this.
Posted on Reply
#59
ZoneDymo
INSTG8RFury and Vega were his “Hype” Neither lived up to it. Yes I still owned both…
its so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.
Posted on Reply
#60
Chas778
Intel needs to broken into two companies one designing chips and the other that runs the fabrication. The whole design and fab thing STILL isn't working for them.

If they want to continue on this course they will end up relying on low margin high quantity products to make money. The high end desktop and server is being consumed by Apple ARM AMD and Nvidia.

I seriously doubt Intel will be able to get their shit together.
Posted on Reply
#61
Bomby569
Everybody loses, i don't get all the happiness
Posted on Reply
#62
Vayra86
TheoneandonlyMrKShit, Vega was his best work!!.
You might be right there. Not sure if that's a positive remark or not though :D

1,5~2 years too late, zero profit margin, built on rare HBM and no future roadmap in gaming ;)
Sounds remarkably like Arc, sans HBM. Heck it even shares the early availability woes!
Bomby569Everybody loses, i don't get all the happiness
There wasn't anything to lose with Intel, we didn't have anything yet. Except Raja showing us chips. And more chips.
Posted on Reply
#63
Flanker
Well, this time it got further than larrabee
Posted on Reply
#64
Vayra86
ZoneDymoits so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.
You must have had some pretty big blinders on

Amd/comments/6pwx3e
www.overclock3d.net/news/gpu_displays/amd_vega_marketing_takes_shots_at_nvidia_s_volta_architecture/2

This wasn't just a GPU man. This was 'an uprising' - 'a revolution', it was the mobilization of gamers worldwide to finally show team green what's what.
This was the culmination of efforts post RX480, Raja's favorite dropped baby:
DeathtoGnomesWow look at all of you bashing Intel! If Intel didnt try to rush to market before the mining bubble collapsed everyone would praising Raja.

But since we got the Pin The Tail On The Donkey game out....Intel still employs Raja?
Hang on.. this was Intel 'rushing to market'? :oops:
Posted on Reply
#65
Bomby569
Vayra86There wasn't anything to lose with Intel, we didn't have anything yet. Except Raja showing us chips. And more chips.
We lost a possible competitor, i doubt they will continue the bet on gpu for much longer being pressed by the financial results. Either Gelsinger gets fired or he will can it to keep his job
Posted on Reply
#66
Jism
ZoneDymoits so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.
Even tho your Fury or Vega 56 worked good as a graphics card, it was originally designed as a compute based card. Yes both excelled when you threw in computational workloads, but performed bad to worse when games where up. At sometimes quite more power then the competition (Nvidia). It was a card that was basicly wasting quite some resources to archieve the same. The Polaris was a good 1080p card, but a clear demonstration that the card was pretty much clocked at such levels that it went past it's efficiency in order to compete with the 1060.

They excel at mining, but they had issues with for example DX11 titles. Nvidia was just faster and it took AMD quite some driver revisions to get it on par.

Now the exact same thing is happening with Intel and it's arc lineup. They are all left-overs from a computational graphics card line. They do provide graphics but usually at 50% higher power consumption compared to the rest. On top of that there's quite some driver issues going on which just makes the card a complete gamble when you buy and try it. The delays where simple revisions because the card just did'nt perform as expected, or did'nt meet quality targets. Raja is'nt a bad individual, i just think he needs to stop doing the obvious because it's not working.

AMD redesigned their GPU completely with RDNA and with succes. Raja left prior and it was proberly the best thing for them to let him go.
Posted on Reply
#68
DeathtoGnomes
Vayra86Hang on.. this was Intel 'rushing to market'?
I was leaning towards coincidence since it wasnt that obvious but, the timing of events won out.
Posted on Reply
#69
TheoneandonlyMrK
Vayra86You must have had some pretty big blinders on

Amd/comments/6pwx3e
www.overclock3d.net/news/gpu_displays/amd_vega_marketing_takes_shots_at_nvidia_s_volta_architecture/2

This wasn't just a GPU man. This was 'an uprising' - 'a revolution', it was the mobilization of gamers worldwide to finally show team green what's what.
This was the culmination of efforts post RX480, Raja's favorite dropped baby:


Hang on.. this was Intel 'rushing to market'? :oops:
Kinda funny that your calling out the 7 year old rx580, yet, it beats this.
And it and the Vega are still viable to use, as opposed to the arc A###.

Not a dig just an observation.
Posted on Reply
#70
AusWolf
My take is the same as with any other Arc-related article: let's not hype it, let's not bury it - let's wait and see.
Posted on Reply
#71
Gundem
"All of this suggests that the new GPU lineup is on the verge of extinction, even before it has launched. However, we are sure that the market will adapt and make a case for the third GPU maker."

???
Posted on Reply
#72
zlobby
Bomby569We lost a possible competitor, i doubt they will continue the bet on gpu for much longer being pressed by the financial results. Either Gelsinger gets fired or he will can it to keep his job
For as long as they put financial guys in top level technical positions the results will always be the same. AMD and nvidia both have Lisa and Mr. Elon-wannabe Huang (ironically, relatives), but those guys are engineers first and then good businespeople (did that come PC enough?).

Intel on the other hand does the opposite. Put some big duck swingers with attitude and see the results. I honestly wonder how their board allows for this? It looks to me that the board is none the wiser.
Posted on Reply
#73
Gundem
watzupkenSince nothing is mentioned about this supposed QC issues, it may be too early to conclude it’s a bad chip. It could be reference design that needs to improve, but hey, there were issues with Ampere release as well if you recall. Considering that Intel is inexperienced in designing dGPU, I don’t think they will deliver a flawless product. Objectively, ARC seems to be in a mess now, but seriously, nobody including Intel should expect flawless success in the first attempt. Deep pockets only mean more people working on the R&D and development, more money for cutting edge technology and marketing. But that does not mean they WILL get it right. Veterans like Nvidia and AMD have stumbled before and expected to stumble at some point.
I do hope Intel keep up the tempo because I feel they have crossed a significant milestone to deliver a working dGPU where I think performance may not be great, but still decent, letdown mostly due to driver stability which should not be impossible to fix.
You make the most sense here compared to all the other comments.
Posted on Reply
#74
Bomby569
zlobbyFor as long as they put financial guys in top level technical positions the results will always be the same. AMD and nvidia both have Lisa and Mr. Elon-wannabe Huang (ironically, relatives), but those guys are engineers first and then good businespeople (did that come PC enough?).

Intel on the other hand does the opposite. Put some big duck swingers with attitude and see the results. I honestly wonder how their board allows for this? It looks to me that the board is none the wiser.
Gelsinger is an engineer
Posted on Reply
#75
zlobby
Bomby569Gelsinger is an engineer
And how did it go for them? ;)
Posted on Reply
Add your own comment
Jan 22nd, 2025 12:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts