Friday, April 8th 2022

Laptops with Arc Graphics Nowhere in Sight, Intel Says Wait Till June

Intel in March 2022 kicked off its ambitious campaign to grab a slice of the consumer graphics market, with its Arc "Alchemist" line of discrete GPUs, based on the Xe-HPG graphics architecture. The announcement mentioned an immediate availability of at least the entry-level Arc 3-series GPU models in notebooks generally available. These GPU models include the Arc A350M and Arc A370M. People on social media are beginning to ask Intel why these notebooks are nowhere in sight, and the company responded.

In response to one such query by a user, Intel Support stated that laptops with Arc will be available "by the end of the second quarter of 2022." This would put general availability in June 2022, two months from now. Interestingly, this hasn't stopped laptop manufacturers from raking in pre-orders, with the likes of the Acer Swift X and Samsung Galaxy Book2 Pro up for "grabs." You can "purchase" the Swift X, but shipping dates are stated to be as late as May 23 (now pushed to June 13).
Sources: Intel Support (Twitter), VideoCardz
Add your own comment

52 Comments on Laptops with Arc Graphics Nowhere in Sight, Intel Says Wait Till June

#26
DeathtoGnomes
bugead Intel's announcement, it's so vague, the first thing I thought about after reading it was "they put so much effort into not mentioning actual products availability".
Now that you pointed that out, the smokescreen shows.
Dr. DroI think Arc will have a newness issue for the first year,
this explains all issues with the driver deployment.
Posted on Reply
#27
Fouquin
Vayra86Rather everywhere he went, stuff went tits up shortly after: S3, ATI, RTG and now Intel.
Things went tits up for ATi after he joined in 2001? R300 and their rise to performance dominance didn't happen I guess... He also oversaw the transition to unified shaders and shift to GPGPU architectures that put ATi on the map for HPC. He almost assuredly did more good than harm. If anything being a senior at ATi during the acquisition that didn't jump ship probably made the biggest impact. The turmoil at ATi in those years is rumored to have been legendary. Not suggesting he was single handedly responsible for their return to competition, but he held a leading role during three periods of trend reversal for ATi/RTG (R300-R520, Cypress/Terascale 2, Polaris). Looks pretty good to me.
Posted on Reply
#28
chodaboy19
Let's pause coverage on ARC until intel releases the product.
Posted on Reply
#29
bug
chodaboy19Let's pause coverage on ARC until intel releases the product.
This is a mobile part, it's released already. Only that means squat for us until someone actually releases a laptop making use of it.
Posted on Reply
#30
Vayra86
FouquinThings went tits up for ATi after he joined in 2001? R300 and their rise to performance dominance didn't happen I guess... He also oversaw the transition to unified shaders and shift to GPGPU architectures that put ATi on the map for HPC. He almost assuredly did more good than harm. If anything being a senior at ATi during the acquisition that didn't jump ship probably made the biggest impact. The turmoil at ATi in those years is rumored to have been legendary. Not suggesting he was single handedly responsible for their return to competition, but he held a leading role during three periods of trend reversal for ATi/RTG (R300-R520, Cypress/Terascale 2, Polaris). Looks pretty good to me.
And over all this time, AMD never made a dime, but break even at best... The man isn't a fool, but he has no handle on the market whatsoever. Over all this time, AMD slowly lost competitive edge, sure they moved forward, but not fast enough. And remember what we saw during the times of mild success, too. Major screw ups. And transition to Polaris... yes. But late, and that specific transition was the moment Nvidia took its lead and ran with it because the high end was simply gone. And when Vega- the high end- actually came, it was already close to midrange territory.

What are we looking at right now? Big chips for what they offer in performance. Shaky PR.

Let's just say I hope I'm wrong on this one and Raja delivers ;)
Posted on Reply
#31
Fouquin
Vayra86And over all this time, AMD never made a dime, but break even at best... The man isn't a fool, but he has no handle on the market whatsoever.

What are we looking at right now? Big chips for what they offer in performance.
First off, AMD's financial success doesn't rest on the shoulders of a single person. Not even the CEO, they still need teams to successfully deliver on their goals. Second, ATi turned a profit and took majority market share multiple years when Raja was employed... Did you suddenly forget that AMD's financials after the ATi purchase were almost entirely held up by the graphics division? I understand you really want to vilify the dude, but come on man. You're way out in lala land.
Posted on Reply
#32
Assimilator
FouquinThings went tits up for ATi after he joined in 2001? R300 and their rise to performance dominance didn't happen I guess... He also oversaw the transition to unified shaders and shift to GPGPU architectures that put ATi on the map for HPC. He almost assuredly did more good than harm. If anything being a senior at ATi during the acquisition that didn't jump ship probably made the biggest impact. The turmoil at ATi in those years is rumored to have been legendary. Not suggesting he was single handedly responsible for their return to competition, but he held a leading role during three periods of trend reversal for ATi/RTG (R300-R520, Cypress/Terascale 2, Polaris). Looks pretty good to me.
R300 through R500 and TeraScale, maybe... Raja was CTO at the time, a role that is primarily a management position, not a technology position.
GCN was first released with HD 7000 in January 2012, which is a year before he returned to AMD as a VP.
All GPUs during his second tenure were iterations of GCN, no new architecture.
Vega was only notable because it tried to sidestep GCN's inability to scale further, that had brought Polaris to the edge of feasibility with the RX 590, by using HBM2.
Posted on Reply
#34
Fouquin
AssimilatorR300 through R500 and TeraScale, maybe... Raja was CTO at the time, a role that is primarily a management position, not a technology position.
CTO and director of technology development. You can quite literally fact check that by looking at his linkedin.
AssimilatorGCN was first released with HD 7000 in January 2012, which is a year before he returned to AMD as a VP.
All GPUs during his second tenure were iterations of GCN, no new architecture.
That actually doesn't matter, like at all. See, iterations are still new designs. The core logic blocks stayed mostly the same, but each new generation still has to be designed, masked, printed, debugged, developed, and properly brought up. They also kept developing key blocks of the architecture such as fixed function encoding blocks, ROP and backends, core interconnects, etc. So just because it's iterative doesn't mean they aren't still designing and building new ASICs. Also, the seeds of an architecture start many years in advance. I guarantee that GCN's building blocks started to take shape around the time Raja left in 2009, because bringing up an entire new core architecture in 3 years for a bleeding edge node is absolutely insane for a company that was hemorrhaging cash quarter to quarter. Doesn't mean any team he was involved with by that point was heading development of GCN, but it definitely crossed his desk once or twice given the position he held.

It's likely true his direct architectural contributions began with Vega and ended with Navi, which is still no small contribution, but he had hands on and some input on how previous architectures were delivered.

Not sure why everyone is putting so much effort into discrediting the guy. Because he was promoting and giving interviews leading up to Vega, and people were disappointed by Vega? That's some peak entitlement from the hardware community if so. How dare he be successful because REMEMBER he said Vega was going to be the BEST and it WASN'T! WAH!
Posted on Reply
#35
bonehead123
hahahahaha........

If you're STILL waitin on this product to actually be available to purchase, then I have some really nice beachfront property that you may be interested in....

It's over near good ole Dodge City.... you know, the one where the Sheriff & Festus & the dear Ms. Kitty all hung out at :D
Posted on Reply
#36
ThrashZone
Vayra86Been doing that since he works at Intel. So far so good :D


What exactly is the man known for? I have yet to see a clear design win coming from him, but maybe I missed something?

Marketing wise he is a complete and utter fail, that's clear enough though :D
Hi,
Well he is well known for frying a lot of x99 chips recommending leaving vccio cpu and vccio pch on auto :laugh:

Haswell-e default for both are 1.05v on bios before broadwell-e was released

After broadwell-e was released just simply activating xmp profile shot vccio cpu and pch to I believe to 1.3v
Good old Raja said it's okay it's for weak chips lol weak and now dead jackass says lots of people and at least one member here I haven't seen in a while @xkm1948 he knows first hand.
Posted on Reply
#37
looniam
idk, maybe folks want to forget 6 months of horrid drivers (ala fine wine my butt)
amd announcing leaving the high end market (for nvidia only)
the back room politics and infighting trying split RTG off from amd.

i'm sure the site's search function can find all the posts calling for his head ~2016
Posted on Reply
#38
Dr. Dro
FouquinNot sure why everyone is putting so much effort into discrediting the guy. Because he was promoting and giving interviews leading up to Vega, and people were disappointed by Vega? That's some peak entitlement from the hardware community if so. How dare he be successful because REMEMBER he said Vega was going to be the BEST and it WASN'T! WAH!
It's a gag and/or cope from /r/AMD that got out of hand, IMO.

I trust Raja did all he could to improve GCN with the resources that were made available to the GPU division at the time. It seems rather distant now, but pre-Ryzen we would often read on forums (posts circa 2015-2016) predicting that AMD would run out of cash and begin divesting IP/eventually filing for bankruptcy soon if a breakthrough wasn't found.
Posted on Reply
#39
aQi
The teasers are therefore just to keep the boat floating until the time comes :)
Posted on Reply
#40
looniam
aQiThe teasers are therefore just to keep the boat floating until the time comes :)
yep.


but the clock is ticking.
Posted on Reply
#42
AusWolf
The announcement said "they're available now". They just forgot to add that one should read the announcement in June.
Posted on Reply
#43
watzupken
FouquinCTO and director of technology development. You can quite literally fact check that by looking at his linkedin.



That actually doesn't matter, like at all. See, iterations are still new designs. The core logic blocks stayed mostly the same, but each new generation still has to be designed, masked, printed, debugged, developed, and properly brought up. They also kept developing key blocks of the architecture such as fixed function encoding blocks, ROP and backends, core interconnects, etc. So just because it's iterative doesn't mean they aren't still designing and building new ASICs. Also, the seeds of an architecture start many years in advance. I guarantee that GCN's building blocks started to take shape around the time Raja left in 2009, because bringing up an entire new core architecture in 3 years for a bleeding edge node is absolutely insane for a company that was hemorrhaging cash quarter to quarter. Doesn't mean any team he was involved with by that point was heading development of GCN, but it definitely crossed his desk once or twice given the position he held.

It's likely true his direct architectural contributions began with Vega and ended with Navi, which is still no small contribution, but he had hands on and some input on how previous architectures were delivered.

Not sure why everyone is putting so much effort into discrediting the guy. Because he was promoting and giving interviews leading up to Vega, and people were disappointed by Vega? That's some peak entitlement from the hardware community if so. How dare he be successful because REMEMBER he said Vega was going to be the BEST and it WASN'T! WAH!
The sad fact is that people will remember one for how they last performed. Raja’s last dedicated GPU release were disappointing. I think the disappointment is compounded by the fact that the marketing hype the products too much. Turns out that the Polaris that “shines 2.5 times brighter” wasn’t that bright. Vega that Raja teased over and over again, only managed to keep up with a GTX 1070 but at the expense of a significantly higher power consumption. And we can observe that same pattern with Vega, this time round again. So let’s see.
Posted on Reply
#44
DeathtoGnomes
watzupkenThe sad fact is that people will remember one for how they last performed. Raja’s last dedicated GPU release were disappointing. I think the disappointment is compounded by the fact that the marketing hype the products too much. Turns out that the Polaris that “shines 2.5 times brighter” wasn’t that bright. Vega that Raja teased over and over again, only managed to keep up with a GTX 1070 but at the expense of a significantly higher power consumption. And we can observe that same pattern with Vega, this time round again. So let’s see.
yep.
The thing about marketing always using some sort of trigger phrases like "shines 2.5 time brighter" usually leave the 'better than...[product x]' part out and let people make assumptions that lead to foot-in-mouth disease. Yes the wise choice is the wait and see path.
Posted on Reply
#46
Assimilator
gasolinaRaja is a joke tbh
let's take a early review from korea where they test the arc which is terribly optimized
videocardz.com/newz/intel-arc-a350m-gpu-has-finally-been-tested-slower-than-gtx-1650-up-to-2-2-ghz-clock
Intel probably will charge higher price than what amd/nvidia offer for same performance.
Holy fucking shit.

That is fucking absymal. It's barely faster than the soon-to-be-2-generations-old bottom-of-the-barrel MX 450 (which also has only half the VRAM) and the driver situation is, as expected, a joke.

But more than that, the frametimes... oh my god, those frametimes. Overwatch (bottom left and bottom middle) is reading 66 average FPS yet the frametimes are all over the place. That's not the fluid playable experience that you need in a MOBA, that's an unplayable stuttery mess.

It's quite clear that Intel's graphics division has been lying to their execs about what Arc can and can't do and how far along it is, except this test launch in South Korea has made it abundantly clear to the execs how much of a clusterfuck Arc really is, with the result that they've given the graphics team three months' grace to fix their shit. Except that you can't fix fundamental problems like this in three months.

This is going to be i740 all over again, and a lot of former Intel employees are going to be out of a job in June. Honestly the execs need to burn down that entire graphics division and rebuild it from scratch, because it's become quite obvious that it's fundamentally rotten to the core in terms of how it's (mis)managed, and that continuing to throw money at the graphics division isn't working and won't work as long as the wrong people are in charge.

I was expecting Arc to be bad; I was not expecting it to be a disaster of this magnitude. But I guess this is what happens when you have the kind of broken corporate culture that allows something like the 10nm debacle to happen.
Posted on Reply
#47
looniam
i guess google/youtube translate was close this time:
Posted on Reply
#48
mama
With the success of Alder Lake, Intel don't want to release a second rate product. Not good from a marketing/brand standpoint. I paper launch and rumours are all they'll look for to preserve their reputation as a performance leader. They really don't want the product properly pitted against superior competition.
Posted on Reply
#49
Vayra86
FouquinCTO and director of technology development. You can quite literally fact check that by looking at his linkedin.

It's likely true his direct architectural contributions began with Vega and ended with Navi, which is still no small contribution, but he had hands on and some input on how previous architectures were delivered.

Not sure why everyone is putting so much effort into discrediting the guy. Because he was promoting and giving interviews leading up to Vega, and people were disappointed by Vega? That's some peak entitlement from the hardware community if so. How dare he be successful because REMEMBER he said Vega was going to be the BEST and it WASN'T! WAH!
Not sure its 'everyone' but I've been shitting over Raja since Poor Volta and his repeated shitty attempts at marketing. The move to RTG did no-one a favor, I'm not even sure what the purpose of Polaris was, but one might say AMD developed the perfect el-cheapo mining card there, apart from serving a midrange from one year ago at the time.

Prior to that, well we've spoken about ATI, let's give him benefit of the doubt in that time period, I'll concede that one ;)

But really, take a long look at the info coming out on Arc. I'm seeing trends, more so than a new beginning. I'm seeing a mix of the Intel hit/miss driver regime, combined with Raja's overly optimistic projections of time to market. The man is senior management. If you still have such a bad handle on product development as it seems now, what the actual fuck are you doing there? And this is a trend, overall time to market, that has existed with AMD too. Its part of the reason Vega failed: it came too late. And not because of factors unknown either: Fury X was already plagued by delays and limited availability because of HBM. And HBM already showed its weakness compared to simple GDDR5, when Fury got smoked by a much simpler and cheaper (cut down!!!) 980ti.

And for some reason, Intel is now apparently diving into the same holes AMD and Nvidia already climbed out of, for reasons unknown. We're looking at a discrete GPU family here that launches numerous new technologies along with its maiden voyage, but seems to be lacking in all the possible ways it truly counts for gaming. It wants to tick all the boxes all at once instead of being a lean, mean little chip. Its already a pretty large die for what it offers in performance. The design appears to be largely similar for enterprise/datacenter and gaming.

Now note this: both Nvidia and AMD have rigorously separated their gaming and non-gaming stacks over the last 2-3 generations, Nvidia definitively went there since Pascal and gained a major efficiency boost, enabling them to re-introduce new blocks from Volta, AMD completed the journey with RDNA2 and can now do a similar trick. But even then we're seeing TDP's move up to cater to further performance bumps along with RT. Here's Raja in 2022H2: 'look at my tiles! and they can game too, one chip to do it all!'... Dude, you already went there before and it didn't work. And piled onto that: the TDP of Arc is higher for substantially lower (raster!) performance.

Frankly, I don't even care about the man Raja, I care about what's coming out of his hands, and its simply not looking good at all, same as it hasn't the last half dozen times. The design simply trails the market reality by a full generation or more. We can talk for hours about cause and effect, but I only really care about product I can buy ;)
AssimilatorHoly fucking shit.

That is fucking absymal. It's barely faster than the soon-to-be-2-generations-old bottom-of-the-barrel MX 450 (which also has only half the VRAM) and the driver situation is, as expected, a joke.

But more than that, the frametimes... oh my god, those frametimes. Overwatch (bottom left and bottom middle) is reading 66 average FPS yet the frametimes are all over the place. That's not the fluid playable experience that you need in a MOBA, that's an unplayable stuttery mess.

It's quite clear that Intel's graphics division has been lying to their execs about what Arc can and can't do and how far along it is, except this test launch in South Korea has made it abundantly clear to the execs how much of a clusterfuck Arc really is, with the result that they've given the graphics team three months' grace to fix their shit. Except that you can't fix fundamental problems like this in three months.

This is going to be i740 all over again, and a lot of former Intel employees are going to be out of a job in June. Honestly the execs need to burn down that entire graphics division and rebuild it from scratch, because it's become quite obvious that it's fundamentally rotten to the core in terms of how it's (mis)managed, and that continuing to throw money at the graphics division isn't working and won't work as long as the wrong people are in charge.

I was expecting Arc to be bad; I was not expecting it to be a disaster of this magnitude. But I guess this is what happens when you have the kind of broken corporate culture that allows something like the 10nm debacle to happen.
Yep. Few ifs/buts though.
We don't know settings, and these are power limited mobile GPUs at/under entry level discrete. Can't expect things to run steady constantly, they never did. Some frame smoothing tech can easily push this into a lower peak / higher dip situation making it a lot more stable.

That said, its clear the thing craps itself in almost every game and drops to low FPS all the time, stuff is clearly missing across the board, its not game specific.
Posted on Reply
#50
ZoneDymo
AssimilatorHoly fucking shit.

That is fucking absymal. It's barely faster than the soon-to-be-2-generations-old bottom-of-the-barrel MX 450 (which also has only half the VRAM) and the driver situation is, as expected, a joke.

But more than that, the frametimes... oh my god, those frametimes. Overwatch (bottom left and bottom middle) is reading 66 average FPS yet the frametimes are all over the place. That's not the fluid playable experience that you need in a MOBA, that's an unplayable stuttery mess.

It's quite clear that Intel's graphics division has been lying to their execs about what Arc can and can't do and how far along it is, except this test launch in South Korea has made it abundantly clear to the execs how much of a clusterfuck Arc really is, with the result that they've given the graphics team three months' grace to fix their shit. Except that you can't fix fundamental problems like this in three months.

This is going to be i740 all over again, and a lot of former Intel employees are going to be out of a job in June. Honestly the execs need to burn down that entire graphics division and rebuild it from scratch, because it's become quite obvious that it's fundamentally rotten to the core in terms of how it's (mis)managed, and that continuing to throw money at the graphics division isn't working and won't work as long as the wrong people are in charge.

I was expecting Arc to be bad; I was not expecting it to be a disaster of this magnitude. But I guess this is what happens when you have the kind of broken corporate culture that allows something like the 10nm debacle to happen.
Honestly your reaction reads as a parody for how over the top it is.
Posted on Reply
Add your own comment
Dec 22nd, 2024 11:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts