Friday, April 8th 2022
Laptops with Arc Graphics Nowhere in Sight, Intel Says Wait Till June
Intel in March 2022 kicked off its ambitious campaign to grab a slice of the consumer graphics market, with its Arc "Alchemist" line of discrete GPUs, based on the Xe-HPG graphics architecture. The announcement mentioned an immediate availability of at least the entry-level Arc 3-series GPU models in notebooks generally available. These GPU models include the Arc A350M and Arc A370M. People on social media are beginning to ask Intel why these notebooks are nowhere in sight, and the company responded.
In response to one such query by a user, Intel Support stated that laptops with Arc will be available "by the end of the second quarter of 2022." This would put general availability in June 2022, two months from now. Interestingly, this hasn't stopped laptop manufacturers from raking in pre-orders, with the likes of the Acer Swift X and Samsung Galaxy Book2 Pro up for "grabs." You can "purchase" the Swift X, but shipping dates are stated to be as late as May 23 (now pushed to June 13).
Sources:
Intel Support (Twitter), VideoCardz
In response to one such query by a user, Intel Support stated that laptops with Arc will be available "by the end of the second quarter of 2022." This would put general availability in June 2022, two months from now. Interestingly, this hasn't stopped laptop manufacturers from raking in pre-orders, with the likes of the Acer Swift X and Samsung Galaxy Book2 Pro up for "grabs." You can "purchase" the Swift X, but shipping dates are stated to be as late as May 23 (now pushed to June 13).
52 Comments on Laptops with Arc Graphics Nowhere in Sight, Intel Says Wait Till June
What are we looking at right now? Big chips for what they offer in performance. Shaky PR.
Let's just say I hope I'm wrong on this one and Raja delivers ;)
GCN was first released with HD 7000 in January 2012, which is a year before he returned to AMD as a VP.
All GPUs during his second tenure were iterations of GCN, no new architecture.
Vega was only notable because it tried to sidestep GCN's inability to scale further, that had brought Polaris to the edge of feasibility with the RX 590, by using HBM2.
There are also some early reviews, it looks like they might have some issues with the drivers:
videocardz.com/newz/intel-arc-a350m-gpu-has-finally-been-tested-slower-than-gtx-1650-up-to-2-2-ghz-clock
It's likely true his direct architectural contributions began with Vega and ended with Navi, which is still no small contribution, but he had hands on and some input on how previous architectures were delivered.
Not sure why everyone is putting so much effort into discrediting the guy. Because he was promoting and giving interviews leading up to Vega, and people were disappointed by Vega? That's some peak entitlement from the hardware community if so. How dare he be successful because REMEMBER he said Vega was going to be the BEST and it WASN'T! WAH!
If you're STILL waitin on this product to actually be available to purchase, then I have some really nice beachfront property that you may be interested in....
It's over near good ole Dodge City.... you know, the one where the Sheriff & Festus & the dear Ms. Kitty all hung out at :D
Well he is well known for frying a lot of x99 chips recommending leaving vccio cpu and vccio pch on auto :laugh:
Haswell-e default for both are 1.05v on bios before broadwell-e was released
After broadwell-e was released just simply activating xmp profile shot vccio cpu and pch to I believe to 1.3v
Good old Raja said it's okay it's for weak chips lol weak and now dead jackass says lots of people and at least one member here I haven't seen in a while @xkm1948 he knows first hand.
amd announcing leaving the high end market (for nvidia only)
the back room politics and infighting trying split RTG off from amd.
i'm sure the site's search function can find all the posts calling for his head ~2016
I trust Raja did all he could to improve GCN with the resources that were made available to the GPU division at the time. It seems rather distant now, but pre-Ryzen we would often read on forums (posts circa 2015-2016) predicting that AMD would run out of cash and begin divesting IP/eventually filing for bankruptcy soon if a breakthrough wasn't found.
but the clock is ticking.
The thing about marketing always using some sort of trigger phrases like "shines 2.5 time brighter" usually leave the 'better than...[product x]' part out and let people make assumptions that lead to foot-in-mouth disease. Yes the wise choice is the wait and see path.
let's take a early review from korea where they test the arc which is terribly optimized
videocardz.com/newz/intel-arc-a350m-gpu-has-finally-been-tested-slower-than-gtx-1650-up-to-2-2-ghz-clock
Intel probably will charge higher price than what amd/nvidia offer for same performance.
That is fucking absymal. It's barely faster than the soon-to-be-2-generations-old bottom-of-the-barrel MX 450 (which also has only half the VRAM) and the driver situation is, as expected, a joke.
But more than that, the frametimes... oh my god, those frametimes. Overwatch (bottom left and bottom middle) is reading 66 average FPS yet the frametimes are all over the place. That's not the fluid playable experience that you need in a MOBA, that's an unplayable stuttery mess.
It's quite clear that Intel's graphics division has been lying to their execs about what Arc can and can't do and how far along it is, except this test launch in South Korea has made it abundantly clear to the execs how much of a clusterfuck Arc really is, with the result that they've given the graphics team three months' grace to fix their shit. Except that you can't fix fundamental problems like this in three months.
This is going to be i740 all over again, and a lot of former Intel employees are going to be out of a job in June. Honestly the execs need to burn down that entire graphics division and rebuild it from scratch, because it's become quite obvious that it's fundamentally rotten to the core in terms of how it's (mis)managed, and that continuing to throw money at the graphics division isn't working and won't work as long as the wrong people are in charge.
I was expecting Arc to be bad; I was not expecting it to be a disaster of this magnitude. But I guess this is what happens when you have the kind of broken corporate culture that allows something like the 10nm debacle to happen.
Prior to that, well we've spoken about ATI, let's give him benefit of the doubt in that time period, I'll concede that one ;)
But really, take a long look at the info coming out on Arc. I'm seeing trends, more so than a new beginning. I'm seeing a mix of the Intel hit/miss driver regime, combined with Raja's overly optimistic projections of time to market. The man is senior management. If you still have such a bad handle on product development as it seems now, what the actual fuck are you doing there? And this is a trend, overall time to market, that has existed with AMD too. Its part of the reason Vega failed: it came too late. And not because of factors unknown either: Fury X was already plagued by delays and limited availability because of HBM. And HBM already showed its weakness compared to simple GDDR5, when Fury got smoked by a much simpler and cheaper (cut down!!!) 980ti.
And for some reason, Intel is now apparently diving into the same holes AMD and Nvidia already climbed out of, for reasons unknown. We're looking at a discrete GPU family here that launches numerous new technologies along with its maiden voyage, but seems to be lacking in all the possible ways it truly counts for gaming. It wants to tick all the boxes all at once instead of being a lean, mean little chip. Its already a pretty large die for what it offers in performance. The design appears to be largely similar for enterprise/datacenter and gaming.
Now note this: both Nvidia and AMD have rigorously separated their gaming and non-gaming stacks over the last 2-3 generations, Nvidia definitively went there since Pascal and gained a major efficiency boost, enabling them to re-introduce new blocks from Volta, AMD completed the journey with RDNA2 and can now do a similar trick. But even then we're seeing TDP's move up to cater to further performance bumps along with RT. Here's Raja in 2022H2: 'look at my tiles! and they can game too, one chip to do it all!'... Dude, you already went there before and it didn't work. And piled onto that: the TDP of Arc is higher for substantially lower (raster!) performance.
Frankly, I don't even care about the man Raja, I care about what's coming out of his hands, and its simply not looking good at all, same as it hasn't the last half dozen times. The design simply trails the market reality by a full generation or more. We can talk for hours about cause and effect, but I only really care about product I can buy ;) Yep. Few ifs/buts though.
We don't know settings, and these are power limited mobile GPUs at/under entry level discrete. Can't expect things to run steady constantly, they never did. Some frame smoothing tech can easily push this into a lower peak / higher dip situation making it a lot more stable.
That said, its clear the thing craps itself in almost every game and drops to low FPS all the time, stuff is clearly missing across the board, its not game specific.