Saturday, June 4th 2022

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.

It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source: @theBryceIsRt
Add your own comment

108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

#26
64K
AusWolfFair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.
That's what has been holding back 4K gaming adoption for years now. At first is was the cost of the monitors but they have come way down in price but it's still the cost of the video card to run it properly that has/will continue to hold down 4K gaming adoption for most people.

I have been gaming for almost 4 decades now and in all those years I've seen it over and over again. Games have always will always continue to require more and faster resources.
Posted on Reply
#27
eidairaman1
The Exiled Airman
DavenOnly the ego’s at Intel would showcase an otherwise insignificant piece of technology as if it were some powerful artifact found in an ancient temple.
Every time intel feels they are being left behind they wave their arms and dangle the carrot in front of suckers.

I say when it's physically on store shelves or being sold on etailers that news be brought up. I mean a RX 6000 series refresh came out before these are to see light of day, If that ever happens.

Deadware to me.
Posted on Reply
#28
Dragokar
Couldn't they at least show working eval samples.....
Posted on Reply
#29
bonehead123
AusWolfI agree. Graphics cards are getting ridiculously overpowered - not just in terms of power consumption, but also performance. I mean, game technology isn't evolving as fast as graphics performance is. That's why they're advocating completely pointless 4K gaming, because there's nothing else that would need the performance of a 3080 level graphics card, not to mention 4080. They have to sell those oversized monsters somehow. I've just recently downgraded my 2070 to a 6500 XT because of the noise, and I'm happy... with an enrty-level graphics card! This has never happened to me before.
Yea, but in light of the above, iNtEl has probably been conflaborating & colluding with the game makers for quite some time to make it so that all new games released near or after the "overpowered monsters" are out in the wild will REQUIRE a minimum of a high-end ARC (or equivalent) card just to run at mid-high res/FR's....

In fact, this may well be the reason for the continued launch delays, so that ALL new games are highly optimized to run on high-end iNtEl's cards way moar betta than the other brands of cards.....

I hope this is not the case, but this IS intel we are talking about here, so who knows what kinda sneaky sh*t they have been doing behind the scenes.....
Posted on Reply
#30
TheLostSwede
News Editor
AusWolfFair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.
Well, some of us use our computers for more than playing games.
Then again, right now I can't afford to buy anything, but at least most games that I play, I can play at 4K.
But yes, you're right that this is becoming a stupidly expensive hobby.
DragokarCouldn't they at least show working eval samples.....
Sorry, the drivers aren't at a state that would allow anything to be shown in public.
Posted on Reply
#31
bonehead123
TheLostSwedeWell, some of us use our computers for more than playing games.
^^THIS too^^
Posted on Reply
#32
Daven
DragokarCouldn't they at least show working eval samples.....
They probably still don’t have working silicon fit for public demos even after all this time. Given everything over the last five years, I can’t understand why anyone cares about Intel GPUs.
Posted on Reply
#33
ZoneDymo
DragokarCouldn't they at least show working eval samples.....
No, they couldn't and we all know it
Posted on Reply
#34
Bomby569
DavenOnly the ego’s at Intel would showcase an otherwise insignificant piece of technology as if it were some powerful artifact found in an ancient temple.
do you think Intel are peasents to present technology in someones kitchen?
Posted on Reply
#35
64K
DavenThey probably still don’t have working silicon fit for public demos even after all this time. Given everything over the last five years, I can’t understand why anyone cares about Intel GPUs.
I think most of what we are seeing is just hope that Intel will release something that will bring the costs of GPUs down to reasonable levels but the fact is those days have long gone. Probably most won't buy an Intel discrete GPU anyway until they prove that they will support their GPUs with proper drivers. It's like you see with AMD GPUs where so many claim they will buy AMD if the value is better but they buy Nvidia anyway and that's why Nvidia holds 80% of the discrete GPU market.
Posted on Reply
#36
eidairaman1
The Exiled Airman
Just like their igp and i740, larrabee
TheLostSwedeWell, some of us use our computers for more than playing games.
Then again, right now I can't afford to buy anything, but at least most games that I play, I can play at 4K.
But yes, you're right that this is becoming a stupidly expensive hobby.


Sorry, the drivers aren't at a state that would allow anything to be shown in public.
Posted on Reply
#37
Bomby569
64KI think most of what we are seeing is just hope that Intel will release something that will bring the costs of GPUs down to reasonable levels but the fact is those days have long gone. Probably most won't buy an Intel discrete GPU anyway until they prove that they will support their GPUs with proper drivers. It's like you see with AMD GPUs where so many claim they will buy AMD if the value is better but they buy Nvidia anyway and that's why Nvidia holds 80% of the discrete GPU market.
to be fair, the driver side of things is very important on a gpu, almost as important as the hardware
Posted on Reply
#38
trsttte
For all the haters, they do have demo computers running Intel GPUs at the event. Laptops with the smaller version of the silicon are also becoming available worldwide (i'm not sure the drivers are up to par yet, but at least they should work better than 5700xt initial launch)

There's also a lot of reasons to buy an Intel GPU other than games, like linux support, the advanced media engine or if you want to dable with OneApi stuff.

Anyway, now you made sound like an Intel fanboy :D , it's fun to mock the forever delayed intel gpu but can we keep some semblance of reality?


8+6 power connector is disapointing, I hate 6 pin connectors, more often than not if you're not running custom cables you'll be left with a 2 pin connector dangling of the gpu, guess this was designed so long ago that the new gen5 power connector wasn't available :D
Posted on Reply
#39
TheLostSwede
News Editor
DavenThey probably still don’t have working silicon fit for public demos even after all this time. Given everything over the last five years, I can’t understand why anyone cares about Intel GPUs.
The rumours suggest the silicon is just fine, but the drivers aren't anywhere near something that could be handed over to the general public.
That said, laptops are going up for pre-order in Australia and New Zealand, so there must be some kind of working drivers... Still about a month out though, maybe...
videocardz.com/newz/intel-arc-a370m-laptops-are-now-available-for-preorder-in-us-new-zealand-and-australia
Bomby569do you think Intel are peasents to present technology in someones kitchen?
Pat will be holding a BBQ for the launch...
Posted on Reply
#40
64K
Bomby569to be fair, the driver side of things is very important on a gpu, almost as important as the hardware
I would say even more so than the hardware. It's pretty much useless to have the fastest hardware if the drivers won't run it properly.
Posted on Reply
#41
trsttte


We need and Intel version of the meme
Posted on Reply
#42
TheLostSwede
News Editor
64KI would say even more so than the hardware. It's pretty much useless to have the fastest hardware if the drivers won't run it properly.
Something that's very obvious when it comes to arm based SoCs made by companies in the PRC.
There's a reason why the RPi is so popular, despite being ho hum in the hardware department.
Posted on Reply
#43
Rares
Nice... p.o.s. intel. Thank you.
Posted on Reply
#44
Bomby569
I've seen very hyped products turn out garbage and low key ones being amazing. We just have to wait. But my expectations are very low, probably they will do great in the low end market with a good price point, were AMD and Nvidia have been shooting their own feet constantly, and that's it. I doubt they can compete in the other markets.
Posted on Reply
#45
Daven
Regarding the comments about Intel laptops on the market, all of these contain Arc A3 series (96 and 128 EUs) which is the bare minimum IGP implementation in a discrete form factor. Intel has had this tech for many years now. The trick is making something above IGP levels of performance which Intel has only slightly exceeded with the A370M. No other higher mobile and certainly any desktop SKUs have appeared yet.

By the way, working silicon implies working hardware AND software. Its not so easy to have one without the other so its implied.
Posted on Reply
#46
mechtech
TomgangInteresting. But comming this late to the game in the current gpu market. I would say Intel's Arc gpu will end competing with amd rdna3 and nvidia rtx 4000 series cards amd not so much the current generation of graphics cards.
Late is a relative term. But yes I would agreed that they will be up against rdna3 and nv 4k series.
Posted on Reply
#47
chodaboy19
One question: when is the release date intel?

Honestly, I just want to see how it performs in real life off the shelf...
Posted on Reply
#48
Frick
Fishfaced Nincompoop
AusWolfFair enough. Correction: pointless / overpriced.

I'm still fine with 1080p and I feel no need to upgrade. An expensive 4K monitor would necessitate a costly GPU upgrade. No thanks.
Ohhh man you haven't tried Freelancer on a big 4K monitor. Generally I agree with you, but that game (and the texture mod) REALLY shines at higher resolution and 32".
Posted on Reply
#49
mechtech
TheLostSwedePointless in your opinion.
Been playing at 4K for several years already and in some games I wouldn't want to go back.
No one really wants to downgrade, but something is better than nothing though. Also can only do what budget/time allow. I play Terraria at 4k with my RX 480 :)
Posted on Reply
#50
Dr_b_
InVasManiIf this is the best Raja can come up with AMD made the right choice.
agreed
Posted on Reply
Add your own comment
Nov 20th, 2024 06:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts