Saturday, June 4th 2022

Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.

It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source: @theBryceIsRt
Add your own comment

108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters

#76
spnidel
JismNah, their initial batch failed. Did'nt meet expectations. So they fix it, respin it and hope for the best. By the time it's released next generation is knocking at the door with figures of 120FPS 4K ready cards.
120fps 4k is possible already with a 3080 or a 6800 xt, so next gen will be more like 165+ fps 4k
...which is sick, can't wait to eventually get a 144hz 4k monitor once they become cheaper lol
Posted on Reply
#77
AusWolf
PapaTaipeiAgreed. RT is also 100% useless. However I can attest that 4K is nice for non competitive games. If the 20xx amd 30xx card didn't had those 100% useless tensor/RT cores it could have 30-40% more performance per die. Also I saw the UE5 demos, yes it looks cool but at the cost of MASSIVE storage requirements and a complete rework of the data pipeline, and even then it would still have bad framerate and more importantly extremely bad mouse input for competitive games.
RT could be nice when implemented right. That is, when the whole scene is ray traced. The problem is that current gen RT cores on both AMD's and Nvidia's side are too slow for this. That's why game devs limit what they do with RT (for example, shadows only) and use traditional rasterization for the rest of the effects - which is what makes RT at its current state kind of pointless, imo. Otherwise, it would be great.

As for Tensor cores, I agree. Everybody praises Nvidia for DLSS, when in fact, if RT cores were strong enough to do their job at a minimal performance loss vs rasterization, then nobody would need DLSS in the first place. Besides, AMD has shown that the whole DLSS shenanigans can be done without Tensor cores so... meh. :rolleyes:
chrcolukThey are kind off, there is two things which in my opinion are keeping the market alive.

One is generated demand via hardware exclusive features, so e.g. with Nvidia, Gsync, DLSS and hardware based RT. All three features thanks to AMD have alternatives that dont require hardware lock in, VRR, FSR and RT using rasterisation hardware.
Exactly. Nearly every single monitor is Freesync capable nowadays, whereas you have to find and pay more for Gsync ones. This is where the slogan "Nvidia - The Way It's Meant To Be Paid" is valid.
chrcolukI get the merit of VRR, Nvidia deserve praise for introducing the concept, but they did attempt vendor lock in, not only via GPU but also using expensive chips on monitors. DLSS I also get the merit and out of the three this is for me by the far the most useful tech in terms of potential, but initially limited to games where dev's implement it (so limited to a a fraction of new games released), however we have more recently seen a new variant of DSR that uses DLSS so can be implemented now driver side, not sure if FXR can do via driver (someone can clarify for me this would be great). Finally RT, this one I have little love for, I feel lighting can be done very well in games via traditional methods, and I thought was lame where RT developed games would have heavily nerfed non RT lighting and to have good lighting you needed RT hardware, AMD have at least proven the hardware isnt needed to make this a bit less lame. But ultimately RT has served to increase the level of hardware required to achieve a quality level so is a bad thing for me.
Well, the point of DLSS is to increase performance at a minimal image quality loss. But then the question is, why don't you have enough performance in the first place? Is it the gamer's fault for sitting on the 120+ fps craze train, or is it Nvidia's way of selling their otherwise useless Tensor cores when they could have used the same die space for more raster cores? RT is nice, but like I said above...
chrcolukThe second demand for new GPUs is been fuelled by the high FPS/HZ craze, which is primarily desired by first person shooter and esport gamers. If there was no high refresh rate monitors, then Nvidia and AMD would be having a much harder time now in convincing people to buy new gen GPUs, as we starting to get to the point a single GPU can handle anything thrown at it to a degree at 60fps.
They have a much harder time convincing me for sure. I honestly think the high refresh rate craze is probably the stupidest thing the gaming industry has invented so far.
Posted on Reply
#78
Ferrum Master
I guess that's the only card they have :D :D

But in reality... guess everyone does remember no driver meme? I have a bad feeling no driver will go to the team blue.
Posted on Reply
#80
wolf
Better Than Native
AusWolfI've just recently downgraded my 2070 to a 6500 XT because of the noise
Were there other incentives? From noise alone I'd have just done a fairly heavy handed undervolt and make a very relaxed fan profile. I'd assume that you made good coin out of the changeover.
Posted on Reply
#81
Vayra86
aQiIntel wont stand tough against rdna3 and rtx4000 series but they could just try to catch up. I remember Intel job openings for discrete gpu graphic driver development in US and China on urgent basis. I guess they want the bells to ring but they just need more and more time to suite up against both green and red team.


Well im not saying their hardware is immortal atleast the build quality engineering is a bit ahead of others. Atleast their hardware carry their reputation whether its a processor or even some controller on pci x1, it would perform as good as it should even after a decade.
Ah I see what you mean now. But then again, other smaller and controller chips on boards the world over perform much the same! And lets not forget the many issues around bent chips/IHS lately. When they have to start doing something new, there are absolutely no guarantees on quality. In fact, we know now 'something is gonna give' and you need a delid, a clamp, or some other workaround to really get what was on the marketing sheet.
Posted on Reply
#82
AusWolf
wolfWere there other incentives? From noise alone I'd have just done a fairly heavy handed undervolt and make a very relaxed fan profile. I'd assume that you made good coin out of the changeover.
Nope, I still have the 2070 as a backup in case I need more graphics performance later. I know it doesn't make much sense, but I like saving interesting pieces of hardware, and my bone stock Evga 2070 is interesting enough to be saved as a prime example of the first generation of RT cards. :ohwell:

Apart from noise, my other incentive was heat (as in air temperature inside the PC case and in the room), also curiosity for RDNA 2, not to mention having an Asus TUF themed build now, which wasn't really a reason to swap, just a bonus.
Posted on Reply
#83
Assimilator
TheLostSwedeSorry, the drivers aren't at a state that would allow anything to be shown in public.
Apparently they aren't even at a state that would allow them to show a static picture on a monitor. Which gives me incredible, I mean just massive confidence that this product absolutely isn't vapourware.
Posted on Reply
#84
TheLostSwede
News Editor
AssimilatorApparently they aren't even at a state that would allow them to show a static picture on a monitor. Which gives me incredible, I mean just massive confidence that this product absolutely isn't vapourware.
Intel is launching in the PRC next month, with select partners.
Posted on Reply
#85
Ferrum Master
I missed the Limited Edition label...

So far all of them seem to be limited editions :D :D :D
Posted on Reply
#86
Vya Domus
I typically don't think it is "too late", even if they're one generation behind but the problem here is that Intel is a new player, they have no consumer base. It will be very difficult and nigh on impossible to convince people to buy something that not only it wont be as good as the competition but it's completely new altogether.
Posted on Reply
#87
Daven
TheLostSwedeIntel is launching in the PRC next month, with select partners.
I’ve been saying all along, Intel first generation desktop GPUs will only launch in far obscure markets in OEM systems. If Intel can keep going, second generation might be AIB for the DIY market. Techpowerup readers don’t expect to buy an Intel GPU anytime soon.
Posted on Reply
#88
R0H1T
Well I wasn't but I was hoping it could lower the prices from the other two, guess that isn't happening either :wtf:
Posted on Reply
#89
aQi
Vayra86Ah I see what you mean now. But then again, other smaller and controller chips on boards the world over perform much the same! And lets not forget the many issues around bent chips/IHS lately. When they have to start doing something new, there are absolutely no guarantees on quality. In fact, we know now 'something is gonna give' and you need a delid, a clamp, or some other workaround to really get what was on the marketing sheet.
Yes thats another story. Its not the same as it use to be. The IHS use to be top quality back in the days but now its not as it should be. Lets confront this from a business perspective. These consumer units are simplified to the extend where they can handle low grade IHS, suitable for business market but not admirable by enthusiasts or gamers. We need more juice !!! Then if we see the ratio of professional working class to enthusiast gamers etc, business wise professionals will derive more profit.
Its starts to matter when reviewers and users start to make a point all at once. I remember Intel using poor IHS on 3rd gen (if i remember exactly) the reviewers and users pointed it so badly that later batches had it revised immediately.

Now lets talk about the gamer enthusiast class. These ARC gpus are build for them. Intel does keep a very good eye on every market segment and its obvious that every bit of Intel gpu is going to go through hell alot of reviews, breakdowns, loads of tests. If I was in the same business I would work on things other competitors do not offer that is top quality low price and same performance.
Posted on Reply
#90
ncrs
DavenI’ve been saying all along, Intel first generation desktop GPUs will only launch in far obscure markets in OEM systems. If Intel can keep going, second generation might be AIB for the DIY market. Techpowerup readers don’t expect to buy an Intel GPU anytime soon.
What about the Intel DG1 that has been shipping in OEM systems since January 2021 as Intel Iris Xe Graphics, albeit with limited compatibility? Was that the first generation desktop GPU or Gen 0.5? ;)
Posted on Reply
#91
AusWolf
Vya DomusI typically don't think it is "too late", even if they're one generation behind but the problem here is that Intel is a new player, they have no consumer base. It will be very difficult and nigh on impossible to convince people to buy something that not only it wont be as good as the competition but it's completely new altogether.
They're a new player in the GPU market, but they have the financial background to endure for one generation. It reminds me of RDNA 1. The top offering 5700 XT had bad driver stability and cooling issues, and could only compete with the RTX 2070 (without raytracing as well), but it was still successful enough for AMD to build RDNA 2 upon it. I kind of expect the same from first gen Arc, to be honest: not to kick the GPU market in the backside, but to be just good enough for Intel to build the next generation upon. A test run, basically.
Posted on Reply
#92
Chrispy_
Can we rename the thread title to "Intel Teaser Vaporware Again"?

If it was a working product, they'd be demoing it, not showing off a mock-up that will most likely change before launch anyway.

It's about 3 years late already because Intel started teasing the DG2 in 2018 and it was supposed to tape out on 10nm 2H 2019. There was a little problem with Intel's 10nm in 2019 though - they axed it.

This was the 10nm process so unusable that Intel actually scrapped plans for 11th Gen CPUs on 10nm in 2019 and spent 6 months backporting Rocket Lake to 14nm. Xe HPG "DG2" Arc Alchemist was supposed to be on shelves with Rocket Lake CPUs, and Rocket Lake was already 6-9 months late over two years ago.
Posted on Reply
#93
Steevo
AusWolfRT could be nice when implemented right. That is, when the whole scene is ray traced. The problem is that current gen RT cores on both AMD's and Nvidia's side are too slow for this. That's why game devs limit what they do with RT (for example, shadows only) and use traditional rasterization for the rest of the effects - which is what makes RT at its current state kind of pointless, imo. Otherwise, it would be great.

As for Tensor cores, I agree. Everybody praises Nvidia for DLSS, when in fact, if RT cores were strong enough to do their job at a minimal performance loss vs rasterization, then nobody would need DLSS in the first place. Besides, AMD has shown that the whole DLSS shenanigans can be done without Tensor cores so... meh. :rolleyes:


Exactly. Nearly every single monitor is Freesync capable nowadays, whereas you have to find and pay more for Gsync ones. This is where the slogan "Nvidia - The Way It's Meant To Be Paid" is valid.


Well, the point of DLSS is to increase performance at a minimal image quality loss. But then the question is, why don't you have enough performance in the first place? Is it the gamer's fault for sitting on the 120+ fps craze train, or is it Nvidia's way of selling their otherwise useless Tensor cores when they could have used the same die space for more raster cores? RT is nice, but like I said above...


They have a much harder time convincing me for sure. I honestly think the high refresh rate craze is probably the stupidest thing the gaming industry has invented so far.
The hardware to ray trace every pixel in Z depth with every light source with 16 bit or 24 bit color and transparently plus texture is so hardware intensive that it will never be a on demand real time function.

We will probably figure out a way to create matrix tables with vertex data precooked in and use fewer ray samples added to the pipeline with current global illumination lighting to achieve the most realistic effect with far less overhead.

I remember tessellated concrete barriers, maybe, hopefully, Intel entering the scene with midrange hardware will put the brakes on stupid implementations of RT
Posted on Reply
#94
64K
SteevoThe hardware to ray trace every pixel in Z depth with every light source with 16 bit or 24 bit color and transparently plus texture is so hardware intensive that it will never be a on demand real time function.

We will probably figure out a way to create matrix tables with vertex data precooked in and use fewer ray samples added to the pipeline with current global illumination lighting to achieve the most realistic effect with far less overhead.

I remember tessellated concrete barriers, maybe, hopefully, Intel entering the scene with midrange hardware will put the brakes on stupid implementations of RT
I remember a Star Wars demo that was running on a $60,000 DGX Station that had 4 Titan V GPUs and managed only 24 FPS. It's going to be a long, long time (if ever) that tech will advance enough that such a machine will be affordable.


arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/
Posted on Reply
#95
Mad_foxx1983
These cards kinda giving off RX Vega 56/64 kinda vibes.
Posted on Reply
#97
eidairaman1
The Exiled Airman
Mad_foxx1983These cards kinda giving off RX Vega 56/64 kinda vibes.
No because Vega Actually is a card, this here is not
Posted on Reply
#98
ixi
So big, but so weak. Little big, sound nice.
Posted on Reply
#99
InVasMani
64KI remember a Star Wars demo that was running on a $60,000 DGX Station that had 4 Titan V GPUs and managed only 24 FPS. It's going to be a long, long time (if ever) that tech will advance enough that such a machine will be affordable.


arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/
That shows just how badly Nvidia manipulated and distorted consumers quite frankly with deceptive advertising. A lot of people bought into RTRT cards based on that video thinking they would end up with something similar or modestly similar and yet look at Cyber Punk they had to handicap poly count in the end just to insert the bits of RTRT they did insert into the game. Plus that demo of Nvidia's is 24 FPS and no one in their right mind is going to considering gaming at 24 FPS reasonably fluid there is just too much input lag at that FPS. It's noticeably better even at 30FPS on input lag and still not great and further still at 36 FPS it's really not until about 48FPS average things start to approach a generally decent experience that feels fairly responsive. Until GPU's can start to integrate better frame interpolation around a FPS render target like that to compensate for it that type of frame rate will ever being very satisfying to end users. The fact is even 60 FPS is a bit of a general crutch for input responsiveness.
Posted on Reply
#100
64K
InVasManiThat shows just how badly Nvidia manipulated and distorted consumers quite frankly with deceptive advertising. A lot of people bought into RTRT cards based on that video thinking they would end up with something similar or modestly similar and yet look at Cyber Punk they had to handicap poly count in the end just to insert the bits of RTRT they did insert into the game. Plus that demo of Nvidia's is 24 FPS and no one in their right mind is going to considering gaming at 24 FPS reasonably fluid there is just too much input lag at that FPS. It's noticeably better even at 30FPS on input lag and still not great and further still at 36 FPS it's really not until about 48FPS average things start to approach a generally decent experience that feels fairly responsive. Until GPU's can start to integrate better frame interpolation around a FPS render target like that to compensate for it that type of frame rate will ever being very satisfying to end users. The fact is even 60 FPS is a bit of a general crutch for input responsiveness.
It is deceptive. Nvidia can call it RTRT and leave out the part where it's only partly RTRT. It's actually a mixture of Ray Tracing and Rasterization.

It will be interesting to see how Intel markets their gaming cards.
Posted on Reply
Add your own comment
Nov 20th, 2024 06:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts