Saturday, June 4th 2022
Intel Shows Off its Arc Graphics Card at Intel Extreme Masters
Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.
It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source:
@theBryceIsRt
It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters
...which is sick, can't wait to eventually get a 144hz 4k monitor once they become cheaper lol
As for Tensor cores, I agree. Everybody praises Nvidia for DLSS, when in fact, if RT cores were strong enough to do their job at a minimal performance loss vs rasterization, then nobody would need DLSS in the first place. Besides, AMD has shown that the whole DLSS shenanigans can be done without Tensor cores so... meh. :rolleyes: Exactly. Nearly every single monitor is Freesync capable nowadays, whereas you have to find and pay more for Gsync ones. This is where the slogan "Nvidia - The Way It's Meant To Be Paid" is valid. Well, the point of DLSS is to increase performance at a minimal image quality loss. But then the question is, why don't you have enough performance in the first place? Is it the gamer's fault for sitting on the 120+ fps craze train, or is it Nvidia's way of selling their otherwise useless Tensor cores when they could have used the same die space for more raster cores? RT is nice, but like I said above... They have a much harder time convincing me for sure. I honestly think the high refresh rate craze is probably the stupidest thing the gaming industry has invented so far.
But in reality... guess everyone does remember no driver meme? I have a bad feeling no driver will go to the team blue.
www.tomshardware.com/news/intel-raptor-lake-cpu-benchmarked-20-percent-faster-than-core-i9-12900k
@Tigger made a thread:
www.techpowerup.com/forums/threads/raptor-lake-20-faster-than-12900k.295518/
Apart from noise, my other incentive was heat (as in air temperature inside the PC case and in the room), also curiosity for RDNA 2, not to mention having an Asus TUF themed build now, which wasn't really a reason to swap, just a bonus.
So far all of them seem to be limited editions :D :D :D
Its starts to matter when reviewers and users start to make a point all at once. I remember Intel using poor IHS on 3rd gen (if i remember exactly) the reviewers and users pointed it so badly that later batches had it revised immediately.
Now lets talk about the gamer enthusiast class. These ARC gpus are build for them. Intel does keep a very good eye on every market segment and its obvious that every bit of Intel gpu is going to go through hell alot of reviews, breakdowns, loads of tests. If I was in the same business I would work on things other competitors do not offer that is top quality low price and same performance.
If it was a working product, they'd be demoing it, not showing off a mock-up that will most likely change before launch anyway.
It's about 3 years late already because Intel started teasing the DG2 in 2018 and it was supposed to tape out on 10nm 2H 2019. There was a little problem with Intel's 10nm in 2019 though - they axed it.
This was the 10nm process so unusable that Intel actually scrapped plans for 11th Gen CPUs on 10nm in 2019 and spent 6 months backporting Rocket Lake to 14nm.
Xe HPG "DG2"Arc Alchemist was supposed to be on shelves with Rocket Lake CPUs, and Rocket Lake was already 6-9 months late over two years ago.We will probably figure out a way to create matrix tables with vertex data precooked in and use fewer ray samples added to the pipeline with current global illumination lighting to achieve the most realistic effect with far less overhead.
I remember tessellated concrete barriers, maybe, hopefully, Intel entering the scene with midrange hardware will put the brakes on stupid implementations of RT
arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/
www.science.org/doi/10.1126/sciadv.abj5014
www.nature.com/articles/d41586-022-01476-7
It will be interesting to see how Intel markets their gaming cards.