Saturday, June 4th 2022
Intel Shows Off its Arc Graphics Card at Intel Extreme Masters
Currently the Intel Extreme Masters (IEM) event is taking place in Dallas and at the event, Intel had one of its Arc Limited Edition graphics cards on display. It's unclear if it was a working sample or just a mockup, as it wasn't running in a system or even mounted inside a system. Instead, it seems like Intel thought it was a great idea to mount the card standing up on the port side inside an acrylic box, on top of a rotating base. The three pictures snapped by @theBryceIsRt and posted on Twitter doesn't reveal anything we haven't seen so far, except the placement of the power connectors.
It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
Source:
@theBryceIsRt
It's now clear that Intel has gone for a typical placement of the power connectors and the card in question has one 8-pin and one 6-pin power connector. Intel will in other words not be using the new 12-pin power connector that is expected to be used by most next generation graphics cards. We should mention that @theBryceIsRt is an Intel employee and is the Intel Arc community advocate according to his twitter profile, so the card wasn't just spotted by some passerby. Intel has as yet not revealed any details as to when it's planning on launching its Arc based graphics cards.
108 Comments on Intel Shows Off its Arc Graphics Card at Intel Extreme Masters
AMD's delay of TeraScale by nearly a year could be covered by continued refreshes of the beefcake x1900 XTX. And NVIDIA survived the 8-month delay of Fermi with endless re-brands of G92 and GT200 x2,
What happens to a delayed architecture when there are two other strong releases waiting on their doorstep? Will Intel even get a foothold, without massively discounted parts?
Classic vaporware.
From the article:
I just don't feel like a bigger monitor would add enough to my general gaming experience to justify its own, and a faster and noisier GPU's cost. That is, I'd much rather game at 1080p 24" with a dead silent 6500 XT than pay for a 4K monitor and a not so silent 3080.
You should try trolling elsewhere or read a little more before spouting off literal fanboy BS.
Flexing boards, flexing CPUs, the terrible idea of the original 775 coolers that caused noobs and even pros to kill boards, their relabeled and unsupported Killer NiC series, their hardware security issues, the number of overheated chips that died in OEMs.
Intel isn’t magical, they are a for profit company just like AMD and their bullshitdozer that got stuck in the ponds.
Quality is quality, buying a 300 board from either means or should last longer than a 79 dollar board from either. Same for coolers they don’t make, memory, PSUs, and hard drives.
One is generated demand via hardware exclusive features, so e.g. with Nvidia, Gsync, DLSS and hardware based RT. All three features thanks to AMD have alternatives that dont require hardware lock in, VRR, FSR and RT using rasterisation hardware.
I get the merit of VRR, Nvidia deserve praise for introducing the concept, but they did attempt vendor lock in, not only via GPU but also using expensive chips on monitors. DLSS I also get the merit and out of the three this is for me by the far the most useful tech in terms of potential, but initially limited to games where dev's implement it (so limited to a a fraction of new games released), however we have more recently seen a new variant of DSR that uses DLSS so can be implemented now driver side, not sure if FXR can do via driver (someone can clarify for me this would be great). Finally RT, this one I have little love for, I feel lighting can be done very well in games via traditional methods, and I thought was lame where RT developed games would have heavily nerfed non RT lighting and to have good lighting you needed RT hardware, AMD have at least proven the hardware isnt needed to make this a bit less lame. But ultimately RT has served to increase the level of hardware required to achieve a quality level so is a bad thing for me.
The second demand for new GPUs is been fuelled by the high FPS/HZ craze, which is primarily desired by first person shooter and esport gamers. If there was no high refresh rate monitors, then Nvidia and AMD would be having a much harder time now in convincing people to buy new gen GPUs, as we starting to get to the point a single GPU can handle anything thrown at it to a degree at 60fps. The to a degree comment is that of course game developers whether its via lazyness or bungs thrown at them by AMD/Nvidia are optimising their games less so they need more grunt to hit a quality and frame rate target. Nvidia have been caught in the past when a developer was rendering tesselation underneath the ground in a game which helped sell Nvidia cards. There was also problems noticed in the FF15 benchmark tool where it was rendering things out of view which lowered the framerate. In that game as well Nvidia exclusive features absolutely tanked framerate and spiralled VRAM demand skywards.
I think 4k is worth it... text sharpness gfx fidelity, it's really nice... I think a 4080 would be a good upgrade to my gaming experience -- esp using larger screens.
At least compute GPUs they can sorta justify (via the supercomputer contracts) but how long do you expect Intel to continue to be #3 in the consumer market, while bleeding thee losses? (untill they fire the entire driver team , and absorb the best innto compute modules.)?
You are right and thats what i mentioned earlier. Intel has software issues and its harmony towards hardware coming from software is less compared to nvidia and amd. This is what I am trying to say that it wants to cover this segment for its ARC gpus causing massive delay.
Of course knowing Intel they'll probably charge you for OCing these cards or turning 3d acceleration on via their premium (upgrade) plans :laugh: