Wednesday, July 13th 2022

Intel's Arc A750 Graphics Card Makes an Appearance

Remember that Limited Edition card that Intel was teasing at the end of March? Well, it turns out that it could very well be the Arc A750 card, at least based on a quick appearance of a card in Gamer Nexus' review of the Gunnir Arc A380 card. For a few seconds in the review video, Gamers Nexus was showing off a card that looked nigh on identical to the renders Intel showed back in March. There was no mention of any specs or anything else related, except that Gamer Nexus has tested the card and that it will presumably be getting its own video in the near future based on what was said in the video.

Based on leaked information, the Arc A750 GPU should feature 24 Xe cores, 3072 FP32 cores and it's expected to be paired with 12 GB of GDDR6 memory on a 192-bit bus. For reference, the Arc A380 features eight Xe cores, 1024 FP32 cores and the cards ship with 6 GB of GDDR6 memory on a 96-bit bus. In related news, Intel is said to be touring some gaming events in the US promoting its yet unavailable Arc graphics cards. LANFest Colorado is said to be the first stop, so if you're planning on attending, this could be your first chance to get some hands-on time with an Arc graphics card.
Sources: Gamers Nexus, @TheMalcore, via Videocardz
Add your own comment

61 Comments on Intel's Arc A750 Graphics Card Makes an Appearance

#2
the54thvoid
Super Intoxicated Moderator
BlaezaGood or Bad? Place your bets...
If it's £3.57, then it's the best thing ever. If it's £700, it's a steaming pile. Frankly, the price point dictates whether it is good or bad.
Posted on Reply
#3
BlaezaLite
Probably steaming poop then. I'd get one as long as it's 6700XT levels.
Posted on Reply
#4
ModEl4
So resizable bar is a must, probably in some old DX11 games you loose a lot of performance (but only one game GTA V showed this behavior, still it seems logical assumption) and If you exclude two games, DX11 GTA V that RX6400 is +80% faster and F1 2022 that ARC A380 is +22% faster than RX6400, it seems that ARC is around -10% from RX6400 in 1080p on average for the rest games.
So according to this preview, if RX6400 is 100% you can say A380 seems like a 90% (and GTX1650 103%) in the current driver state and for these games (still too early and too game dependent, certainly we should wait from TPU/other testing to have a better overall picture)
According to the above 90% assumption, performance/W could potentially be -5%/-10% from a RX6500XT (or -35%/-40% vs RX6400) but the testing was based on Furmark, so another result that you cannot extrapolate really that much.
In 1440p the performance gap is smaller but it may be probably due to memory size and bandwidth disadvantage of RX6400/GTX1650.
I wonder if GamersNexus will cover media engine in a future video.
I also wonder if @W1zzard will have a surprise for us soon, probably not?
Posted on Reply
#5
usiname
BlaezaGood or Bad? Place your bets...
1070ti - rx5600xt performance
170-180w consumption
Edit: I will change my prediciton to rx 5700 non xt
Posted on Reply
#6
BlaezaLite
usiname1070ti - rx5600xt performance
Currently on 1660 super so would be very unhappy if that turned out to be the case.
Posted on Reply
#7
usiname
BlaezaCurrently on 1660 super so would be very unhappy if that turned out to be the case.
That is what I expect. Lets everyone say his predictions, maybe tomorrow GN will post review and we can see who is closest
Posted on Reply
#8
Harakhti
BlaezaProbably steaming poop then. I'd get one as long as it's 6700XT levels.
Funny you mention that, me and our colleagues were doing some performance bingo based on what to expect from specs and we shot in the small card to be around GTX 1050 levels, which it seems to be, and with that in mind, we were making the guess that the top end will cap out at 6700 XT performance, so I'm hoping the good ol' crystal ball didn't lie too much. ;)
Posted on Reply
#9
64K
This Arc series has been a disappointment all around. Especially the price for what you get. I don't expect the Arc A750 to be any different.
Posted on Reply
#10
Dristun
Well, I guess I was wrong to expect A770 to beat 3070ti. I suppose A750 vs 2060S/5700 and A770 vs 2080/5700XT, unless intel pull some driver magic out of the hat before release.
Still slower than 1080ti innit?
Posted on Reply
#11
ravenhold
I'm expecting something around 3060 performance.
Posted on Reply
#12
usiname
I bet that A770 will be 5% slower than 1080ti. Lets raja suffer :laugh:
Posted on Reply
#13
bonehead123
Well, I'm not placin any bets just yet, but will stick with the "till I have one runnin in my rig, it's STILL just moar intelly-wishy-washy v*A*p*O*u*R*w*A*r*E*z" to me" position......:roll:
Posted on Reply
#14
HisDivineOrder
My guess: 3050 class performance in games, 3070 class performance in benchmarks. Runs hot, chews through power. Drivers failed in 3ish games. Obvious paying for the chance to do beta testing.
Posted on Reply
#15
ModEl4
In the current driver state it seems ARC A750 will be slower than RTX3060 in QHD, probably around -10% slower or even more, so slower in that resolution vs RX5700/RTX2060super.
Posted on Reply
#16
john_
ARC series are like UFOs. So many rumors, photographs, stories, but still no first contact for the consumers.
Posted on Reply
#17
Frick
Fishfaced Nincompoop
usinameI bet that A770 will be 5% slower than 1080ti. Lets raja suffer :laugh:
Which would be absolutely fine if the price is right.
Posted on Reply
#18
ModEl4
Right now with July's upcoming street prices it seems A750 could get away with a $299 SRP but if we are talking about an end of September international on self launch, then I certainly don't think $299 will cut it!
Posted on Reply
#19
R0H1T
Give me one for free, that'll decide my vote :pimp:
Posted on Reply
#20
64K
john_ARC series are like UFOs. So many rumors, photographs, stories, but still no first contact for the consumers.
It would be helpful if Intel would send some cards to W1zzard to review. Some of the rumors could be sorted out.
I think I know why they don't though. Because the reviews might make the cards look bad for the money that Intel wants.
Posted on Reply
#21
Assimilator
64KIt would be helpful if Intel would send some cards to W1zzard to review. Some of the rumors could be sorted out.
I think I know why they don't though. Because the reviews might make the cards look bad for the money that Intel wants.
That's exactly the reason why Intel is only launching the A3 cards in China. They know how bad they are.
Posted on Reply
#22
cvaldes
64KIt would be helpful if Intel would send some cards to W1zzard to review. Some of the rumors could be sorted out.
I think I know why they don't though. Because the reviews might make the cards look bad for the money that Intel wants.
This makes zero sense.

It's unlikely that Intel is naive enough to think that these graphics cards won't ever be reviewed. At some point tech sites are going to get their hands on them regardless of whether it's a review sample from Intel (or an AIB partner) or a card from the retail marketplace.

If Intel doesn't want a particular CPU, NUC, network card, whatever reviewed, what do you think happens? Do they stick their head in the sand and hope no one notices? NO. This is not Intel's first rodeo, not by a long shot. Intel is not being run by a bunch of snot-nosed 22-year-old interns whose mantra is "fake it until you make it."

Furthermore if Intel knew the cards were disappointing, wouldn't it be in their best interest to send key reviewers cherry picked Golden Samples instead of letting reviewers play the silicon lottery from retailers? Or maybe if Intel did send out review samples for presumed stinkers, people would accuse them of that ploy. It has been done before countless times and not just PC components either.

Even if TPU doesn't review this card someone else will eventually.
Posted on Reply
#23
Assimilator
cvaldesThis makes zero sense.

It's unlikely that Intel is naive enough to think that these graphics cards won't ever be reviewed. At some point tech sites are going to get their hands on them regardless of whether it's a review sample from Intel (or an AIB partner) or a card from the retail marketplace.

If Intel doesn't want a particular CPU, NUC, network card, whatever reviewed, what do you think happens? Do they stick their head in the sand and hope no one notices? NO. This is not Intel's first rodeo, not by a long shot. Intel is not being run by a bunch of snot-nosed 22-year-old interns whose mantra is "fake it until you make it."

Furthermore if Intel knew the cards were disappointing, wouldn't it be in their best interest to send key reviewers cherry picked Golden Samples instead of letting reviewers play the silicon lottery from retailers? Or maybe if Intel did send out review samples, people would accuse them of that ploy. It has been done before countless times and not just PC components either.

Even if TPU doesn't review this card someone else will eventually.
Except that Intel has done this before. Witness the Core i3-8121U, the only 10nm 8th generation part, a 2c/4t CPU with a defective iGPU, a chip so bad that Intel didn't even tell Western reviewers that it exists, never mind shipping them review samples. Anandtech had to import a Chinese laptop to get hold of it, and their coverage was, as expected, not positive.

Intel is doing the exact same thing with the Arc A3 series. It's only launching in China, Western reviewers will not get samples. Gamers Nexus has just posted a review of it - they got a card from China - and it's as bad as everyone who knows Intel's history with GPUs has been saying ever since Arc was announced.

Intel is too proud to kill products they know are bad before those products are launched, but they don't want to be confronted with the failure of said products. So the answer is to dump those products in China, so they can tell investors "yes we launched it just like we said we would", and not have to confront the negative reviews from the West. It's the equivalent of playing a game with a 3-year-old, and when they lose, they flip the board and declare "nobody won, therefore I didn't lose". It's embarrassing and pathetic for such a large and powerful company to essentially be run by petulant children, and yet here we are.
Posted on Reply
#24
cvaldes
AssimilatorExcept that Intel has done this before. Witness the Core i3-8121U, the only 10nm 8th generation part, a 2c/4t CPU with a defective iGPU, a chip so bad that Intel didn't even tell Western reviewers that it exists, never mind shipping them review samples. Anandtech had to import a Chinese laptop to get hold of it, and their coverage was, as expected, not positive.

Intel is doing the exact same thing with the Arc A3 series. It's only launching in China, Western reviewers will not get samples. Gamers Nexus has just posted a review of it - they got a card from China - and it's as bad as everyone who knows Intel's history with GPUs has been saying ever since Arc was announced.
Congratulations, you have illustrated some of my points.

First of all, someone is going to get wind of a bad product, get their hands on it, and eventually give a bad review. Gamers Nexus did exactly as I described. And if it hadn't been GN, it would have been someone else including sites that don't publish in the English language (there are lots of those FYI).

And Anandtech has been calling out manufacturers for pisspoor products since the Nineties.

And Arc Graphics is not the same BECAUSE INTEL HAS PROMINENTLY ANNOUNCED THIS PRODUCT LINE.

It's on their corporate website. And it's not just one little tiny webpage buried someplace in the sitemap.

www.intel.com/content/www/us/en/products/details/discrete-gpus/arc.html

Intel is not surreptiously launching this. Arc Graphics is not being snuck under the radar. They made a BIG deal about it.

Industry eyes have been following this ever since they announced their intention to get into the discrete GPU market. That was years ago. Designing, manufacturing, and marketing GPUs is hard.

My guess is that Pat Gelsinger knows this first generation is less than stellar and doesn't want to send out samples. And yes, they might limit distribution to certain markets and shunt most of the GPUs to large system builders like HP, Dell, Lenovo, etc. rather than focus on the DIY AIB market (which is in shambles right now).

I doubt that Intel will give up. Eventually they will figure it out. It might require them to poach more engineers from the competition. After all Intel, AMD, NVIDIA, and Apple headquarters are all in about a 10-15 minute radius of each other. Maybe their current staff didn't get the job done but there is nothing that says that they can't add more people to the project.

No one expects Arc Graphics to contribute a sizable portion of the company revenue initially. So if the first generation products don't sell well, it's not like Intel is going to file Chapter 11.
Posted on Reply
#25
80251
Hmmmmm, "limited edition" that sounds like it's going to translate to "priced at more than its worth" (especially if cryptominers begin flogging their dead horses en mass on fleabay).

I'm hoping Intel eventually turns the videocard market into a three ring circus instead of the present duopoly.
Posted on Reply
Add your own comment
Dec 23rd, 2024 22:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts