# Intel Arc A770



## W1zzard (Oct 5, 2022)

With the amazing-looking Intel Arc A770, the blue team is making a push to offer a capable mid-range graphics card product at affordable pricing. Intel is including a lot of modern tech like AV1 video encode, hardware-accelerated ray tracing units and more on their newest release.

*Show full review*


----------



## ZetZet (Oct 5, 2022)

Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.


----------



## P4-630 (Oct 5, 2022)

Conclusion, my MSI RTX 2070 Super Gaming X Trio might even perform better at 1440p....


----------



## Colddecked (Oct 5, 2022)

ZetZet said:


> Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.


Lol little harsh. 

350 for a card that performs as good as a 3060ti is not bad on the surface.  There's alot of asterisks though.


----------



## ZetZet (Oct 5, 2022)

Colddecked said:


> Lol little harsh.
> 
> 350 for a card that performs as good as a 3060ti is not bad on the surface.  There's alot of asterisks though.


You can get a card that sometimes gets to 3060ti or goes way lower than that. RX 6650 XT is 299 and it's way more consistent. I guess you could bet on Intel to improve their drivers and continue support.


----------



## dj-electric (Oct 5, 2022)

ZetZet said:


> Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.


Noting that Radeon has a sale now. You can get an RX 6650 XT for 329 on Amazon


----------



## Dirt Chip (Oct 5, 2022)

A770 scale beautiful as resolution increase, jumping over NV and AMD.


----------



## Pumper (Oct 5, 2022)

Looks like drivers are still not there to give consistent performance, but the fact that it's beating the 3060Ti in the heavy games and has better RT performance than RDNA2 is impressive as far as I'm concerned.

Hopefully the idle power draw is a software issue and can be fixed.


----------



## usiname (Oct 5, 2022)

Dirt Chip said:


> A770 scale beautiful as resolution increase, jumping over NV and AMD.


Yes, if you planning to watch powerpoint presentation


----------



## KrazedOmega (Oct 5, 2022)

On page one. "Intel is pricing the 16 GB version of the A770 at $50."


----------



## W1zzard (Oct 5, 2022)

KrazedOmega said:


> On page two. "Intel is pricing the 16 GB version of the A770 at $50."


Thanks, fixed. There will be more typos, just finished writing up all these articles


----------



## Chrispy_ (Oct 5, 2022)

Still reading through the review, but 1080p consistency is all over the place and there are too many games that struggle to call this a true 1440p card.
I suspect driver immaturity is to blame for a lot of this, and potentially the Arc 750 might be the card to get if you play the longer game and wait for performance improvements via drivers.

Honestly, that's not bad for their first real dGPU. If it had released in 2020 as initially planned, Intel would have made a killing but this 18 month delay sure has hurt its prospects. Here's hoping they don't give up because the second gen is likely to be much better than this - they're likely learning _a lot_ from their first attempt's mistakes.

The thing that stops me from buying one is that although it's clear the performance is there in many titles, there are plenty of instances where it's outperformed by half-decade old cards that you can pick up on ebay for $100 now. Until those drivers are good enough that you can expect proper utilisation and performance in at least 90% of games, you can't look at its average performance, because you'll almost certainly have several games you want to play where it runs like a dog.

The average performance is somewhere between a 6600 and 6600XT but you'll only care about performance when there's not enough of it, at which point you'd have been much better off buying an RX 6600 for $224.99:





When a $224 card (it's definitely not hard to find sub-$250 RX6600 cards somewhere on any given day) is consistently providing a superior overall experience, it's really hard to justify paying $290 or $350 for an Arc card. You're effectively paying now for _maybe_ 6700XT performance 12 months down the line but if you're only worried about performance 12 months* from now, buy an RX7600 or RTX4060 instead, which will presumably run circles around Arc, the 3060, and the 6600-series.

* - a number I pulled out of my ass as a very rough estimate of how long I think it'll take Intel to deal with the worst-performing games through driver updates


----------



## Solid State Brain (Oct 5, 2022)

The 44W idle power consumption, even with a single display connected, is a deal breaker for me. That must also be the reason why there's no fan-stop feature.


----------



## wolf (Oct 5, 2022)

Well done Intel honestly, great to have another player in the game and a very respectable product for a first entry given the asking price imo.

Particularly interesting to me that I'd love to drill more into is RT performance, check out Metro Exodus EE.... and XeSS seems very performant with the XMX instruction set, bloody well done.

Color me impressed.


----------



## ZetZet (Oct 5, 2022)

Chrispy_ said:


> I suspect driver immaturity is to blame for a lot of this, and potentially the Arc 750 might be the card to get if you play the longer game and wait for performance improvements via drivers.


Considering the card is like 1.5 years late you might be doing a lot of waiting.


----------



## Dirt Chip (Oct 5, 2022)

usiname said:


> Yes, if you planning to watch powerpoint presentation


1080:
A770 - 100%
6600XT - 109%
3060TI - 125%

1440P:
6600XT - 97%
A770 - 100%
3060TI - 117%

4K:
6600XT - 80%
A770 - 100%
3060TI - 112%

so, ?


----------



## birdie (Oct 5, 2022)

To sum it up:

Idle power consumption is crazy and has to be fixed
Power efficiency is a lot lower than the GeForce 30 series despite the latter being produced using a worse process
Pricing is good but not excellent
Drivers are a concern
RTRT performance is actually really good
XeSS is great but needs minor fixes and adjustments
Excellent open source Linux drivers
In some cases horrible performance in DirectX 9 titles (not limited to CSGO). Intel cards are meant to run Vulkan/DirectX 12 titles
For a first gen product it's actually great. Hopefully people will start buying Intel GPUs, so the company won't give up on its graphics division and it will hopefully result in saner/lower prices for NVIDIA _and_ AMD both of which have seemingly lost touch with reality pricing-wise.


Spoiler: CSGO performance in 1080p


----------



## Dirt Chip (Oct 5, 2022)

birdie said:


> To sum it up:
> 
> Idle power consumption is crazy and has to be fixed
> Power efficiency is a lot lower than the GeForce 30 series despite the latter being produced using a worse process
> ...


yep.
Bottom line- don't buy, wait for ARC second gen (battlemage).


----------



## gridracedriver (Oct 5, 2022)

DOA, almost 2 years late on the market.
Low perf / watts, ok are good in ray tracing, but in 2 or 3 months they will be paved, the only strong point is the price, indeed Intel is giving them away.


----------



## Flanker (Oct 5, 2022)

Not there yet. It is something to improve upon I guess


----------



## usiname (Oct 5, 2022)

Dirt Chip said:


> 1080:
> A770 - 100%
> 6600XT - 109%
> 3060TI - 125%
> ...


48fps? For $40 more you can get 6700xt - 15% faster in 4k, 25% faster in 1440p and 30% faster in 1080p
6650xt is faster than 6600xt and cost $299 still better performance/$ in 4k, and smashing a770 in 1080p
Should I mention the 40% worse performance per watt?


----------



## theGryphon (Oct 5, 2022)

ZetZet said:


> Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.



Good observation. 20 yrs later these will be collectors' items selling for 100 times the MSRP.


----------



## BSim500 (Oct 5, 2022)

Colddecked said:


> Lol little harsh. 350 for a card that performs as good as a 3060ti is not bad on the surface.  There's alot of asterisks though.


Depends on the game. Consistency is a serious issue with these it seems...


----------



## dirtyferret (Oct 5, 2022)

Am I going to go out and buy one?  No
But it's a good first entry into a two party market that is notorious for price fixing and ever increasing prices

Let's hope Intel keeps at it so the answer to the question above becomes a "yes" or "sure, why not".


----------



## usiname (Oct 5, 2022)

theGryphon said:


> Good observation. 20 yrs later these will be collectors' items selling for 100 times the MSRP.


Like the $25 Intel i740 in Ebay?


----------



## Legacy-ZA (Oct 5, 2022)

This is much better than I expected, well done Intel, I am looking forward to what you will bring next. 

Next time I upgrade, I might just choose you.


----------



## Dirt Chip (Oct 5, 2022)

usiname said:


> 48fps? For $40 more you can get 6700xt - 15% faster in 4k, 25% faster in 1440p and 30% faster in 1080p
> 6650xt is faster than 6600xt and cost $299 still better performance/$ in 4k, and smashing a770 in 1080p
> Should I mention the 45% worse performance per watt?
> View attachment 264294


Cool, but what does it got to do with what I wrote?
A770 scale better than AMD and NV as resolution increase. That's a fact.
My opinion- don't buy any ARC first gen card.
As for your 6650xt, 6700xt input- wonderful


----------



## LFaWolf (Oct 5, 2022)

Not bad for a first card. It could be a value card. I actually like the mismatched power plug color.


----------



## Dirt Chip (Oct 5, 2022)

Wonderful card, great achievement. Wait for the flash sale at 50% in a few month.


----------



## Count von Schwalbe (Oct 5, 2022)

Dirt Chip said:


> A770 scale beautiful as resolution increase, jumping over NV and AMD.


Not gonna disagree, it looks like Raja gave it some actual memory bandwidth. The issue is the raster performance is a lil low to be gaming at 4k unless using an older game, which "hopefully" supports DX12 as the drivers are mostly focused on current games.

@W1zzard 


> Taking the Arc A770 is slightly more complicated than cards from other brands


Missed an "apart" it looks like


----------



## TheinsanegamerN (Oct 5, 2022)

I was hoping in the month+ since the a380 came out that intel would smooth the drivers out a little. 

Looking at performance, its still all over the place. How many years will it take I wonder? Also, so heavily dependent on ReBar....

does TPU have any plans to do a re-review in, say, 3-6 months time to see if drivers have improved at all?


birdie said:


> To sum it up:
> 
> Idle power consumption is crazy and has to be fixed
> Power efficiency is a lot lower than the GeForce 30 series despite the latter being produced using a worse process
> ...


IIRC the A380 that phoronix tested on linux also ran terribly, the drivers are open source but not quite the same branch as the iGPU drivers.


----------



## gridracedriver (Oct 5, 2022)

_"Not bad to first relase." _BUT_ "They are not cards I would buy, immature drivers, low perf/watt, idle power, pricing is good but not excellent" _
ecc...
This only means one thing, enough on paper, but nobody buys → FAIL


----------



## TheinsanegamerN (Oct 5, 2022)

gridracedriver said:


> _"Not bad to first relase." _BUT_ "They are not cards I would buy, immature drivers, low perf/watt, idle power" _ecc
> 
> This only means one thing, enough on paper, but nobody buys → FAIL


Honestly if these had come out during the great GPU drought of 2020, intel would have made bank. 

Now though, just being there isnt enough. Hopefully intel doesnt abandon the product...


----------



## ZetZet (Oct 5, 2022)

theGryphon said:


> Good observation. 20 yrs later these will be collectors' items selling for 100 times the MSRP.


Highly doubt it considering both AMD and Nvidia have had "legendary" cards which are paperweights these days and they do not cost a lot. Not even above MSRP, ignoring inflation.


----------



## Space Lynx (Oct 5, 2022)

I welcome the extra competition, it's good for the consumer.


----------



## gridracedriver (Oct 5, 2022)

TheinsanegamerN said:


> Honestly if these had come out during the great GPU drought of 2020, intel would have made bank.
> 
> Now though, just being there isnt enough. Hopefully intel doesnt abandon the product...


True, but they come out too late.
imho.


----------



## Count von Schwalbe (Oct 5, 2022)

ZetZet said:


> Highly doubt it considering both AMD and Nvidia have had "legendary" cards which are paperweights these days and they do not cost a lot. Not even above MSRP, ignoring inflation.


Depends if ARC is canceled or not. A working Larrabee card just sold for over $5k.


----------



## ZetZet (Oct 5, 2022)

Count von Schwalbe said:


> Depends if ARC is canceled or not. A working Larrabee card just sold for over $5k.


That was a prototype, there aren't many of those, possibly only one. This won't be that.


----------



## gridracedriver (Oct 5, 2022)




----------



## ZetZet (Oct 5, 2022)

gridracedriver said:


> View attachment 264300


Sad part is that this difference is clearly visible on the synthetic benchmarks. Real world performance is just reaaaaaaaaaaaaally bad. If you were to believe in Intel then these cards probably would get faster in time, that's if they don't cancel the whole project.


----------



## HD64G (Oct 5, 2022)

Unfortunately for Raja, AMD with their GPU pricing lately, buried his newborn just after it was declared alive.


----------



## W1zzard (Oct 5, 2022)

Count von Schwalbe said:


> Missed an "apart" it looks like


Fixed, thanks



TheinsanegamerN said:


> I was hoping in the month+ since the a380 came out that intel would smooth the drivers out a little.


They made huge improvements actually. There's still a lot to do of course, but Intel is VERY actively hunting down those bugs. They know that they have to get their software improved


----------



## mindfractal (Oct 5, 2022)

I wonder that the  FP64 (double) performance can be 4.301 TFLOPS (1:4) as shown in the database?


----------



## Blueberries (Oct 5, 2022)

What I found most interesting is the Ray Tracing performance, Intel has already caught up in this area and has proven they can be competitive in the GPU space. 

Efficiency leaves much to be desired, I'm not sure how much of that is hardware and how much of it is driver level optimizations that need to occur.

I won't be in the market for an Intel GPU this tier, but I'm very interested to see where they go with Battlemage and Celestial.


----------



## TheDeeGee (Oct 5, 2022)

So at 1080p it's as fast as my GTX 1070... GG.


----------



## RandallFlagg (Oct 5, 2022)

This is better than expected.  The fact it bumps up in ranking as resolution increases, and in some cases matches a 3070, shows this is a lot of hardware for the money that is just held back in some areas by drivers.

At 1440P this is showing +12% FPS over the 3060 on average.  In places where the drivers are optimized, it's trading blows with a 3060 Ti and 3070.  

I call this a win for a premium card at $350 when most 3060s are still selling for $350. As drivers improve, this A770 will probably morph into a 3060 Ti or 3070.


----------



## Anarkie (Oct 5, 2022)

The resizable BAR requirement is what kills it the most for me. Looking for cheap upgrades for a few computers, but that part kills it for me. Hopefully either they can remove that requirement in some way or Nvidia/AMD reduce their prices a little more. 

Right now, I'm eyeing the RX 6600/XT for a few upgrades, but still waiting just a bit longer to see if the prices get a little better by the end of the year.


----------



## GunShot (Oct 5, 2022)

Meanwhile, ole lying / hateful conspiracy theorists, TweakTown, etc., are all mad and extremely salty right now that Intel refused to send them a review sample.


----------



## vmarv (Oct 5, 2022)

theGryphon said:


> Good observation. 20 yrs later these will be collectors' items selling for 100 times the MSRP.


For that to happen, the item should be an icon of its times, like the Neo Geo. 
No chance in this case.


----------



## Dirt Chip (Oct 5, 2022)

The most important thing in this lunch is to see that the drives improved, a lot.
Keep the improvement like this, and by the time Arc second gen arrive Intel will have real worthy product to offer (subject to cost of course)


----------



## TheinsanegamerN (Oct 5, 2022)

GunShot said:


> Meanwhile, ole lying / hateful conspiracy theorists, TweakTown, etc., are all mad and extremely salty right now that Intel refused to send them a review sample.


Is the conspiracy in the room with you right now?



Dirt Chip said:


> The most important thing in this lunch is to see that the drives improved, a lot.
> Keep the improvement like this, and by the time Arc second gen arrive Intel will have real worthy product to offer (subject to cost of course)


If they can get around their requiring ReBar and can make a low profile card that is faster then a 1650 super I'd jump on it. The complete and utter lack of upgrade path is annoying ASF.


----------



## kapone32 (Oct 5, 2022)

What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.


----------



## Fleurious (Oct 5, 2022)

Better than I was expecting.  Hopefully this is the footing they need to push into the discrete GPU market.   Looking forward to seeing their next line of GPUs.


----------



## thegnome (Oct 5, 2022)

About what I expected... If only they had more Xe cores (like 36-40) along with a less "max power" approach.


----------



## MrDweezil (Oct 5, 2022)

They'd sell as many of these as they could make if they could have released 6 months ago.  But with current pricing I think most people would rather spend a little less for a 6600/6650, or spend a little more for a 6700, which you can find for $380-$390 now.


----------



## Solid State Brain (Oct 5, 2022)

I missed this initially, it should explain why idle power consumption is so high.





I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.


----------



## chr0nos (Oct 5, 2022)

To all those bashing drivers, try RX5700 XT launch drivers that was FUN


----------



## ZetZet (Oct 5, 2022)

Solid State Brain said:


> I missed this initially, it should explain why idle power consumption is so high.
> 
> 
> I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.


Looks like a bandaid fix to some issue they had considering the state of the drivers. Not that it helped much.


----------



## GunShot (Oct 5, 2022)

kapone32 said:


> What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.


Won't happen and it doesn't need to happen because it will extremely limit innovation.

We have XeSS, FSR, accelerated RT cores, etc. now ONLY due to NVIDIA is LEADING (not jelly about others innovations) the market and forcing its rivals to do better and catch up and stop being lazy and/or inferior.


----------



## Testsubject01 (Oct 5, 2022)

> Decent midrange performance
> Reasonable pricing
> Support for DirectX 12 and hardware-accelerated ray tracing
> Better RT performance than AMD, slightly worse than NVIDIA
> ...



Might we add “Dual-slot card” as a Pro? Especially in the light of the absolute behemoths called the 40-series.

B2T:
Nice improvements on the drivers since the introduction of ARC 380 and very promising performance (Raster and Ray tracing) for a first generation, even though it was delayed quite a bit.
Price could be better too, $275-$300 would have been excellent, $350 is too much for what it has to offer at the moment.

Battle mage might actually take a real stab at the enthusiast level, and we will have 3 options throughout the line-up. Exciting prospects ahead!


----------



## birdie (Oct 5, 2022)

kapone32 said:


> What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.



Intel has said they are working with Microsoft to create an API for that where you can plug in any implementation which works on your GPU and choose it at runtime. I sure hope it'll come sooner than later because game developers are now wasting a ton of resources implementing all three of them.

When it's done you just go to game settings choose DLSS/FSR/XeSS and the amount of sharpening and you're all set.


----------



## N3utro (Oct 5, 2022)

I was not expecting the A770 to run faster than a 3060ti on several games and benchmarks, and this even on 1st release drivers. On other games thought the performance is concerning because it's way lower. What would make sense is that the drivers are holding back the real performance on these games and hopefully it will be on par with the other benchmarks after some updates.

I disagree on the conclusion from Wizzard thought: 350$ for a card that which performance is tied with a 3060ti sounds like a bargain. I feel like the general performance which shows that the 3060ti is 10% faster is probably  due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better. But even if i disagree on the conclusion, thank you Wizzard for the review!

I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.

I must say i'm pretty tempted to buy one even knowing that early adopter = beta tester. But if the promesses of the card come true and you have a 3060ti performance with 16GB vram for 350$ then hell yeah!

Who cares if the card is not as power efficient as others, at this TDP it's not like it's heating up like crazy anyway, and i very much prefer a higher TDP card with better performance for 350$ than the same performance with better TDP at 450$.


----------



## Chrispy_ (Oct 5, 2022)

birdie said:


> RTRT performance is actually really good


Yup, pleasantly surprised by that. 

They've done a better job than AMD did with RDNA2, though arguably two years after AMD and with a die that is larger and has more transistors than even Navi 22 found in the much faster 6750XT.

Still, it doesn't suck - and that's what we need for healthy competition in DXR titles.


----------



## kapone32 (Oct 5, 2022)

GunShot said:


> Won't happen and it doesn't need to happen because it will extremely limit innovation.
> 
> We have XeSS, FSR, accelerated RT cores, etc. now ONLY due to NVIDIA is LEADING (not jelly about others innovations) the market and forcing its rivals to do better and catch up and stop being lazy and/or inferior.


That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?


----------



## mechtech (Oct 5, 2022)

What’s with the abysmal borderlands performance?!?!



ZetZet said:


> Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.


But who wouldn’t want a piece of history???


----------



## Count von Schwalbe (Oct 5, 2022)

kapone32 said:


> That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?


My personal opinion is that the ARC team should have worked with the Radeon team to come up with an industry standard for AI (tensor) cores, upscaling, and revised OpenCL to actually compete with CUDA. 

With Nvidia dominating the market like they do, splintering up alternatives will only hurt the both of them.


----------



## Xex360 (Oct 5, 2022)

Beautiful paperweight.
More seriously it is impressive in RT showing that or nVidia's RT cores aren't all that impressive.
The card isn't very good it is similar to 3070 in size, yet it is miles behind, Intel is losing a lot on each card.


----------



## BSim500 (Oct 5, 2022)

mechtech said:


> What’s with the abysmal borderlands performance?!?!


I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...


----------



## Kissamies (Oct 5, 2022)

Better than I thought. From their first generation of discrete cards in 25 years, this looks to be a capable card as it looks to be around 1080 Ti/2080 levels in performance.


----------



## efikkan (Oct 5, 2022)

A very big negative was forgotten; No DirectX 9 support. So gamers have to rely on emulation for half of their game collection.



Legacy-ZA said:


> This is much better than I expected, well done Intel, I am looking forward to what you will bring next.


So what was your expectation then?
It's not that long ago A770 was supposed to be a RTX 3060 Ti competitor, now in reality it's more a RTX 3060 competitor.



Dirt Chip said:


> A770 scale better than AMD and NV as resolution increase. That's a fact.


So what? The cards are barely suitable for 1440p, so how are you going to enjoy that "scaling"? 



N3utro said:


> I was not expecting the A770 to run faster than a 3060ti on several games and benchmarks, and this even on 1st release drivers. On other games thought the performance is concerning because it's way lower. What would make sense is that the drivers are holding back the real performance on these games and hopefully it will be on par with the other benchmarks after some updates.


People need to stop making excuses for bad performance. Bugs will be fixed, but the overall performance characteristics are unlikely to change. In order for there to be a significant change in performance, they would have to do major changes to one of their implementations of graphics APIs, and all the games using that would be affected accordingly.



N3utro said:


> I feel like the general performance which shows that the 3060ti is 10% faster is probably  due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better.


What's the logic here?
The general performance is the average of all the games. There is no reason to conclude the driver is holding it back based on this. It seems like you are looking at outliers and expecting everything to scale the same. This is kind of the point of averaging a large sample set; it eliminates the cherry-picked cases.



N3utro said:


> I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.


This happens periodically anyways.
They've had this card in testing for ~6 months, there is no reason why there should be a major untapped performance potential here.



N3utro said:


> I must say i'm pretty tempted to buy one even knowing that early adopter = beta tester. But if the promesses of the card come true and you have a 3060ti performance with 16GB vram for 350$ then hell yeah!


At this price I would rather have RTX 3060.


----------



## mechtech (Oct 5, 2022)

BSim500 said:


> I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...


No doubt. Wow.  I think my library is probably 90% Not dx12


----------



## Xebec (Oct 5, 2022)

Solid State Brain said:


> I missed this initially, it should explain why idle power consumption is so high.
> 
> View attachment 264306
> I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.


This is my main concern with the card - 44W idle power is huge nowadays..    I hope it’s something they can fix in software.  Since you can’t OC the memory clock I’m wondering if there’s something preventing stability at different clocks for the VRAM.  

I seem to remember some previous gen AMD or Nvidia having problems with memory clocks not being stable below a certain frequency, a while ago.


----------



## Hyderz (Oct 5, 2022)

Nice review! Expected results and somehow this reminded me of first gen ryzen cpu when it first launched offering reasonable price and reasonable performance


----------



## lexluthermiester (Oct 5, 2022)

W1zzard said:


> Intel Arc A770 Review - Finally a Third Competitor​


I love this headliner! 

I find very interesting that this card is trading blows with my RTX2080 and seems to handily beat out a RTX3060, and for less money!


----------



## spnidel (Oct 5, 2022)

could you please add a d3d9 performance per dollar chart? I'd love to see it be 0 FPS/$


----------



## r9 (Oct 5, 2022)

For where they priced from a customer standpoint it's not bad from Intel's standpoint it's a complete fail. 




Needs twice the memory width and transistors to achieve the same performance that's absolutely terrible.
I would like nothing more then for them to stick around in the GPU space but if they don't make huuuuge improvement with the next gen (if we ever get another one) this is unsustainable.


----------



## The Quim Reaper (Oct 5, 2022)

This isn't a bad first showing, IMO.

The 770 is as fast as my old RTX 2080, which has kind of surprised me.


----------



## shovenose (Oct 5, 2022)

Someone please help me.

I play GTA V at 1920x1080 and MSFS2020 at 3840x1080. Really the only two games I care about. The rest, if they work at least OK, it’s good enough for me. My monitors only 60Hz but I play GTA with the settings maxed out and it’s already fine but I can’t manage 60FPS on MSFS even at medium with my RX6600. 

I don’t see benchmarks for these two games. 
can someone please do a comparison of the A770 16GB to the RX6600 8GB?

I want to know if it’s worth upgrading. Thank you.


----------



## GunShot (Oct 5, 2022)

kapone32 said:


> That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?


...and that's your "opinion" too. 

See how that works on the other side of that fence? Facts has also proven that everyone that screams "this needs to be standard for everyone" - DLSS, NVIDIA's Broadcast or just ANY NVIDIA industry leading tech - are usually users that have NVIDIA competitors' hardware or still have NVIDIA relic GPUs. I wonder, why are there any screams from users for not using any of Intel/AMD's industry game-changers?! 

Interesting! /s


----------



## ZetZet (Oct 5, 2022)

shovenose said:


> I don’t see benchmarks for these two games.
> can someone please do a comparison of the A770 16GB to the RX6600 8GB?
> 
> I want to know if it’s worth upgrading. Thank you.


GTA V is a DX11 game, it's going to be much worse on A770. Also what kind of an upgrade is this, same generation, same price tier and performance tier (mostly). Wait for RDNA3 and get a next-gen graphics card in the middle of 2023.


----------



## evernessince (Oct 5, 2022)

Hyderz said:


> Nice review! Expected results and somehow this reminded me of first gen ryzen cpu when it first launched offering reasonable price and reasonable performance



When Ryzen first launched AMD was offering an 8 core for the same price as Intel's 4 core.  Suffice to say it won in multi-threaded benchhmarks big time.

I don't feel like this GPU is providing the same level of value.


----------



## shovenose (Oct 5, 2022)

ZetZet said:


> GTA V is a DX11 game, it's going to be much worse on A770. Also what kind of an upgrade is this, same generation, same price tier and performance tier (mostly). Wait for RDNA3 and get a next-gen graphics card in the middle of 2023.


What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.


----------



## W1zzard (Oct 5, 2022)

Solid State Brain said:


> I missed this initially, it should explain why idle power consumption is so high.
> 
> View attachment 264306
> I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.


It seems they designed the GPU with fixed memory clock in mind, which is why there's no memory OC either. On all hardware that I'm aware of this is purely a firmware/software limitation. The biggest issue is changing the frequency without things breaking during the switch, so it's not something trivial. I confirmed with them that they will look into memory OC for next-gen



N3utro said:


> I disagree on the conclusion from Wizzard thought: 350$ for a card that which performance is tied with a 3060ti sounds like a bargain. I feel like the general performance which shows that the 3060ti is 10% faster is probably due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better. But even if i disagree on the conclusion, thank you Wizzard for the review!


Good. Read my reviews, look at the data, come to your own conclusions, don't parrot the opinion someone else feeds you



N3utro said:


> I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.


The A770 will be part of my test group for the foreseeable future, and I update that with new drivers and new games every few months. Whether I'll make a separate article depends on what they can achieve in terms of product improvements


----------



## ZetZet (Oct 5, 2022)

shovenose said:


> What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.


If you need an improvement now then the only option is to step up to a higher tier. If you want gaming performance Intel Arc is the LAST place you would ever want to look. The issues with drivers pretty much make them unbuyable, especially if you are unhappy now.


----------



## Jism (Oct 5, 2022)

It took one or two respins which caused the delay of 18 months.

I woud'nt say it's a bad performer, its nicely in the middle. But that idle power consumption, the beta driver feature. I guess if you need a "quick" esports based machine on 1080p then this is a OK card. But next gen AMD and Nvidia will make that Arc look stupid.


----------



## Count von Schwalbe (Oct 5, 2022)

shovenose said:


> What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.


MSFS 2020 does support DX12 and (imo) will likely get XeSS at some point, way down the line. It could be better, but unless it is running out of VRAM (unlikely), it will probably not show a large improvement.


----------



## Makaveli (Oct 5, 2022)

Not bad for a first attempt just need lots of work still on drivers and the timing well both NV and AMD have a new gen coming out in weeks.....


----------



## Chrispy_ (Oct 5, 2022)

Xex360 said:


> Intel is losing a lot on each card.


Yeah, it's going to be an acceptable loss (hopefully) to gain marketshare and experience in the GPU business. In terms of die size and transistor count, it's not a lot lower than Navi 21, and it's significantly higher than Navi 22.

You can't compare die size with Ampere as TSMC6 is *very *different to Samsung 8nm, but in terms of transistor count, VRM complexity, and bus width (which affects PCB costs) - it's definitely costing Intel more than a 3070Ti is costing Nvidia. Don't forget that TSMC6 is a more expensive node than Samsung 8nm too, and I've not even accounted for that.


----------



## neatfeatguy (Oct 5, 2022)

They're not bad, but they're not that good, either.

Some ups and downs with performance consistency, but as others have mentioned, the power consumption is a bit of a glaring issue and hopefully they can get that resolved.

Here's hoping they can fix issues with future drivers and future builds. I'll be patiently waiting and hoping we keep a third party in this GPU race.


----------



## ToTTenTranz (Oct 5, 2022)

Looks like there's very good potential for competitive performance/power ratio when we look at properly optimized games like Metro Exodus with raytracing.
A year from now, we could be looking at the A770 competing with the 3070 and 3070 Ti all around, which is pretty great considering Intel is a newcomer.




r9 said:


> For where they priced from a customer standpoint it's not bad from Intel's standpoint it's a complete fail.
> View attachment 264313
> Needs twice the memory width and transistors to achieve the same performance that's absolutely terrible.
> I would like nothing more then for them to stick around in the GPU space but if they don't make huuuuge improvement with the next gen (if we ever get another one) this is unsustainable.


Why would you compare transistor amount between a fully enabled Navi 23 and cut-down G10? The A750 isn't using a bunch of those transistors anyway.
A better comparison would be the GA104 with 17B transistors, because that way we're comparing GPUs with similar hardware capabilities (dedicated tensor cores, larger RT units with more accelerated stages, etc.). In this case Intel is spending a bunch more transistors on the 16MB big L2 cache inside the chip, whereas Nvidia depends on a more expensive GDDR6X outside the chip.
The only obvious loss on Intel's side is die size (transistor density), but it just goes to show how intel does have room to grow.


----------



## Sithaer (Oct 5, 2022)

BSim500 said:


> I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...



In overall the card itself is better than what I expected for a first real try + I really like the design, RT is also surprisingly decent.

But since I'm also a variety gamer who plays both new and old/er games this would bother me a lot not knowing what its gonna do when I start up whatever game._ 'That and I do have 1k+ hours in BL 3..'_
I've bought a second hand RTX 3060 Ti ~ 3 weeks ago and so far I'm very satisfied with it, I did consider waiting for Intel's offers but honestly I got tired of waiting and the prices are always questionable in my country anyway.

How these cards will age vs the 3060 Ti only time will tell tho I kinda expect the 770 16GB model to end up being the better card in the long run._ 'I do intend on keeping this 3060 Ti for a long time unless it dies on me'_


----------



## shovenose (Oct 5, 2022)

ZetZet said:


> If you need an improvement now then the only option is to step up to a higher tier. If you want gaming performance Intel Arc is the LAST place you would ever want to look. The issues with drivers pretty much make them unbuyable, especially if you are unhappy now.



More than anything I'm concerned I'll have the same issue as I did with the ASRock Arc A380 where it didn't output to 3/4 monitors I tried it on across 3 different builds/PCs. And the one I finally got it to post/boot on, after installing the driver and rebooting, stopped working as well. So I sent it back for RMA. My RX6600 isn't flawless, there are some driver issues (GTA crashes if I load into Online directly, for it to work I have to load into Story Mode then switch to Online) and also I can't play some games full screen on monitor one with YouTube playing on the second monitor because it'll lock up occasionally. Not hardware issues, definitely AMD driver issues, and I've tried everything to resolve it. I would upgrade to an RX6700XT but I'm leery of having the same issues. That's why the A770 might be a worthy gamble.

Edit: also, Eyefinity is broken, there is no way to compensate for bezels like on NVIDIA Surround. There is a way to get to the old "Eyefinity Pro" control panel to try to do it but while it opens it's totally broken. Does Intel Arc have this?


----------



## Luminescent (Oct 5, 2022)

The lesson to be learned from this is brute force is not going to cut it.
I like that all the way Intel was very humble and didn't exaggerated with expectations, with proper drivers a770 can equal rtx3070/6700xt, maybe in a year or two.


----------



## FeelinFroggy (Oct 5, 2022)

I wont buy the card but I hope people will.  It would be nice to have another player in the market driving competition.


----------



## vmarv (Oct 5, 2022)

shovenose said:


> What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.


MSFS is Microsoft Flight Simulator, right?
NVIDIA showed this game as an example for the better performance of the 40 series cards with dlss 3. If you love the game why on earth you think to pick a different brand and something without their technology? Just wait some months, wait to see if the 4060 will have the specs leaked recently and then read the reviews to make a good decision.
Intel just joined the desktop gpus market, their drivers are a mess. Also prices are still high. Wait!


----------



## ZetZet (Oct 5, 2022)

shovenose said:


> More than anything I'm concerned I'll have the same issue as I did with the ASRock Arc A380 where it didn't output to 3/4 monitors I tried it on across 3 different builds/PCs. And the one I finally got it to post/boot on, after installing the driver and rebooting, stopped working as well. So I sent it back for RMA. My RX6600 isn't flawless, there are some driver issues (GTA crashes if I load into Online directly, for it to work I have to load into Story Mode then switch to Online) and also I can't play some games full screen on monitor one with YouTube playing on the second monitor because it'll lock up occasionally. Not hardware issues, definitely AMD driver issues, and I've tried everything to resolve it. I would upgrade to an RX6700XT but I'm leery of having the same issues. That's why the A770 might be a worthy gamble.
> 
> Edit: also, Eyefinity is broken, there is no way to compensate for bezels like on NVIDIA Surround. There is a way to get to the old "Eyefinity Pro" control panel to try to do it but while it opens it's totally broken. Does Intel Arc have this?


so if you keep having problems why aren't you just buying something like a RTX 3070. Seems weird to go back to the same platform and expect different outcomes.


----------



## Solid State Brain (Oct 5, 2022)

W1zzard said:


> It seems they designed the GPU with fixed memory clock in mind, which is why there's no memory OC either. On all hardware that I'm aware of this is purely a firmware/software limitation. The biggest issue is changing the frequency without things breaking during the switch, so it's not something trivial. I confirmed with them that they will look into memory OC for next-gen



With that, did they also imply that automatic idle memory clock adjustment just for the purposes of lowering idle power consumption is not expected to be implemented for the current-gen? I imagine that real-time memory frequency switching upon user intervention may pose problems, but perhaps things could be simpler with only 2 frequencies (high-low) engaged in a predictable manner by the driver/firmware.

A potential workaround for the high idle power consumption is using an iGPU as a primary adapter in a _hybrid graphics_ configuration, but whether this actually works depends on how well the drivers play with this function and if the discrete GPU (i.e. the A770) supports getting completely switched off and turned on seamlessly. Furthermore, it introduces additional inefficiencies that lower GPU performance and can bring software compatibility issues, at least on Windows.


----------



## Vayra86 (Oct 5, 2022)

@W1zzard what happened here? An anomaly, or did those cards really choke completely on this game/load?

Is this that famous memory wall they run into? They're all 8GB cards, and apparently, Intel can do a whole lot more with 8GB? That idea is supported by the fact RTX 3070ti 8GB is also on rock bottom there, while the 12GB 3060 is fighting Arc. This is a very interesting result if so.

Overall, great review as usual, and I have a much better feeling about Arc altogether now. Frametimes look very good, I had expected a massacre there, nothing of the sort, efficiency on par with Turing top end is also impressive. Looks like Raja did quite a few things right, too, damn.


----------



## trsttte (Oct 5, 2022)

Chrispy_ said:


> Yeah, it's going to be an acceptable loss (hopefully) to gain marketshare and experience in the GPU business. In terms of die size and transistor count, it's not a lot lower than Navi 21, and it's significantly higher than Navi 22.



It was certainly designed to compete at the 3070 level, like the old rumours said. It just... couldn't really


----------



## Vayra86 (Oct 5, 2022)

Luminescent said:


> I like that all the way Intel was very humble


That's definitely a saving grace, but everybody knows first impressions matter a lot. The pessimist could conclude they're buying goodwill for the future 



Makaveli said:


> Not bad for a first attempt just need lots of work still on drivers and the timing well both NV and AMD have a new gen coming out in weeks.....


Well, Nvidia for sure isn't competing on Intel's level for a while to come... not with ADA at least.


----------



## dayne878 (Oct 5, 2022)

Well, I'm not in the market for a mid-range card right now, but it is good to have more competition. Now all we need is NVIDIA to release a desktop CPU and we can have a 3-way competition for CPUs as well.

All this talk of graphics brings up something I wish they would branch into and this is APUs. AMD once had awesome (for the time) APUs and it seems they're bringing them back (to an extent). I know there was talk with DX12 that you could one day combine the power of integrated graphics with discrete graphics cards, but I haven't heard much more about that in years. It would be cool if Intel brought their iGPUs up to match AMD's iGPUs and then there was a way to combine the power of both integrated and discrete.


----------



## Readlight (Oct 5, 2022)

Lots off transistors and power


----------



## QuietBob (Oct 5, 2022)

The A770 actually came out better than I had expected. It's a solid 1440p60 card with good RT performance. The 16GB of VRAM will make it relevant going forward, and AV1 encode is a boon.

That said, it has way too many disadvantages for me. As a person for whom efficiency is far more important than raw performance, I find the power figures completely unacceptable. 44W idle, 50W during video playback and 211W at 60Hz V-synced is simply atrocious. My GPU spends most of its time either idling or playing back videos, and it uses 4 and 10W doing so. The fact that the ARC requires ReBAR and Win10 makes it useless as an upgrade option. The final straw is the lack of native DX9 support and subpar performance in DX11, further limiting its appeal.

And then there's the price. For what it offers, $250 would be fair. Here in Europe the RX6600 starts at $215, and the RX6600XT at $280 sans VAT. I'm glad Intel entered the dGPU market, but I see no way it can influence it with its current pricing.


----------



## qubit (Oct 5, 2022)

I'd like to see a top end Arc card which I hope shows up in time.


----------



## W1zzard (Oct 5, 2022)

Vayra86 said:


> what happened here? An anomaly, or did those cards really choke completely on this game/load?


VRAM



Vayra86 said:


> Frametimes look very good, I had expected a massacre there, nothing of the sort


Yeah nothing noteworthy there. Actually they look a tiny bit better sometimes than competitors, but again, not enough to mention imo


----------



## solarmystic (Oct 5, 2022)

I do believe the ARC A770 page (https://www.techpowerup.com/gpu-specs/arc-a770.c3914) in the GPU Database needs to be updated, now that Techpowerup's review has been released, the estimated relative performance positioning was indeed too hopeful


----------



## sLowEnd (Oct 5, 2022)

Colour me impressed with the RT performance. Intel hit the ground running there.

Still, there is a lot of variance in performance between games it seems. The Borderlands and Divinity results are kind of weird, same performance at 1080p and 1440p?


----------



## Sithaer (Oct 5, 2022)

@W1zzard

Was the card tested with DX 11 in Borderlands 3 instead of 12? Watching now the review from Jayztwocents and he got completely different numbers with 12:




I know that once it was mentioned in a TPU BL 3 test that DX 11 is preferred cause of early DX 12 build issues but as someone who plays this game on and off since the launch day I've tried it a few times with diff cards.
DX 12 used to be in beta mode but after a certain update its not beta anymore, it depends on the card but for the most part DX 12 is the better one to use imo.


----------



## W1zzard (Oct 5, 2022)

Sithaer said:


> Was the card tested with DX 11 instead of 12? Watching now the review from Jayztwocents and he got completely different numbers with 12:


As indicated I'm using DX11 because DX12 was pretty much unusable the last time I checked due to constant shader recompile on startup (that was a year or so after release). 
Is this fixed now and DX12 is a first class citizen in BL3? What API the majority of players use?


----------



## erek (Oct 5, 2022)

usiname said:


> Like the $25 Intel i740 in Ebay?


Larrabee, seen some go for over 4000 USD


----------



## Sithaer (Oct 5, 2022)

W1zzard said:


> As indicated I'm using DX11 because DX12 was pretty much unusable the last time I checked due to constant shader recompile on startup (that was a year or so after release).
> Is this fixed now and DX12 is a first class citizen in BL3? What API the majority of players use?



I can't speak for the playerbase but it was indeed patched some time ago and its out of the early stage, from what I heard it even defaults to 12 with AMD cards now and Wonderlands using the same engine also defaults to 12.

On my system DX 12 is much better than 11 with the current game version so I switched to that. 
It does use up a lot more Vram/memory tho but that shouldn't be an issue here.


----------



## W1zzard (Oct 5, 2022)

Sithaer said:


> using the same engine also defaults to 12.


That's why I'm testing Days Gone. Same engine, proper DX12 implementation


----------



## Sithaer (Oct 5, 2022)

W1zzard said:


> That's why I'm testing Days Gone. Same engine, proper DX12 implementation



Yeah to be honest BL 3 was never a well optimized title, I was just wondering what happened there with those numbers and ye Days Gone is well made considering its UE 4.
I guess if you have nothing better to do you could give it a quick check, that early long shader compile loading is gone now at least I did not see it happen when I switched to 12 recently.


----------



## AusWolf (Oct 5, 2022)

Thanks for the review. 

Now I'm sure: if AIB cards come out with idle fan-stop, I'll buy one. I'm specifically hunting for an Asus TUF to match the rest of my build. I'm just not sure if I want the 8 or 16 GB version. I'm on 1080p, so don't need the extra VRAM, but I might later.


----------



## RandallFlagg (Oct 5, 2022)

So how can we buy one of the Intel FE cards?  Is this going through traditional retailers?


----------



## ZoneDymo (Oct 5, 2022)

I have very little faith currently in Intel's driver team, everything points to problems (shady maybe even) in that area.


----------



## dragontamer5788 (Oct 5, 2022)

Other sites have done some OpenCL benchmarks / Blender, and it seems like the A770 is pretty good for compute tests.

I'm cautiously optimistic here.


----------



## pavle (Oct 5, 2022)

Raytracing yes, but where is shader and TMU and ROP performance? Looks like 256TMUs and 128ROPs ain't helping much, I wonder how powerful occlusion culling is on these cards.


----------



## RainingTacco (Oct 5, 2022)

What the hell is happening with idle power consumption? Looks like a bug similar i had with 5700XT - you could use another monitor in the review or check if VRAM clocks were elevated, becasuse that looks like the symptoms -increased power consumption.


----------



## R-T-B (Oct 5, 2022)

GunShot said:


> I wonder, why are there any screams from users for not using any of Intel/AMD's industry game-changers?!


You mean like how Mantle became Vulkan, and heavily influenced d3d12?

Turn down the fanboy please.  Every company has a good idea once in a while.


----------



## Absolution (Oct 5, 2022)

So a Vega 64 but slightly better lol (well with AI cores)


----------



## RedBear (Oct 6, 2022)

Absolution said:


> So a Vega 64 but slightly better lol (well with AI cores)


Or considering it in another way, if you were playing only with Metro Exodus, it's RTX 3070Ti performance at little above the price of an RTX 3050. More seriously, that game probably gives an idea of what is possible to achieve with these GPUs that Intel might or might not achieve in other titles too.


----------



## wheresmycar (Oct 6, 2022)

As a less complex tech-know-how performance enthusiast... i only look at 1080p/1440p game performance, power consumption, thermals and noise levels (and a few features which i won't bore you with). IMO, intels done a great job for getting their feet wet (again) in the GPU dome of riches where 2 gladiators have stood the test of time for too long and hard and now turning on consumers by wreaking havoc to fill their pockets.

ARC: Looking at the performance charts, yep a little pricey for my liking too and the rebar limitation kinda sucks (for system oldies). But, i'm sure the price isn't set in stone and seeing its Intels first attempt, forthcoming updates/optimizations should rally up some much deserved added performance margins (maybe even lifting non-rebar woes to a finer degree of acceptance). Bottom line, these cards are competing at the highest level if we can account for the majority of GPU sales which sit in the ~$350'ish plenty-of-perf-fuelled-visual-candy/mid-tier space. If intel can somehow tighten up that wider performance hit on non-rebar systems and offer better value going forward (slashing prices), we've definitely got something well established already without having to build our hopes on next Gen.

I would love to see these cards reviewed again at a later time, assuming Intel is already working at full pace to roll out updates.

The funny thing is, i'm not even in the market for a mid-tier card... my current one competes at the 3070 TI/RX 6800 level. Whether its Intel or Bob in his backyard, the new kid on the block makes things only so much more interesting... [hopefully] even a tiny share in the market for now is enough to eventually put holes in NVIDIAs fine timber boat which is costing the consumer a submarine + annual crew wages. Yeah thats right, i'm all in for the DISRUPTION of waves and would like to see (one day) the big boys squeal in ripples - fat chance, but there u have it! (...meanwhile will end up grabbing a 40-series or RDNA3 at silly prices... so don't mind my non-comformist eruption and weak-ass hypnotised fattened-up-beyond-reasonable-enthusiasm consumerist hypocrisy)


----------



## ModEl4 (Oct 6, 2022)

Driver quality is killing it.
Hardware is mostly OK, Software needs a lot of improvement.
If they don't improve drastically the driver efficiency and also make it good  at CPU multithreading scaling, the problem (the % performance that they lose due to driver inefficiency) will be even bigger in Battlemage.
Raja didn't even seem to me that optimistic!
This is the first time i watched him in video interview, it seem he gave mostly honest answers and didn't sugarcoated the whole situation!


----------



## AusWolf (Oct 6, 2022)

RainingTacco said:


> What the hell is happening with idle power consumption? Looks like a bug similar i had with 5700XT - you could use another monitor in the review or check if VRAM clocks were elevated, becasuse that looks like the symptoms -increased power consumption.


I'd like to know that too. I'm wondering if we can expect some improvement with newer drivers.


----------



## Minus Infinity (Oct 6, 2022)

Honestly if this came out 12 months ago it might have made an impression. A770 is slower than a 2080! And we'll soon comparing it to RDNA3 and Lovelace. Competition is good, but it'll only make a dent at the lower end. I hope drivers can eek another 10-20% more performance and power consumption can be lowered. I still wonder if they will go all in on development of Battelmage though as that needs to be a massive upgrade and at least double performance because it'll be facing Blackwell and RDNA4.


----------



## BunGFIzaN (Oct 6, 2022)

this card would be a great buy when pandemic happen, post-pandemic

not so much, still nice try tho


----------



## AusWolf (Oct 6, 2022)

Minus Infinity said:


> Honestly if this came out 12 months ago it might have made an impression. A770 is slower than a 2080! And we'll soon comparing it to RDNA3 and Lovelace. Competition is good, but it'll only make a dent at the lower end. I hope drivers can eek another 10-20% more performance and power consumption can be lowered. I still wonder if they will go all in on development of Battelmage though as that needs to be a massive upgrade and at least double performance because it'll be facing Blackwell and RDNA4.


For me, this kind of performance is absolutely fine. The only thing I don't like so far is the idle power consumption. I hope it'll be fixed with driver updates.


----------



## RandallFlagg (Oct 6, 2022)

Interesting...

At 3440X1440 (UWQHD which I run at) - 

Riftbreaker :






This site runs 1080P/1440P/3440x1440/4K with 20 games and averages them together to get a ranking.  Using that method the A770 is 14.6% faster than the 3060, and ~3.3% faster than the 6650XT  :


----------



## bhappy (Oct 6, 2022)

Should be called Intel Arc A770 Beta Edition not Limited Edition


----------



## 64K (Oct 6, 2022)

Too little, too late imo

It's really going to show when the nextgen GPUs from Nvidia and AMD come out in probably a couple of months.


----------



## Zubasa (Oct 6, 2022)

64K said:


> Too little, too late imo
> 
> It's really going to show when the nextgen GPUs from Nvidia and AMD come out in probably a couple of months.


I am pretty sure Intel is counting on both AMD and nVidia to not release any mainstream gpus this year to keep the prices high.
Otherwise they might have just shove it under the rug and forget it ever existed. I suspect the 40-series announcement have something to do with that.


----------



## Dirt Chip (Oct 6, 2022)

Any info on using A770 as a GPU aid for adobe premiere or after effect?
Utilizing the AV1 encoder during editing?
Dose it have CUDA like hardware so I can shift from NV to intel sometime?


----------



## Easo (Oct 6, 2022)

People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
Common...


----------



## 64K (Oct 6, 2022)

Easo said:


> People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
> Common...



It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.


----------



## Zubasa (Oct 6, 2022)

64K said:


> It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.


TBH Intel taught many people a good lesson, that software isn't simply throwing money at the screen and drivers comes out.


----------



## Bwaze (Oct 6, 2022)

It's a bit sad the drivers are in a state they are - but it's not just driver issue I guess, the whole architecture is very compute oriented. As if it was made for mining first. ;-) 

And people were just days ago convinced that Intel wouldn't release a card which doesn't work without BAR...


----------



## OneMoar (Oct 6, 2022)

they need todo something about the driver overhead in both d9d9on12 and DX11/12 scenarios
the performance issues at lower resolutions SCREAM driver overhead
thankfully this should be fixable

not sure what they are smoking with that idle power consumption tho
id buy one just to play with it but they gotta fix there driver overhead

they need to write their own implementation of D3D9ON12 and DX11 figure out how to accelerate the draw calls
AMD had similar issues with draw call overhead tanking performance in DX11 due to a poor scheduler implementation

thankfully intel's architecture looks like it might be flexible enough for a driver level fix
D3D9ON12 is relatively new 








						GitHub - microsoft/D3D9On12: The Direct3D9-On-12 mapping layer
					

The Direct3D9-On-12 mapping layer. Contribute to microsoft/D3D9On12 development by creating an account on GitHub.




					github.com


----------



## Zubasa (Oct 6, 2022)

OneMoar said:


> they need todo something about the driver overhead in both d9d9on12 and DX11/12 scenarios
> the performance issues at lower resolutions SCREAM driver overhead
> thankfully this should be fixable
> 
> ...


The issue for AMD was more than just drivers, it was inherit to GCN architecture.
it is the same reason why AMD never made a GCN gaming card with more than 4096 cores, and was not able to compete with Pascal.
One of RDNA's main goal was to address the bottleneck, so I wouldn't be so sure that it is just drivers for Arc.
Software overhead and GPU architecture goes hand in hand, it could also be Arc being so compute oriented makes it harder to optimize.


----------



## GunShot (Oct 6, 2022)

R-T-B said:


> You mean like how Mantle became Vulkan, and heavily influenced d3d12?
> 
> Turn down the fanboy please.  Every company has a good idea once in a while.


Mantle?! Wtf! Mantle is like ~9-years-old and still only supports ~180 titles out of tens-of-thousands of titles. How is that even comparable to NVIDIAs tech features' prowess that supports over gazillions more on top of that?!   

People like you need to understand, like others here, *today's* OBSERVATION / PERSONAL EXPERIENCE (not going back to the stone-age ) has nothing to do with fanaticism, etc.

Find another term because that doesn't apply here.


----------



## erek (Oct 6, 2022)

64K said:


> It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.


I still say massively scale up Gen w/ it’s stable and mature drivers


----------



## AusWolf (Oct 6, 2022)

Easo said:


> People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
> Common...


It's a shame. They want Nvidia and AMD to give them the best they can for a good price even though there's proof that it's not gonna happen. Then a 3rd player comes up with something, and it's suddenly not good enough. It's like everybody needed 4K 120 FPS all of a sudden.

When Nvidia came up with the (massively crippled) 3050, people rejoiced, even though it was too expensive for what it is. Now we have a card that can approximate the 2080 in some games in a 2060 price range, and it's not good.

I don't understand people, really.


----------



## Jimmy_ (Oct 6, 2022)

Intel has Power consumption - Nothing better than this love story

Awesome review  kudos to TPU


----------



## Solid State Brain (Oct 6, 2022)

There's also this:


----------



## Legacy-ZA (Oct 6, 2022)

I am very pleased with the raytracing performance compared to its competitors, it looks great to be honest and if XeSS is going to be implemented by devs into current games, we might just see even better results, I am very excited to see where things are going to go post RTX4000 / RX7000 and ARC 770.

nVidia killed my enthusiasm for the GPU space, but Intel kindled something, mmm.


----------



## Felix123BU (Oct 6, 2022)

First of all, nice review, and even though most shit on the A770, I kinda like it, it's rough, there are much better options out there, but it's an ok first step for Intel, I really hope they continue.
Second, the page here showing the A770 Intel Arc A770 Specs | TechPowerUp GPU Database is kind of misleading and wrong, shows the A770 being better than the 3070TI ???


----------



## pavle (Oct 6, 2022)

OneMoar said:


> ...not sure what they are smoking with that idle power consumption tho...


Idle consumption can't really be lower with that 600MHz GPU clock. It should be at 300MHz or lower for lower power...


----------



## AusWolf (Oct 6, 2022)

pavle said:


> Idle consumption can't really be lower with that 600MHz GPU clock. It should be at 300MHz or lower for lower power...


I'm still hoping that they'll fix it in newer drivers. The latest driver is still in beta. Their iGPUs run at 350 MHz with basically zero power consumed when idle. I see no reason why the A770 can't do the same.


----------



## Solid State Brain (Oct 6, 2022)

pavle said:


> Idle consumption can't really be lower with that 600MHz GPU clock. It should be at 300MHz or lower for lower power...



The VRAM is running at 2 Ghz no matter the load. That's most probably where the high idle power comes from.
Many other GPUs, mainly from AMD, have a similar problem, but only when using multiple display outputs or high refresh rate displays.


----------



## AusWolf (Oct 6, 2022)

Solid State Brain said:


> The VRAM is running at 2 Ghz no matter the load. That's most probably where the high idle power comes from.
> Many other GPUs, mainly from AMD, have a similar problem, but only when using multiple display outputs or high refresh rate displays.


The review was done with this testing method:


> Idle: Windows 10 sitting at the desktop (2560x1440) with all windows closed and drivers installed.


I'm thinking whether we would see the same with a 1080p desktop. I know it shouldn't happen at 1440p either, I'm just wondering.


----------



## Ferrum Master (Oct 6, 2022)

@W1zzard 

How much better does XeSS work using native instructions? Any differences image wise, also in motion/ghosting? Well there shouldn't but you you never know.


----------



## Shatun_Bear (Oct 6, 2022)

A770 performs worse than 6600 XT, draws same power as the 6800, but costs $50 more than the former... awful. Should be $250 max not $350.


----------



## trsttte (Oct 6, 2022)

Solid State Brain said:


> There's also this:



Not as bad as rtx2000 but close yikes


----------



## r9 (Oct 6, 2022)

ToTTenTranz said:


> Looks like there's very good potential for competitive performance/power ratio when we look at properly optimized games like Metro Exodus with raytracing.
> A year from now, we could be looking at the A770 competing with the 3070 and 3070 Ti all around, which is pretty great considering Intel is a newcomer.
> 
> 
> ...


Because I'm talking about cost in making those cards. Disabling part of the GPU doesn't make it cheaper to make.


----------



## pavle (Oct 6, 2022)

Solid State Brain said:


> The VRAM is running at 2 Ghz no matter the load. That's most probably where the high idle power comes from.
> Many other GPUs, mainly from AMD, have a similar problem, but only when using multiple display outputs or high refresh rate displays.


Indeed - I forgot to mention that, an important reason for high power figures. Higher core clock was the low hanging fruit and yes, it's quite high still, remember nv GTX 470 having 608MHz 3D clock and effectively 704 shaders (at that clock), well the intel card idles at that clock.  

By the way - does aynone know if the Arc has memory compression going/turned on, or just Z-compression?
Would be interesting to measure. From all the test results quite a lot of times, despite having all those shaders and TMUs and ROPs, it just can't compete on par with the others, kinda like
Radeon HD 7970 never really could match (unless it was a shader-heavy load) the nimble GeForce GTX 680 until HD 7970 had bandwith (not Z, Z was turned on already) compression (in later Windows 7 and above drivers, finewine it was called or something), like the GTX 680 (from the start).


----------



## RandallFlagg (Oct 6, 2022)

pavle said:


> Indeed - I forgot to mention that, an important reason for high power figures. Higher core clock was the low hanging fruit and yes, it's quite high still, remember nv GTX 470 having 608MHz 3D clock and effectively 704 shaders (at that clock), well the intel card idles at that clock.
> 
> By the way - does aynone know if the Arc has memory compression going/turned on, or just Z-compression?
> Would be interesting to measure. From all the test results quite a lot of times, despite having all those shaders and TMUs and ROPs, it just can't compete on par with the others, kinda like
> Radeon HD 7970 never really could match (unless it was a shader-heavy load) the nimble GeForce GTX 680 until HD 7970 had bandwith (not Z, Z was turned on already) compression (in later Windows 7 and above drivers, finewine it was called or something), like the GTX 680 (from the start).



That's a good question, and I've been wondering if some of those major software \ driver driven performance features, introduced many years ago, and that most of us have forgotten about are lacking or absent.  Another one is, I can't recall the technical name, but PowerVR like 20 years ago had a tech that allowed the GPU to not render parts of a 3D scene that are not going to be seen by the viewer.  So like if there is a trash can behind a person on the screen, hence you wouldn't see the trash can from your angle, the trash can doesn't get rendered.  I believe that was incorporated from their IP by AMD and Nvidia.  Someone is going to need to do more in-depth testing to see what is up.

To look at a spec sheet, you would think that these cards would stomp all over things like the 3060 Ti or 6700XT, but we know they don't.

This is the A750 vs the 6700XT for example :


----------



## Easo (Oct 6, 2022)

64K said:


> It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.


You literally have a review saying that they lack optimisation, not that they are not working + there is noticable improvement in them - all of which reasonable for the first actual dedicated GPU series. Have you forgotten the endless issues both other teams have had and still do from time to time? It is literally what I said - unless the Intel is 146% perfect then they shouldn't bother. You won't ever get a 3rd player that way.


----------



## RandallFlagg (Oct 6, 2022)

Easo said:


> You literally have a review saying that they lack optimisation, not that they are not working + there is noticable improvement in them - all of which reasonable for the first actual dedicated GPU series. Have you forgotten the endless issues both other teams have had and still do from time to time? It is literally what I said - unless the Intel is 146% perfect then they shouldn't bother. You won't ever get a 3rd player that way.



It's kind of funny, as you were posting that I was looking at reviews of 6650XT and 6700 on newegg.

Seems like objectivity is very subjective..

A few exerpts - 

5 star review : 





Different 5 star :




4 star :



Another 5 star :


----------



## AnotherReader (Oct 6, 2022)

Thanks for the great review @W1zzard . For me, the idle power consumption is too great for it to be an alternative. However, the performance in some cases is promising, and given the progress Intel has made with the drivers since the A380 review, the future may see it competing with a 3060 Ti/6700 XT rather than a 3060/6600 XT. Still, given the high clocks, it should have performed between the RX 6800 and 6800 XT in at least some titles, and I haven't seen any where it does that. To summarize:

*Pros*

Good raytracing implementation, better than AMD, and close to Nvidia
Low price
*Cons*

High idle power consumption
poor performance in most pre DX12 API based games
Price too close to the 6700 XT which is almost always faster
In its best titles, it performs very close to the 3060 Ti, and in some cases, even the 3070. Most encouragingly, this list includes one DX11 title too: The Witcher 3.


*Benchmark**Release Year**API**Resolution**A770**3060**6600 XT**Uplift over 3060**Uplift over 6600 XT**3060 Ti**6700 XT**Uplift over 3060 Ti**Uplift over 6700XT**3070*​*Loss vs 3070*Assassin's Creed Valhalla2020​DX121440p59.2​48.6​55.6​22%​6%​61.5​69.8​-4%​-15%​67.5​12%​Control2019​DX121440p68.2​52​49.7​31%​37%​68.8​66.3​-1%​3%​80.4​15%​CyberPunk 20772020​DX121080p74.2​59.6​68.5​24%​8%​77.7​83.9​-5%​-12%​88.1​16%​Days Gone2021​DX121440p73.8​67.4​66.8​9%​10%​90.2​84.5​-18%​-13%​104.5​29%​Deathloop2021​DX121440p79.7​67.7​63.9​18%​25%​88.9​88.4​-10%​-10%​102.1​22%​Doom Eternal2020​Vulkan1440p167.3​134.1​151.3​25%​11%​155.4​194.5​8%​-14%​194​14%​Dying Light 22022​DX121440p68.8​48.7​54.9​41%​25%​65.5​71.8​5%​-4%​75.2​9%​Elden Ring2022​DX121440p63.1​53.1​55.1​19%​15%​68.8​64.4​-8%​-2%​75.8​17%​Far Cry 62021​DX121440p75.7​70.3​79​8%​-4%​90.3​95.2​-16%​-20%​100.2​24%​Forza Horizon 52021​DX121440p64.6​60.1​69.1​7%​-7%​78.4​92.3​-18%​-30%​87.6​26%​Hitman 32021​DX121440p94.2​72.2​86.9​30%​8%​97.4​105.8​-3%​-11%​113.5​17%​Metro Exodus2019​DX121440p113.7​75.7​79.6​50%​43%​98.5​101.2​15%​12%​113.9​0%​Red Dead Redemption 22019​DX121440p65.3​46.9​55.6​39%​17%​59.5​72.4​10%​-10%​65.4​0%​Resident Evil Village2021​DX121440p125.9​87.7​112.1​44%​12%​115.4​145.6​9%​-14%​133.3​6%​The Witcher 32015​DX111440p113.0​86.9​97.5​30%​16%​117.9​118.1​-4%​-4%​135.7​17%​Watch Dogs Legion2020​DX121440p58.7​46.9​54.7​25%​7%​62​70.7​-5%​-17%​70.3​17%​*Summary**GEOMEAN*81.3​64.6​71.5​*26%*​*14%*​84.1​90.9​*-3%*​*-11%*​96.1​*15%*​


----------



## AusWolf (Oct 6, 2022)

Where are the AIB cards?


----------



## AnotherReader (Oct 6, 2022)

@W1zzard Given the CPU bottleneck apparent at lower resolutions for the A770 in many DX11 games, is it time for the GPU testbed to switch to the 12900k or the 7700X?


----------



## ARF (Oct 6, 2022)

Can anyone explain why with so many *8 GB cards* the performance hit is so large, while in others with *8 GB* it is running just fine?


----------



## trsttte (Oct 6, 2022)

ARF said:


> Can anyone explain why with so many *8 GB cards* the performance hit is so large, while in others with *8 GB* it is running just fine?
> 
> View attachment 264442



The only 8gb card that runs fine is from Intel, clearly they did something right and have some aces under their sleeves

This example specifically is a test with ray tracing and intel has been showing some very good ray tracing performance even beating nvidia at their own game, i don't want to just say fine wine but well, clearly the cards have much bigger silicon than the tier they are competing at and a lot of driver handicaps so yeah, maybe it will be a rather fine wine after it ages.


----------



## W1zzard (Oct 6, 2022)

AnotherReader said:


> @W1zzard Given the CPU bottleneck apparent at lower resolutions for the A770 in many DX11 games, is it time for the GPU testbed to switch to the 12900k or the 7700X?


I doubt it'll go away with a faster CPU, just higher. Yes, I have plans to upgrade CPU for next rebench (early winter)



ARF said:


> Can anyone explain why with so many *8 GB cards* the performance hit is so large, while in others with *8 GB* it is running just fine?


Memory management. Looks like Intel is doing something right here


----------



## SOAREVERSOR (Oct 6, 2022)

I mean intel has been at ray tracing for a while and does know a bit about memory management.  The card is interesting and I'll pick one it up and jam it in a nuc extreme box for an htpc in the living room.


----------



## OneMoar (Oct 6, 2022)

streaming arc gaming


----------



## The Von Matrices (Oct 6, 2022)

What is the maximum bandwidth of the HDMI 2.1 port? Since it's bridged from DisplayPort I assume it can't be the full 48gbps.


----------



## Zubasa (Oct 7, 2022)

Easo said:


> You literally have a review saying that they lack optimisation, not that they are not working + there is noticable improvement in them - all of which reasonable for the first actual dedicated GPU series. Have you forgotten the endless issues both other teams have had and still do from time to time? It is literally what I said - unless the Intel is 146% perfect then they shouldn't bother. You won't ever get a 3rd player that way.


This gen was known as DG2 aka this is actually the 2nd gen dedicated graphics from Intel.
DG1 was so embarrassing that Intel only sold it to a few OEMs and shoved it under the rug.

As for getting a 3rd player, from all the comments online about all the different GPU reviews you see a common trend.
People don't even want a 2nd player let alone a 3rd, all they want is nVidia or cheaper nVidia cards.
Gamers actively mock and ridicule others with "knock-off" brand cards that is not a Geforce.
Everyone says they want competition yet nobody want to support the competition when it comes down to actual purchase.
You have influencer like LTT that actively pushs people who buy a product in such a state, yet you almost never see them actually use an AMD card let alone Intel.
Sure, for an editing/ rendering machine it only makes sense to use a 3090/4090 due to CUDA/Optix and the 24G vram being very useful in those tasks.
On the other hand, you basically never see anything except a 3090/Ti or 4090 even in a gaming build.


----------



## AusWolf (Oct 7, 2022)

Zubasa said:


> This gen was known as DG2 aka this is actually the 2nd gen dedicated graphics from Intel.
> DG1 was so embarrassing that Intel only sold it to a few OEMs and shoved it under the rug.
> 
> As for getting a 3rd player, from all the comments online about all the different GPU reviews you see a common trend.
> ...


It's ironic - everybody wants cheaper Nvidia cards, but nobody lifts a damn brain cell to think about what could make Nvidia cards cheaper (competition).

I actually liked LTT'S video asking gamers to consider the A750 and A770 to bring some competition into the game. On the other hand, I totally disliked Hardware Unboxed and Gamers' Nexus's videos. Even though Arc showed good performance for its price, they mocked it for doing "only" about 120 FPS in CS:GO. It's as if you buy flowers, cook and wash every day and take your girlfriend everywhere around the world, but she leaves you anyway because you left your pants on the sofa one time two years ago. I mean, wtf, really?

It's funny because I usually prefer HU or GN over LTT, but not this time.


----------



## Zubasa (Oct 7, 2022)

AusWolf said:


> It's ironic - everybody wants cheaper Nvidia cards, but nobody lifts a damn brain cell to think about what could make Nvidia cards cheaper (competition).
> 
> I actually liked LTT'S video asking gamers to consider the A750 and A770 to bring some competition into the game. On the other hand, I totally disliked Hardware Unboxed and Gamers' Nexus's videos. Even though Arc showed good performance for its price, they mocked it for doing "only" about 120 FPS in CS:GO. It's as if you buy flowers, cook, wash and take your girlfriend everywhere around the world, but she leaves you anyway because you left your pants on the sofa one time two years ago. I mean, wtf, really?
> 
> It's funny because I usually prefer HU or GN over LTT, but not this time.


HUB and GN are doing their due diligence as reviewers to point out all the caveats of a product, so that buyer can make their own decision.
TBH HUB is being more lenient in its thumbnail saying it is "not terrible" which is actually more positive than most other youtuber with click-bait titles.

One of the worst this round is DigitalFoundry, they constantly try to downplay the 6600XT by keep stating it is the "most expensive",
despite the 6650XT can regularly be found under $300 and the 3060 is hard to find @Msrp.
It is the mix of some good data with his personal opinion that is most dangerous.


----------



## solarmystic (Oct 7, 2022)

Zubasa said:


> HUB and GN are doing their due diligence as reviewers to point out all the caveats of a product, so that buyer can make their own decision.
> TBH HUB is being more lenient in its thumbnail saying it is "not terrible" which is actually more positive than most other youtuber with click-bait titles.
> 
> One of the worst this round is DigitalFoundry, they constantly try to downplay the 6600XT by keep stating it is the "most expensive",
> ...



After a cursory example of a poorly optimized game for ARC (Ass Creed Unity on DX11), DF then proceeded to just use best case scenarios for all the subsequent benchmarks (DX12 and Vulkan games only), which would lead the viewer to believe that ARC is almost flawless and gives a false and misleading impression of the card. The 6600XT was also misrepresented, as you put it, since it can be found for cheaper than the A770 and will easily destroy the A770 in the thousands of DX11 and DX9 titles available right now.

When a majority of the most played games on Steam are still on DX11 and DX9, this is a huge shortcoming of the review.


----------



## BSim500 (Oct 7, 2022)

solarmystic said:


> After a cursory example of a poorly optimized game for ARC (Ass Creed Unity on DX11), DF then proceeded to just use best case scenarios for all the subsequent benchmarks (DX12 and Vulkan games only), which would lead the viewer to believe that ARC is almost flawless and gives a false and misleading impression of the card. The 6600XT was also misrepresented, as you put it, since it can be found for cheaper than the A770 and will easily destroy the A770 in the thousands of DX11 and DX9 titles available right now.
> 
> When a majority of the most played games on Steam are still on DX11 and DX9, this is a huge shortcoming of the review.


That's my gripe too. I'm more than happy to cut Intel some slack for a first attempt and openly welcome their competition, but I don't think I've ever seen such a gaping chasm between _"We picked as many of the 0.3% titles that were DX12-only games whilst excluding as many of the DX9-11 titles __most people are actually playing__ to benchmark"_ vs observable reality from certain tech sites. "Average fps across 12 games" charts in this case are utterly worthless when fps plummets outside of a "bubble" of a couple of dozen 'benchmark titles'. Also still waiting for even a single tech site to test emulated DX9 compatibility. As we've seen in the past with stuff like DgVoodoo2, API translation layers are not without issues (increased rendering errors, glitches, etc).


----------



## trsttte (Oct 7, 2022)

AusWolf said:


> It's as if you buy flowers, cook and wash every day and take your girlfriend everywhere around the world, but she leaves you anyway because you left your pants on the sofa one time two years ago. I mean, wtf, really?



I understand your point but that's not really what's happening with ARC, there's still several problems, some more frequent some less, and the product is only competitive againts inflated Nvidia options, which Intel casually shruggs off (and Linus as well for example - some rtx/ai features are more limited on AMD but that's just Intel using the same selective benchmarks as Nvidia as been using since it brough RTX hw to market). GN compares the card against AMD and the sale pitch from intel gets much less appealing very fast (there's also the hole GN vibe of being overly critical of anything)

I hope RDNA3 gets better rtx performance and is able to sets the record straight (and kicks a big fire under Nvidias pants), this almost tradition of shrugging off AMD by the "big guys" (nvidia and intel, one on gpus and the other on cpus but now also on gpus since it can't on cpus) needs to end.


----------



## SOAREVERSOR (Oct 7, 2022)

Zubasa said:


> This gen was known as DG2 aka this is actually the 2nd gen dedicated graphics from Intel.
> DG1 was so embarrassing that Intel only sold it to a few OEMs and shoved it under the rug.
> 
> As for getting a 3rd player, from all the comments online about all the different GPU reviews you see a common trend.
> ...



That's easy to understand though.  People want other people to take the pain for them.  The last AMD cared I owned was back when it was ATi and I had a 9700 pro.  Was great at the time, nVidia had nothing that could compete with it.  I built at box with a 3850 for a room mate at one point as well.

The issue is that nvidia's ecosystem is just better.


----------



## RandallFlagg (Oct 7, 2022)

Minus Infinity said:


> Honestly if this came out 12 months ago it might have made an impression. A770 is slower than a 2080! And we'll soon comparing it to RDNA3 and Lovelace. Competition is good, but it'll only make a dent at the lower end. I hope drivers can eek another 10-20% more performance and power consumption can be lowered. I still wonder if they will go all in on development of Battelmage though as that needs to be a massive upgrade and at least double performance because it'll be facing Blackwell and RDNA4.



Well, Intel's driver team was in Russia and that got shut down when Russia invaded Ukraine.  Intel had to rebuild it elsewhere.

Probably explains why Metro runs so well (mostly ties with a 3070), that game was developed by a Ukrainian studio so likely they had links to the Intel team in Russia (despite the war, there are many many familial connections and so on between the two).  It's probably a good example of optimization of the driver for a game.


----------



## Count von Schwalbe (Oct 7, 2022)

RandallFlagg said:


> Well, Intel's driver team was in Russia and that got shut down when Russia invaded Ukraine.  Intel had to rebuild it elsewhere.
> 
> Probably explains why Metro runs so well (mostly ties with a 3070), that game was developed by a Ukrainian studio so likely they had links to the Intel team in Russia (despite the war, there are many many familial connections and so on between the two).  It's probably a good example of optimization of the driver for a game.


Crumbs, I reckon Intel regretted that decision...

I think a lot of the hate Intel is getting over the apparent cherry-picking is based on this kind of thing. AMD/Nvidia have had years to optimize the performance of each and every major game release (when they released) but the ground-up development of Intel's drivers means that many are not optimized yet. Most likely each driver release will have a few titles with drastically improved performance as Intel works through the back-log of popular games.


----------



## GhostRyder (Oct 7, 2022)

Excellent review, I was excitedly waiting to see what your performance numbers would be for this card as its (At least to me) one of the most exciting releases in a long time (Solely based on the fact its Intel's highest current GPU I am aware of and a newcomer in general).  Unfortunately, with prices dropping on the equivalent cards in the performance category (Which based on the chart is somewhere around the RTX 3060 and AMD RX 6600 in terms of performance at the different resolutions on average) this card is a very hard sell since its performance is all over the place.  I really want to support them and hopefully get a second generation that's even better but it's not in the range I would want for my main rig (Maybe my second rig or wife's).

Still its a cool midrange card and I will say Intel has already come a long way just from the first cards!


----------



## RandallFlagg (Oct 7, 2022)

Zubasa said:


> This gen was known as DG2 aka this is actually the 2nd gen dedicated graphics from Intel.
> DG1 was so embarrassing that Intel only sold it to a few OEMs and shoved it under the rug.
> 
> As for getting a 3rd player, from all the comments online about all the different GPU reviews you see a common trend.
> ...



DG1 really was never meant for games/consumers.  It was likely just to iron out any driver issues related to normal office productivity work.  Just to give a hypothetical, for me, if a game crashes that's one thing that is not really a big show-stopper where I'd toss a $300-$350 GPU.   But if Teams or Office are unstable, it's gone.  



Zubasa said:


> HUB and GN are doing their due diligence as reviewers to point out all the caveats of a product, so that buyer can make their own decision.
> TBH HUB is being more lenient in its thumbnail saying it is "not terrible" which is actually more positive than most other youtuber with click-bait titles.
> 
> One of the worst this round is DigitalFoundry, they constantly try to downplay the 6600XT by keep stating it is the "most expensive",
> ...



HUB was fair IMO.  

On video, TechYES is I think one of the better channels for what GN seems to be trying to do, especially for value seeking gamers.  He talks about both the new cards, and the deals you can get on used cards vs new.  He's very much into what gives you the best bang for buck, and yes he's all over the 6600XT / 6650XT as best value.  

GN, the more I look at their recent reviews the more I think that guy is trash.  He spends the first half of his reviews (not just this one) pontificating about his viewpoints, and his opinion on some industry drama or the other.  

So now you're halfway through the video and he proceeds to draw conclusions from benchmarking a whole 6-7 games.  That should take like 1 hour, and yet he leads you to believe they've been pulling 16 hours days to bring that information.  There are just a whole lot more informational sites with far far broader test data than what that guy provides out there, and more useful dialogue like 'Console ports work great on this card, frostbite engine is horrible though'.  

This kind of data is also out there for AMD vs Nvidia, they are not consistent with each other at all on what games run well on each platform.  

Case in point, from TPU, this is part of a 6800 vs 3070 Ti comparison.  I wonder which one GN would say was consistent  :


----------



## AusWolf (Oct 7, 2022)

Zubasa said:


> HUB and GN are doing their due diligence as reviewers to point out all the caveats of a product, so that buyer can make their own decision.
> TBH HUB is being more lenient in its thumbnail saying it is "not terrible" which is actually more positive than most other youtuber with click-bait titles.


Maybe... or maybe they're trying to play it safe and not suggest people to buy a product that isn't 100% reliable. I keep forgetting that not only techies watch these channels.

There's also the clickbait factor. More people watch your video if you openly take a dump on the product, or even the industry. That's why I'm trying to restrict the time I spend on watching reviews instead of just reading TPU and calling it a day. TPU's game tests are the closest to the ones I play anyway.



trsttte said:


> I understand your point but that's not really what's happening with ARC, there's still several problems, some more frequent some less, and the product is only competitive againts inflated Nvidia options, which Intel casually shruggs off (and Linus as well for example - some rtx/ai features are more limited on AMD but that's just Intel using the same selective benchmarks as Nvidia as been using since it brough RTX hw to market). GN compares the card against AMD and the sale pitch from intel gets much less appealing very fast (there's also the hole GN vibe of being overly critical of anything)
> 
> I hope RDNA3 gets better rtx performance and is able to sets the record straight (and kicks a big fire under Nvidias pants), this almost tradition of shrugging off AMD by the "big guys" (nvidia and intel, one on gpus and the other on cpus but now also on gpus since it can't on cpus) needs to end.


I think JayzTwoCent's video is the best on this. He didn't do as many benchmarks as the others, but the ones he did showed a clear Nvidia win in some titles, a clear AMD win in others and a huge Intel win in some. It tells me that any of the 3 can be a winner depending on your needs.

Personally, I don't have too high hopes of RDNA 3 mainly because it hasn't even been announced yet. It'll be a good few months before we see any of them in stores, and when we do, it'll be the 7800 and 7900 series first, which are a competition on an entirely different price and performance range. These are 4K cards, and I'm still on 1080p with no plans to upgrade any time soon. I prefer to keep the cost of computational power needed low for now. The 3060 (Ti) / 6600 (XT) / A770 level is just right for me.

I agree that the tradition of shrugging off needs to end... so does the tradition of comparing everything to the competition. I mean, you (general you) know what games you play, then why not just look at review data on those games, and buy something that suits your price and performance needs? Why does everything need to be 5 FPS faster than the competition? You don't see any competition when you have one graphics card plugged into your system with nothing to compare it to. I also don't understand the purpose of averages. Nobody plays "Average Game" because it doesn't exist, and if you don't test for every single game on the planet, then your average value means nothing.


----------



## R-T-B (Oct 7, 2022)

GunShot said:


> Mantle?! Wtf! Mantle is like ~9-years-old and still only supports ~180 titles out of tens-of-thousands of titles. How is that even comparable to NVIDIAs tech features' prowess that supports over gazillions more on top of that?!
> 
> People like you need to understand, like others here, *today's* OBSERVATION / PERSONAL EXPERIENCE (not going back to the stone-age ) has nothing to do with fanaticism, etc.
> 
> Find another term because that doesn't apply here.


Mantle was influential in low level APIs, and literally commited swaths of code to Vulkan, which is huge today.


----------



## efikkan (Oct 7, 2022)

ModEl4 said:


> Driver quality is killing it.
> Hardware is mostly OK, Software needs a lot of improvement.
> If they don't improve drastically the driver efficiency and also make it good  at CPU multithreading scaling, the problem (the % performance that they lose due to driver inefficiency) will be even bigger in Battlemage.


The discrepancy between synthetic performance and real world performance is strong evidence of the exact opposite; the problem is in hardware scheduling, not software at all. These performance characteristics are very comparable to the performance issues of Polaris and Vega, only worse. If driver overhead in general was to blame, we should have seen a clear growing overhead with the more powerful ARC cards. If specific API calls were to blame for slowing everything down, then they would have easily identified and fixed that by now. They've had the final GPUs in testing since early 2022, and engineering samples long before that. The issues with Alchemist is hardware level, and no amount of driver patchwork can solve that.



bhappy said:


> Should be called Intel Arc A770 Beta Edition not Limited Edition


No, it's _Limited Performance Edition_. 



R-T-B said:


> You mean like how Mantle became Vulkan, and heavily influenced d3d12?





R-T-B said:


> Mantle was influential in low level APIs, and *literally* commited swaths of code to Vulkan, which is huge today.


Either this is some kind of joke, or you (like most people) have no idea what the word _literally_ actually means.

The fact police is here;
There is zero Mantle code in Vulkan, as there is zero code in the core of Vulkan. It's an API spec after all, not a code base, the real code is in the various drivers implementing the graphics APIs, this should be basic knowledge.

And to your earlier claim; no Mantle did not become Vulkan at all. Vulkan is based on the SPIR-V architecture from OpenCL and the efforts of the OpenGL AZDO initiative. The final spec did get some influence from the work on DirectX 12 (which again was inspired from Mantle), but these are still principally different APIs based on different coding paradigms, and therefore work structurally different.


----------



## RandallFlagg (Oct 7, 2022)

efikkan said:


> The discrepancy between synthetic performance and real world performance is strong evidence of the exact opposite; the problem is in hardware scheduling, not software at all. These performance characteristics are very comparable to the performance issues of Polaris and Vega, only worse. If driver overhead in general was to blame, we should have seen a clear growing overhead with the more powerful ARC cards. If specific API calls were to blame for slowing everything down, then they would have easily identified and fixed that by now. They've had the final GPUs in testing since early 2022, and engineering samples long before that. The issues with Alchemist is hardware level, and no amount of driver patchwork can solve that.



No, actually they know exactly where in the API the major bottlenecks are.


----------



## OneMoar (Oct 7, 2022)

efikkan said:


> The discrepancy between synthetic performance and real world performance is strong evidence of the exact opposite; the problem is in hardware scheduling, not software at all. These performance characteristics are very comparable to the performance issues of Polaris and Vega, only worse. If driver overhead in general was to blame, we should have seen a clear growing overhead with the more powerful ARC cards. If specific API calls were to blame for slowing everything down, then they would have easily identified and fixed that by now. They've had the final GPUs in testing since early 2022, and engineering samples long before that. The issues with Alchemist is hardware level, and no amount of driver patchwork can solve that.
> 
> 
> No, it's _Limited Performance Edition_.
> ...


they can fix it at the driver level as far as DX9/11 performance is concerned either though the use of a modified D3D9ON12 implementation or DXVK or some custom solution because D3D9ON12 is fantastically
slow because you are taking D3D9 Api calls which are already numerous and slow(the draw calls are notoriously expensive without a hardware backed scheduler specific for DX9/11
and then converting those to a string of DX12 commands

as for raw vulkan/pure DX12 perf I don't how much more they can squeeze out of it
they really need to find anouther 15-20% in raw fps uplift in addition to fixing the problems with the D3D9ON12 translation layer


----------



## efikkan (Oct 7, 2022)

RandallFlagg said:


> No, actually they know exactly where in the API the major bottlenecks are.


Bandwidth, etc. are not API bottlenecks.



OneMoar said:


> they can fix it at the driver level as far as DX9/11 performance is concerned either though the use of a modified D3D9ON12 implementation or DXVK or some custom solution because D3D9ON12 is fantastically
> slow because you are taking D3D9 Api calls which are already numerous and slow(the draw calls are notoriously expensive without a hardware backed scheduler specific for DX9/11
> and then converting those to a string of DX12 commands


They can fix DirectX 9 performance by *actually* implementing DirectX 9 in their driver instead of relying on an abstraction layer. But DirectX 9 games are not normally a part of GPU reviews, so thos will not scew the benchmark results in any way.

D3D9ON12 in an abstraction layer made by Microsoft. It's not a matter of "optimizing" it, as that will never be a top quality solution, they should implement DirectX 9 in their graphics driver instead. And speed isn't the biggest concern here, but the fact that DirectX 9 and 12 have very different states, and there isn't a direct 1-to-1 translation between API calls, so the result will always be lousy compared to a proper API implementation.

DirectX 11 isn't going to be affected by D3D9ON12, nor are driver optimizations likely to do a major uplift here.



OneMoar said:


> as for raw vulkan/pure DX12 perf I don't how much more they can squeeze out of it
> they really need to find anouther 15-20% in raw fps uplift in addition to fixing the problems with the D3D9ON12 translation layer


And they're not going to find it. They've had the final hardware since early 2022, and if there was a major bottleneck in the drivers which could unleash 20% more performance across the board, they would have found it by now.

The problem is in the hardware, and requires a hardware fix.


----------



## R-T-B (Oct 8, 2022)

efikkan said:


> Either this is some kind of joke, or you (like most people) have no idea what the word _literally_ actually means.


I mean literally.  Do I need to dig up old news, or can you use google?  Mantle contributed the complete codebase of itself to help get Vulkan going.  It's very dead, but it's influence lives on, is the point.

Heck, it even says this right on the wikipedia page for Mantle in the opening paragraph:









						Mantle (API) - Wikipedia
					






					en.wikipedia.org
				




Of course the API has stuff to commit.  Do you think an API lacks documentation, code examples, and dev tools?


----------



## OneMoar (Oct 8, 2022)

efikkan said:


> Bandwidth, etc. are not API bottlenecks.
> 
> 
> They can fix DirectX 9 performance by *actually* implementing DirectX 9 in their driver instead of relying on an abstraction layer. But DirectX 9 games are not normally a part of GPU reviews, so thos will not scew the benchmark results in any way.
> ...


your entire argument hinges on "well if there was 20% in the drivers to find they would have found it by now" 
yea if they were looking OR cared or where not busy or incompetent 
this is intel we are talking about  we have seen the level of give a fuck out of there driver team its not very high


----------



## efikkan (Oct 8, 2022)

OneMoar said:


> your entire argument hinges on "well if there was 20% in the drivers to find they would have found it by now"
> yea if they were looking OR cared or where not busy or incompetent
> this is intel we are talking about  we have seen the level of give a fuck out of there driver team its not very high


What a remarkable, fact based and well formed response! You know, with cursing you defeat any kind of logical argument 

Technically speaking, it's not hard to analyze and detect overhead. There are profiling tools which can pinpoint pretty precisely timing and resource allocation. It's not like developers rely on the Ballmer peak and glass balls to optimize code, contrary to popular opinion development skills are surprisingly methodical, rational and deductive in nature.

So, if there were major driver overhead, this would be easily detectable. Not only could GPU performance be severly bottlenecked by the CPU then, but we would expect this bottleneck to increase with GPU power (assuming FPS will increase, not details), so we should expect a A770 to be significantly more bottlenecked by it than A380. As I've said, Intel Arc performs poorly in real world gaming compared to synthetic benchmarks, which points to hardware level scheduling, not driver overhead.

I don't think you grasp how massive a driver overhead issue would have to be to hold back ~20% performance. Whether this is a whole API or just single API calls causing this, this would be very massive, and would be very evident using a profiling tool. And remember, to unleash major gains this trend have to persist across "every" workload, so any such trend should be easy to find. Especially if there are a few API calls taking up too much time of frame rendering and the GPU is undersaturated, this sort of stuff is very evident in profiling. And the fact that they have been struggling since the early engineering samples last year to find anything to squeeze out a tiny bit more of performance, but they can't, because there isn't anything significant to gain from the driver side. So it's very unlikely that Intel will suddenly stumble across something that will unleash 20% more performance, and I'm not talking about a single edge case here, but 20% across the board, which is highly unlikely.

Then lastly, there is history;
Those who remember the launches of Polaris and Vega, remembered that not only forums but also some reviewers claimed that driver optimizations would make them age like "fine wine" and turn out to be better investments than their Nvidia counterparts. Some even suggested e.g. RX 480 to compete in the GTX 1070/1080 range, once the drivers matured after "a few months". Well did it happen? Not yet, but I'm sure it will happen any day now!
And there are not many examples of driver "miracles". The biggest pretty much across-the-board driver optimization I can recall was done by Nvidia shortly after the release of DirectX 12, when they brought most of their DirectX 12-related driver improvements to their respective DirectX 9/10/11 and OpenGL implementations, and achieved something like ~10% after a massive overhaul. And this was overhead they were well aware of for years.
Another recent example is AMD's OpenGL implementation rewrite which yielded some significant gains (and some regressions). And this was an issue OpenGL devs have known about since the early 2000s, AMD(ATI)'s OpenGL implementation was always buggy and underperforming, and it was simply not prioritized for over a decade.

So my point here is, we should stop making excuses about poorly performing hardware and blaming "immature" drivers. DirectX 10/11/12 are high priority APIs, so if there were major bottlenecks in their driver implementation, they would know, no matter how "stupid" you think Intel's engineers are.
And isn't it funny, that for years "immature drivers" have been the excuse whenever AMD have released an underperforming product (and now Intel), but not Nvidia? I smell bias…



R-T-B said:


> I mean literally.  Do I need to dig up old news, or can you use google?  Mantle contributed the complete codebase of itself to help get Vulkan going.  It's very dead, but it's influence lives on, is the point.


Contributing to something and claiming A became B is not the same thing. And since you are twisting words I'm going to use *your own words* against *you*;
- "You mean like how *Mantle became Vulkan*"
- "Mantle was influential in low level APIs, and literally commited swaths of *code to Vulkan*"
Both of these claims are untrue, no matter how you try to twist it or split hairs.
Khronos developed Vulkan based on input from numerous contributors, including AMD and their Mantle, and they built this on top of their SPIR-V architecture and the ground work done by the AZDO initiative for OpenGL. While there may be some surface-level similarities between Mantle and Vulkan, Vulkan is far more featured and have much more state management than Mantle ever had, so these are not the same, even though many in the tech press don't know the difference.


----------



## AusWolf (Oct 8, 2022)

efikkan said:


> What a remarkable, fact based and well formed response! You know, with cursing you defeat any kind of logical argument
> 
> Technically speaking, it's not hard to analyze and detect overhead. There are profiling tools which can pinpoint pretty precisely timing and resource allocation. It's not like developers rely on the Ballmer peak and glass balls to optimize code, contrary to popular opinion development skills are surprisingly methodical, rational and deductive in nature.
> 
> ...


The only question is: did anybody actually test for CPU usage (to detect overhead symptoms) with these cards?


----------



## efikkan (Oct 8, 2022)

AusWolf said:


> The only question is: did anybody actually test for CPU usage (to detect overhead symptoms) with these cards?


Testing "CPU usage" isn't the right way to do it, as pretty much any rendering thread or other thread waiting for something will be pegged at 100%. This is something developers do whenever there are latency concerns, because if we don't then the OS scheduler might kick another random thread in there an cause up to milliseconds of latency which can ultimately affect the game. So this is why most games have 1-3 threads at 100%, even if they have many more threads with some load.

The way to test for CPU bottlenecks is to reduce the potential bottleneck by either using a faster CPU or a slower/slowed down GPU, and check how the real world performance is affected.


----------



## Night (Oct 8, 2022)

> Vulkan is derived from and built upon components of AMD's Mantle API, which was donated by AMD to Khronos with the intent of giving Khronos a foundation on which to begin developing a low-level API that they could standardize across the industry.


Keywords: 'derived' and 'fundation'.


----------



## DemonicRyzen666 (Oct 8, 2022)

As Steve of gamers Nexus said *"Gamers are not Enthuiast"*

however this is a fundamental problem for intel

Enthuaist are the people willing to try new things regaurdless of the problems, They like to tinker with new things, ok cool those people will buy ARC. How ever they're very small and are niche anymore compared to the sheer volume of "Gamers".

Gamers Don't want to be testers for hardware, they don't want to try new things. They just want stuff that works, on everything they use & play. This were the problem is, because intel won't get the sheer volume needed of different setups.

I still don't understand why gamers keep complaining about prices of Nvidia GPU's? 

Gamers should know by now there is enough of them to pursade the market in a such a way that Nvidia would stop selling over priced GPU if they would just stop buying them? 

I consider my self an ethauist, but I'm looking into mGPU gaming on DX12 & Vulkan, so the top card I'd buy is a anything from a RTX 2070 super to RTX 3090 TI/ 6400 XT to 6950 XT/ maybe A380, A750  A770, there is no point for me to ever buy a RTX 4090. I've been doing Research for mGPU games & games that support it. I don't have a reason to keep up with triple A titles coming, which means I'll probably be finding older games, that usually end up on sale. What I'm doing is actually very hard, Heck just looking for. 

Gamers would feel like what I'm doing very big waste of time & money, but it's what I want to do, not them. To me many times on here it just seems like Gamers have "mob mentality" it's their way or nothing.


----------



## efikkan (Oct 8, 2022)

As I've said in earlier threads;
My opinion is that Intel could have turned this entire Arc debacle into something positive by labeling it "Beta" and selling it at a very significant discount, like ~$150 for A770, and with a crystal clear disclaimer that performance would be inconsistent and there could still be significant bugs.


----------



## wheresmycar (Oct 9, 2022)

DemonicRyzen666 said:


> As Steve of gamers Nexus said *"Gamers are not Enthuiast"*
> 
> however this is a fundamental problem for intel
> 
> ...



I'm sure that was said within context and not necessarily branding all gamers as non-hardware-enthusiasts. In general, I can't see most gamers being hardware enthusiasts, which usually comes in small packets in any workload/profession/etc... most gamers just want sufficient hardware or if the pockets are swell over-kill hardware for a wholesome gaming experience. Many of these guys will opt for pre-builds, consoles and handhelds. Then you've for the Gamer comm who fancy getting their hands dirty going the DIY route for all sorts of reasons (aesthetics, uncompromising quality parts, freedom for hardware adaptation/upgrading, just making it your own per size/preference/spec/colour contrast/features/etc etc). Some of these reasons spell out "interest"/"hobby"/"hands-on motivation" which is already a level-up in "enthusiasm" .... so how do we then determine who is a hardware enthusiast and who isn't? I'd like to see this enthusiast measuring yard stick.

Keep in mind, most people don't have enough money to splash around for, as you suggest: _"willing to try new things regaurdless of the problems" or "Gamers Don't want to be testers for hardware, they don't want to try new things. They just want stuff that works, on everything they use & play"._ I'm certain the "enthusiast" has not been robbed of intelligence or logic to blindly buy into anything and everything to fill some odd urge to nurse their enthusiasm... i'm sure the majority of enthusiasts are ordinary people who just want the best hardware at the right price (+ considering temps/noise/power consumption) to fulfil a desired performance target (be it gaming or other). That's why we look to reviews/benchmarks/etc for a more informed decision before parting with our hard earned money.

I'm a gamer and definitely a hardware enthusiast. Even when i'm not buying i'm fixated on all things new and challenging. It gets so bad, even after upgrading to something more than adequate if not overkill, it doesn't take long for the upgrade-itch to kick in. Not because I "need" more performance but because the fixation of newer developments, the small insignificant details and negligible upticks in performance keeps the ball rolling - it's a battlefront for the enthused to keep up with the more relevant 3 year'ish (or 5 for a platform swap) upgrade plan. In short, I don't need to touch hardware on a regular basis nor invest a ton of money on various applications of hardware and yet i'm ENTHUSED!! 



DemonicRyzen666 said:


> I still don't understand why gamers keep complaining about prices of Nvidia GPU's?



If you still don't get it, i don't believe the train of cognition will postpone any further. Probably already left the station.

A little clue though: Vast majority of gamers can't afford or can't justify throwing their hard earned cash at $500+ graphics cards. I earn well, usually maintain a healthy savers account... even I highly dislike how the market dynamics have changed and simply don't like parting chunks of my money to feed the unregulated authoritarian greed. But what can you do... i'm a performance buff and I demand visual eye candy at its best... hence compelled by the forces at play to pluck another ~$800 savers wound to grab a next-Gen hi-perf card. Not just an enthusiast, but enthusiastically obsessed if you ask me!



DemonicRyzen666 said:


> Gamers would feel like what I'm doing very big waste of time & money, but it's what I want to do, not them. To me many times on here it just seems like Gamers have "mob mentality" it's their way or nothing.



Honestly i don't see this type of only my highway gamer mob mentality on my screen, especially on this site. It's likely the majority of members here are potentially gamers hence seeing more game-relevant material shouldn't be too surprising. You should also appreciate the large "gamer" slice of the pie... its one of the well fed and teased instruments which heavily influences modern consumer tech.


----------



## Redwoodz (Oct 9, 2022)

Funny how, in this TPU review, the 6650XT is presented in the beginning as a direct competitor to the 770. Moving on to every single graph used, you notice the 6650XT is completely missing. The 6600XT is only used instead. Seeing as how your very own review states the 6650XT is overall 5% better than the 6600XT, this omission is the single reason you can claim the Intel 770 has a price performance lead...except it's not. Nice try.


----------



## Dirt Chip (Oct 9, 2022)

DemonicRyzen666 said:


> As Steve of gamers Nexus said *"Gamers are not Enthuiast"*


Generally speiking, I dont think you need expensive stuff in you computer to be enthusiast (see my system) and versa versa - you are not enthusiast by default if you have expensive stuff in you computer so we very much agree on that.
Regarding ARC, i don't think buying on is an enthusiast thing, more like 'early-adopter' thing. I`m not even considering buying on until ARC second gen but the subject interest me a lot. Resone: driver are to immature for my taste, but I can definitely understand someone will buy one for that reason excactly.
So enthusiast isn`t an 'early-adopter', most of them aren't imo.


----------



## AusWolf (Oct 9, 2022)

Dirt Chip said:


> Generally speiking, I dont think you need expensive stuff in you computer to be enthusiast (see my system) and versa versa - you are not enthusiast by default if you have expensive stuff in you computer so we very much agree on that.
> Regarding ARC, i don't think buying on is an enthusiast thing, more like 'early-adopter' thing. I`m not even considering buying on until ARC second gen but the subject interest me a lot. Resone: driver are to immature for my taste, but I can definitely understand someone will buy one for that reason excactly.
> So enthusiast isn`t an 'early-adopter', most of them aren't imo.


Not only that. I think using low-end and outdated hardware to create something that is still usable today is more of an enthusiast thing than buying the highest-end stuff and calling it a day. Everybody knows what a 3090 is, and everybody knows it'll play any game at basically any setting. There's no challenge in building a system around it. You only need money. I'm more proud of having built my two HTPCs (in my signature) than I was of any of my high-end systems in the past. "HTPC 2" is dead silent with a passively cooled CPU as well as GPU, yet it's still capable of some light gaming. Currently, I only have Nascar Heat 5 installed on it, which runs at around 40 FPS at 1080p which I think is impressive.


----------



## QuietBob (Oct 9, 2022)

The more I think about it, the less reason I see behind Intel's marketing and pricing of the A750/770. It's quite obvious that by betting on DX12/RT and AV1 encode they are targeting the younger demographic of "pro gamers" / Twitch players. These are typically the people who only play the newest or most popular games at the moment, and would purchase any publicized title on launch day. But this group tend to be habitual Nvidia buyers, and are not going to change their affinity simply because a third player has entered the market. They are also very likely to already own a GPU which performs on par with Intel's top offering, or better.

And I find it funny how Intel seem to completely disregard AMD's 6600XT/6650XT as the competitor to the A750/770. Intel promote their cards as a cheaper and faster alternative to NVidia's RTX3060, but they fail even by this metric. In TPU's review the A750 ends up being marginally slower than the RTX3060 @ 1080p and only slightly faster @ 1440p, while the A770 is mere 4% and 12% faster. Price wise, at least in Europe, the RTX3060 can easily be found for under $300, and the 6600XT even cheaper. Either of these cards offers better value for money with more consistent performance, mature drivers, better compatibility in a wide range of games, and much better power efficiency,

Don't get me wrong, I really want Intel to succeed with their dGPUs. But I can't see that happening with the suggested MSRP of $300-350. When you are a newcomer in a highly competetive market, you don't get customers by offering them something similar at a higher price. You need to give them something better at a lower price.


----------



## AusWolf (Oct 9, 2022)

QuietBob said:


> The more I think about it, the less reason I see behind Intel's marketing and pricing of the A750/770. It's quite obvious that by betting on DX12/RT and AV1 encode they are targeting the younger demographic of "pro gamers" / Twitch players. These are typically the people who only play the newest or most popular games at the moment, and would purchase any publicized title on launch day. But this group tend to be habitual Nvidia buyers, and are not going to change their affinity simply because a third player has entered the market. They are also very likely to already own a GPU which performs on par with Intel's top offering, or better.
> 
> And I find it funny how Intel seem to completely disregard AMD's 6600XT/6650XT as the competitor to the A750/770. Intel promote their cards as a cheaper and faster alternative to NVidia's RTX3060, but they fail even by this metric. In TPU's review the A750 ends up being marginally slower than the RTX3060 @ 1080p and only slightly faster @ 1440p, while the A770 is mere 4% and 12% faster. Price wise, at least in Europe, the RTX3060 can easily be found for under $300, and the 6600XT even cheaper. Either of these cards offers better value for money with more consistent performance, mature drivers, better compatibility in a wide range of games, and much better power efficiency,
> 
> Don't get me wrong, I really want Intel to succeed with their dGPUs. But I can't see that happening with the suggested MSRP of $300-350. When you are a newcomer in a highly competetive market, you don't get customers by offering them something similar at a higher price. You need to give them something better at a lower price.


I think targeting DX12 and RT is normal - that's where we need more performance. DX11 and older games already run well enough on basically anything (except for a couple titles on Arc).

I want Intel to succeed myself, and I really want to buy an A770 to play with it, but a couple of things hold me back:
1. Drivers.
2. Unexplained high idle power consumption. My PC is in idle / browsing most of its time, so it's really important.
3. Low overall performance relative to chip size, theoretical capabilities and power consumption. I mean, 4096 shaders with 128 ROPs should really perform better especially with 225 W TDP.

Also, I found a really good deal on the reference 6750 XT at a local store which made me doubt what I want. It's 100-120 quid cheaper than AIB 6750 XTs. I think I'll wait until the A770 appears in stores and see what its real price will be like and go from there.


----------



## Count von Schwalbe (Oct 9, 2022)

AusWolf said:


> Unexplained high idle power consumption.


VRAM locked to 2000MHz on idle?


----------



## AusWolf (Oct 10, 2022)

Count von Schwalbe said:


> VRAM locked to 2000MHz on idle?


Probably. The question is, is that due to the beta driver, or is it by design?


----------



## efikkan (Oct 10, 2022)

AusWolf said:


> I think targeting DX12 and RT is normal - that's where we need more performance. DX11 and older games already run well enough on basically anything (except for a couple titles on Arc).
> 
> I want Intel to succeed myself, and I really want to buy an A770 to play with it, but a couple of things hold me back:
> 1. Drivers.


Well, it's not necessarily wrong to think like performance only matters for the latest titles, and the rest are fine as long as they reach a decent performance level (e.g. 120 FPS, good consistency). But thinking that ignoring anything but DirectX 12 titles is going to paint a different picture, then you're mistaken. Let's go down the rabbit hole…

At 1440p, if you only account for DirectX 12 titles, RTX 3060 Ti is still 13% faster than A770 (compared to 17% with all games), similarly RTX 3060 goes from being 10% slower to 14% slower with only DirectX 12 titles. So barely a significant difference, not enough to bump it up a performance tier or two. If you look closely, there are very few games where A770 can beat RTX 3060 Ti, if anything the few times it does seems more like statistical outliers than anything. What's even more concerning if you study the numbers is that the performance characteristics in DirectX 12 games vary a lot more than DirectX 11 games, which makes sense as the quality of the game code affects the performance the more control the game has. If you remember the article about Intel's official performance figures for A770, their DirectX 12 results showcased a much "better" results vs. RTX 3060 Ti, but as I pointed out in that thread, they used a lot of obscure Unreal titles which made up a large portion of the "favorable" games, so in no doubt they had cherry-picked games to make A770 look like it's better for DirectX 12 games and "future" games.

There is an interesting historical parallel; The Radeon 200/300 series was supposed to be better than Nvidia based on a few cherry-picked games, e.g. AotS. Even back then I pointed out these to be statistical outliers, but people claimed these counted more, as they somehow represented future games. This myth lived on with 400/500 series and Vega, along with the magical driver optimizations which never arrived.

So in conclusion, there is no technical or statistical basis to claim that ARC Alchemist is going to be viewed more favorable as DirectX 12 games become more dominant, and it's extremely unlikely that future games is going to bump it up a performance tier or two. Remember that based on the specs of this chip we should expect it to perform in the RTX 3080 range, but it doesn't come close to this, not from lacking drivers or faulty games, but from terribly performing hardware. No amount of driver tinkering and new games is going to paint a very different picture. And lastly don't forget that A770 only managed to hit >120 FPS in one of the DirectX 12 games, and assuming future games will be more demanding, A770 is not even going to be regarded as a 1440p card any more.


----------



## big_glasses (Oct 10, 2022)

What's considered running "well enough"? (I hate that form of referring to perf, given everybody has different acceptable levels. which again might be different depending on games)

I'll assume for this card that's 1080p@60fps?




If it's 60fps, then it's a couple that is just about or under, GoW(dx11) being 60fps on the point (which probably means dips up and down), BL3(Dx11) is weird, but very low. and TW:WH3 is also below, but is strategy so more acceptable and presumably can reduce settings to boost fps (which I guess is applicable for all...)
BF V(dx11) with a very acceptable 112fps have almost a 40fps drop in the avg to 99th percentile fps

edti: I'd love if a dx9 game or two was added to the test bench (and a unity-engine game)


----------



## AusWolf (Oct 10, 2022)

efikkan said:


> Well, it's not necessarily wrong to think like performance only matters for the latest titles, and the rest are fine as long as they reach a decent performance level (e.g. 120 FPS, good consistency).


This is exactly what I mean. Older titles require fewer resources, so they tend to run well enough even without driver optimizations. In HuB's review, the A770 was deemed a failure because it only delivered 120 FPS in CS:GO while everything else in the price range did well above 300. I can't speak for others, but 120 FPS is way more than I'll ever need in any game. The "you paid for more performance than that" argument is understandable, except that I literally can't tell the difference between 120 and 300 FPS, especially not on my 60 Hz monitor.



efikkan said:


> At 1440p, if you only account for DirectX 12 titles, RTX 3060 Ti is still 13% faster than A770 (compared to 17% with all games), similarly RTX 3060 goes from being 10% slower to 14% slower with only DirectX 12 titles. So barely a significant difference, not enough to bump it up a performance tier or two. If you look closely, there are very few games where A770 can beat RTX 3060 Ti, if anything the few times it does seems more like statistical outliers than anything. What's even more concerning if you study the numbers is that the performance characteristics in DirectX 12 games vary a lot more than DirectX 11 games, which makes sense as the quality of the game code affects the performance the more control the game has. If you remember the article about Intel's official performance figures for A770, their DirectX 12 results showcased a much "better" results vs. RTX 3060 Ti, but as I pointed out in that thread, they used a lot of obscure Unreal titles which made up a large portion of the "favorable" games, so in no doubt they had cherry-picked games to make A770 look like it's better for DirectX 12 games and "future" games.


It's the same lesson that we're being taught at every launch: never believe internal marketing numbers.



efikkan said:


> So in conclusion, there is no technical or statistical basis to claim that ARC Alchemist is going to be viewed more favorable as DirectX 12 games become more dominant, and it's extremely unlikely that future games is going to bump it up a performance tier or two. Remember that based on the specs of this chip we should expect it to perform in the RTX 3080 range, but it doesn't come close to this, not from lacking drivers or faulty games, but from terribly performing hardware. No amount of driver tinkering and new games is going to paint a very different picture. And lastly don't forget that A770 only managed to hit >120 FPS in one of the DirectX 12 games, and assuming future games will be more demanding, A770 is not even going to be regarded as a 1440p card any more.


Maybe not in DX12 in general, but in RT perhaps - their hardware seems to be quite capable, keeping pace with the 2080 in Control and Cyberpunk, and even passing it in Metro: Exodus while Radeon cards just completely lose their sh*t when RT is on.


----------



## trsttte (Oct 10, 2022)

AusWolf said:


> Probably. The question is, is that due to the beta driver, or is it by design?



On the Intel Discord someone mentioned that's caused by lack of PCIe Active-state power management (ASPM) but I believe that should be a pretty common feature so maybe it's some driver/firmware weirdness in arc cards. Definetely not ideal, those idle and basic use power draws are pretty brutal.


----------



## AusWolf (Oct 10, 2022)

trsttte said:


> On the Intel Discord someone mentioned that's caused by lack of PCIe Active-state power management (ASPM) but I believe that should be a pretty common feature so maybe it's some driver/firmware weirdness in arc cards. Definetely not ideal, those idle and basic use power draws are pretty brutal.


Yeah. That's what's keeping me from buying one right now (apart from availability). Part of me is giving that reference 6750 XT more and more thought.


----------



## RandallFlagg (Oct 10, 2022)

AusWolf said:


> Yeah. That's what's keeping me from buying one right now (apart from availability). Part of me is giving that reference 6750 XT more and more thought.



AMD has everyone beat in the price/perf category for GPUs right now.  Although, I noticed some price inflation on the 6700 XT in the last 2 weeks.  For a little bit there, you could get one for $379.  They seem to be back up to $420 now.

Intel has another ace up the sleeve though.  Bundled software.  I don't usually assign 'free' software any real value, because typically you're looking at either one game which I will never play, or a few months of a subscription service which I won't renew.

In this case though, it's pretty massive.  You get all of the games listed (though one is in-game content), and can select 3 of the image/video editing packages.  A couple of these media editing packages are subscriptions, but 3 of them are full packages, and after looking them up they are not gimmicky cheap packages.

This is apparently available to not only discrete, but mobile ARC.  Anyone looking to get into content creation but doesn't want to get stuck with subscriptions to Adobe, or maybe just wants to get off of such a subscription, this is a pretty compelling option.






This is Gigapixel AI upscaling from a review on youtube.


----------



## AusWolf (Oct 10, 2022)

RandallFlagg said:


> AMD has everyone beat in the price/perf category for GPUs right now.  Although, I noticed some price inflation on the 6700 XT in the last 2 weeks.  For a little bit there, you could get one for $379.  They seem to be back up to $420 now.
> 
> Intel has another ace up the sleeve though.  Bundled software.  I don't usually assign 'free' software any real value, because typically you're looking at either one game which I will never play, or a few months of a subscription service which I won't renew.
> 
> ...


Nice. Although I'm on your side - bundled software doesn't interest me 99% of the time. The games that I want to play, I already have (mostly). My Steam wishlist consists mainly of slightly overpriced indie games, waiting for a discount. Those will never be bundled with anything.


----------



## RandallFlagg (Oct 10, 2022)

AusWolf said:


> Nice. Although I'm on your side - bundled software doesn't interest me 99% of the time. The games that I want to play, I already have (mostly). My Steam wishlist consists mainly of slightly overpriced indie games, waiting for a discount. Those will never be bundled with anything.



Yup, the games are of near zero value to me.  

However, the media software is different.  I think I could, for me, ascribe about $100 of value to that pretty easily.


----------



## efikkan (Oct 10, 2022)

AusWolf said:


> Maybe not in DX12 in general, but in RT perhaps - their hardware seems to be quite capable, keeping pace with the 2080 in Control and Cyberpunk, and even passing it in Metro: Exodus while Radeon cards just completely lose their sh*t when RT is on.


The trend in RT is very comparable to how A750/A770 performs in gaming in general vs. their respective Nvidia counterparts. (AMD is a different story)
So if you consider A770 a "RTX 3060 Ti" class card (at best), then you'll see it also performs comparably to RTX 3060 Ti in RT too. There are only a few cases where it barely outperforms RTX 3060 Ti in RT. And matching RTX 2080 isn't impressive, as it's weaker than RTX 3060 Ti. So they may beat AMD in this area, but still doesn't make them a good option.

One thing which may make the competition even more difficult is Nvidia's impending RTX 3060/3060 Ti refreshes with faster memory, and then next year RTX 4060 will probably make A770 a hard sell.


----------



## QUANTUMPHYSICS (Oct 11, 2022)

If Intel was building its own gaming PC with its own parts, this would be awesome in a $1000 machine. 

Choosy moms choose 4090 tho...


----------



## AusWolf (Oct 11, 2022)

I'll just drop this here.









						This is how Intel ARC graphics save electricity while idle - iGamesNews
					

One of the most important points of a graphics card is power management. Since we're not interested in clock ramp-up periods that last longer than necessary.




					igamesnews.com


----------



## OneMoar (Oct 11, 2022)

AusWolf said:


> I'll just drop this here.
> 
> 
> 
> ...


brauh thats from MAY
and ASPM is a PCI-E spec literally every device supports it 
and something most people turn off  because it can lead to screwy behavior

I would expect nothing less from a site calling its self "igamesnews"
thats some quality tech journalism


----------



## Solid State Brain (Oct 11, 2022)

I've read that "Native" ASPM needs to be enabled in BIOS settings for ARC gpus to have a good idle power consumption but I am skeptical that it will have any effect under the tested conditions in the TPU review.

On my PC it is necessary to enable it so that on Windows the discrete GPU can completely turn off when not used, in a hybrid graphics configuration (iGPU+dGPU), but I am not aware if with modern hardware that setting enables power savings even with an active display output with just one discrete GPU used in the system.


----------



## trsttte (Oct 11, 2022)

AusWolf said:


> I'll just drop this here.
> 
> 
> 
> ...



From the article:



> This is not an Intel-only feature, as many NVIDIA and AMD cards support this mode as it is a function of the PCI Express interface. However, it is usually not advertised. Of course, this has become one of the reasons why GPUs have become more efficient in recent years and have therefore been able to increase their performance per watt. Its support in Intel ARC should therefore not surprise us.



And only intel cards seem to have this problem, I don't know what to make of it. On my system hwinfo mentions ASPM with support for L1 as disabled and I have no bios option for that. 

Might have to contact MSI to see what's up but in either case, this seems like an Intel problem (even if they are the only ones following spec, maybe first follow the established players who don't have this problem?)


----------



## kapone32 (Oct 11, 2022)

AusWolf said:


> Nice. Although I'm on your side - bundled software doesn't interest me 99% of the time. The games that I want to play, I already have (mostly). My Steam wishlist consists mainly of slightly overpriced indie games, waiting for a discount. Those will never be bundled with anything.


Gotham iKnghts looks pretty good though



RandallFlagg said:


> AMD has everyone beat in the price/perf category for GPUs right now.  Although, I noticed some price inflation on the 6700 XT in the last 2 weeks.  For a little bit there, you could get one for $379.  They seem to be back up to $420 now.
> 
> Intel has another ace up the sleeve though.  Bundled software.  I don't usually assign 'free' software any real value, because typically you're looking at either one game which I will never play, or a few months of a subscription service which I won't renew.
> 
> ...


I might buy one of these for the curiosity but Gotham Knights has added an incentive. I don't play COD but I am sure that will be sweet too. I just hope it's not a scenario where you have to have the card installed to gain access to the Key. Gigapixel AI also looks very interesting too.


----------



## Solid State Brain (Oct 11, 2022)

trsttte said:


> From the article:
> 
> 
> 
> ...



My 2016 Radeon RX480 supports ASPM L1 mode, that's presumably what allows it to turn completely off with the hybrid graphics configuration on my system. Perhaps L0 is the one which could allow idle power savings on Intel ARC GPUs? But I'm not going to purchase one just to find out.






						Active State Power Management - Wikipedia
					






					en.wikipedia.org
				






> Currently, two low power modes are specified by the PCI Express 2.0 specification; L0s and L1 mode. L0s concerns setting low power mode for one direction of the serial link only, usually downstream of the PHY controller. L1 shuts off PCI Express link completely, including the reference clock signal, until a dedicated signal (CLKREQ#) is asserted, and results in greater power reductions though with the penalty of greater exit latency.




```
# lspci -vv

[...]
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Ellesmere [Radeon RX 470/480/570/570X/580/580X/590] (rev c7) (prog-if 00 [VGA controller])
        Subsystem: PC Partner Limited / Sapphire Technology Radeon RX 470/480
[...]
        Capabilities: [58] Express (v2) Legacy Endpoint, MSI 00
[...]
                LnkCap: Port #0, Speed 8GT/s, Width x16, ASPM L1, Exit Latency L1 <1us
                        ClockPM+ Surprise- LLActRep- BwNot- ASPMOptComp+
                LnkCtl: ASPM L1 Enabled; RCB 64 bytes, Disabled- CommClk+
                        ExtSynch- ClockPM+ AutWidDis- BWInt- AutBWInt-

[...]
        Capabilities: [370 v1] L1 PM Substates
                L1SubCap: PCI-PM_L1.2+ PCI-PM_L1.1+ ASPM_L1.2+ ASPM_L1.1+ L1_PM_Substates+
                          PortCommonModeRestoreTime=0us PortTPowerOnTime=170us
                L1SubCtl1: PCI-PM_L1.2- PCI-PM_L1.1- ASPM_L1.2- ASPM_L1.1-
                           T_CommonMode=0us LTR1.2_Threshold=184320ns
                L1SubCtl2: T_PwrOn=170us
[...]
```


----------



## kapone32 (Oct 11, 2022)

RandallFlagg said:


> AMD has everyone beat in the price/perf category for GPUs right now.  Although, I noticed some price inflation on the 6700 XT in the last 2 weeks.  For a little bit there, you could get one for $379.  They seem to be back up to $420 now.
> 
> Intel has another ace up the sleeve though.  Bundled software.  I don't usually assign 'free' software any real value, because typically you're looking at either one game which I will never play, or a few months of a subscription service which I won't renew.
> 
> ...


I just copied this from the Magix website.
"Thanks to our partnership with Intel, we can now offer optimized Intel® Hyper Encode technology alongside Video Pro X. This means both graphics cards are simultaneously enabled on Intel desktop and laptop systems equipped with an onboard GPU and an additional Intel GPU, such as the brand-new Intel® Arc™. Hyper Encode accelerates rendering, so you can export your movie twice as fast* as with just one GPU."

Read more: https://www.magix.com/ca/video-editor/video-pro-x/new-features/
Follow us: Facebook

This could actually sell these cards if it is compelling enough. There could be a world where you have an Intel system with a AMD/Nvidia in the top x8 slot and a and ARC card in the 2nd slot (some MBs) and use that card for video processing and the first card for Gaming.


----------



## Count von Schwalbe (Oct 11, 2022)

kapone32 said:


> I just copied this from the Magix website.
> "Thanks to our partnership with Intel, we can now offer optimized Intel® Hyper Encode technology alongside Video Pro X. This means both graphics cards are simultaneously enabled on Intel desktop and laptop systems equipped with an onboard GPU and an additional Intel GPU, such as the brand-new Intel® Arc™. Hyper Encode accelerates rendering, so you can export your movie twice as fast* as with just one GPU."
> 
> Read more: https://www.magix.com/ca/video-editor/video-pro-x/new-features/
> ...


Most have a x4 slot too, I doubt it would need more than that. 30(40?)90 in the top slot at full x16, bottom slot A380 at x4 should do it. 

I wonder if AMD can do that with their APU's or 7000 series iGPU's.


----------



## AusWolf (Oct 11, 2022)

OneMoar said:


> brauh thats from MAY
> and ASPM is a PCI-E spec literally every device supports it
> and something most people turn off  because it can lead to screwy behavior
> 
> ...


Is this it? I've always had it enabled without any issue.


----------



## Solid State Brain (Oct 11, 2022)

On my MSI PRO Z690-A motherboard, if I don't enable "Native ASPM" the dGPU won't turn off on Windows (11) when not used (again, I have a hybrid graphics configuration and the video outputs are normally on the iGPU). However, on Linux that setting appears to be ignored; the dGPU will turn off anyway regardless of its state.

In other BIOS settings I can also enable L1 or L0s states for each PCIe device.


----------



## solarmystic (Oct 11, 2022)

Thanks for the updated positioning boss! @W1zzard


----------



## W1zzard (Oct 11, 2022)

solarmystic said:


> Thanks for the updated positioning boss! @W1zzard
> 
> View attachment 265060


Yeah, I added A750, A770, RTX 4090. The Arcs have been on my list all this week, but I couldn't find any time with the FE review. Made good progress with the 7 custom design reviews today, all benchmarks and photos done, just text to write tomorrow


----------



## AusWolf (Oct 12, 2022)

The bloody thing finally showed up at one of the UK stores for £389 and the A750 for £329. I'm still in dilemma whether I should get one of these or the 6750 XT for £469.


----------



## RandallFlagg (Oct 13, 2022)

AusWolf said:


> The bloody thing finally showed up at one of the UK stores for £389 and the A750 for £329. I'm still in dilemma whether I should get one of these or the 6750 XT for £469.



Honestly unless you want some of the software that comes with the Intel A770, or unless everything you want to play is DX12 or Vulkan,  I'd say the 6750 XT.  

I'd like to get the software, I have some use for the image and video editors.  

I think that is only for the Intel branded models though.   I haven't seen any of the AIB models except a couple of the ASRock ones, and it was both hard to find and sold out.  

This is a very strange launch.  I get the impression it is being done like a big pre-launch beta test.   Like Intel doesn't want too many out there because they don't want to deal with a big volume of support calls.


----------



## kapone32 (Oct 13, 2022)

Looks like the software is only available in the US as I see nothing about a free software here in Canada.


----------



## RandallFlagg (Oct 13, 2022)

kapone32 said:


> Looks like the software is only available in the US as I see nothing about a free software here in Canada.



I am pretty sure you get it anywhere.

There is a catch though :









						Intel to give away $370 worth of games and software with select Intel Alder Lake & Arc Alchemist systems - VideoCardz.com
					

Intel 12th Gen Core and Arc A7/A5 system purchases are eligible for free software Intel announced a new Software Advantage Program targeted at prebuilt systems and laptops equipped with the latest hardware.  Intel Arc Alchemist GPUs might not be here yet for desktops, at least not everywhere and...




					videocardz.com
				









						Intel | Software Advantage Program
					






					softwareoffer.intel.com


----------



## trsttte (Oct 18, 2022)

High Power Consumption when Intel® Arc™ Graphics Card is...
					

Configuration required to enable an idle low power consumption profile for Intel Arc Graphics Desktop cards.




					www.intel.sg
				




Intel has this guide about reducing idle power consumption, any chance you could take a look with any of the Intel gpus @W1zzard (if the x570 dark doesn't have the option, maybe with a different board that has it)?


----------



## Warrax (Nov 24, 2022)

Great review, W1zzard!!! Your dedication is incredible, also all in-depth benchmarks.


----------

