# Intel Arc A380



## W1zzard (Jul 26, 2022)

Intel Arc A380 is the first graphics card that the Blue Team is releasing this year. It's based on the new Xe "Alchemist" architecture, which includes support for hardware-accelerated ray tracing. In our in-depth review of the A380 we check rasterization performance, RT performance, power, heat, noise, on an AMD Ryzen platform.

*Show full review*


----------



## ZetZet (Jul 26, 2022)

Very good, almost matches my 1060 3GB 4 years later for the same amount of money.


But seriously, who is this card for? It's shit value for gaming, it's too expensive to just be a display expansion card...


----------



## ZoneDymo (Jul 26, 2022)

That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like


----------



## ZetZet (Jul 26, 2022)

ZoneDymo said:


> That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like


I don't know if I would use the word impressive, it's either matching or losing to Nvidia 3xxx series cards. AMD supposedly got close to Nvidia with RDNA3 too, of course we will have to see. But it's just okay, especially since Intel had all the advantage of designing their cards around RT, instead of having to slap it on like AMD did without much time.


----------



## ZeppMan217 (Jul 26, 2022)

A beta version of a GPU. What a time to be not dead.


----------



## ZoneDymo (Jul 26, 2022)

ZetZet said:


> I don't know if I would use the word impressive, it's either matching or losing to Nvidia 3xxx series cards. AMD supposedly got close to Nvidia with RDNA3 too, of course we will have to see. But it's just okay, especially since Intel had all the advantage of designing their cards around RT, instead of having to slap it on like AMD did without much time.



Yeah but this is also such a weak card that no gamer should get, so when we get the A750 and A770, im expecting those to do relatively well there, which is good enough .


----------



## R0H1T (Jul 26, 2022)

Slightly impressed in some of the areas like power consumption, temp/heat & noise especially but a lot of others were a letdown including that awful hit die to rebar!


ZoneDymo said:


> That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like


You mean that performance only marginally better than an XP slideshow with RT on 


ZoneDymo said:


> Yeah but this is also such a weak card that no gamer should get, so when we get the A750 and A770, im expecting those to do relatively well there, which is good enough .


I'm still hoping RT dies some unnatural death & we get something much less taxing/efficient maybe with a hybrid RT+raster approach completely "revolutionizing" gaming.


----------



## Bruno_O (Jul 26, 2022)

HTPC card I guess... would rather get a 5600G for that actually


----------



## R0H1T (Jul 26, 2022)

It's definitely much better than that.


----------



## FreedomEclipse (Jul 26, 2022)

ZetZet said:


> But seriously, who is this card for? It's shit value for gaming, it's too expensive to just be a display expansion card...



I guess thats why they limited the distribution to China only, Its their testing ground. Of course you'll eventually be able to buy one off ebay or aliexpress. Even if Intel banned the sale of it to overseas customers, It will still make it overseas by other means eventually one way or another.


----------



## ir_cow (Jul 26, 2022)

ZetZet said:


> But seriously, who is this card for? It's shit value for gaming, it's too expensive to just be a display expansion card...


China Internet Café.

Add in AV1 and H.265 encoding, its nice little capture offloader for OBS.


----------



## Lew Zealand (Jul 26, 2022)

It's a 'good enough' first try for Intel in the modern GPU world and hopefully more competition works as it's supposed to for all us consumers.  If the A770 and other big brothers can work similarly well and settle on a good price, maybe they'll sell with improved drivers.

Or wait until "B"attlemage or "C"attlemage or whatever's next.


----------



## chrcoluk (Jul 26, 2022)

Does it beat 6400 on pcie3?


----------



## GreiverBlade (Jul 26, 2022)

mmhh color me impressed ... if there is a LP variant (if the consumption was lower and the performance gap was larger)it would be a nice option over a RX 6400, also if it was not more than 180chf (which is the price of a Sapphire RX 6400 4gb )
RT @1080p is interesting on a budget gpu 

my current "HTPC" use a i7-3770 thus the IGP is not the strongest (hence why i use my ridiculous GT 730 silent 2gb (ddr3)  for now )

although... SERIOUSLY? it use a 8pin?  
yep quite impressed for a "first" but that power consumption and the 8pin is ... a bit odd, given the small performance marging over a 6500 4gb @1080p


----------



## Lew Zealand (Jul 26, 2022)

chrcoluk said:


> Does it beat 6400 on pcie3?



Not likely as no ReBAR on PCIe3 and the frametimes will be awful to deal with.  Which is too bad as one target market for low end GPUs is older computers.  The 6400 is at least still competent on PCIe3 as it's less bottlenecked than the 6500XT by simply being a lower performance part on the same bus.  The 1650 is a better option if you can find it for a decent price.


----------



## Fourstaff (Jul 27, 2022)

Good attempt, but plenty of work in front of them. I wonder if Intel has the stamina to fund tens of billions in order to close the gap with AMD/Nvidia, or are they going to give up soon after.


----------



## trsttte (Jul 27, 2022)

@W1zzard on the second page you mention typical A380 should have 2 DP + 2 HDMI ports but this one has 3 DP + 1 HDMI, typo maybe? (from what i've seen at least only gigabyte sometimes does the 2+2 instead of 3+1). Also regarding the price, Ryan Shrout and Tom Petersen mentioned on gamers nexus a price of 130$ (129$ to 139$ which Steve nuked with something like "130$ without marketing lingo"). 

Also any chance for you could do a couple tests with DxVK for some DX11/DX9 games to see if it improves things?



chrcoluk said:


> Does it beat 6400 on pcie3?



Probably, this has 8 lanes and more ram (both quantity and bandwith).



GreiverBlade said:


> SERIOUSLY? it use a 8pin?



I think it's better than a 6pin, most PSU come with 6+2 connectors (leaving you with a 2pin connector dangling if it was a 6pin connector on the gpu) and i'd wager the gpu might even work with a regular 6pin connector anyway since it's such a low power card


----------



## asdkj1740 (Jul 27, 2022)

remarkable review!

it really doesnt matter how much is the real msrp for us, the actual prices right now in china are simply way higher for an entry level model, unless intel would be able to "control" precisely the market price later.


----------



## cellar door (Jul 27, 2022)

Linus said on the wan show that these ARC gpus look like a complete dumpster fire - and now I'm disappointed because I was hoping for them to bring some proper competition to the gpu market.


----------



## AusWolf (Jul 27, 2022)

ZoneDymo said:


> That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like


Exactly my thoughts, considering that the card performs at 6400 levels, but completely destroys even the 6500 XT in raytracing (which isn't hard, but still).


----------



## ModEl4 (Jul 27, 2022)

Good overall showing.
Matched my optimistic performance scenario.
This means that a 2.5GHz A770 will be at RTX 3060Ti performance level which is a very good thing.
We just need better cards (ASRock Phantom Gaming A770? ≤$399 and 3060Ti performance class not very suitable for Taichi and Formula brands I guess) and of course more importantly Intel to improve drivers/software.
Raytracing was a huge win for ARC architecture. Slightly slower than Ampere and much better than RDNA2 (let's hope XeSS to be close to DLSS also)


----------



## mplayerMuPDF (Jul 27, 2022)

ZetZet said:


> Very good, almost matches my 1060 3GB 4 years later for the same amount of money.
> 
> 
> But seriously, who is this card for? It's shit value for gaming, it's too expensive to just be a display expansion card...


It is for Linux users  But seriously, it will be amazing on Linux. Intel drivers for Linux are very good. Furthermore, it does not matter that the (Windows) driver has poor support for DX11 and below because you will just use DXVK and Zink on Linux. GPGPU (OpenCL) support will probably be good on Linux too unlike with AMD Radeon. Aside from that it has AV1 decoding support and Intel GPUs are well-supported with VAAPI hardware video acceleration on Linux and now that hardware video acceleration works in browsers too finally on Linux, you will be able to watch YouTube in AV1 format without wasting electricity and hearing the fan in your laptop/desktop.


----------



## GoldenX (Jul 27, 2022)

RT performance is amazing considering it was going against 2x faster Nvidia GPUs with 3x times the memory bandwidth.
Still, what I expected since this Arc thing started, Intel takes ages to optimize drivers.


----------



## Garrus (Jul 27, 2022)

Frankly the performance is staggeringly bad. $100 fine, but not at MSRP. We should be easily beating the 1660 Super in 2022.... everything slower is terrible, I'd buy used.


----------



## joemama (Jul 27, 2022)

To be fair, though the performance is bad, it's quite impressive as it is a first generation graphics card from Intel


----------



## catulitechup (Jul 27, 2022)

mplayerMuPDF said:


> It is for Linux users  But seriously, it will be amazing on Linux. Intel drivers for Linux are very good. Furthermore, it does not matter that the (Windows) driver has poor support for DX11 and below because you will just use DXVK and Zink on Linux. GPGPU (OpenCL) support will probably be good on Linux too unlike with AMD Radeon. Aside from that it has AV1 decoding support and Intel GPUs are well-supported with VAAPI hardware video acceleration on Linux and now that hardware video acceleration works in browsers too finally on Linux, you will be able to watch YouTube in AV1 format without wasting electricity and hearing the fan in your laptop/desktop.



@W1zzard thanks for testing but as other said some linux tests will be very interesting


----------



## Vario (Jul 27, 2022)

I think its hard to pull off a new release in a product category that is novel to them, so I will give them that.  Maybe some of the technology will also filter into CPU integrated graphics.


----------



## Lew Zealand (Jul 27, 2022)

Garrus said:


> Frankly the performance is staggeringly bad. $100 fine, but not at MSRP. We should be easily beating the 1660 Super in 2022.... everything slower is terrible, I'd buy used.



Right now the 1660 Super and RX 6600 are about the same price new, $253 vs $260, and the 6600 is at least 38% faster at 1080p, so that's a decent performance uplift.  The real problem here is that nobody is making a less expensive video card with actually good performance.  The bottom is $250.  Even if the 3050 came down to MSRP, that'd _still _be $250.

Apparently that's the price floor for decent performance and *profits* nowadays.  Maybe Intel's midrange GPUs can do something decent in the $200 range, we will see...


----------



## beautyless (Jul 27, 2022)

> "it only works on Intel and only works with resizable BAR". So when the card came in, I kept an open mind, plopped it into my Zen 3 Ryzen test system right away and, surprise, it worked perfectly fine. Well, not _perfectly_ fine, because there are numerous layers of driver bugs to deal with, but the card performed as expected with Resizable BAR enabled on the AMD platform. No special tweaks, BIOS updates, or settings were necessary as I simply inserted the card, powered up the machine, and its BAR size was 6 GB—the exact same mechanics as with AMD and NVIDIA—it just works.


Resizable BAR is a must but this function on Intel GPUs is work on both AMD and Intel. Nice!

A380 looks under perform for it price. But it can be changed after a while.


----------



## mechtech (Jul 27, 2022)

Well better than Larrabee 

Not bad considering first kick at the can.

One more company using TSMC nodes....................

@W1zzard   you should have tested this against the best Iris integrated


----------



## Dirt Chip (Jul 27, 2022)

Nice start, good luck to the (very) early adopters out there...
The right thing to do is let intel do it`s beta testing with "Alchemist" and wait with conclusion about the ARC endeavor to the next gen, "battlemage".
If one is 'disappointed' about A380 or ARC because of this (excellent) review than please check your expectation meter and parameters as they are not align with realty.
This card is a beta GPU of a beta gen of the very first attempt in the big boys territory (let put Larrabee aside, please).

It is very encouraging to see that the beta driver solved all of the bugs.
Driver support and it`s maturation process is the important thing that Alchemist should be measured by, power and FPS are the side dish so to much debating about them is pretty null at this (very) early point in time.


----------



## HD64G (Jul 27, 2022)

Exactly as expected. Bad drivers. And their top GPU (A770) will not even reach average performance of 6700XT/3060Ti me thinks. Next gen cannot arrive soon enough for Intel and they need (much?) time to fix their drivers.


----------



## Dirt Chip (Jul 27, 2022)

HD64G said:


> Exactly as expected. Bad drivers. And their top GPU (A770) will not even reach average performance of 6700XT/3060Ti me thinks. Next gen cannot arrive soon enough for Intel and they need (much?) time to fix their drivers.


From the conclusion:
"I then tried Intel's newest 101.3220 Beta driver and it magically solved all serious issues. *The card would now work and be 100% stable in all games*, including Witcher, DoS 2 and RDR2—looks like the driver team is actually fixing bugs, good job! Just to emphasize, *as soon as I switched to the beta drivers there was not a single crash during all my testing—a huge improvement.*"

So not so bad and obviously getting better quite fast.

also from the cons:

"Immature drivers and software"
Immature is not so bad, it is just expected at this point in time. 
So if we`ll see improvement at this (driver) front on a monthly manner I can call it better than expected.


----------



## Fatalfury (Jul 27, 2022)

Being an Early Adopter and BETA TESTER for intel is not worth it ATM no matter how you see it.

will think on buying Intel GPU maybe in 2025 or atleast 3Gen or even when 5th Gen of Intel Cards comeout..

Until till..Good to see a new player..Let the new ship keep sailing further into deep waters.


----------



## AusWolf (Jul 27, 2022)

Fatalfury said:


> Being an Early Adopter and BETA TESTER for intel is not worth it ATM no matter how you see it.
> 
> will think on buying Intel GPU maybe in 2025 or atleast 3Gen or even when 5th Gen of Intel Cards comeout..
> 
> Until till..Good to see a new player..Let the new ship keep sailing further into deep waters.


But we need some people to "beta test" it, right?

If everyone held out with buying an Arc GPU until 2025, then there would be no Arc GPU in 2025. It's not economically feasible to spend money on developing a product line that no one buys.

If you personally prefer a well tested product and don't want to risk having issues with something new, that's fine. But you can't expect everyone to be on the same opinion, unless you want Arc to fail miserably - which I don't see why you would, as you said it yourself that it's good to have a third player in the game.

So yes, "beta testing" is worth it. 

As for me, I'll buy an A770 if it comes at a fairly decent price because I'm curious. I don't even care if it doesn't reach 3060 Ti heights.


----------



## Dirt Chip (Jul 27, 2022)

AusWolf said:


> But we need some people to "beta test" it, right?
> 
> If everyone held out with buying an Arc GPU until 2025, then there would be no Arc GPU in 2025. It's not economically feasible to spend money on developing a product line that no one buys.
> 
> ...


You shouldn't beta test just to help intel. You should beta test if you love the process of dealing with childbirth problems.

I advise everyone on this form NOT to buy A380 of A770 just because they "want to help the competition". It is futile argument as it can be.

Buy it if you are curious, want to "play" with it and love the feeling of holding something at its crud, un-refined form. Of course, be very well aware that you WILL held back by immature driver and need much patients to deal with it but- that is the fun and beauty of being an early adopter of totally new tech.

Arc is an uncharted territory at its best and you can see from the very (A380) beginning.  It's DNA is different from GeForce and Radeon ,and that's a very good thing.


----------



## ZoneDymo (Jul 27, 2022)

R0H1T said:


> I'm still hoping RT dies some unnatural death & we get something much less taxing/efficient maybe with a hybrid RT+raster approach completely "revolutionizing" gaming.



Dont really get that mentality, RT has been the end game goal for like...3 decades now, we always been able to do it, just not in real time.
Now we are approaching it and you would rather have an in-between step?

thats like saying you hate electric cars and would rather have hybrids...


----------



## AusWolf (Jul 27, 2022)

Dirt Chip said:


> You shouldn't beta test just to help intel. You should beta test if you love the process of dealing with childbirth problems.
> 
> I advise everyone on this form NOT to buy A380 of A770 just because thay "want to help the competition". It is futile argument as it can be.
> 
> ...


Of course. 

All I'm saying is that we need early adopters to let Intel iron out its issues for future generations, and make enough cash so that developing those future generations won't seem like a pointless idea.

Would I recommend everyone to buy an Arc GPU? Definitely not! You need a tech-savvy and adventurous mind to do venture out, a hundred percent. But you can't say that it isn't worth it for anyone.



ZoneDymo said:


> Dont really get that mentality, RT has been the end game goal for like...3 decades now, we always been able to do it, just not in real time.
> Now we are approaching it and you would rather have an in-between step?
> 
> thats like saying you hate electric cars and would rather have hybrids...


I hate electric cars and would rather have hybrids petrol/gasoline.

As for RT, I agree.


----------



## W1zzard (Jul 27, 2022)

trsttte said:


> @W1zzard on the second page you mention typical A380 should have 2 DP + 2 HDMI ports but this one has 3 DP + 1 HDMI, typo maybe? (from what i've seen at least only gigabyte sometimes does the 2+2 instead of 3+1). Also regarding the price, Ryan Shrout and Tom Petersen mentioned on gamers nexus a price of 130$ (129$ to 139$ which Steve nuked with something like "130$ without marketing lingo").


Haven't seen it offered for $130 anywhere. What would you say if you were on camera as Intel Marketing and people are like "uh, but it's listed for $190"



Lew Zealand said:


> Not likely as no ReBAR on PCIe3 and the frametimes will be awful to deal with.  Which is too bad as one target market for low end GPUs is older computers.  The 6400 is at least still competent on PCIe3 as it's less bottlenecked than the 6500XT by simply being a lower performance part on the same bus.  The 1650 is a better option if you can find it for a decent price.


Great point, while technically it is easy to force Gen 3 on a modern board and get test results, actual platforms with Gen 3 limitation do not have the BIOS updates for ReBAR.



Dirt Chip said:


> it is just expected at this point in time.


For us tech nerds certainly, but a general consumer audience should not spend their money on a product in this state. But unfortunately it seems to have become the norm to accept sub-par software .. imagine the drama if they sold houses or cars that don't work



AusWolf said:


> But we need some people to "beta test" it, right?


Well .. some of those bugs make it seem like Intel has no Q&A and their programmers don't execute their own code to see if it works. (I know both those statements are false)


----------



## GreiverBlade (Jul 27, 2022)

trsttte said:


> I think it's better than a 6pin, most PSU come with 6+2 connectors (leaving you with a 2pin connector dangling if it was a 6pin connector on the gpu) and i'd wager the gpu might even work with a regular 6pin connector anyway since it's such a low power card


actually it would work without any connector and limited to a max powerdraw of 75w given its V-Sync 60hz consumption if kept at 1080p (my PSU offer 6pin and 6+2pin tho but i usually use sleeved extensions)

well it does have a max of 94w with spikes at 101w, which make, in the end given the small difference in performance, it a poor concurrent for the 6400 4gb even tho it has 2gb more and PCIex 4.0 X8
although if they had a 75w limited LP model at a lower price than the 6400, i would take one for giving "the new player on the market" a go (and if there was no RE-BAR "mandatory"  )


----------



## Dirt Chip (Jul 27, 2022)

AusWolf said:


> Of course.  - *Fix, thanks.*
> 
> All I'm saying is that we need early adopters to let Intel iron out its issues for future generations, and make enough cash so that developing those future generations won't seem like a pointless idea.
> 
> ...


Well, the nature of truly curious early adopters is that they don't need any encouragement to do it 
They are out there and will do their "thing" anyway.
And there are always those who will use A380 without even knowing it (they just need a working PC, spec doesn't matter).

Intel has tremendous amount of cash to spill on ARC. Please don't buy because you are worry for them income. It will actually just hurt the maturation process- if Intel is really serious about GPU`s and ARC as they say the best thing one can do is to buy their produce IF and only IF it fits his need and in his budget, no matter if ARC will fail completely in the process. If intel mean business with ARC (and I think they are) they need to prove that they can take a blow of two, get up and better themselves with new relevant product. If not, "not helping" them will just do good to us all, Intel included.

Buying a produce that isn't right for you in the excuse that it "help" and/or "support" do the exact opposite- you will just see more products that aren't what you need and all you are left with is the "hope" that sometime something will change. That's not how capitalisms works...


----------



## Solid State Brain (Jul 27, 2022)

Idle power consumption is unusually high at 17W; hopefully it can be sorted out through driver updates.

By the way, an obvious advantage compared to the RX6400/6500 is that it has *4* display outputs. I think 3 is the bare minimum for power users, using for example 2 for standard displays and 1 for head-mounted display/VR. And before anybody asks: if the RX470/480 were "VR-Ready", so should be these GPUs.


----------



## AusWolf (Jul 27, 2022)

Dirt Chip said:


> Well, the nature of truly curious early adopters is that they don't need any encouragement to do it
> They are out there and will do their "thing" anyway.
> And there are always those who will use A380 without even knowing it (they just need a working PC, spec doesn't matter).
> 
> ...


I absolutely agree again. 

Buy an Arc if you're up for some adventure with something new. Don't buy one if you need a 100% reliable product with decades of history and testing. Everybody needs to decide which category they fall into. We can't say that one is more right than the other.

I didn't mean to correct you on the "of course". It was a genuine "of course" to show that I agree with you.  



W1zzard said:


> Well .. some of those bugs make it seem like Intel has no Q&A and their programmers don't execute their own code to see if it works. (I know both those statements are false)


That would be very sad. At least there are no issues with the latest beta driver, which is a positive sign.


----------



## Solid State Brain (Jul 27, 2022)

catulitechup said:


> @W1zzard thanks for testing but as other said some linux tests will be very interesting


I'm very curious too since I use Linux as well, but it appears that Kernel 5.20, _which isn't out yet_, is the one where complete support will be expected. I think it should be ready by roughly the end of September/early October.







						Linux 5.20 Looks To Be Primed For Usable DG2/Alchemist Desktop Graphics & ATS-M Support - Phoronix
					






					www.phoronix.com
				






> After many months and a lot of work by the open-source Intel Linux graphics driver developers, Linux 5.20 looks like it will be the base kernel where the DG2/Alchemist Arc Graphics desktop GPUs and Arctic Sound M (ATS-M) server graphics card support will be ready in usable shape. [...]


----------



## Dirt Chip (Jul 27, 2022)

W1zzard said:


> For us tech nerds certainly, but a general consumer audience should not spend their money on a product in this state. But unfortunately it seems to have become the norm to accept sub-par software .. imagine the drama if they sold houses or cars that don't work


Much agreed, it should bet dealt with as toxic waste- don't touch it unless you have the tools and knowledge to deal with it.



> W1zzard said:
> Well .. some of those bugs make it seem like Intel has no Q&A and their programmers don't execute their own code to see if it works. (I know both those statements are false)


I tend to believe they have priority list and that bugs and glitches' are known thing that they choose not to deal with because of more important ones. Time will tell, but for now I give them the 100 days "early grace" and mark it as "understandable".


----------



## docnorth (Jul 27, 2022)

For me most promising is the stability improvement with the new beta driver. If this trend continues (and it should), after 1 or 2 months we could talk only about a few driver bugs, well known already from the competition, cough, cough.
Or I am too optimistic...


----------



## rutra80 (Jul 27, 2022)

Doesn't look bad, looking forward for better cards and drivers.


----------



## ZoneDymo (Jul 27, 2022)

AusWolf said:


> I hate electric cars and would rather have hybrids petrol/gasoline.
> 
> As for RT, I agree.



not to go too off topic, but tell me with a bit of detail and reason, why do you hate electric cars and would rather have hybrids petrol/gasoline?


----------



## chrcoluk (Jul 27, 2022)

Lew Zealand said:


> Not likely as no ReBAR on PCIe3 and the frametimes will be awful to deal with.  Which is too bad as one target market for low end GPUs is older computers.  The 6400 is at least still competent on PCIe3 as it's less bottlenecked than the 6500XT by simply being a lower performance part on the same bus.  The 1650 is a better option if you can find it for a decent price.


I checked the review again and it doesnt say rebar only works on pcie4, so where did you get that from?

for reference my pcie3 board has rebar.  so is it a arc limitation?


----------



## kapone32 (Jul 27, 2022)

Now I understand why AMD is releasing a 8GB Version of the 6500XT. The review was in-depth as always and I would not personally get one of these cards. Unless the price is much more attractive once the novelty wears off, the issues will make you never buy another one for some people.


----------



## PapaTaipei (Jul 27, 2022)

I'm not sure the public looking for low end GPUs really use RT. Btw that pricing is extremely high.


----------



## uuee (Jul 27, 2022)

W1zzard said:


> Great point, while technically it is easy to force Gen 3 on a modern board and get test results, actual platforms with Gen 3 limitation do not have the BIOS updates for ReBAR.



Intel 10th gen has rebar with PCIe 3.0.


----------



## to001 (Jul 27, 2022)

Instead of written down unclear graphics setting.  Why not just screenshots of graphics setting menu ?


----------



## AusWolf (Jul 27, 2022)

ZoneDymo said:


> not to go too off topic, but tell me with a bit of detail and reason, why do you hate electric cars and would rather have hybrids petrol/gasoline?


I'll PM you.



kapone32 said:


> Now I understand why AMD is releasing a 8GB Version of the 6500XT.


The what now?


----------



## RedBear (Jul 27, 2022)

W1zzard said:


> Great point, while technically it is easy to force Gen 3 on a modern board and get test results, actual platforms with Gen 3 limitation do not have the BIOS updates for ReBAR.


I have a fairly shitty Gen 3 motherboard, Prime B450 Plus, but it did receive BIOS updates for ReBAR. I'm sure that there are plenty of B450/X470 motherboards out there that can use ReBar on PCIe Gen 3.


----------



## r9 (Jul 27, 2022)

What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?


----------



## W1zzard (Jul 27, 2022)

r9 said:


> What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?


That happens when you let random internet people review your product



uuee said:


> Intel 10th gen has rebar with PCIe 3.0.





RedBear said:


> I have a fairly shitty Gen 3 motherboard, Prime B450 Plus, but it did receive BIOS updates for ReBAR. I'm sure that there are plenty of B450/X470 motherboards out there that can use ReBar on PCIe Gen 3.


Right .. thanks for clearing that up


----------



## Solid State Brain (Jul 27, 2022)

r9 said:


> What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?



Perhaps that's possibly game-specific? It would be useful to see if OC performance is better in modern (DX12/Vulkan) games.

*EDIT*: I tried pulling up the video where that claim was made, and it seems that the A380_ without_ OC ran slower than the GTX1650, whereas it was about on par with it or faster with OC. On the other hand, in the TPU review stock performance is already similar to the GTX1650. So it could be their model had (driver-related?) issues with OC disabled.


----------



## trsttte (Jul 27, 2022)

GreiverBlade said:


> actually it would work without any connector and limited to a max powerdraw of 75w given its V-Sync 60hz consumption if kept at 1080p (my PSU offer 6pin and 6+2pin tho but i usually use sleeved extensions)



Maybe but probably not. The sense pins on pcie power connectors are there for a reason, to tell the card if it has power or not. If the card has a power connector at least 6 pins should plugged



GreiverBlade said:


> although if they had a 75w limited LP model at a lower price than the 6400, i would take one for giving "the new player on the market" a go (and if there was no RE-BAR "mandatory"  )



From the bio you're using a 3600 and b550, don't you have rebar on that? It should be a bios update away


----------



## kapone32 (Jul 27, 2022)

AusWolf said:


> The what now?


All of the negative hype around the 6500XT will have those town criers and now with this card's 6GB frame buffer will be called the minimum. We have to look at the fact that this A380 card is only being held back by drivers to more compete with the rest of the low end cards. Yes it is in the range of the 1650 Super/6400 but those are both super mature in terms of drivers.


----------



## swirl09 (Jul 27, 2022)

Solid State Brain said:


> Perhaps that's possibly game-specific? It would be useful to see if OC performance is better in modern (DX12/Vulkan) games.


Umm, thats like a 45% perf jump with the OC..... my 'X' key is glowing and calling to me.


The state of things is so messed up rn. 200 quid or so for something thats in the region of a 1/4 slower than a GTX 1060. I dont like this timeline.


----------



## wolf (Jul 27, 2022)

ZoneDymo said:


> That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like


I agree, impressive as a first showing, very keen to see how that plays out through the rest of the stack.


R0H1T said:


> You mean that performance only marginally better than an XP slideshow with RT on


I think W1zzard summed it up pretty well, obviously the outright performance of this card makes RT on unplayable, but the relative performance against competitors of different architectures is an indication that their architecture may handle RT reasonably well.

From the conclusion; 


> It's still an interesting data point with promising results that bodes well for the architecture though, because Intel will be releasing higher-end graphics cards very soon.


----------



## GreiverBlade (Jul 27, 2022)

trsttte said:


> Maybe but probably not. The sense pins on pcie power connectors are there for a reason, to tell the card if it has power or not. If the card has a power connector at least 6 pins should plugged


i meant in a LP board -without- a connector at all. (i am not that idiotic, no worries ... )


trsttte said:


> From the bio you're using a 3600 and b550, don't you have rebar on that? It should be a bios update away


from my syspecs, why would i get a A380 to replace a Powercolor Red Devil RX 6700 XT 12gb  (i suspect not even a A770 either ...  )
no, it would be in LP only for the "SFFHTPCARGH!" listed just after, which use a i7-3770 and a GT 730 2gb ... PCIeX 2.0  no Re-Bar
thus, why the RX 6400 4gb is a better option in the end, as it consum less and have performance in the same ballpark


----------



## Forza.Milan (Jul 27, 2022)

not bad, not bad at all, just try again next time..


----------



## QuietBob (Jul 27, 2022)

_"into the *unknow*"_
What an apt blurb for Intel's debut GPU


----------



## defaultluser (Jul 27, 2022)

Slower performance than the 6400 (excluding rt, which isn't a realistic idea on either card),  higher price and twice the power -- if Intel thinks this will end up in anything except the i740 discount bin, they've got another thing coming!


----------



## The red spirit (Jul 27, 2022)

ZeppMan217 said:


> A beta version of a GPU. What a time to be not dead.


Still better than GTX 1630, that was unsuccessful abortion


----------



## 80-watt Hamster (Jul 27, 2022)

PapaTaipei said:


> I'm not sure the public looking for low end GPUs really use RT.



Nope, but as it's built into the architecture, Intel has two choices:  Disable it and deal with complaints about locked-out features, or leave it in and deal with complaints that it doesn't perform.  Personally, I'm glad they chose the former.  If nothing else, it provides a window into the potential of higher-end chips.



PapaTaipei said:


> Btw that pricing is extremely high.



MSRP:  Pretty high, by 20-30 USD.  Street pricing?  Oh yeah; straight bananas.


----------



## chaosmassive (Jul 27, 2022)

Bruno_O said:


> HTPC card I guess... would rather get a 5600G for that actually


never imagined I would read HTPC card in 2022, what is this? 2010 ?


----------



## Gundem (Jul 27, 2022)

Holy Moly, its real 8O


----------



## efikkan (Jul 27, 2022)

Intel's big mistakes are advertising this as a "gaming card" and the pricing, when it's really a low-end graphics card which is fine for most workloads other than gaming.



AusWolf said:


> But we need some people to "beta test" it, right?
> 
> If everyone held out with buying an Arc GPU until 2025, then there would be no Arc GPU in 2025. It's not economically feasible to spend money on developing a product line that no one buys.
> 
> ...


If I were one of the three GPU makers, I would create a "beta GPU" program, where the public could get their hands on a "locked down" and downclocked QS GPU to get more widespread game testing, either for free or a symbolic fee, and supply these a couple of months ahead of a new GPU architecture release. With hundreds of games (thousands if you count indie games), there is no way any of these three can cover enough test cases.



chaosmassive said:


> never imagined I would read HTPC card in 2022, what is this? 2010 ?


So no one is building media center PCs any more?


----------



## Dragokar (Jul 27, 2022)

That's also why they're sending out massive staff to YouTubers to give them a cozy time


----------



## chstamos (Jul 27, 2022)

Quite honestly, Intel needs to subsidize first gen in order to create a large (beta) user base, iron out the issues, and go on to move to competitive successors. I don't really get the "buy it even if it's substandard to help competition" argument.

Intel is not some company struggling to keep afloat and keep developing before closing shop. Intel is not 3dfx before rampage.

It's not going bankrupt anytime soon, it's got billions of dollars, cash, at hand.

They want to succeed, they need to price accordingly. Current pricing on this card is laughable, while the card itself could be an enticing proposition at 110-120 dollars. That's what it should cost, at most, in my opinion.


----------



## HD64G (Jul 27, 2022)

Dirt Chip said:


> From the conclusion:
> "I then tried Intel's newest 101.3220 Beta driver and it magically solved all serious issues. *The card would now work and be 100% stable in all games*, including Witcher, DoS 2 and RDR2—looks like the driver team is actually fixing bugs, good job! Just to emphasize, *as soon as I switched to the beta drivers there was not a single crash during all my testing—a huge improvement.*"
> 
> So not so bad and obviously getting better quite fast.
> ...


Bad drivers means not only unstable, bugged or having a suite not working as intended. Bad drivers also means unoptimised or having heavy overhead that decreases performance and having stuttering on old graphic APIs. Especially with reBAR off the GPU is a bad experience in many games with excessive stutters.


----------



## Valantar (Jul 27, 2022)

Thanks for another great review! Seems to mostly support GN's findings from a few weeks ago - roughly RX 6400 performance (though quite variable, and must have ReBAR) at 50% higher power consumption.

One question, @W1zzard: shouldn't the "average fps" and "relative performance" tables be identical, just with different numerical representations? 'Cause at least for 1080p the ordering is different, which seems strange to me.


----------



## waltc (Jul 27, 2022)

Good review--it's exactly what I thought it would be.  Can only point out that Intel, too, has spent billions of dollars in R&D on architecture scaling--likely considerably more than AMD, for instance, over the past decade.  I don't think anyone buying this card would be impressed with it or even keep it, for that matter...  Why should they?  So Intel threw a few billion $ and 5 + years at designing a GPU--but can only sell it in China for obvious reasons?  Nobody here would buy it.  Wake me when they have something they can ship into the West. Sorry to sound negative, but I haven't seen anything to get excited about yet, truthfully. Extrapolating future performance from an over-priced, exceptionally low performance GPU with tons of driver bugs that isn't good enough to ship into western markets doesn't seem very worthwhile to me...but that's just me. I remember the three i7xx GPUs I bought from Intel before they dropped out of the GPU businesses the first time (after buying Real3D IIRC.) Took 'em all back...  If they get the driver bugs ironed out (big if, for Intel) then selling something like this for ~$100 might give them a spot in the entry-level GPU markets.  But not yet...


----------



## Imouto (Jul 27, 2022)

Kudos to Raja for not disappointing. The guy was promising a turd and delivered handsomely.


----------



## W1zzard (Jul 27, 2022)

Valantar said:


> shouldn't the "average fps" and "relative performance" tables be identical, just with different numerical representations? 'Cause at least for 1080p the ordering is different, which seems strange to me.


Yeah it's a bit of an edge case here, I was wondering when someone would ask  As you can see from the data, in some games the card with the WHQL drivers failed, average FPS uses "0 FPS" for the average calculation, relative performance estimates a value based on the results from the other results, because using "0" in that calculation won't work, because the other cards would all be relatively infinitely fast in that test


----------



## AusWolf (Jul 27, 2022)

chstamos said:


> Quite honestly, Intel needs to subsidize first gen in order to create a large (beta) user base, iron out the issues, and go on to move to competitive successors. I don't really get the "buy it even if it's substandard to help competition" argument.
> 
> Intel is not some company struggling to keep afloat and keep developing before closing shop. Intel is not 3dfx before rampage.
> 
> ...


Don't buy the GPU to help the competition. Buy it only if you want to experiment with something new. 

All I'm saying is that we need people with these GPUs in their PCs to submit lots of bug reports so that Intel can deal with the issues either in driver updates or in future GPU generations. If everybody waits for something stable, there will be no future generations (maybe apart from Battlemage as it's already in development).


----------



## Assimilator (Jul 27, 2022)

I don't understand why every review outfit, including the ones I trust to be objective like GN and TPU, is giving this GPU a pass. Because it's rubbish. Objectively rubbish. Intel/GUNNIR haven't even figured out fan hysteresis FFS!

Saying "RT performance is good compared to AMD" is pointless when we're talking about double or even single-digit RT framerates. And when the generation-old RTX 2060 is twice as fast.

Don't buy this rubbish. Don't pay Intel to be beta testers for them.


----------



## PapaTaipei (Jul 27, 2022)

Assimilator said:


> I don't understand why every review outfit, including the ones I trust to be objective like GN and TPU, is giving this GPU a pass. Because it's rubbish. Objectively rubbish. Intel/GUNNIR haven't even figured out fan hysteresis FFS!
> 
> Saying "RT performance is good compared to AMD" is pointless when we're talking about double or even single-digit RT framerates. And when the generation-old RTX 2060 is twice as fast.
> 
> Don't buy this rubbish. Don't pay Intel to be beta testers for them.


I don't think anyone is buy it, it's meant for prebuilt PC I guess. I don't know maybe there is a market for that in developing countries?


----------



## InVasMani (Jul 27, 2022)

HTPC card and not much else.


----------



## QuietBob (Jul 27, 2022)

Big thanks to @W1zzard for this thorough review. I especially appreciated the minimum settings tests, since the card clearly isn't meant to handle full detail. Regardless, the 6500XT currently starts at $150 pre-tax here in Europe, making it a much better offer.

BTW, p. 37 has the wrong card model in the temperature table


----------



## GreiverBlade (Jul 28, 2022)

InVasMani said:


> HTPC card and not much else.


yeah, if it was less power hungry and performing a bit better ... while i am a bit surprised about the A380 (and Intel Xth atempt at GPU, which is still to be seen if they fail, like previous atempt, or not)  , the more i read the review in details the more i think, even at same price, a RX 6400 (preferably LP no additional power connector) would be better ...



QuietBob said:


> Big thanks to @W1zzard for this thorough review. I especially appreciated the minimum settings tests, since the card clearly isn't meant to handle full detail. Regardless, the 6500XT currently starts at $150 pre-tax here in Europe, making it a much better offer.
> 
> BTW, p. 37 has the wrong card model in the temperature table


switzerland : cheapest RX would be the 6400 @ 170ish $  (still a slightly better offer than the A380 )


----------



## Dirt Chip (Jul 28, 2022)

Assimilator said:


> I don't understand why every review outfit, including the ones I trust to be objective like GN and TPU, is giving this GPU a pass. Because it's rubbish. Objectively rubbish. Intel/GUNNIR haven't even figured out fan hysteresis FFS!
> 
> Saying "RT performance is good compared to AMD" is pointless when we're talking about double or even single-digit RT framerates. And when the generation-old RTX 2060 is twice as fast.
> 
> Don't buy this rubbish. Don't pay Intel to be beta testers for them.


From a narrow consumer point of view I agree, I is rubbish you better of with.

But from a wider point of view, of a process at it very beginning, it is firstly very interesting and specifically, from this A380 review, much encouraging. Sure not a rubbish start.


----------



## W1zzard (Jul 28, 2022)

QuietBob said:


> BTW, p. 37 has the wrong card model in the temperature table


Fixed, thanks!


----------



## noel_fs (Jul 28, 2022)

absolute garbage


----------



## efikkan (Jul 28, 2022)

noel_fs said:


> absolute garbage


The card isn't garbage, but it is terrible value.

I think every article/review about low-end cards needs a disclaimer that these cards have many valid uses beyond gaming, especially those wanting to add new capabilities to older machines which still have are working fine.

If Intel priced this card below $100 and didn't market it as a gaming card, the reception would be fine.


----------



## Assimilator (Jul 28, 2022)

efikkan said:


> If Intel priced this card below $100 and didn't market it as a gaming card, the reception would be fine.


Except they didn't. What-ifs are about as useful to the GPU market as a new competitor that can't compete.

Ultimately Intel's greed is what has made Arc DOA. I disagree with Wizz that this card should be priced as low as $100 (although that would be a slam-dunk in terms of marketshare); realistically all Intel needed to do was price it lower than GTX 1630 i.e. $135. They simply didn't bother to do that because that would cause them to make a loss, and that's pure greed and pride and refusal to accept market reality.

It certainly doesn't help that Intel decided to overbuild an entry-level card, thus increasing cost, instead of going as simple and cheap as possible. It's like they have no concept of the fact that price is king at the low end of the market.


----------



## Valantar (Jul 28, 2022)

Assimilator said:


> Except they didn't. What-ifs are about as useful to the GPU market as a new competitor that can't compete.
> 
> Ultimately Intel's greed is what has made Arc DOA. I disagree with Wizz that this card should be priced as low as $100 (although that would be a slam-dunk in terms of marketshare); realistically all Intel needed to do was price it lower than GTX 1630 i.e. $135. They simply didn't bother to do that because that would cause them to make a loss, and that's pure greed and pride and refusal to accept market reality.
> 
> It certainly doesn't help that Intel decided to overbuild an entry-level card, thus increasing cost, instead of going as simple and cheap as possible. It's like they have no concept of the fact that price is king at the low end of the market.


It's worth noting that in one of the recent YouTube conversations with Ryan Shrout and another Intel GPU guy I can't remember(can't remember if it was one of their GN appearances or that thing they did on LTT), they highlighted that the US MSRP of this GPU would _not_ be $150, but rather something unspecified but lower (and that Chinese pricing was affected by local/regional factors like tax ++). Still, even at $130 this would be rather dubious in terms of value IMO. What they said was that Arc pricing would be aggressively competitive and based on what they call "tier three" game performance, i.e. performance in poorly/unoptimized games running on older APIs, rather than well optimized (tier 2) or DX12/Vulkan (tier 1) games. I guess we'll see how that pans out though.


----------



## Assimilator (Jul 28, 2022)

Valantar said:


> and that Chinese pricing was affected by local/regional factors like tax ++


I guess neither AMD nor NVIDIA GPUs get taxed, huh? What a ridiculous non-argument from Intel.



Valantar said:


> What they said was that Arc pricing would be aggressively competitive


Except it's not, and Intel says a lot of things that are complete bullshit.


----------



## Dirt Chip (Jul 28, 2022)

Assimilator said:


> Except they didn't. What-ifs are about as useful to the GPU market as a new competitor that can't compete.
> 
> Ultimately Intel's greed is what has made Arc DOA. I disagree with Wizz that this card should be priced as low as $100 (although that would be a slam-dunk in terms of marketshare); realistically all Intel needed to do was price it lower than GTX 1630 i.e. $135. They simply didn't bother to do that because that would cause them to make a loss, and that's pure greed and pride and refusal to accept market reality.
> 
> It certainly doesn't help that Intel decided to overbuild an entry-level card, thus increasing cost, instead of going as simple and cheap as possible. It's like they have no concept of the fact that price is king at the low end of the market.


A little exaggerating of you, to DOA a ARC on a basis of 1 review and about 3 sec after lunch...
Give it (or don`t..) a few month to mature before you decide. See how it evolve and than choose whether to buy it, recommend it or not.

Maybe dial down the harsh words, so much fustration in you for intel.
You should also consider yourself to do a 'reality check' becuse intel isn't your friend so thay can't "betray" you. Thay are here, by definition and without any disguise, to do as much mony as they can- thay must so. This is how you do business this days.
Using 'greed' ' pride' and so on look like a childish, affended, reaction (which I'm sure your not).


----------



## KLiKzg (Jul 28, 2022)

Intel's 1st steps back to GPUs, as AI is on their scope.


----------



## eidairaman1 (Jul 28, 2022)

Failintel



Assimilator said:


> Except they didn't. What-ifs are about as useful to the GPU market as a new competitor that can't compete.
> 
> Ultimately Intel's greed is what has made Arc DOA. I disagree with Wizz that this card should be priced as low as $100 (although that would be a slam-dunk in terms of marketshare); realistically all Intel needed to do was price it lower than GTX 1630 i.e. $135. They simply didn't bother to do that because that would cause them to make a loss, and that's pure greed and pride and refusal to accept market reality.
> 
> It certainly doesn't help that Intel decided to overbuild an entry-level card, thus increasing cost, instead of going as simple and cheap as possible. It's like they have no concept of the fact that price is king at the low end of the market.



This should of been a half height card sub $70, its nothing but a display adapter


----------



## Valantar (Jul 28, 2022)

Assimilator said:


> Except it's not, and Intel says a lot of things that are complete bullshit.


The product literally isn't launched outside of China yet, so your basis for that statement is ... guesswork?


----------



## Assimilator (Jul 28, 2022)

Valantar said:


> The product literally isn't launched outside of China yet, so your basis for that statement is ... guesswork?


The claim: "Arc pricing will be aggressively competitive".
The reality: In 1 of 1 areas Arc has launched in, Arc pricing has been the exact opposite.
Primary conclusion: 100% of the sample size directly contradicts the claim.
Secondary conclusion: Intel's GPU division is feeding you bullshit, again, still.

Unless you want to claim that Intel said it would be price-competitive everywhere _except China_?


----------



## Valantar (Jul 28, 2022)

Assimilator said:


> Unless you want to claim that Intel said it would be price-competitive everywhere _except China_?


That is pretty much exactly what they said - they were explicitly differentiating US/EU pricing from Chinese pricing, and describing different policies and strategies for the different markets.


----------



## AusWolf (Jul 28, 2022)

Assimilator said:


> The claim: "Arc pricing will be aggressively competitive".
> The reality: In 1 of 1 areas Arc has launched in, Arc pricing has been the exact opposite.
> Primary conclusion: 100% of the sample size directly contradicts the claim.
> Secondary conclusion: Intel's GPU division is feeding you bullshit, again, still.
> ...


100% of a sample size of 1 isn't much, is it? Let's wait for the whole product line to be released worldwide before we jump to conclusions.


----------



## efikkan (Jul 28, 2022)

Valantar said:


> It's worth noting that in one of the recent YouTube conversations with Ryan Shrout and another Intel GPU guy I can't remember <snip> What they said was that Arc pricing would be aggressively competitive and based on what they call "tier three" game performance, i.e. performance in poorly/unoptimized games running on older APIs, rather than well optimized (tier 2) or DX12/Vulkan (tier 1) games. I guess we'll see how that pans out though.


Yeah, those videos with Tom and Ryan were awkward, they struggled really hard to put the cards in a positive light. It was probably right before they claimed to have the most "design wins"!
I fear that claim of pricing competitively according to tiers leaves them with too much wiggle room; competitive according to which standard? Does "special features" play a role in valuation?
It kind of reminds me of pretty much every car brand out there who all apparently are "best in class"…


----------



## Valantar (Jul 28, 2022)

efikkan said:


> Yeah, those videos with Tom and Ryan were awkward, they struggled really hard to put the cards in a positive light. It was probably right before they claimed to have the most "design wins"!
> I fear that claim of pricing competitively according to tiers leaves them with too much wiggle room; competitive according to which standard? Does "special features" play a role in valuation?
> It kind of reminds me of pretty much every car brand out there who all apparently are "best in class"…


I mostly agree, though at least that part of the claim was pretty clear to me: competitive pricing (i.e. similar to or lower than competitors) when comparing games that are poorly optimized for Arc, with well optimized titles seeing the card punch well above its price class (at least from the basis that the difference between "poorly optimized" and "well optimized" in GN's testing was something like a 50% increase in performance compared to competitors). Of course this still leaves a lot of room for working on drivers before setting the MSRP, and judging by this review that work is definitely having effects of various kinds. For now, it seems they're running the usual "launch as a beta test in China" shtick while stalling/waiting it out in other markets.


----------



## Mussels (Jul 29, 2022)

It'd be a total failure from AMD or Nvidia, but for the first card from a new competitor its decent (with the beta drivers, at least)


It's not what you'd expect from intel who's had decades of GPU driver management for their IGP's, however



Assimilator said:


> I don't understand why every review outfit, including the ones I trust to be objective like GN and TPU, is giving this GPU a pass. Because it's rubbish. Objectively rubbish. Intel/GUNNIR haven't even figured out fan hysteresis FFS!
> 
> Saying "RT performance is good compared to AMD" is pointless when we're talking about double or even single-digit RT framerates. And when the generation-old RTX 2060 is twice as fast.
> 
> Don't buy this rubbish. Don't pay Intel to be beta testers for them.


It's not priced or marketed to be a competitor with a 3090Ti here, if performance is within the correct segment its doing well - the RT performance is unplayable here, but the higher performance models wont be






Oh and good to see a lot of detail given on the state of the drivers and software, because holy hell that'd be a shock to someone expecting 'intel quality'


----------



## Forza.Milan (Jul 29, 2022)

I really hope intel succeeds with their gpu planning, we need another source of gpu to tackle mining activities in the future..


----------



## R0H1T (Jul 29, 2022)

And some rumors already suggesting that Intel may quit their dGPU gig early, *brutal hit* to their numbers everywhere!








						Intel Reports Second-Quarter 2022 Financial Results. First Net Loss in Decades
					

Intel Corporation today reported second-quarter 2022 financial results. "This quarter's results were below the standards we have set for the company and our shareholders. We must and will do better. The sudden and rapid decline in economic activity was the largest driver, but the shortfall also...




					www.techpowerup.com


----------



## AusWolf (Jul 29, 2022)

R0H1T said:


> And some rumors already suggesting that Intel may quit their dGPU gig early, *brutal hit* to their numbers everywhere!
> 
> 
> 
> ...


I know my corporate blah-blah language skills are limited, but I don't see any mention of Intel's future dGPU plans in this article.


----------



## R0H1T (Jul 29, 2022)

It's not from here, the rumor about exiting dGPU space, but read it on reddit & AT. With Intel you never know, outside of CPU's they've never really excelled at anything else!

And things will get much worse before they get better for them, basically if they're looking to cut even more costs (gaming?) dGPU's could be on the chopping block!


----------



## AusWolf (Jul 29, 2022)

R0H1T said:


> It's not from here, the rumor about exiting dGPU space, but read it on reddit & AT.


I don't trust reddit. If I read about it here, I'll believe it. 



R0H1T said:


> With Intel you never know, outside of CPU's they've never really excelled at anything else!


I don't know. Their network adapters are pretty decent.



R0H1T said:


> And things will get much worse before they get better for them, basically if they're looking to cut even more costs (gaming?) dGPU's could be on the chopping block!


I think it's too soon to talk about a business segment that they haven't really started yet. First, they have to release some products to have their investments returned. If Arc ends up being a huge flop, then they might think about quitting the dGPU field. But not before - it would be foolish.


----------



## R0H1T (Jul 29, 2022)

AusWolf said:


> I don't trust reddit. If I read about it here, I'll believe it.


Hence the part about it being a rumor, I trust AT as much as TPU if not more.


AusWolf said:


> Their *network adapters* are pretty decent.


Probably because they're much simpler?


AusWolf said:


> First, they have to release some products to have their investments returned.


They have the enterprise market for that & much better margins there. The consumer market can be brutal if you've not planned it properly & don't intend to be in it for the long run! Intel probably spent 10x as much trying to fit in the mobile market & eventually left with literally *billions in losses per quarter* & that was back when *they could afford it*.


----------



## AusWolf (Jul 29, 2022)

R0H1T said:


> They have the enterprise market for that & much better margins there. The consumer market can be brutal if you've not planned it properly & don't intend to be in it for the long run! Intel probably spend 10x as much trying to fit in the mobile market & eventually left with literally *billions in losses per quarter* & that was back when they could afford it.


Yes, but they've tried. A-series Arc is basically done, and B-series is already in development. Making it all happen is not cheap, and leaving it all to rot without selling anything would be stupid. They might stop after B-series if the concept doesn't gain foothold, but not before. If I had an unsuccessful investment, I wouldn't shut it down straight away. I'd try to be on the market as long as I can to earn back at least some of my losses.


----------



## R0H1T (Jul 29, 2022)

Of course they wouldn't suddenly stop tomorrow or even next quarter, for that to happen things would have to get much much worse. But Intel's in a much worse shape than many others realize ~








						Intel Arc Alchemist refresh reportedly in the works as next-gen Arc Battlemage supposedly gets delayed
					

Arc Alchemist is Intel’s first foray into dedicated gaming hardware meant to compete directly with Nvidia and AMD. Rumor has it that Intel might be planning a refresh of the Arc Alchemist GPU lineup for notebooks. The refresh, if true, will be surprising since notebooks with Arc Alchemist dGPUs...




					www.notebookcheck.net
				



From AT ~


> > Q: Reserves for Sapphire and timelines - can give more specific on SPR, not just early units but volume? How does this impact/push out EMR and GNR?
> 
> 
> 
> ...








						Question - Intel Q2 Results - Terrible
					

https://d1io3yog0oux5.cloudfront.net/_0f37de80c52d83ac6338b4805dea84aa/intel/db/887/8856/earnings_release/Q2+22_EarningsRelease+%281%29.pdf  Revenue down 22%.   Edit: Oh and it gets better... Intel posted a loss!




					forums.anandtech.com
				



Their latest & greatest, in servers, is delayed again!

Just to give you a hint look at their margins ~






Now imagine AMD just around the Dozer era, except 10-15x as big!


----------



## AusWolf (Jul 29, 2022)

R0H1T said:


> Of course they wouldn't suddenly stop tomorrow or even next quarter, for that to happen things would have to get much much worse. But Intel's in a much worse shape than many others realize ~
> 
> 
> 
> ...


Sure, but this is still not an indication of their future dGPU plans.


----------



## R0H1T (Jul 29, 2022)

Like I said it's rumor so take it for what it's worth, or not.


----------



## Mussels (Jul 29, 2022)

R0H1T said:


> It's not from here, the rumor about exiting dGPU space, but read it on reddit & AT. With Intel you never know, outside of CPU's they've never really excelled at anything else!
> 
> And things will get much worse before they get better for them, basically if they're looking to cut even more costs (gaming?) dGPU's could be on the chopping block!


I mean, until this year almost every single computer company used intel hardware - my AMD motherboards have intel ethernet

SATA ports, Ethernet chipsets, USB chipsets, wireless chips
Intel networking hardware is overall considered the best there is - high performance, no driver issues (and on the wifi side, incredibly cheap)

Intel make a looooot more than just CPUs


----------



## Valantar (Jul 29, 2022)

Mussels said:


> I mean, until this year almost every single computer company used intel hardware - my AMD motherboards have intel ethernet
> 
> SATA ports, Ethernet chipsets, USB chipsets, wireless chips
> Intel networking hardware is overall considered the best there is - high performance, no driver issues (and on the wifi side, incredibly cheap)
> ...


Not to mention FPGAs, datacenter/server networking and fabric, server RAID controllers/HBAs, 5G-related ASICs, SSDs, non-volatile memory solutions ... they do _a ton_ of stuff. With varying success? Sure. But a ton of stuff still.


----------



## R0H1T (Jul 29, 2022)

Mussels said:


> Intel make a looooot more than just CPUs


And they fail at most of them, remember (5G) modems? Most of what you quoted is part of the (mother) board or does Intel sell SATA ports now? And as I pointed out earlier they're not that complex.



Valantar said:


> FPGAs


They acquired Altera, just like AMD did with Xilinx. Outside their core area of x86 CPU, besides "networking" they have very little to show!



Valantar said:


> SSDs


Sold that a while back!









						Intel Sells SSD Business and Fab to Sk Hynix, New 'Solidigm' Subsidiary Launched
					

Intel and SK hynix seal the first phase of the deal.




					www.tomshardware.com


----------



## Valantar (Jul 29, 2022)

R0H1T said:


> They acquired Altera, just like AMD did with Xilinx. Outside their core area of x86 CPU, besides "networking" they have very little to show!


...and? That makes Altera a part of Intel, and what-used-to-be-Altera's business Intel's business. Also, saying "besides "networking"" seems to indicate a lack of understanding of just how large, profitable and fundamental to the industry that sector is. Fabric and networking is the backbone of modern server, HPC and cloud computing, and will only be growing more important - and Intel is among the longest-standing, highest performing, most respected vendors in that area. There's a reason Nvidia spent billions acquiring Mellanox - fabric is crucial. 


R0H1T said:


> Sold that a while back!


The main NAND unit, sure, but they still make Optane SSDs for enterprise users. As I said, not all of these ventures are very successful, but many of them are.


----------



## R0H1T (Jul 29, 2022)

Valantar said:


> The main NAND unit, sure, but they still make *Optane SSDs* for enterprise users.


Did you miss the front page news?


Valantar said:


> Also, saying "besides "networking"" seems to indicate a lack of understanding of just how large, profitable and fundamental to the industry that sector is.


Your post would've made some sense had they excelled in 5G, or even 4G, but that never came to pass! In a world where QC, Broadcom, Marvell, MediaTek exist Intel doing solid networking is not earth shattering news. They've been at it for what 30~40 years now IIRC?


Valantar said:


> ...and? That makes Altera a part of Intel, and what-used-to-be-Altera's business Intel's business.


And that makes what Altera developed eons back somehow Intel's innovation?


Valantar said:


> There's a reason Nvidia spent billions acquiring Mellanox - fabric is crucial.


It is & how does that make my statement less relevant? Should I post the list of their failures? I'm sure I can get more than what you posted back then.

There's a reason there's just 2 major x86 chipmakers in the world or 2 (major) dGPU makers & no it's nothing to do with just their patents! It's effin hard that's why.


----------



## Valantar (Jul 29, 2022)

R0H1T said:


> Did you miss the front page news?


Given that it was posted a couple of hours ago, yes.


R0H1T said:


> Your post would've made some sense had they excelled in 5G, or even 4G, but that never came to pass! In a world where QC, Broadcom, Marvell, MediaTek exist Intel doing solid networking is not earth shattering news. They've been at it for what 30~40 years now IIRC?


... and does "having been at it for 30~40 years" somehow mean that they're not successful or a major player in this segment? I don't follow your logic here, sorry. I never claimed that it was news, neither earth shattering nor otherwise. I'm simply pointing out that your view of what Intel does is myopic.


R0H1T said:


> And that makes what Altera developed eons back somehow Intel's innovation?


Have I claimed that it is? Again: Altera's current business is Intel's business, and it is relatively successful. As such, it is a field in which Intel is doing well. Is that success largely due to the work of people at what was at the time another company? Sure. But so what? Companies merge and split constantly, and Intel bought Altera in 2015, meaning that FPGAs developed fully under Intel have been on the market for a bit now.


R0H1T said:


> It is & how does that make my statement less relevant? Should I post the list of their failures? I'm sure I can get more than what you posted back then.


Well, given that your statement was


R0H1T said:


> outside of CPU's they've never really excelled at anything else!


then yes, it does indeed make it less relevant, as it is a major, crucial field in which Intel has been a leading actor for decades by your own admission. That would constitute "excelling" at that, no?


R0H1T said:


> There's a reason there's just 2 major x86 chipmakers in the world or 2 (major) dGPU makers & no it's nothing to do with just their patents! It's effin hard that's why.


It's hard, expensive, and even more expensive if you don't have the patents. And, crucially, literally not legally possible without access to licencing those patents and core technologies. VIA _could_ have done it, but never had the funding or market presence to get that funding (though their Chinese branches/subsidiaries/spinoffs seem to be doing quite a lot better now, with the cash injections that has brought with it).


----------



## R0H1T (Jul 29, 2022)

Valantar said:


> That would constitute "excelling" at that, no?


No it would have to be at or near class leading, which admittedly due to lack of competition in the GPU space ~ they are, although Apple's beating them hard with the Mx gen SoCs.


Valantar said:


> It's hard, expensive, and even more expensive if you don't have the patents. And, crucially, literally not legally possible without access to licencing those patents and core technologies.


That doesn't apply to Intel in the dGPU space, they had the patent sharing (licensing?) agreement with Nvidia & then AMD. If they couldn't do it a major reason was that they were simply not good enough!

Strictly speaking outside of probably ethernet Intel's not been leading anywhere in the *networking *space AFAIK & even when it comes to ethernet they're not winning any benchmarks, so I'm not sure how you'd show they're excelling there? I'll give Intel bonus points for reliability though.








						Qualcomm sues Apple for making its modems look inferior to Intel’s
					

The chipset company countersues after it was hit with numerous lawsuits for overcharging manufacturers.




					www.gsmarena.com


----------



## Valantar (Jul 29, 2022)

R0H1T said:


> No it would have to be at or near class leading, which admittedly due to lack of competition in the GPU space ~ they are, although Apple's beating them hard with the Mx gen SoCs.


... so Intel is "at or near class leading" in the GPU space, but by some measure _not_ in the networking space? Sorry, but what planet are you on? Intel is _far_ closer to the front of the pack in networking than they are in GPUs. It's also a more diverse field and thus more difficult to judge, but Intel is way, way up there in terms of networking both for consumers, enterprise, and high end datacenter stuff. As for GPUs, they are ... doing okay for a first effort, I guess?


R0H1T said:


> That doesn't apply to Intel in the dGPU space, they had the patent sharing (licensing?) agreement with Nvidia & then AMD. If they couldn't do it a major reason was that they were simply not good enough!


Wait, I was responding to your argument about X86 CPUs here, this wasn't about GPUs, was it?


R0H1T said:


> Strictly speaking outside of probably ethernet Intel's not been leading anywhere in the *networking *space AFAIK & even when it comes to ethernet they're not winning any benchmarks, so I'm not sure how you'd show they're excelling there? I'll give Intel bonus points for reliability though.
> 
> 
> 
> ...


From the way you're speaking of this it seems your limited understanding of what goes into "networking" is rather limited, as you're speaking either of client controllers (ethernet/Wifi) or mobile data (which isn't generally considered networking in that sense, though of course there's an overlap in functionality). You're entirely glossing over the enterprise networking market, datacenters, HPC, server fabric solutions, +++. These are _huge_, high performance, high cost and high margin areas. Intel is by no means an out-and-out market leader in this space, but they've delivered among the largest, best supported and best performing solutions in many areas of these complex markets for years and years. Claiming they "haven't really excelled" in this space is pure nonsense.


----------



## R0H1T (Jul 29, 2022)

Valantar said:


> so Intel is "at or near class leading" in the GPU space, but by some measure _not_ in the networking space?


I mentioned* at least 4 other competitors *in the "networking" space, besides ethernet ~ which probably you can't prove they're leaders in based on benchmarks otherwise you would've done it! Where have they excelled? They're kinda "leading" in the GPU space due to lack of competition, much like x86 CPU's.


Valantar said:


> Wait, I was responding to your argument about X86 CPUs here, this wasn't about GPUs, was it?


What? You need to get your story straight, I already covered this here ~


R0H1T said:


> 2 (major) dGPU makers & no it's nothing to do with just their patents!* It's effin hard that's why*.


I clearly mentioned both CPU & GPU, the reason Intel couldn't do it because it's difficult (to compete) & complex!

Wanna go back to your favorite point about networking now?


Valantar said:


> You're entirely glossing over the enterprise networking market, datacenters, HPC, server fabric solutions, +++. These are _huge_, high performance, high cost and high margin areas. Intel is by no means an out-and-out market leader in this space, but *they've delivered among the largest, best supported and best performing solutions in many areas of these complex markets for years and years. *Claiming they *"haven't really excelled" in this space is pure nonsense.*


Let me give you a hint ~ it's because of that virtual server (& until recently x86) monopoly!








						Apple Removes Remaining Intel Components from M2 MacBooks
					

Apple has removed the final remaining Intel components from its latest M2 MacBooks with the Intel JHL8040R USB4 timer chips being replaced with a pair of custom U09PY3 chips. This change was discovered by iFixIt during a recent teardown and documented by Twitter user SkyJuice with the exact...




					www.techpowerup.com
				



You seem to excel in pedantry, how about you back your arguments with more than just "the world knows"? I gave you real world example of how Intel's failed at 4G modem, they utterly failed in even getting a 5G one to work!


----------



## noel_fs (Jul 29, 2022)

efikkan said:


> The card isn't garbage, but it is terrible value.
> 
> I think every article/review about low-end cards needs a disclaimer that these cards have many valid uses beyond gaming, especially those wanting to add new capabilities to older machines which still have are working fine.
> 
> If Intel priced this card below $100 and didn't market it as a gaming card, the reception would be fine.


i absolutely agree

but since thats not the case, we gotta call it for what it is.


----------



## efikkan (Jul 29, 2022)

noel_fs said:


> i absolutely agree
> 
> but since thats not the case, we gotta call it for what it is.


Precisely how is a product with the _wrong_ price "garbage"?
I believe any reasonable mind would distinguish between an overpriced product and a garbage product.
And prices can change, you know


----------



## Valantar (Jul 30, 2022)

R0H1T said:


> I mentioned* at least 4 other competitors *in the "networking" space, besides ethernet ~ which probably you can't prove they're leaders in based on benchmarks otherwise you would've done it! Where have they excelled? They're kinda "leading" in the GPU space due to lack of competition, much like x86 CPU's.


But ... does the existence of competitors somehow constitute Intel not excelling in networking? I don't follow your logic here, sorry. To excel at something doesn't mean you have to be the best at all times, nor that you have to be the indisputed leader at something.


R0H1T said:


> What? You need to get your story straight, I already covered this here ~
> 
> I clearly mentioned both CPU & GPU, the reason Intel couldn't do it because it's difficult (to compete) & complex!


Yet they _are_ competing - with a sub-par solution, sure, but for a first effort GPU it's really not bad - and, crucially, you said "it has nothing to do with their patents" which is just plain untrue. It's not like having access to the patented tech makes you immediately produce a competent product, obviously, but without those you're screwed before you even start.

I know you mentioned both CPUs and GPUs, but I responded specifically to the CPU part, using VIA as an example. I could have been more explicit about that selection, sure, but it should be pretty clear from what I wrote that I was talking specifically about CPUs. And, crucially, patent licencing in GPUs is much, much more open than CPUs since X86 is locked into a mutually supporting duopoly, while there are many actors holding important GPU patents.


R0H1T said:


> Wanna go back to your favorite point about networking now?
> 
> Let me give you a hint ~ it's because of that virtual server (& until recently x86) monopoly!
> 
> ...


... so? Someone picking another option, or Intel failing, is not proof of them "never really excelling" at something. A current failure does not negate previous success; a failure in one field does not negate a success in another. You're making major logical leaps here with nothing much supporting them, all to back up an exceptionally overbroad statement rather than admitting that it was bombastic and overbroad to begin with.


R0H1T said:


> You seem to excel in pedantry, how about you back your arguments with more than just "the world knows"? I gave you real world example of how Intel's failed at 4G modem, they utterly failed in even getting a 5G one to work!


So, let me get this straight: you make an exceptionally overbroad statement of "outside of x86 CPUs, Intel has never really excelled at anything else", which I object to, bringing up multiple examples of product segments in which they have been or are currently very successful, and that amounts to _pedantry_ to you? I don't think you're quite grasping just how broad and bombastic your first claim was - I'm just trying to add some desperately needed nuance here. You seem to express yourself in bold capital letter-type statements, which means there is _a lot_ of detail lacking in both arguments and purportedly factual statements. Are x86 CPUs the main reason for Intel's success, and the core of their business throughout the past three decades or so? Yes, absolutely. Does that amount to them not excelling at anything else? Not even slightly.


----------



## Selaya (Jul 30, 2022)

i mean ... arent/werent intel ssds like, _par excellence_ (ignoring the p660 and other qlc memes) - the problem is just, the nand/ssd market is an open one where intel doesn't hold the duopoly, so they were forced to compete on price (-performance), and weren't able to just like _hey, this is what we've got, and this is how much we're charging for it - take it or leave it_ as they are w/ their cpus, basically so they eventually just gave up on that market


----------



## 80-watt Hamster (Jul 30, 2022)

Selaya said:


> i mean ... arent/werent intel ssds like, _par excellence_ (ignoring the p660 and other qlc memes) - the problem is just, the nand/ssd market is an open one where intel doesn't hold the duopoly, so they were forced to compete on price (-performance), and weren't able to just like _hey, this is what we've got, and this is how much we're charging for it - take it or leave it_ as they are w/ their cpus, basically so they eventually just gave up on that market



This seems to be an emerging theme (or more likely one I'm just now noticing) in highly competitive markets.  "Gaming" memory brands have been popping up like weeds, so Crucial axes Ballistix.  Chinese headphones are taking over the world, so Sennheiser exits the consumer market entirely (licensing the Sennheiser name to someone else, IIRC).  Kingston sells HyperX to HP.  Honeywell sold/spun off all of its consumer product lines decades ago, as did GE.  There are probably innumerable other examples.


----------



## Valantar (Jul 30, 2022)

80-watt Hamster said:


> This seems to be an emerging theme (or more likely one I'm just now noticing) in highly competitive markets.  "Gaming" memory brands have been popping up like weeds, so Crucial axes Ballistix.  Chinese headphones are taking over the world, so Sennheiser exits the consumer market entirely (licensing the Sennheiser name to someone else, IIRC).  Kingston sells HyperX to HP.  Honeywell sold/spun off all of its consumer product lines decades ago, as did GE.  There are probably innumerable other examples.


Large brands exiting highly competitive, low margin markets is hardly new, though not all the examples you mention are relevant. Headphones in particular are quite high margin, and Sennheiser left because company leadership wanted to focus on theie professional product lines (aviation, recording industry, etc.) and not spread company efforts too thinly. That's how their consumer brand got turned into Epos, which AFAIK is entirely independent from Sennheiser at this point (but seem to maintain the right to use the name on existing product lines, unless those are made by the pro team now?).


----------



## trsttte (Jul 31, 2022)

Selaya said:


> i mean ... arent/werent intel ssds like, _par excellence_ (ignoring the p660 and other qlc memes) - the problem is just, the nand/ssd market is an open one where intel doesn't hold the duopoly, so they were forced to compete on price (-performance), and weren't able to just like _hey, this is what we've got, and this is how much we're charging for it - take it or leave it_ as they are w/ their cpus, basically so they eventually just gave up on that market



Worse than that, they were doing optane and 3d x-point in a joint venture with Micron which pulled out of the project and Intel doesn't have the dedicated capacity to produce nand flash. The technology had a very good performance but couldn't compete on price and a few workstations for nerds is not enough to justify keeping it alive (servers will just throw more cheap stuff in parallel until it meets the target). 



80-watt Hamster said:


> This seems to be an emerging theme (or more likely one I'm just now noticing) in highly competitive markets.  "Gaming" memory brands have been popping up like weeds, so Crucial axes Ballistix.  Chinese headphones are taking over the world, so Sennheiser exits the consumer market entirely (licensing the Sennheiser name to someone else, IIRC).  Kingston sells HyperX to HP.  Honeywell sold/spun off all of its consumer product lines decades ago, as did GE.  There are probably innumerable other examples.



The same old race to the bottom. How many storage brands are there vs number of flash vendors? About 10:1 maybe?


----------



## Mussels (Jul 31, 2022)

R0H1T said:


> And they fail at most of them, remember (5G) modems? Most of what you quoted is part of the (mother) board or does Intel sell SATA ports now? And as I pointed out earlier they're not that complex.
> 
> 
> They acquired Altera, just like AMD did with Xilinx. Outside their core area of x86 CPU, besides "networking" they have very little to show!
> ...


you can buy them seperately, especially as RAID cards.

I get the feeling you truly didn't know as much on this topic as you think you did - intel sell a LOT of different hardware, they want to charge you for every slice of the pie they can


----------



## AusWolf (Aug 4, 2022)

What about this?










Did any of these issues occur during the review? Did the "stable" beta driver help?


----------



## W1zzard (Aug 4, 2022)

AusWolf said:


> What about this?


Don't have time to watch the review, can you TLDR it for me? Happy to answer specific questions


----------



## Valantar (Aug 4, 2022)

W1zzard said:


> Don't have time to watch the review, can you TLDR it for me? Happy to answer specific questions


Lots and lots of driver/software issues. The base driver seems to work okay (if at very variable levels of optimization), but Arc Control seems to be so buggy as to be unusable:
- frequently breaks itself in unfixable ways, requiring an OS reinstall or manual registry cleaning to make it work again
- frequently fails to open, even directly after an installation
- Their new anti-tearing sync thing causes major artefacting
- OC controls do nothing
- On some PCs intermittent black screen issues, with the gpu refusing to give off any picture at all even after rebooting (but working in other systems)
- configuration conflicts between Arc Control and the regular Intel display settings app

And a bunch more. Their conclusion is essentially that the GPU should have launched with just a bare metal driver and no additional software at all, just a promise that it's coming once it works.


----------



## InVasMani (Aug 4, 2022)

So in summary I'd have less problems getting a GPU w/o a display output.


----------



## W1zzard (Aug 4, 2022)

Valantar said:


> - frequently breaks itself in unfixable ways, requiring an OS reinstall or manual registry cleaning to make it work again


never happened to me



Valantar said:


> - frequently fails to open, even directly after an installation


works for me. if they are talking about arc control, you need to update it first through the integrated updater



Valantar said:


> - Their new anti-tearing sync thing causes major artefacting


havent tried



Valantar said:


> - OC controls do nothing


worked for me, you have to unselect the control after dragging, by clicking into the window. this is a very common mistake rookie programmers make



Valantar said:


> - On some PCs intermittent black screen issues, with the gpu refusing to give off any picture at all even after rebooting (but working in other systems)


happens mostly with DP, use HDMI or Remote Desktop to work around


----------



## Valantar (Aug 4, 2022)

W1zzard said:


> works for me. if they are talking about arc control, you need to update it first through the integrated updater


They did, and had significant issues with the updater as well (anything from it not updating to failing to launch after updating to corrupting the install as described above).


W1zzard said:


> worked for me, you have to unselect the control after dragging, by clicking into the window. this is a very common mistake rookie programmers make


Pretty sure their video showed them clicking elsewhere afterwards with no effect (leaving the app? Switching to another?) but i can't say I remember that accurately. Still hardly acceptable...


W1zzard said:


> happens mostly with DP, use HDMI or Remote Desktop to work around


Yeah, that is... not an acceptable level of functionality for a shipping retail product.


----------



## W1zzard (Aug 4, 2022)

Valantar said:


> Pretty sure their video showed them clicking elsewhere afterwards with no effect


No idea what they did, but I used Arc Control for overclocking in my review and achieved the results presented there

But yeah, if GN can't get it working, a vast majority of users won't either


----------



## AusWolf (Aug 4, 2022)

Valantar said:


> Pretty sure their video showed them clicking elsewhere afterwards with no effect (leaving the app? Switching to another?) but i can't say I remember that accurately. Still hardly acceptable...


I think it was only about overvolting... the slider goes from 0 to +0.2 mV or something, (+0.0002 V), that's why it has no effect. They also complained about overclocking not helping real world performance.

I remember another issue from the video: you have to install the "driver helper" (or something) app in order to install Arc Control.
1. The driver helper sometimes creates multiple instances in Task Manager, eating away your spare RAM,
2. You need to have an internet connection to install Arc Control. If you do a regular "unplug the net - install driver - plug it back in" thing, you won't have Arc Control installed, and you cannot even install it later.


----------



## W1zzard (Aug 4, 2022)

AusWolf said:


> I think it was only about overvolting... the slider goes from 0 to +0.2 mV or something, (+0.0002 V), that's why it has no effect


Yeah OV is broken



AusWolf said:


> They also complained about overclocking not helping real world performance.


I gained 5% real-world performance, the "+x%" scale in Arc Control is completely wrong, maybe that set wrong expectations


----------



## noel_fs (Aug 10, 2022)

efikkan said:


> Precisely how is a product with the _wrong_ price "garbage"?
> I believe any reasonable mind would distinguish between an overpriced product and a garbage product.
> And prices can change, you know


we are not in the market of stock


----------



## kajson (Aug 17, 2022)

Valantar said:


> Lots and lots of driver/software issues. The base driver seems to work okay (if at very variable levels of optimization), but Arc Control seems to be so buggy as to be unusable:
> - frequently breaks itself in unfixable ways, requiring an OS reinstall or manual registry cleaning to make it work again
> - frequently fails to open, even directly after an installation
> - Their new anti-tearing sync thing causes major artefacting
> ...


Sounds like anything but a paper launch on august 22th would be ludicrous. Certainly they wont try to get this into oems anytime soon? The negative impact on the brand name surely isn't worth it to intel. Personally I think it would be incredibly arrogant to do a non paper launch, wasting  many hours  of every costumer troubleshooting or worse, having them do your alpha testing for you.  ( not even remotely close to beta in my eyes ) They should not be allowed to sell it with the current drivers/software.


----------



## catulitechup (Aug 23, 2022)

And now is possible buy arc a380 in us courtesy of newegg at 139us









						ASRock Challenger Arc A380 Video Card A380 CLI 6G OC - Newegg.com
					

Buy ASRock Challenger Arc A380 6GB GDDR6 PCI Express 4.0 ITX Video Card A380 CLI 6G OC with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com


----------



## Lew Zealand (Aug 23, 2022)

I can get it tomorrow.  I'm tempted out of pure curiosity, but not enough.

Mmmmmostly not enough...


----------



## R0H1T (Aug 24, 2022)

Mussels said:


> I get the feeling you truly didn't know as much on this topic as you think you did


I'm pretty sure I know a lot about Intel & their various product lines with lots of infamous bugs, though looks like some of you would like to pretend that just *because it's Intel they are flawless* or they're doing great everywhere!








						An Intel Atom C2000 bug is killing products from multiple manufacturers - ExtremeTech
					

Intel's Avoton and Rangeley server products appear to have a significant flaw that's killing hardware years before its expected end-of-life.




					www.extremetech.com
				











						Intel Atom C2000 AVR54 Bug Strikes STH Today
					

A rough evening and morning in the STH infrastructure after the power failed and several infrastructure pieces did not come back up. That includes our primary firewall due to the Intel Atom C2000 series AVR54 bug.




					www.servethehome.com
				




__
		https://www.reddit.com/r/intel/comments/gp2okm








I can find more but I'm sure everyone knows this useful thing called *Google  *


----------



## Valantar (Aug 24, 2022)

R0H1T said:


> I'm pretty sure I know a lot about Intel & their various product lines with lots of infamous bugs, though looks like some of you would like to pretend that just *because it's Intel they are flawless* or they're doing great everywhere!
> I can find more but I'm sure everyone knows this useful thing called *Google  *


Has anyone here been denying that Intel has made a bunch of buggy products? Not that I can recall, at least. Can you please just accept that you made a silly, overbroad statement that simply does not match reality, had explained to you why it was wrong, and that this is perfectly fine? This just looks like even more goalpost-shifting on your part. Intel has excelled in a bunch of product segments throughout the decades. No amount of examples of flaws or bugs will change that reality, as those things are not contradictory whatsoever. The only way you could prove your point would be to show how Intel has in fact _never _been successful in each of the segments that have been pointed out to you - because that's how broad your initial statement was. So, unless you can actually prove that Intel hasn't been successful in _any_ segment outside of X86 CPUs throughout the lifetime of the company, maybe just ... accept that you said something wrong, and move on?


----------



## R0H1T (Aug 24, 2022)

Nope I meant it mainly wrt their failed ventures, & hence you being pedantic about this point ~ *they've never really excelled at anything else*!

Then you countered my post with something like ~ Not to mention FPGAs, datacenter/server networking and fabric, server RAID controllers/HBAs, 5G-related ASICs, SSDs, non-volatile memory solutions ... they do a ton of stuff. With varying success? Sure. But a ton of stuff still.

So tell me again what was your point?


Valantar said:


> The only way you could prove your point would be to show how Intel has in fact _never _been successful in each of the segments that have been pointed out to you


What you & the others have been doing is your own "subjective" deep dive analysis on what I meant. I clarified a bit later that it's more to do with more complex systems like CPU, GPU etc. Since this is an ARC GPU thread it was more about that product & then you tried to dodge that by bringing in SSD, Optane, networking & what not. And now you're ignoring these product failures, some of them being massive fails?

If you're going to be pedantic then why don't we deep dive into your definition of "successful" & do a through analysis of your claims, including all these major whoopsies by Intel 

Intel for 20-30 years was the only choice in the x86 space & for about 2 decades in the server space. *You think that natural monopoly didn't give them a distinct advantage in selling & setting up their ecosystem*?

And in the context of this thread you can define "excelled" however you'd like to. It's like *celebrating the second or third participant just for taking part in a 2/3 horse race*!


----------



## Valantar (Aug 24, 2022)

R0H1T said:


> Nope I meant it mainly wrt their failed ventures, & hence you being pedantic about this point ~ *they've never really excelled at anything else*!
> 
> Then you countered my post with something like ~ Not to mention FPGAs, datacenter/server networking and fabric, server RAID controllers/HBAs, 5G-related ASICs, SSDs, non-volatile memory solutions ... they do a ton of stuff. With varying success? Sure. But a ton of stuff still.
> 
> ...


You mean to say you made an overbroad statement, tried to walk it back when confronted with that fact, but the people responding to you refused to agree to your arbitrary delineations? 

And, to be clear: nobody has argued that Intel's dominant X86 position hasn't benefitted them elsewhere. That is quite clearly the case - but it doesn't change the fact that, for example, their network adapters have been considered the industry standard for a decade or more across many if not most forms of networking.

As for the whole "I wasn't talking about in general, I meant CPUs, GPUs, etc." thing: what's "etc" in this case? Is there any other product category that is comparable to those two in the computing industry? You might say accelerators, but most of those are GPUs, and non-GPU accelerators are rather niche and new. And, if that is the line you're drawing, doesn't that kind of make what you're saying meaningless? "Intel has never really excelled outside of CPUs" isn't quite the same as "Intel has never really excelled in GPUs", is it?


----------



## Mussels (Aug 29, 2022)

R0H1T said:


> I'm pretty sure I know a lot about Intel & their various product lines with lots of infamous bugs, though looks like some of you would like to pretend that just *because it's Intel they are flawless* or they're doing great everywhere!
> 
> 
> 
> ...


Yeah... nice topic change.
Not sorry I forgot to check this thread for updates.

You made a statement in error and just dug yourself deeper and deeper, until you decided to talk about something else. Fun.


----------



## HD64G (Sep 10, 2022)

And... that's it folks!

Sadly, descrete GPUs from Intel won't be made for long. The software strugle and some hw problems demanded much money from Intel to sustain that product segment until it would be financially self-sustainable and Itel execs decided to axe it. So, Battlemage will be out with a few models since its design is ready but short-term sw support and not future products after that will be another big fail for the big blue.

I would like to get another competitor to keep the duo in check price-wise but it was too hard and too expensive even for Intel to make that out of the blue...

I wonder if people can appreciate AMD for maintaining both CPU and GPU segments close to their competitors for a few years now and reach the top lately.


----------



## AusWolf (Sep 10, 2022)

HD64G said:


> And... that's it folks!
> 
> Sadly, descrete GPUs from Intel won't be made for long. The software strugle and some hw problems demanded much money from Intel to sustain that product segment until it would be financially self-sustainable and Itel execs decided to axe it. So, Battlemage will be out with a few models since its design is ready but short-term sw support and not future products after that will be another big fail for the big blue.
> 
> ...


Moore's Law Is Dead is a rumour channel. Their content has no news value in my opinion.


----------



## TheoneandonlyMrK (Sep 10, 2022)

Well a quick Google of Arc A380 brings up A380 model airplane kits and interestingly a Intel arc GPU sticker, but NO a380 GPU.

Soooooooo, there's that.

Xe is dead, it's e shame but it doesn't work, and isn't out in time to be competitive with anything, and Intel knows it.

MLID has it right IMHO.


----------



## HD64G (Sep 10, 2022)

TheoneandonlyMrK said:


> Well a quick Google of Arc A380 brings up A380 model airplane kits and interestingly a Intel arc GPU sticker, but NO a380 GPU.
> 
> Soooooooo, there's that.
> 
> ...


His sources (at least 2 of those) about this matter are directly from Intel.


----------



## Ravenas (Sep 18, 2022)

At $139 it is well below MSRP, however, in my opinion it is still too expensive. It's definitely not a gamer card, and its place in the HTPC world is rather high priced. I would probably just buy an APU like the 5600G.

ASRock Challenger Arc A380 Video Card A380 CLI 6G OC - Newegg.com


----------



## trsttte (Oct 1, 2022)

@W1zzard would it be possible to do some quick tests with the latest driver after the arc750/arc770 review to see if/how much of an improvement there is?


----------

