• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580

Joined
Jun 20, 2024
Messages
389 (2.19/day)
Wondering if the relative lack of performance in Unreal based titles is something that will be overcome or a fundamental architecural issue with how the Unreal game engine + Xe work together.
Silent Hill 2 and Black Myth Wukong for example both show how the B580 slips to being noticably slower than the RTX 4060 and more level with the AMD RX 7600 offerings.

Doesn't bode well for Stalker 2, or the upcoming Witcher 4 game. Would also impact Hellblade, Deliver Us Moon/Mars, etc.... Hoping Intel driver team can figure that one out.

He's talking about hybrid PhysX...

To be fair most games now do multicore PhysX instead of GPU but there are some older ones that will still use GPU - in that case to be honest an older Kepler/Maxwell card would be more than enough to act as the PhysX card. Wouldn't even need to be a powerful one...
 
Joined
Sep 26, 2022
Messages
2,118 (2.61/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
Wondering if the relative lack of performance in Unreal based titles is something that will be overcome or a fundamental architecural issue with how the Unreal game engine + Xe work together.
Silent Hill 2 and Black Myth Wukong for example both show how the B580 slips to being noticably slower than the RTX 4060 and more level with the AMD RX 7600 offerings.

Doesn't bode well for Stalker 2, or the upcoming Witcher 4 game. Would also impact Hellblade, Deliver Us Moon/Mars, etc.... Hoping Intel driver team can figure that one out.
UE5 at that. Seen some reviews and it runs UE4 games just fine.
 
Joined
Jun 20, 2024
Messages
389 (2.19/day)
UE5 at that. Seen some reviews and it runs UE4 games just fine.
Fair enough - that'll mean the Hellblade (1), and Deliver Us titles would be OK then, as would older UE games.
But UE5 games are actually fairly pletinful now - e.g. Robocop and the new Fortnite update...

Wonder if anyone has seen how well it fairs on pre-DX11 titles, i.e. how Intel's compatibility layer performance compares (i.e. is it significantly better than Xe1 cards and no major bugs and what silly FPS numbers it can pull out)....
Half-Life 2 has had a bit of a repackage - some people may be dusting off some DX9 or older titles...
 
Last edited:
Joined
Dec 6, 2022
Messages
431 (0.58/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
I know most people say that RT is not relevant in this segment, but isnt it possible that more games like the new Indiana Jones will pop up in years to came? The game simply wont run without RT hardware. At this moment this B580 should be more future-proof then the competition in that reagard. Or is this a moot point?
But I must ask, are there any visible improvements brought by this forced RT?

I have seen plenty of samples were someone has to point out if and were the damned RT effect is it, besides the performance hit.

Games this "early" already demanding RT hardware is a bad precedence, unless somehow they have managed to include it without tanking performance.
To be fair most games now do multicore PhysX instead of GPU but there are some older ones that will still use GPU - in that case to be honest an older Kepler/Maxwell card would be more than enough to act as the PhysX card.
Every now and then, I like to replay the Arkham games and those are infected with PhyX, to the point that only having a Ngreedia GPU in the system will enable them and yes, some of us (and all consoles) only have AMD hardware and cant see all the eye candy, especially in Arkham Knight.

I tried once to recreate the old ways of having a Ngreedia gpu just for PhyX and an AMD one as a main (remember how back in the day, Ngreedia went to the extreme of disabling your GPU if their drivers detected an AMD GPU in the same system? fun times) and it was as bad as it was back then. Couldn't do it because the Ngreedia GPU would deactivate itself if it wasnt the primary gpu or had a dummy adapter connected to one of its ports.

Fine, its only eye candy, which it doesnt affect gameplay, but it sucks that its simply locked behind a hardware paywall.

RT so far, has proven to be even less significant (so far) but the performance hit is simply not worth the minimal visual changes (that I have observed)

Granted, there are exceptions, like the Portal and Quake remakes, but the rest is not all that.
 
Last edited:
Joined
Dec 31, 2020
Messages
993 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
4060 is the 50 class card of the 40 series, 18 months old and after june-july obsolete with 5060 in the picture. B580 should be using N3B 140mm2. Currently the die is almost as big as 4070 Ti, density like 6nm.
 
Joined
Dec 6, 2022
Messages
431 (0.58/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
Maybe we will see another era of secondary GPUs in computers. Just like some 12-14 years ago, someone would buy GPU to compute PissX, whereas primary GPU was used for anything else.
See response above.
 
Joined
Nov 13, 2024
Messages
61 (1.91/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
But I must ask, are there any visible improvements brought by this forced RT?

I have seen plenty of samples were someone has to point out if and were the damned RT effect is it, besides the performance hit.

Games this "early" already demanding RT hardware is a bad precedence, unless somehow they have managed to include it without tanking performance.
Maybe it cuts production time a bit if you don't have to generate/include/place shaders, also maybe download size, because it's rendered in real time (certainly unsure if that's true)

Also forced RT sounds ass
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,918 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
but isnt it possible that more games like the new Indiana Jones will pop up in years to came
Possible yes, but looking at the flop sales numbers for Indiana Jones I think a lot of publisher will wonder if RT-only is the right strategy. Why block a vast majority of the userbase from buying your game?
 
Joined
Jun 20, 2024
Messages
389 (2.19/day)
I tried once to recreate the old ways of having a Ngreedia gpu just for PhyX and an AMD one as a main (remember how back in the day, Ngreedia went to the extreme of disabling your GPU if their drivers detected an AMD GPU in the same system? fun times) and it was as bad as it was back then. Couldn't do it because the Ngreedia GPU would deactivate itself if it was primary or had a dummy adapter connected to one of its ports.
Supposedly you don't have to hack it anymore - it's no longer blocked by nvidia drivers (still need a 'display'/dummy connected for adapter to be active). I suspect that's more to do with the fact that most laptops (even at the higher end) may still use CPU iGPU as primary display controller and then pass-thru the dGPU for gaming. It would be hypocritical to design for / support such a hardware implementation by Nvidia and then block certain software from functioning because of the same supported configuration. Also by mid-2010s most games were not bothering with GPU physx as all the consoles were not using Nvidia for anything and were happy to leverage muti-core CPU implementations.
 
Joined
Dec 12, 2016
Messages
1,928 (0.66/day)
Possible yes, but looking at the flop sales numbers for Indiana Jones I think a lot of publisher will wonder if RT-only is the right strategy. Why block a vast majority of the userbase from buying your game?
Welcome to the great Nvidia RT con. By the way, why can't we get RT vs gen ras comparisons for AAA games that come out with RT options in the same way that you compare DLSS vs. XeSS vs. FSR? If we are all to care so much about RT, shouldn't we have side by side screenshot comparisons for different games as they are released with RT on and off? You are very good at these kinds of comparisons so it would be really nice to know how much better a particular game looks with and without RT and get an idea of whether our game enjoyment will increase (thus justifying the extra premium we pay for GPUs nowadays).
 
Last edited:
Joined
Jan 3, 2021
Messages
3,575 (2.48/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
Joined
Jan 18, 2020
Messages
830 (0.46/day)
Very similar to a 6700xt, which you can get for slightly less money but without the warranty. No OC on this card though by the look of it. If this pulls prices down for AMD and Nvidia then a good product.
 
Joined
Jul 24, 2024
Messages
274 (1.90/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
See response above.
I meant it other way. Nvidia can't restrict RT capabilities, as it's part of DX12. Maybe to see enough improvement in graphics in games devs will need to implement RT on much wider scale. It is a situation which will be much more demanding on RT cores. So maybe ... primary GPU for rendering, secondary for RT computing? Just like with PissX back then.
 
Joined
Dec 28, 2012
Messages
3,927 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
It's a good card if it's sold for a lower price than its direct competitors, but it's an entry-level card for 1080p only, nothing more, competing with the RTX 4060/4060Ti and Radeon 7600XT.
But it's very likely that when the Radeon RDNA4 comes out (which will be launched in early 2025), Intel's new VGAs will lose their cost-benefit once again.


Which compute applications would you like to see tests with?
AMD's track record would suggest otherwise. their $300 and below GPUs have consistently been dissapointig lately, aside from the 6650xt.

If rDNA4 is as much of an improvement as rDNA3 was, intel doesnt have a lot to worry about, unless AMD is willing to take a loss.

Possible yes, but looking at the flop sales numbers for Indiana Jones I think a lot of publisher will wonder if RT-only is the right strategy. Why block a vast majority of the userbase from buying your game?
I doubt the flop sales are from RT, much more likely it's from Disney's culture war that has utterly obliterated any fanbase for Marvel or Lucasfilm properties. As more of the writing was revealed to be modern slop, enthusiasm for the title quickly dissipated.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,918 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Very similar to a 6700xt, which you can get for slightly less money
I checked and 6700 XT goes for $320 to 360 on eBay. What pricing do you see?
 
Joined
Dec 10, 2014
Messages
1,332 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | Koss KSC75
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Windows 11 Education 24H2 x64
Looks like Intel's bringing the heat to AMD. AMD can't catch up to Nvidia in the high-end and now Intel's hot in their heels in the low end. Good riddance. Maybe Lisa'll finally spare some R&D budget for Radeon. Haven't been this excited in a while.


Since B580 goes up in ranking at higher resolution, does that mean driver overhead or some such at lower resolution? Too tired to peruse through 9 pages of comments to see anybody else asked this already.


If so, is it finewinin' time?


 
Joined
Jan 27, 2015
Messages
1,741 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
This begs the question: What did LTT do differently?

Different and fewer games, also used Vulkan in some of them.

Since B580 goes up in ranking at higher resolution, does that mean driver overhead or some such at lower resolution? Too tired to peruse through 9 pages of comments to see anybody else asked this already.

I think it's primarily the hardware, wider memory bus and more VRAM, that makes it go up in ranking at 1440p/4K.

I also think 4K and 1440p are more important than people posting here realize. I pretty much run everything at 4K now with a 6700 XT for example. I never run at 1080P.

So, taken from that vantage point, the B580 suddenly becomes a direct competitor to the 4060 Ti at $150 less.

4K overall results:
1734098075438.png
 
Joined
Aug 21, 2015
Messages
1,742 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
I really wish people would stop with the asinine name-calling.
If it weren't guaranteed to get me LQ'd to hell and back, I'd be sorely tempted to respond to any instance of "M$" or "Ngreedia", etc. with a clip of the How I Met Your Mother cast making fart noises.
 
Joined
Apr 30, 2011
Messages
2,712 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Great VFM. If they manage to have reliable drivers and less games that drop performance like a rock ti will sell well. Thing is, they cannot make many of those in that price without having problems with their profit margins. I hope I am wrong and they make millions of those that will help push AMD and nVidia to drop their prices also.
 
Joined
Jan 18, 2020
Messages
830 (0.46/day)
Joined
Aug 21, 2015
Messages
1,742 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Great VFM. If they manage to have reliable drivers and less games that drop performance like a rock ti will sell well. Thing is, they cannot make many of those in that price without having problems with their profit margins. I hope I am wrong and they make millions of those that will help push AMD and nVidia to drop their prices also.

I'm not convinced profit margins on ARC matter that much to Intel at present. PC graphics looks for all the world like bonus money for all three players, who see HPC as the profit generator for GPU compute. Nvidia may basically own the space, but it's a space that's growing. Where there's growth, there's opportunity to snag share. ARC may simply be a way to salvage some revenue while Intel gets the architecture and ecosystem sorted for the real work: large-scale AI.
 
Joined
Dec 6, 2022
Messages
431 (0.58/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
I meant it other way. Nvidia can't restrict RT capabilities, as it's part of DX12. Maybe to see enough improvement in graphics in games devs will need to implement RT on much wider scale. It is a situation which will be much more demanding on RT cores. So maybe ... primary GPU for rendering, secondary for RT computing? Just like with PissX back then.
I misunderstood your original comment.

I recall reading that such scenario is not possible because of how RT is rendered in the final image, but i am open to it if it ever becomes a possibility.
I really wish people would stop with the asinine name-calling.
Funny enough, that seems to trigger people that are fans of the mentioned brand/corporation.

You must accept that others have good reasons to hate what you perceive as a favorite/loved/preferred brand.

Perhaps ask why such people hate them and maybe their reasons are valid, instead of getting upset about them.

And don’t take this as an attack, since personally, i think that you in particular provide lots of good input in this forum.

Simply bringing another possible point of view.

I'm not convinced profit margins on ARC matter that much to Intel at present.
My curiosity is, how low can a company go, price wise, before it’s considered illegal price dumping?

Dont get me wrong, I am down to cheaper gpus.
 
Last edited:
Joined
Jan 27, 2015
Messages
1,741 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I'm not convinced profit margins on ARC matter that much to Intel at present. PC graphics looks for all the world like bonus money for all three players, who see HPC as the profit generator for GPU compute. Nvidia may basically own the space, but it's a space that's growing. Where there's growth, there's opportunity to snag share. ARC may simply be a way to salvage some revenue while Intel gets the architecture and ecosystem sorted for the real work: large-scale AI.

The entire dGPU market was 11.7B in 2023. By contrast, Musk reportedly spent ~6B on the initial AI rollout for the Memphis, TN xAI Colossus data center.
 
Top