• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580

I'm a bit confused here, maybe I misunderstood W1zzards message. But why does it matter which own the right now/implemented it as a standard? AMD found a way to introduce something cheap that almost did the same thing. Why argue about something this pointless. It's a win for amd (and us) in my book at least.

was already mentioned by me in #262 (Freesync)
Well then I will use AMD software. It is so good that Nvidia has copied it for an alleged 15% performance tax.
 
That depends on what you mean by "Standby". If you're talking about low-power sleep mode during an active OS session, than then maybe. If you're talking about when the display is not receiving an active signal due to the connected system being powered off, then no. In "Powered Off" mode, the display will draw less than 2.5W(IIRC) as per US regulations, which became mandatory in the US a number of years ago and were adopted world-wide soon thereafter.
Thus. Read, learn.

Good grief! And people call ME condescending?
There are norms, standards, regulations, but despite all that shit indeed sometimes happens. Btw, US energy regulations are a joke, EU requires less than 1W, I think 0.5W to be precise is the newest norm. JayzTwoCents found out that GSync module continued to consume 14W when monitor got to state when display along with it's illumination were not operating due to missing input signal. Now it's up to you to do your own magic and process this statement as your very own way of thinking desires.

Also, this was interesting:

I'm a bit confused here, maybe I misunderstood W1zzards message. But why does it matter who owns the tech/software now/implemented it as a standard? AMD found a way to introduce something cheap that almost did the same thing. Why argue about something this pointless. It's a win for amd (and us) in my book at least.
It was to point out that W1zzard's statements regarding how AMD found a way to VRR standard and then handled it to VESA to standardize was not based on the truth.
Adaptive Sync was specified by VESA more than year sooner than any AMD FreeSync happenned. And even VESA's own statements refutes his statement. You can't f*ck with history, after all.
 
I have a theory why the card crashes randomly with the voltage slider maxed or near to max.
when i have HWINfo or GPU Z open on my second monitor and the card crashes, it freezes for 5-10 seconds before the PC Reboots.
Every time the last frame on my GPU-Z Window shows that the GPU hits the Powerlimit very hard and dropped only the voltage but not the core clock.

Example:
Stock = 1045mv 2850MHz

+50% Voltage = 1100mv 3000MHz
+100% Voltage = 1195mv 3150MHz

when the card hits over 215W it drops the clocks down to ~2950MHz but keeps to Voltage at 1.195V but when it locks up and hard crashes the last frame shows 3150MHz at 1.095V instead.
it looks like the throttling behaviour is broken.
2024-12-18_11-31.png
 
I only see NVIDIA advancing technology in major ways
And ignoring their main intention, to keep you locked into their hardware. Hence limiting your option as a consumer.
I dont know how old you are, but I am old enough to remember when reviewers would advice consumers to steer away from lock-in tech and savvy consumers would heed such calls.

Hell, PhysX was their first push for this BS and they were called out and the industry fought it with proper open options.

I personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.
I'm a bit confused here, maybe I misunderstood W1zzards message.

But why does it matter who owns the tech/software now/implemented it as a standard?
It matters because W1zzards is trashing AMD efforts, their pro consumer impact being ignored and now typical of the current tech media, to praise all the anticonsumer crap that Ngreedia pulls.
 
I dont know how old you are
Considering that I've been running this site for 20 years and have been in this industry for close to 25 years, I'm at least 25. But I don't think age matters. If you have to resort to personal attacks just because I don't say what you want to hear I think the issue is somewhere else

Trust me, I hope that AMD can do better in the future, they must, or we'll be paying 1000 bucks for a RTX 7050 in a few years and I can close the site because nobody will care about GPUs anymore. Well, actually I'd have to rebrand to AIPowerUp and test enterprise stuff. Comments will be friendlier, I'm sure
 
Last edited:
Considering that I've been running this site for 20 years and have been in this industry for close to 25 years, I'm at least 25.
So I am way older than you and congrats on being able to keep a tech site alive for so long. Not an easy task.
But I don't think age matters.
In this case, it does, I am using it on the prefix of the old adage "Those who forget history, tend to repeat it"
If you have to resort to personal attacks
That was not a personal attack, was a simple part of a sentence, stablishing a parameter that explained the point that I made co-relate to age and time.
just because I don't say what you want to hear I think the issue is somewhere else
Sorry but in this case, its not what I want to hear, is what you decided to ignore.
Trust me, I hope that AMD can do better in the future,
Same here and I went one extra step in purchasing their GPU, regardless of all the negative press that they keep getting.
And I have zero complaint about my previous 6900XT and the current 7900 XTX.
they must, or we'll be paying 1000 bucks for a RTX 7050 in a few years
Now I will need royalty payments on that, because I said (and trademarked it) that a while ago. :D
Comments will be friendlier, I'm sure
Nah, count on people to continue bitching and attacking others to defend their favorite mega corporation that has shown zero care for your needs.
 
Nah, count on people to continue bitching and attacking others
Irony!

There are norms, standards, regulations, but despite all that shit indeed sometimes happens. Btw, US energy regulations are a joke, EU requires less than 1W, I think 0.5W to be precise is the newest norm. JayzTwoCents found out that GSync module continued to consume 14W when monitor got to state when display along with it's illumination were not operating due to missing input signal. Now it's up to you to do your own magic and process this statement as your very own way of thinking desires.

Also, this was interesting:
None of that supports your conjecture. Just throwing it out there.
 
And ignoring their main intention, to keep you locked into their hardware. Hence limiting your option as a consumer.
I dont know how old you are, but I am old enough to remember when reviewers would advice consumers to steer away from lock-in tech and savvy consumers would heed such calls.

Hell, PhysX was their first push for this BS and they were called out and the industry fought it with proper open options.

I personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.



It matters because W1zzards is trashing AMD efforts, their pro consumer impact being ignored and now typical of the current tech media, to praise all the anticonsumer crap that Ngreedia pulls.
btu nvidia wins now since they can force devs to prioritize DLSS over FSR like in dragon dogma 2. u get DLSS 3 but FSR still 2. there also many devs that only optimized for amd and that why 4070 wins over 7800xt with less 30% raw power.
now nvidia confirmed to use pci-e 5 that we already have for few years. while amd still stick to higher latency pci-4 but still make 8x card. intel already made cheap $100-150 pci-e 5 board
what really kill the chip is 128bit ram bus and 8x pci-e if it 4.0
nvidia also not stingy on L2 cache unlike rx7000 that reduced L3 to almost same size as nvidia's L2 in mid/low tier card
they all sell at same price here so there no benefit choosing amd. while i had been waiting super SIMD so that it can be true UDNA (formerly planed for RDNA5 but everything can be changed)
 
Lower idle draw for intel cards sounds great, however I was wondering about your power figures. I saw you measure them on the DC side, but I'm scratching my head a bit because Cybenetics results tell me my power supply should be 70~85% efficient at 25~50w idle, my 12400 system without a dGPU would idle at 21w but having a RTX3070 dGPU plugged in would bump it up to 57w... so in a really rough sense (57w-21w)/9w = 4... aka my dGPU should idle at 9w but my psu has an efficiency of 25%? I didn't see any decrease when enabling the aspm feature in my bios so perhaps someone could help clear up this mystery?

msi pro b660-a, phanteks amp 650, asus tuf rtx3070, and a kill a watt meter
 
That is a far point. Does that work on BattleMage? I though PTOD was dependent on feature support on the GPU?
It actually runs: There are 2 new vids which test this now: In YouTube type "arc b580 cyberpunk path tracing" and filter by last week:
  1. The 1st vid shows in 1440p, native/no upscaling/Resolution Scaling OFF, no frame generation, Path Tracing ON, High-Ultra: 10 FPS (other settings are also tested).
  2. The 2nd vid shows in 1080p, native/no upscaling/Resolution Scaling OFF, no frame generation, Path Tracing ON, High-Ultra: 16 FPS (other settings are also tested).
But neither include a proper comparison with other GPUs at same system/settings.
 
Last edited:
  1. The 1st vid shows in 1440p, native/no upscaling/Resolution Scaling OFF, no frame generation, Path Tracing ON, High-Ultra: 10 FPS (other settings are also tested).
  2. The 2nd vid shows in 1080p, native/no upscaling/Resolution Scaling OFF, no frame generation, Path Tracing ON, High-Ultra: 16 FPS (other settings are also tested).
This begs the question: How would the performance be if setting were not on Ultra/High but instead were more balanced/reasonable.
 
Last edited:
Well..
"Arc B580 Overhead Issue, Ryzen 5 3600, 5600, R7 5700X3D & R5 7600: CPU-Limited Testing"

This begs the question: How would the performance be if setting were not on Ultra/High but instead were more balanced/reasonable.
idk, "High" is the lowest I would go. The 10-16 FPS is not an issue in the sense that I think a 4060(Ti) performs about the same?, which implies that a B780 (or what the name would be) might be an alternative to a NV GPU for same/60+ FPS using Path-tracing + upscaling?

[..]

I've been absolutely annoyed by HDMI Forum for the last 2 years.

That's one of the main reasons I switched to NV (I use a SAMSUNG OLED TV and there are only HDMI ports. Adapters may have issue with working VRR and/or other features. I have no issue if AMD would use binary blobs to make it work). The other was power scaling issue with RDNA3 (maybe due to chiplet design, though I wonder how the non-chiplet RDNA3 RX 7600 (XT) do power-scale).
 
Last edited:
Looks like Intel's bringing the heat to AMD. AMD can't catch up to Nvidia in the high-end and now Intel's hot in their heels in the low end. Good riddance. Maybe Lisa'll finally spare some R&D budget for Radeon. Haven't been this excited in a while.


Since B580 goes up in ranking at higher resolution, does that mean driver overhead or some such at lower resolution? Too tired to peruse through 9 pages of comments to see anybody else asked this already.


If so, is it finewinin' time?


1.jpg
In light of HardwareUnboxed's recent testing, very unfortunate. Didn't AMD's 6000 series cards also had this issue few years ago?
 
Well..
"Arc B580 Overhead Issue, Ryzen 5 3600, 5600, R7 5700X3D & R5 7600: CPU-Limited Testing"


idk, "High" is the lowest I would go. The 10-16 FPS is not an issue in the sense that I think a 4060(Ti) performs about the same?, which implies that a B780 (or what the name would be) might be an alternative to a NV GPU for same/60+ FPS using Path-tracing + upscaling?


That's one of the main reasons I switched to NV (I use a SAMSUNG OLED TV and there are only HDMI ports. Adapters may have issue with working VRR and/or other features. I have no issue if AMD would use binary blobs to make it work). The other was power scaling issue with RDNA3 (maybe due to chiplet design, though I wonder how the non-chiplet RDNA3 RX 7600 (XT) do power-scale).

Nvidia doesn't work with gamescope, waste of time and money to try to. They are slowly but surely marching to an open source driver anyway.
 
In light of HardwareUnboxed's recent testing, very unfortunate. Didn't AMD's 6000 series cards also had this issue few years ago?

Not that I remember though I didn't get my Radeon 6xxx cards until after the cryptocraze was over and prices came back down to normal so that's like a year later or more.

On topic: I got my B580 yesterday and tested out a few games, most of which are older so these don't get tested as much as current games by reviewers.

Run fine at 1440p with no CPU overhead shenanigans using a 5700X3D:

Minecraft w/Vulkanmod at 32 (max) render distance (350-1400 fps)
Cyberpunk 2077 (~85fps w/HUB Quality settings)
Shadow of the Tomb Raider (~105 fps w/HUB Quality settings)
Rocket League 144Hz VSync is perfect, lol didn't try uncapped, will do this next.

Has some CPU overhead issues at 1440p:

Horizon Zero Dawn (DF quality settings) in benchmark at Meridian where there are many dozens of NPCs. ~74 fps with an occasional stutter which I never see in Nv or AMD cards. My save game is currently not in Meridian and plays 1440p 60Hz VSync perfectly but I usually play this game on other cards at 80fps. I'll try 80 next and head to Meridian in-game to see if it's any different as the benchmark moves thru the city faster than you can run so may exacerbate the problem.
 
Not that I remember though I didn't get my Radeon 6xxx cards until after the cryptocraze was over and prices came back down to normal so that's like a year later or more.

On topic: I got my B580 yesterday and tested out a few games, most of which are older so these don't get tested as much as current games by reviewers.

Run fine at 1440p with no CPU overhead shenanigans using a 5700X3D:

Minecraft w/Vulkanmod at 32 (max) render distance (350-1400 fps)
Cyberpunk 2077 (~85fps w/HUB Quality settings)
Shadow of the Tomb Raider (~105 fps w/HUB Quality settings)
Rocket League 144Hz VSync is perfect, lol didn't try uncapped, will do this next.

Has some CPU overhead issues at 1440p:

Horizon Zero Dawn (DF quality settings) in benchmark at Meridian where there are many dozens of NPCs. ~74 fps with an occasional stutter which I never see in Nv or AMD cards. My save game is currently not in Meridian and plays 1440p 60Hz VSync perfectly but I usually play this game on other cards at 80fps. I'll try 80 next and head to Meridian in-game to see if it's any different as the benchmark moves thru the city faster than you can run so may exacerbate the problem.
Honesty is still available on TPU. Interesting results. I wish AMD and Nvidia would lower the price on the 4070 and 7600XT.
 
Has some CPU overhead issues at 1440p:

Horizon Zero Dawn (DF quality settings) in benchmark at Meridian where there are many dozens of NPCs. ~74 fps with an occasional stutter which I never see in Nv or AMD cards. My save game is currently not in Meridian and plays 1440p 60Hz VSync perfectly but I usually play this game on other cards at 80fps. I'll try 80 next and head to Meridian in-game to see if it's any different as the benchmark moves thru the city faster than you can run so may exacerbate the problem.

OK played some HZD and headed back to Meridian at uncapped fps as the B580 can't do 80fps locked at 1440p. I never play uncapped as it's always smoother to VSync to 60, 80, 120 or 144Hz, depending. That said, outside the city the game plays perfectly at between 68-100 fps, scene-dependent with no stutters even with 15+ enemies in the action. As tested before, 60fps VSync is also perfect.

Entering Meridian was interesting, as you pass Cut Cliffs and descend into the city CPU use goes up from 40-45% to 70-90% That's pretty far outside Meridian, so the game loads and schedules all those NPCs from quite a ways away. And FPS and GPU use drops, into the 50s with ~85% GPU use, both fluctuating. This is the driver overhead bottleneck. Worst was inside the city just past the market (now following the benchmark path) which dropped into the mid 40s fps with 65% GPU use on rare occasion. Usually getting over 60fps with ~85% usage though. Of course the presentation wasn't smooth though it's much smoother than games with traversal stutter so I guess it could be worse. Could be a lot better, though.

OK doing the same run locked to 60fps VSync works much better, drops from 60 are notably less frequent and only to 54fps with a smoother presentation. CPU use goes up to 92% but usually stays in the 70s. There is an area of notably higher GPU use in the city, the public prayer area with lots of smoke and GPU gets to 100% with 60fps there. Most of the time in the city it was at 58-60fps so this seems a good way to run the game on the B580.

I hope Intel can address this driver use in some updates as it's not stuttery, which would suggest 'broken' to me, instead it's simply too heavy on the CPU and competes for resources in CPU-heavy parts of some games. Trying SotTR in Paititi next but based on the benckmark, it'll likely work fine.

Edit: SotTR works smoothly with no issues at 1440p 80fps (HUB Quality settings) running around the NPC-heavy Paititi and Kuwaq Yaku areas.
 
Last edited:
Well..
"Arc B580 Overhead Issue, Ryzen 5 3600, 5600, R7 5700X3D & R5 7600: CPU-Limited Testing"
Can't take seriously anything from HU. Got a better source of info?

idk, "High" is the lowest I would go. The 10-16 FPS is not an issue in the sense that I think a 4060(Ti) performs about the same?, which implies that a B780 (or what the name would be) might be an alternative to a NV GPU for same/60+ FPS using Path-tracing + upscaling?
That's personal preference. You'd be surprised just how good a mix of medium/high settings looks. Finding the right balance takes some tweaking but it's worth it to get the most from the experience.
 
Can't take seriously anything from HU. Got a better source of info?

Their data look good and backs up what HardwareCanucks originally reported. What's wrong with their methodology?
 
Their data look good and backs up what HardwareCanucks originally reported. What's wrong with their methodology?
Never trust their info by itself. They have a habit of skewing data/info in favor of whatever agenda they have going at the time. Always seek out comparative data.
 
Never trust their info by itself. They have a habit of skewing data/info in favor of whatever agenda they have going at the time. Always seek out comparative data.
Hardware Canucks are not in the same league as HUB for what you describe.
 
Back
Top