• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580

1734112276184.png
TechPowerUp made a complete mess of the Space Marine 2 test. The 4060 Ti 16 GB ended up behind the 8 GB version in the overall average across 25 games, even though it outperformed in several games by a decent margin. Just to highlight the absurdity: the 4060 Ti 8 GB "scored" 103 FPS compared to a ridiculous 56.5 FPS for the 16 GB version. And the 4060? It came out way ahead of the 4060 Ti 8 GB, hitting over 70 FPS. lol, that alone wrecked the entire average. There are other nonsensical results too, like the 3070 being 50% ahead of the 3060 Ti. The worst part? They haven’t fixed anything yet. Please correct these Space Marine 2 results; they make no logical sense.
 
Look again .. you'll find it .. and I wonder why suddenly everybody is doing so much video encode .. and why they couldn't do it before?
I am converting all of my physical media to AV1, which wasn't available on AMD or my (now retired) Nvidia card. My a310 rips through encoding & only cost me $88.

I would recommend checking out the Phonerix review to see what the B580 can do in productivity workloads.
 
I meant it other way. Nvidia can't restrict RT capabilities, as it's part of DX12. Maybe to see enough improvement in graphics in games devs will need to implement RT on much wider scale. It is a situation which will be much more demanding on RT cores. So maybe ... primary GPU for rendering, secondary for RT computing? Just like with PissX back then.
I meant it another way. People can't stop crypto, its part of the monetary system now. Maybe to see enough improvement in how we divide power, we need to truly implement DeFi, on a much wider scale. It is a situation which will be much more demanding on our systems. So maybe.... primary wallet for Bitcoin, and a secondary for your fiat money? Just like nobody in their right mind ever did?

FTFY.
 
Looks like a much better offering than the A series. I'd give this a really hard look if my rig supported ReBar, but it doesn't, and I'm in no mood to mess with firmware. The best I can hope is this will really drive down prices on the x600 and x700 Radeons.
 
TL;DR: The INTEL Arc B580 is a success

* When comparing the price/from what I've seen so far, the INTEL Arc B580 is a success for the gaming consumer (time will tell if there are any major/unfixable bugs). More competition is good for the consumer (obv, whether one likes INTEL or not). Arc B580 is also like 55(mixed resolutions?) to 67%(1440p [which is the new 1080p anyway]) more power efficient than the prev gen Arc-series, which I almost kinda didn't expect, but it was needed. According to some reviews, it even manages to to be sometimes more power efficient than RDNA3 by 4-7%. Similar performance to a RX 6700 XT/RTX 4060 (Ti) (depending on review), but much cheaper.
* The RTX 4060 Ti 16GB is 17% faster, but it's much more expensive/costs 80% more (450 vs 250). The RTX 4060 non-Ti is 8% slower, but its 8GB obsolete VRAM is really small and I wouldn't buy it, and gain, it's also much more/60% expensive (400 vs 250).
* When comparing for same memory interface (192 bit, 12GB): The Arc B580 is similar in performance to a RDNA2 RX 6700 XT, the RDNA3 RX 7700 XT is 31% faster but its TDP is also 29% higher (245W vs 190W) and the RTX 4070 is 45% faster, but again, they are also much more expensive.
* Also props to INTEL for not releasing an obsolete 8GB VRAM DOA card.

I wonder how the Arc B580 power-scales.

I want a Arc 24GB VRAM clamshell (just like the 4060 Ti 16GB one is) version for AI inference and later, if there's going to be a Arc B700 series, a B750/B770 32GB VRAM clamshell version as well, because the 24GB VRAM are kinda not enough for fast inference to fit bigger LLMs.

Computational and memory capacity often have to both scale together otherwise you simply cannot efficiently make use of either one of them. Since 2013 computing power went up by ~16X times but memory nowhere near that factor, this is problematic.

This is just classic nvidia fanboy PTSD Stockholm syndrome "uhm, actually it's good that GPUs don't have a lot VRAM". Most games run fine with 8GB VRAM because that's what developers are forced to work with you genius, ask any programmer if he would rather work with less or more memory. It's nearly impossible to find a computing problem where execution time, memory and problem size don't all scale together, you are forced to work with a serious handicap if the problem size keeps going up but you are constrained by the same memory limitation over and over, this isn't a game development thing, this is a programming thing in general. Stop pretending like you know anything about this subject.

People keep asking for more detailed and complex worlds in games which amongst many things requires more resources aka memory but then you have people like this guy who thinks that you can just magically "optimize" games ad infinitum.
Hahaha "PTSD Stockholm syndrome". Indeed, texture size, the amount of textures, the resolution and other stuff increase. There's only so much a certain VRAM can hold and only so much that can be optimized (a lot is already optimized) (maybe some things could be optimized even further, but then the game development would take longer and the game may cost more, who would want that? This is not a computer science low level optimization competition. Just as 1GB, 2GB, 4GB, so is 8GB becoming not enough, yes, even in 1080p.
GBI know most people say that RT is not relevant in this segment, but isnt it possible that more games like the new Indiana Jones will pop up in years to came? The game simply wont run without RT hardware. At this moment this B580 should be more future-proof then the competition in that reagard. Or is this a moot point?
Good point. Upscaling is almost required and soon it's going to be RT-required, too (supposedly the new Indiana Jones game requires hardware RT) (and even if the game is not popular(?), RT hardware may still become a requirement later in other games). I guess another thing: A minimum of 10GB VRAM for 1080p at any game settings to not only have a non-stutter, non-low res textures experience, but for the game to even start.
 
Looks like a much better offering than the A series. I'd give this a really hard look if my rig supported ReBar, but it doesn't, and I'm in no mood to mess with firmware. The best I can hope is this will really drive down prices on the x600 and x700 Radeons.

At some point, you have to let old technology go. Rebar has been around since Intel 10th Gen/Zen+.

A Zen3 solution (5700x/B550/16gb ram) is less than $250.
An Intel 12th gen solution (i5 12400f/B760/32gb of ram) is also less than $250; going with an i5 12400f/H610/16gb combo is $124.
 
At some point, you have to let old technology go. Rebar has been around since Intel 10th Gen/Zen+.

A Zen3 solution (5700x/B550/16gb ram) is less than $250.
An Intel 12th gen solution (i5 12400f/B760/32gb of ram) is also less than $250; going with an i5 12400f/H610/16gb combo is $124.
That might be true, but if just a GPU upgrade gets me where I want, then I’m fine sticking with what I have. I know that’s a tall order, but honestly on such an old system, the Intel card is probably a bit of a gamble anyway.
 
Is it? Maybe you're on 4K/+.

It looks pants on our 1080p and 1440p gaming rigs

Most of these cards are perfectly fine at 4K if you turn a few settings down.

i.e. this is how the 3070 and 6700 XT perform at 4K / Medium settings - and this B580 is right there in this mix based on the 4K review benchmarks at ultra :


1734127925427.png
 
You tell me:
I will be honest, no, cant see it.

I did notice some differences because i was looking for them because it claims that RT is on or off.

So my opinion about it stays, useless gimmick.

Having a problem with a company does not justify such childish behaviour.
And i find it infantile getting offended by such actions , so thats that.

so lets move on.
 
So my opinion about it stays, useless gimmick.

And the best part is that 2080 Ti is capped at 60% due to the RT core bottleneck. The RT off option is nowhere to be seen. Talk about planned obsolescence. I wonder what the GPU usage is on the B580.

Circle.jpg
 
You tell me:
WTF! At 1:25, is that a bird? Is that a jet? No! It's TAA trails. How dafuq is a console better than a PC in this regard? Same at 2:07 with the dragonflies, although less noticeable.
 
Most of these cards are perfectly fine at 4K if you turn a few settings down.
This! It's been true since the GTX1000 series cards.

And i find it infantile getting offended by such actions , so thats that.
There's a difference between being offended and being irritated & annoyed. Being "offended" means something was taken personally. As the silly comment in question wasn't aimed at me personally we can dismiss such a notion easily. Being irritated and annoyed means that such a thing is, well, irritating and annoying, kinda like pedantic comments that make assumptions...

So my opinion about it stays, useless gimmick.
You do you. Everyone else will do what they like.
 
Last edited:
There's a difference between being offended and being irritated & annoyed. Being "offended" means something was taken personally. As the silly comment in question wasn't aimed at me personally we can dismiss such a notion easily. Being irritated and annoyed means that such a thing is, well, irritating and annoying, kinda like pedantic comments that make assumptions...
Just moving the goalpost, but the message its the same.
You do you. Everyone else will do what they like.
Thats why I said “my opinion “. Never stated neither ordered nor expected anyone to do as I said. :D
 
@W1zzard
Cyberpunk 2077 pathtracing "Overdrive" benchmarks of the B580 would be interesting (there are raytracing Ultra benchmarks, but PT Overdrive transforms visuals the most and is the future).
 
so lets move on.
Hmm.. :rolleyes:

Thats why I said “my opinion “. Never stated neither ordered nor expected anyone to do as I said. :D
Yes, but your opinion is an attempt at needless negativity clearly aimed at downplaying a feature that is both very visually striking and widely accepted. News flash: Just because not all devs are doing it well or getting it right, does NOT mean that said feature is, as you put it, a "useless gimmick".

(there are raytracing Ultra benchmarks, but PT Overdrive transforms visuals the most and is the future)
That is a far point. Does that work on BattleMage? I though PTOD was dependent on feature support on the GPU?
 
Last edited:
Hmm.. :rolleyes:


Yes, but your opinion is an attempt at needless negativity clearly aimed at downplaying a feature that is both very visually striking and widely accepted. News flash: Just because not all devs are doing it well or getting it right, does NOT mean that said feature is, as you put it, a "useless gimmick".


That is a far point. Does that work on BattleMage? I though PTOD was dependent on feature support on the GPU?
I came back to the forums after what an year or more of inactivity. I think your avatar was G-Man at that. Glad to see you're still insufferable as ever. Love ya bud.
 
Has anyone compared image quality? Especially with ray tracing.
(Like the old days, when reviews compared anisotropic filtering.)

In Doom Eternal, RT penalty is much smaller on the B580 than on any NV or AMD.
https://tpucdn.com/review/intel-arc-b580/images/rt-doom-eternal-3840-2160.png

And speaking of RT in general, curiously, in Resident Evil 4 at 4K there's near-zero RT penalty on most cards, and the NV 3000-series cards are even a bit faster?!
https://tpucdn.com/review/intel-arc-b580/images/rt-resident-evil-4-3840-2160.png
 
Last edited:
Witcher 3 results are still incomparable to any other reviewers and give the appearance that this card is faster than it actually is.
 
Ok be nice now. Others actually want to talk about the review and results.
 
I will be honest, no, cant see it.

I did notice some differences because i was looking for them because it claims that RT is on or off.

So my opinion about it stays, useless gimmick.
TBH, visually games have barely gotten better looking over the past 4 years while being noticeably heavier in computing requirements. We've reached a point where more details and higher poly count are also becoming gimmicky because you only notice the difference when you stop and stare.
 
Back
Top