• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

B580 tanks performance with low end CPUs

"I'm not rich enough to buy cheap things" - Henry David Thoreau
 
But performance would be even worse.

If drivers were optimized for 256 MB aperture, it'd probably work well enough, I mean, 7700 XT and 4060 Ti generally do work well enough on X99 platforms
 
apparently the otherwise good B580 has a big problem with "non ultrafast CPUs" (I mean, I wouldn't call the 7600 slow necessarily)

Short: That video just shows that dated hardware should not be paired with certain hardware.

I have a Ryzen 7600X. (I look always for the stuff I currently own in different articles and such)
I think we all agree, that a Ryzen 7500F / 7600 / 7600X are basically similar. Buy the cheapest of those three.

The 7600X is entry class. In the past similar to the Ryzen 5800X, which was maybe in the middle class. Not high end. (If you disagree - let's talk about that entry class classification in 2 years please)

Common sense - common knowledge: A graphic card usually is slower with a slower, "low end", processor. Feel free to search the nvidia 4090 testing with different processors (i think it was computerbase.de)

I think the youtube video belongs to hardware unboxed. I do not like their test stuff. That's why I usually do not watch them anymore. The statement about RBAR years ago - really annoyed me. And other things which did not make sense. I would be careful with certain youtube video channels, they are more entertainment as correct.
 
i wonder why he didn't test a single intel cpu @.@
Yeah I was asking myself the exact same question. Especially when I could use a backup gpu. I cant imagine a 14700kf would have too much penalty but... if a 7700x has some (like I think I saw in a graph though could be mistaken) ... then I would really like to know. And I'd really like to know how many games are affected. Is it just a select handful? is it 10%? 50%? I guess thats kind of unreasonable to ask at this point but thats whats going through my head right now. Hardware unboxed even did TWO videos now... neither with intel CPUs. Also, if it plays nice with a 12400f thats a really cheap budget combo. Would be really nice to know if that one would work out.
Hey, seems like nobody is talking about this "little" problem? o_O
Really? to me it seems like absolutely everybody is talking about it at least as far as I keep seeing youtube videos popping up to the point where in some cases the headline literally went from battlemage is the best to battlemage is the worst.

Anyway I don't really fully understand the problem at this point, seems to have to do with older/lower end cpu power/overhead overly affecting FPS more than it should.... but I sure hope its being blown out of proportion and doesn't turn public sentiment sour on battlemage. I mean of course I want people to make the right choice for their setups but also intels window for getting these sold could be closing and they haven't figured out their stocking problems yet at least where I live ( can probably partially blame scalpers for that...) and if demand evaporates once the amd and nvidia GPUs come out, especially if people got a bitter taste in their mouth for battlemage... it might be over for dedicated intel GPUS.

I really think this might be a make or break moment. They'll continue in SOCs of course.... but I know thats not what most of us here want, but that is what most regular people are buying... laptops. DIY market is small.

Guess its gonna depend on the scale of the problem, if intel can do anything to fix it, and how the narrative develops.

But it would also be nice to know if maybe it plays nicer with intels cpus but everywhere it seems to be amd or very old intels ( like 9th and 10th gen) as examples, which I guess kind of makes sense since gaming diy has largely moved to amd with good reason, but it is still lower end so if say it played nice with something like a 12400f, thats something that could be used in a super budget build with good results (maybe?), with an upgrade path to a 13600k or something, with the bios updates and 5 year warranty its not that risky and at that point you're opened up to all kinds of GPUs. And I kinda just wanna know for myself, seems like a good cheap backup to have,
 
Last edited:
Then it's pretty bad value for people in the market for a ~$250 card. Though I have to add that in Europe it costs more than the cheapest 4060s and sometimes more than 7600XT 16GB, making it pointless anyway.
Makes you wonder why the 7600 and the 4060 are regarded by the media as pieces of turd while the B580 is the next best thing since the wheel was invented. Hmph...
 
Makes you wonder why the 7600 and the 4060 are regarded by the media as pieces of turd while the B580 is the next best thing since the wheel was invented. Hmph...
It doesnt really. The 7600 and 4060 are both regarded poorly because they not only didnt move the needle compared to the previous generation, but cost MORE then their predecessors and are stuck with the obsolete 8GB memory buffers.

The B580, OTOH, showed a major generational leap, since it was slightly faster then the A770 despite having only 20 cores compared to the A770's 32. It's not hard to understand why this would be exciting to the tech media when we've had NOTHING like this for years.
 
It doesnt really. The 7600 and 4060 are both regarded poorly because they not only didnt move the needle compared to the previous generation, but cost MORE then their predecessors and are stuck with the obsolete 8GB memory buffers.

The B580, OTOH, showed a major generational leap, since it was slightly faster then the A770 despite having only 20 cores compared to the A770's 32. It's not hard to understand why this would be exciting to the tech media when we've had NOTHING like this for years.
True. Though actually the 4060 was slightly cheaper than 3060 if memory serves. I mean.... less bandwidth.... less memory... and more money... for the poorest people? Maybe nvidia saw sense for a brief moment and realized they just can't charge more for that one. But I agree with the rest of what you said even though I'm not totally understanding the scale of this cpu overhead problem yet.
 
Short: That video just shows that dated hardware should not be paired with certain hardware.

I have a Ryzen 7600X. (I look always for the stuff I currently own in different articles and such)
I think we all agree, that a Ryzen 7500F / 7600 / 7600X are basically similar. Buy the cheapest of those three.

The 7600X is entry class. In the past similar to the Ryzen 5800X, which was maybe in the middle class. Not high end. (If you disagree - let's talk about that entry class classification in 2 years please)

Common sense - common knowledge: A graphic card usually is slower with a slower, "low end", processor. Feel free to search the nvidia 4090 testing with different processors (i think it was computerbase.de)

I would totally count a Zen+ system as entry level and a modern GPU shouldn't be held back this much by an older CPU.

As for where the 7600x sits, mid range. This is because the AM4 platform still exists and is significantly cheaper than any AM5 platform. There is no entry level AM5 system IMO. Why we should talk about it in two years I do not know. By then the 7600 better be entry level, but today it is not.
 
It doesnt really. The 7600 and 4060 are both regarded poorly because they not only didnt move the needle compared to the previous generation, but cost MORE then their predecessors and are stuck with the obsolete 8GB memory buffers.

The B580, OTOH, showed a major generational leap, since it was slightly faster then the A770 despite having only 20 cores compared to the A770's 32. It's not hard to understand why this would be exciting to the tech media when we've had NOTHING like this for years.
You're not wrong, but still, the B580 offers very little to no improvement over the 7600 / 4060, costs more and isn't available. To me, this is a big nothing burger.
 
Not even one Intel CPU tested is quite strange.
But a useful warning sign anyway, I would avoid those cards.

I believe because he was looking more at the budget PC gamers who usually go for AMD CPUs. I've been AMD since the Duron days because price to performance-wise, it was the better option for me. Sure, it didn't get the absolutely best performance compared to Intel, but it was good enough for what I needed.

The B580 being priced as it is screams "budget" GPU, but it requires a non-budget CPU in order to perform well and at least in line with a 4060. Problem is, the 4060 performs much better on the budget CPUIs tested compared to the B580. People looking at a $250 GPU are those who don't spend on more expensive components. It's why when I built my PC (Ryzen5 3600) in 2020, I got the GTX-1660 at $239. It could handle most games released in that year fairly well and it has lasted me 5 years and still performs fairly well depending on the game. Of course, with more games on Unreal Engine 5, and now many starting to require hardware ray tracing, the B580 looked like a nice option, but not so now. I would have to upgrade my motherboard (B450 that does have Rebar and Above 4GB support but is x16 PCIe 3) and my CPU and RAM (currently 32gb dual channel DDR4-3200) to get the performance I need. That's well above the $250 I'm willing to spend to get slight or even worse performance.
 
The B580 being priced as it is screams "budget" GPU, but it requires a non-budget CPU in order to perform well
Well this is the exact reason it would have been great to test 12400f because its only $100. If the 5600 doesn't play nice with it, that would be the first thing I would think of as a replacement.
 
I'd like to see this tested with mid range intel chips (i5 13500-13600 etc.).
 
Short: That video just shows that dated hardware should not be paired with certain hardware....
After seeing this new development, we should just admit what was obvious when A770 came out and Resizable BAR requirement made known, but made painfully obvious now - it is a bad GPU architecture, plain and simple, avoid buying it. There goes the third player - again - nothing new really. :shadedshu:
 
Well this is the exact reason it would have been great to test 12400f because its only $100. If the 5600 doesn't play nice with it, that would be the first thing I would think of as a replacement.
Check 12700kf. Thing costs under 250eur new last time I checked, very good price for a 8/16 cpu. it can be paired with your old ddr4 without much perf hit too. but HUB tend to snub Intel.
 
Last edited:
Not even one Intel CPU tested is quite strange.
But a useful warning sign anyway, I would avoid those cards.
I find it funny that he tested so many different cpus and yet we still don't know what the issue is. Is it single thread performance that causes this? Is it just the lack of cores / MT performance on those R5 cpus? Say I have a 5950x, is it going to be better / worse / same? How can you spend that much time testing and...not answer any of that :D
 
Newegg has some strange pricing.

Acer Nitro out of stock at 279$
GUNNIR priced as high as 429$
Another GUNNIR at 379$
AsRock order for Jan 9th, no price.

I'd grab one for testing a 14100F entry level cpu, but I was told 250$ and as fast as 4060ti.

Why the F are the pricing all over the place?
 
I find it funny that he tested so many different cpus and yet we still don't know what the issue is. Is it single thread performance that causes this? Is it just the lack of cores / MT performance on those R5 cpus? Say I have a 5950x, is it going to be better / worse / same? How can you spend that much time testing and...not answer any of that :D
looks ipc/latency related to me, seeing how much better 7600 is over 5600. the stranger not seeing any older intel cpus (8th/9th/10th gen) tested, given their low latency thanks to ring bus. Maybe they did test them, just never included the results. I mean, testing 6 generations of ryzen was okay, but not a single intel cpu ?
 
looks ipc/latency related to me, seeing how much better 7600 is over 5600. the stranger not seeing any older intel cpus (8th/9th/10th gen) tested, given their low latency thanks to ring bus. Maybe they did test them, just never included the results. I mean, testing 6 generations of ryzen was okay, but not a single intel cpu ?
It's not latency, it might core counts but until he tests something with more counts of the same generation, we won't know
 
looks ipc/latency related to me, seeing how much better 7600 is over 5600. the stranger not seeing any older intel cpus (8th/9th/10th gen) tested, given their low latency thanks to ring bus. Maybe they did test them, just never included the results. I mean, testing 6 generations of ryzen was okay, but not a single intel cpu ?
Exactly, its claimed they had no time or ran out of time but they made an effort to test a top Ryzen which is a 9800 3D, so why not test a 9th/10/11th/12th/13th/14th, same with the AMD parts, same mobo makers, same ram, same ssds, same displays, mouse and keyboard, make them clones and 2 of the exact same gpus, test SAM/BAR off then On.
 
I find it funny that he tested so many different cpus and yet we still don't know what the issue is. Is it single thread performance that causes this? Is it just the lack of cores / MT performance on those R5 cpus? Say I have a 5950x, is it going to be better / worse / same? How can you spend that much time testing and...not answer any of that :D

In HUB testing the 6 cores 7600 is consistency faster than the 8 cores 5700X3D, so nah it's not MT performance related.

Would still be interesting to see how B580 perform on Intel 12/13/14th gen CPU though
 
Back
Top