Well, AMD states tht in Athlon 300G page, they claim that it runs esports games at 720p well. Ryzen 5000Gs are expected to run anything, that includes AAA games too. I guess that what they call "enthusiast level performance" includes AAA games.
Do you mean 3000G? Or 300GE?
Either way, this is what AMD's landing page for "Athlon processors with Vega Graphics" tells us:
With AMD Radeon™ Graphics built right in, you’ll enjoy every pixel as you edit family photos, stream your favorite shows in up to 4K HDR, and play the most popular esports games in high-definition 720p. Fueled by AMD advanced 7nm processor core technology, AMD Athlon™ 3000 Series is ready to harness the power of graphics card upgrades for smooth HD+ 1080p gaming – so gamers looking for the flexibility for adding future upgrades like discrete graphics cards will enjoy an easy upgrade path.
(my emphasis)
That latter claim is rather weird as there are only 12nm and 14nm chips in the 3000G series (so far - might be foreshadowing some future launch, or one that has been cancelled due to 7nm shortages I guess?), but they're pretty clear about the scope of performance for these: 720p esports, dGPU upgrades for anything else. (And their use of "HD+ 1080p" there also indicates they're not selling this as the basis of a high end gaming platform.)
I remember it ran Battlefield at 1080, so I thought they were capable, but still bellow HD 7750 obviously.
"Battlefield".
Which one? You said you had a 6000-series APU, which came out in 2013. According to that list, there were eight main series Battlefield games out at that point. I can find some BF4 videos of that APU, but none detailing the resolution or with a framerate counter, and judging either from an overcompressed youtube video is a fool's errand.
Your A8 is not A10, A8s had lower CPU clock speed, maybe lower cache too and certainly much less GPU cores.
... so you didn't look at the sources I linked then? The A8-7600 is in that AnandTech review as well. It's slower than the A10-7850K, yes, but not by a huge margin at all. They're both squarely in the same performance class overall.
Still, they are marketed as smooth 1080p gaming solutions by themselves and they are yet to match RX 550 properly.
Where? Can you show me an example of it being marketed as a smooth 1080p gaming solution that, as you've been harping on, somehow implies that this applies for AAA gaming?
I strongly disagree. Even before I was enthusiast (whatever that means), I tried to make games to run smoothly and by that I mean targeting at 45-60 fps. 30 fps is a piss and is clearly laggy or unresponsive to me. Not even for fast paced games, but for anything at all. It's not acceptable even for Age of Mythology. My standards were like that with Athlon 64 3200+/FX 5200 machine, even if it meant much worse visual quality to achieve that. I only make an exception to Far Cry as it had surprisingly consistent framerate and didn't feel unresponsive when aiming, but that's just one exception. And to be honest, I never had "enthusiast level" gear. Downclocked RX 580 (which is slower than RX 480) was the best I ever had, you can take my word, that I wouldn't want to go back to 650 Ti 1GB for daily gaming. I remember it being somewhat a potato at Battlefield and almost insufferable in GTA 5.
The fact that you were aware of framerates at all places you squarely in the enthusiast group. Seriously, most gamers don't know what framerate is or what it indicates. They can feel the difference between something being smooth and not, and might start looking into it if it bothers them too much, but most still have no idea.
I'm certainly not getting too conditioned to fps, it was just plainly obvious before, that fps matters. I never bought into nonsense that 30 fps is minimum playable framerate. For me that would be 40.
40? On a 60Hz panel? That's a juddery (or teary) mess. A steady 30fps feels
far smoother than some in-between range.
Perhaps your perspective on that is different, since you own 6900 XT and perhaps a higher refresh rate monitor or at least one with Freesync. I don't and never even saw one in person.
Well, I do have a Freesync monitor, that is true. It's my secondary 75Hz 1080p monitor that I only really use for office work (rotating it into landscape and setting it as the main monitor in Windows is too much of a hassle for some occasional 75fps gaming). My main monitor is a decade old Dell U2711, at 60Hz 1440p. So, no, sorry, not applicable. Or, I guess you could count the 2160p120 TV, but I only rarely game on that, and I've so far not bothered lugging my main PC into the living room to test that out. I'm planning to, as it will no doubt be great, but it's not something I'm used to, no.
You can tell me about "stuff that isn't strictly necessary", when I was getting piss frames in GTA 5, on anything that wasn't 1080p low or when I had to run Far Cry 5 on RX 560 at less than 1600x900 to get 40-50 fps and even then it wasn't too stable. I know full well, that RX 580 is what I need. There's no merit in low end junk, it just sucks your money and doesn't deliver. That's exactly what 320 EUR 5600G is.
But that's the thing: you knew enough to identify what was bothering you. Again, that places you in the enthusiast class. Beyond that, you clearly have strong preferences for higher resolutions as well - remember, both the PS4 and XO render at somewhere between 900p and 720p in the vast majority of games, and that's what the vast majority of gamers are used to. Most games on those consoles are 30fps as well.
Also, as you point out, unsteady framerates exacerbate poor gameplay smoothness. You'd likely have been better off at a locked 30fps than that unstable 40-50.
That doesn't change anything about them being grossly overpriced and overadvertised. Obviously, I will stick to my defined minimum spec, 5600G doesn't deliver. Doesn't matter to me if it's close or not, if it doesn't even meet a spec that I define as playable. Besides that, it's already this poor today, so it doesn't have any longevity in it.
Why overpriced? You get a near-5600X CPU with a moderately capable GPU built in for a lower price than the 5600X. Intel has launched some very competitive offerings since, but their iGPUs are still trash, so they lose out there. And as I've shown, you're not getting an equally fast CPU + equally fast GPU for the same price that way.
At that budget, it's no brainer to avoid garbage like 5600G. i3 with GTX 960 is a way to go. Or new Quadro T600 it is.
If you can find a used 960 for a decent price? And one that isn't run into the ground, being 5+ years old? Also, while TPU doesn't allow for a direct comparison, the 960 isn't
that much faster either. The 1060 in the 5600G review is 242% of the 5600G's performance at 1080p, or 41% of the performance. In TPU's database the 960 is 58% of a 1060 6GB. That makes the 960 clearly faster, but it's not a staggering difference. Definitely enough to make games playable on the 960 that aren't on the APU, sure, but for that you have to step down significantly in CPU performance and ease of upgradeability, as that i3 is going to start being a bottleneck long before the 5600G's CPU is. Everything has tradeoffs.
Wait, weren't you just recently claiming that Threadripper makes no sense, because Zen 2 and IPC matters more. Oh my, how tables have turned. 3950X is pretty much a Threadripper on AM4.
... sigh. Seriously? Yes, I did make that argument. And I also made the argument that there is a significant upgrade path from a low-end Athlon on an A320 board even if it's limited to "only" 3000-series CPUs. You see how those two statements
in no way whatsoever contradict each other or even conflict with each other, right? One is in the context of "someone wants to maximize performance for a high end PC, which parts are smart to choose", while the other is in the context of "can we speak of a viable upgrade path for an Athlon 300GE on an A320 motherboard". The scenarios are
wildly different. If you can't see that, we literally can't have a conversation.
I'm really not convinced, unless it has some overclock wall, that shouldn't be a case. Or maybe it's just DDR5.
DDR5? 5000-series APUs use DDR4. Also, "overclock wall"? Yes, 3000-series APUs have the same limits to overclocking as all 14/12nm Zen/Zen+ CPUs have - they don't go much above 4GHz (and the iGPUs might reach 1700MHz if you're lucky). Meanwhile the 4000 and 5000 series chips have significantly higher IPC (~+15% and ~+35% respectively) and clock their iGPUs
much higher even at stock (my 4650G is 1900MHz, and I got a bit of a dud that only OC's to 2100 - 2400 is relatively common). They also have much better memory controllers, which of course help, but ... well, that's part of what makes them better. Yes. They are better. That is the core of the argument here.
Still better than pouring millions to yet another chip IPO scam.
Has anyone here suggested doing so?
I still disagree. In service sector you also have people like barbers, that barely use any goods (unless you go to hair salon, but that's entirely different thing).
Uh ... razors? Shaving cream? Lotions and all that stuff? The equipment they use? The furnishings in the barbershop? Literally everything they need to do their jobs is dependent on the flows of global capital, and the large-scale exploitation of natural resources and labor in poorer parts of the world.
There is finance sector, again barely uses resources.
But also does literally
zero of any worth. They shuffle numbers around to make them look bigger, and organize gambling circles for the ultra-rich. Oh, and they spend
massive amounts on computers and technology, creating a lot of E-waste.
What about government? What about education, which recently proved that it can be done with just internet available?
"Can be done" is a stretch. That it can be done at a significantly reduced quality by massively overworked staff in a crisis situation is ... well, not proof of anything. Also, what about the stuff you need to teach and learn remotely? Are computers or the internet outside of the flows of global capital? Obviously not. Government is obviously not either. We live in a neoliberal world. The flows of global capital run through
everything. Unless you live off of subsistence farming and make your own tools, there is no way for any person to avoid this in our current world.
Oh shit, I meant left ideologies. Sorry for snafu, I'm not yet too familiar with formal terms of political parties. By left I mean those that have strong welfare, reject laissez faire capitalism and often private property.
No problem, we all mix up words from time to time
Oh well, in my region those were available, but seemingly there was nearly no demand for them. Athlon 3000G was sold out nearly instantly and I don't see it anywhere to buy anymore.
Hm, that's odd. Probably down to some weird dynamic of distribution. I don't think the Pentium Gold series has been reliably in stock at all since it launched in the markets I've paid attention to.
But 5700XT was mid range product, not high profile one. It took on 2060 Super, not rally 2070 and certainly not on 2080.
Oh, absolutely. It's just that it was the highest end they had at the time, and thus their focus as they desperately needed to focus on rebuilding the image of their graphic division after five years of not really competing.
I can bet that we won't see normalcy for at least 5 years. Normalcy is dead and so are 200 EUR/USD GPUs. We would be doing well, if after 5 years we could start going back to that, but current situation is still a mess and we still have rampant pandemic that murders everyday, with no real supplies to tame it. Supply chains might get even more borked, if some will go bankrupt. Intel or AMD won't build fabs instantly either. And world economy is still in some turmoil, that is at mercy of how we handle pandemic, not really in hands of people doing a strong business otherwise. Countries still can just start a lockdown rather easily, which puts them in inescapable debt. Debt not only means that you are taking others money, but also that you pay interest. The poorer you are, the worse interest is for you and the more on slippery slope you end up. We are still deep in shit and just getting deeper, we are not even close to coming out of it.
Yeah, I don't think you're necessarily wrong here. At lest it's a good thing that we're seeing pushes for more localized chip production, as the centralization of the industry that we see today has made it - as we're currently experiencing - extremely precarious. Another built-in function of neoliberal thinking: if overhead is seen as bad and detrimental to profits, you start building things
exactly to your projected future needs, and if those projections are wrong, you're suddenly in a situation where it takes several years of scrambling to correct for the simple fact that predicting the future accurately is impossible. The chip industry didn't just put all their eggs into ever fewer baskets, they also made sure those baskets were
just big enough, so that when crisis struck and we suddenly needed more eggs in more places there was no way to make this happen. The short-sighted, profit-oriented thinking of global neoliberal capital is, when you look at it in certain ways, impressively dumb. Though it's easy enough to think that this is a feature rather than a bug, as those in power are never the ones hurt by these events.
What's the point of these back and forth walls of text besides making the thread unreadable for everyone else?
Hey, you're not wrong, but there doesn't seem to have been any interest in actually discussing the 12900K since this kicked off, so ... meh.