• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900K

Joined
Jun 5, 2021
Messages
284 (0.22/day)
There seems to be a (potentially very cool, or very janky, depending on perspective) BIOS-level fix for this. Don't know if it's on current BIOSes or upcoming, but they've announced a toggle ("Legacy game compatibility mode" or something) that when enabled lets you press Scroll Lock to park and effectively disable the E cores on the fly, avoiding Denuvo issues and the like. I'm a bit shocked that this doesn't hard crash the system, but I guess it automatically migrates all threads off of the cores before shutting them down?

The underdog effect has a lot to do with AMD's current success, that's for sure. And yes, AMD has historically been everywhere from competitive to clearly ahead to clearly behind, but they have always been a much smaller company than Intel, with all the competitive drawbacks of that - including (but not limited to) various illegal anticompetitive behaviour on Intel's part. That "Dell is the best friend money can buy" quote from internal Intel communications that came to light in one of the antitrust cases summarizes things pretty eloquently IMO. There's one major factor that hasn't been mention though: the size of the total PC market and how this affects things. Intel's late-2000s performance advantage with Core, combined with the tail end of their various dubiously legal exclusivity deals, combined at the point where a) laptops became properly viable for everyday use, and b) the internet had become a properly commonplace thing. So Intel came into a rapidly expanding market (lower prices per unit, but volume increased dramatically) while also rapidly becoming the only viable server alternative. They had some damn good luck with their timing there, and IMO that contributed quite a bit to their dominance in the 2010s, not just that Bulldozer and its ilk were bad. Now, it was pretty bad, but in a less drastically unequal competitive situation that could have evened out in some way. Instead AMD nearly went bankrupt, but thankfully stayed alive thanks to their GPU division churning out great products still. As such, I guess we can thank ATI, and the AMD decision to buy them, for having the competitive and interesting CPU market we have today. And I don't think we would have gotten the underdog effect without the GPUs either - their ability to deliver good value GPUs is likely to have contributed significantly to that.


Paragraphs, man. Paragraphs!

You're right about that, but my point was based on current reality, which is that both major vendors have had IMCs for at least two generations that can reliably handle XMP settings at 3600 or higher. I don't believe I've ever heard of a recent-ish Intel or Zen2/3 CPU that can't handle memory at those speeds. Zen and Zen+ on the other hand ... nah. My 1600X couldn't run my 3200c16 kit at XMP, and saw intermittent crashes at anything above 2933. But AMD IMCs have improved massively since then. So while you're right on paper, reality is that any current CPU is >99% likely to handle XMP just fine (given that, as I've said all the time, you stick to somewhat reasonable speeds).

Yeah, that makes sense. Zen and Zen+ IMCs were pretty bad, and 2dpc definitely didn't help. If those DIMMs had been dual rank as well, you might not have been able to boot at all. Thankfully AMD stepped up their IMC game drastically with Zen2 and beyond, and hopefully they're able to keep that up for DDR5.

I don't doubt it, but that's definitely niche territory :)

Yes, obviously, but my point was that compression/decompression workloads tend to be intermittent and relatively short-lived (unless you're working on something that requires on-the-fly decompression of some huge dataset, in which case you're looking at a much more complex workload to begin with). They might be tens or even hundreds of GB, but that doesn't take all that much time with even a mid-range CPU.

I said "roughly even", didn't I? Slower in one, faster in the other, that's roughly even to me.

Yeah, they were good if you had a low budget and workloads that were specifically MT-heavy and integer-only (with that shared FP unit per two cores making it effectively half the core count in FP loads). So, yes, they were great value for very specific workloads, but they were quite poor overall - and while this gave them a good niche to live in, the lack of flexibility is part of why Intel's perceived advantage grew so huge (in addition to the inefficiency and thus lack of scaling, of course).

Hm, thanks for linking those. Seems the board partners have been even worse than I thought - and, of course, Intel deserves some flak for not ensuring that they meet the requirements of the platform. This is simply not acceptable IMO.

In theory you're right, but if you're buying a high end CPU for a heavy, sustained workload, you really should also buy something with a VRM to match. If not, you're making a poor choice, whether that is due to ignorance or poor judgement. That could still be a H510 board of course, it's just that nobody makes a H510 board with those kinds of VRMs. That doesn't take away the responsibility of Intel and board manufacturers to ensure they can meet spec - i.e. every board should be able to handle PL1 power draws indefinitely for the most power hungry part on the platform, and should also handle PL2 reasonably well as long as conditions aren't terrible (i.e. there is some ventilation cooling the VRMs and ambient temperatures are below 30°C or thereabouts). The issue, and why I said you shouldn't buy that combo, is that all the H510 boards are bargain-basement boards that generally can't be expected to do this, and will almost guaranteed leave performance on the table - which undermines the "I only need CPU perf, not I/O" argument for going that route in the first place.

*raises hand* I used my C2Q 9450 from 2008 till 2017 ;)
You're entirely right that these chips lasted a long time. However, a huge part of that is that MSDT platforms went quite rapidly from 1 to 4 cores, and then stopped for a decade, meaning software wasn't made to truly make use of them. HEDT platforms had more cores, but were clocked lower, meaning you had to OC them to match the ST perf of MSDT parts - and still topped out at 6 or 8 cores. Now that we've had 8+ core MSDT for four years we're seeing a lot more software make use of it, which is great as it actually makes things faster, rather than the <10% perf/year ST improvements we got for most of the 2010s.

There's definitely a point there, but the shrinking of the HEDT market shows us that the most important thing for most of these customers has been CPU performance, with I/O being secondary. And of course PCIe 4.0 and current I/O-rich chipsets alleviate that somewhat, as you can now get a lot of fast storage and accelerators even on a high-end MSDT platform. X570 gives you 20 native lanes of PCIe 4.0 plus a bunch off the chipset, and there are very few people who need more than a single GPU/accelerator + 3-4 SSDs. They exist, but there aren't many of them - and at that point you're likely considering going for server hardware anyhow.

That's quite different from what I'm used to seeing across these and other forums + in my home markets. Here, Celerons and Pentiums have been nearly nonexistent since ... I want to say 2017, but it might have been 2018, when Intel's supply crunch started. Ryzen 3000G chips have come and gone in waves, being out of stock for months at a time, particularly the 3000G and 3200G, and the Ryzen 3100, 3300 and 3500 have been nearly unobtainable for their entire life cycle. These chips seem to have been distibuted almost entirely to OEMs from what I've seen, but clearly there are regional differences.

Oh, absolutely. But that just underscores what a massive turnaround this has been. They've always had a small core of fans, but that suddenly turning into a dominant group is downright shocking.

Hm, that sounds ... yes, mind-boggling. At least from that description it sounds like a society with a significant fixation on wealth and prestige - not that that's uncommon though. Still, in Norway and Sweden too we've gone from a flagship phone at 4-5 000NOK being considered expensive a decade ago to 10-15 000NOK flagships selling like hotcakes these days. Of course people's usage patterns and dependency on their phones have changed massively, so it makes some sense for them to be valued higher and people being more willing to spend more on them, but that's still a staggering amount of money IMO. PC parts haven't had quite the same rise, possibly due to being more of an established market + more of a niche. There have always been enthusiasts buying crazy expensive stuff, but there's a reason why the GTX 1060 is still the most used GPU globally by a country mile - cheaper stuff that performs well is still a much easier sell overall.

I think the promise of Intel HEDT-like performance (largely including the ST performance, though of course you could OC those HEDT chips) was much of the reason here, as well as the 5+ year lack of real competition making an opening for a compelling story of an up-and-coming company. If nothing else, Zen was exciting, as it was new, different, and competitive. You're mistaken about the power consumption though: Zen was very efficient from the start. That was perhaps the biggest shock - that AMD went from (discontinued) 250+W 8c CPUs that barely kept up with an i5 to ~100W 8c CPUs that duked it out with more power hungry Intel HEDT parts, and even competed on efficiency with Intel MSDT (though their ST perf was significantly behind).

I also think the surprise at "Intel being good again" is mainly due to Intel first not delivering a meaningful performance improvement for a solid 4 years (incremental clock boosts don't do much), and failing to get their new nodes working well. Rocket Lake underscored that with an IPC regression in some workloads - though that was predictable from the disadvantages of backporting a 10nm design to 14nm. Still, this is the first time in more than half a decade that Intel delivers true improvement in the desktop space. Their mobile parts have been far better (Ice Lake was pretty good, Tiger Lake was rather impressive), but they've had clear scaling issues going above 50-ish watts and the very short boost times of laptops. They've finally demonstrated that they haven't lost what made them great previously - it just took a long time.
Amd did good catching up and beating intel with zen 3... but raptor lake and metereo lake will dominate in single core performance
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I frankly still don't like Ryzen. I think that K10 era was the best, because AMD had tons of skus. Want a single core? Buy Sempron. Want triple core? Phenom II X3 it is. Want power saving chips? Buy e CPUs. Want 100 EUR quad core? Athlon II X4 is there. Want excellent gaming chip? Get Phenom II X4. Want crunching monster? Get Phenom II X6 1100T? Want the same but cheaper? Get 1055T. Don't even have AM3 board? AM2+ is fine. Got FX chip with AM3+ board? You can still use all Athlon IIs, Phenom IIs and Semprons and maybe K10 Opties too. I loved that diversity and it was fun to find less common models. Too bad I was too young to get into computers then, but if that weren't the case, AMD was the ultimate value provider. Now they only have a few skus for each person and AMD sure as hell doesn't offer super cheap, but downclocked 16 core chips. If you are cheap, your only options are Ryzen 3s and even then they are overpriced. Athlons now suck. And every single chips is focused on performance, meaning is clocked as far as they can. There is no real e chip availability. And now the equivalent of those old Athlon IIs would be octa core chips with maybe cut down L3 cache (Athlon IIs had no L3 cache, that was Phenom II exclusive feature). There's no way now that AMD would sell an 8C/16T chip at 100 EUR/USD or one with low heat output. There just aren't deals like that anymore, value AMD has been dead since FX got discontinued. Ryzen is antithesis to value. First gen Ryzen was kinda cool, but nothing beyond that.
I think you angered quite a few forum members with the first sentence. :D As for me, I can see where you're coming from. I wouldn't say I don't like Ryzen. The proper phrase would be Ryzen doesn't fit my use case with their current lineup, even though they're excellent CPUs. My issues are:
  • The deal breaker: They run hot. I know, everyone says otherwise, but They. Run. Hot. Period. Of course you can slap a triple-fan AIO, or a Noctua NH-D15 on anything and call it a day, but that doesn't mean easy cooling. Not for me at least. In my terms, easy cooling means building your system in a small form factor case, using a small or medium-sized down-draft cooler like my Shadow Rock LP and stay within safe temperatures without modifying your power limits or volt-modding. Except for my R3 3100 (which I love), no other Ryzen chip can do that as of now. Something's wrong with the heat dissipation of the 7 nm chiplets, or the IHS design, or something else (I don't know and I don't care until it's fixed in a future generation).
  • And a few minor ones:
    • Chipset software, Zen Master and Ryzen specific power plan needed for them to run properly. I mean, software controlled CPU? What the F? Thankfully, this doesn't apply to Zen 3.
    • High idle power consumption. I just can't not mention this. Efficiency under load is great, but how much of its active time does one's PC spend in idle? Mine around 80-90%.
    • Bonkers clock readings. OK, I know about "effective clocks" in HWinfo, but come on. Do I really need HWinfo just to get a simple clock reading?
The AMD CPU that I loved the most is the Athlon 64 3000+. It destroyed 3 GHz Pentium 4s while it only ran at 2 GHz (some only 1.8, but mine was 2 GHz on Socket 754) and being a lot cooler. Though I have to admit that ever since the introduction of Core, Intel has been a smooth, stable and hassle-free experience on all fronts. I've owned, or at least tried nearly every generation, and didn't really dislike anything about any of them. I even love my 11700 and I don't understand why it's so fashionable to hate 10/11th gen. I would also love to build a SFF rig around Alder Lake, but I don't have any spare money to play with right now. :ohwell:
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You're disregarding some pretty major factors here though: those old chips were made on relatively cheap production nodes. Per-wafer costs for recent nodes are easily 2x what they were just a 1-2 generations back, and many times the cost of older nodes (yes, the older nodes were likely more expensive when new than the numbers cited in that article, but nowhere near current cutting-edge nodes). And while these nodes are also much denser, transistor counts are also much higher for current CPUs. The new CCDs are just 80-something mm² compared to over 300 for those 45nm Phenom II X6es, but then those were made on the relatively cheap 45nm node, and of course the Ryzens also have a relatively large IOD as well, on a slightly cheaper node. Substrates and interconnects are also much more expensive due to the high speed I/O in these chips. The older stuff was cutting-edge for its time, obviously, but it was still much, much cheaper to produce. Double, triple and quadruple patterning, more lithographic masks, more time in the machines, and the introduction of EUV are all driving costs up. A $100 downclocked 16-core Ryzen wouldn't be even remotely feasible today.
Phenom II X6 at low end was 200 USD/EUR, it were those Athlon II X4 chips that were dirt cheap at 100USD/EUR with 4 K10 cores, but with chopped L3 cache and locked multiplier. But even if new chips are more sophisticated and some costs (that you mentioned) are now higher, isn't there anything that could be done to make them cheaper? Like backporting 5000 series Ryzen chips to older node, to make them cheaper. Somehow Intel still managed to make i5 10400F cheap, at least relatively cheap.


You're also misrepresenting things a bit IMO. The Zen Athlons have been great value and great bang for the buck as long as they have existed.
Or are they? They are just barely faster than K10 Athlon IIs and are as fast as FX Athlons. Also overclocking. Athlon X4 760K could clock to 5Ghz on modest board and cooling and at that point it may actually beat 200GE in MT stuff.

E and GE parts are available, but mainly for OEMs, as they generally sell in very low numbers on the DIY market, making it not worth the distribution cost - this is the same for Intel's T SKUs, though Intel of course has the advantage of ~80% market share and thus ~5x more product availability.
At least you had an option to have those. I haven't seen Intel T chips available after Haswell.

You're also missing that literally every single Ryzen chip has lower power modes built in, which can be toggled through the BIOS - at least 65W for the 105W SKUs and 45/35W for the 65W SKUs, though I've seen screenshots of 45W modes for 105W SKUs as well. You can buy a 5950X and make it a "5950XE" with a single BIOS toggle.
On FM2+ platform low power mode didn't quite work, I'm still recovering trust of AMD after their complete disregard of low power features and their TDP lies.

The current Zen APUs are also extremely power efficient, and can run very low power if you toggle that mode or hand tune them. You're right that they are currently focusing exclusively on what I would call upper midrange and upwards, which is a crying shame, as lower core count Zen3 CPUs or APUs would be a fantastic value proposition if priced well (there have been some glowing reviews of the 5300G despite it only being available through OEMs). But AMD is sadly following the logic of capitalism: in a constrained market, you focus on what is the most profitable. Which means you bin and sell parts as the highest possible SKU, rather than dividing them between a full range of products. This will likely continue until we are no longer supply constrained, which will be a while. After all, with current laws for publicly traded companies, they could (and would) be sued by shareholders if they did anything but that, as the legislation essentially mandates a focus on maximizing profits above all else, on pain of significant monetary losses from lawsuits. And while I have no doubt AMD's executives wholeheartedly buy into this and think it's a good thing to maximize profits overall, they are also conditioned into this through rules, regulations, systems and societies that all adhere to this logic.
And cynical me just thinks, that AMD and nV just want to make PC DIY market premium and to destroy cheaper part market as those are super low margin parts. And let's be honest, DIY PC market is a bit in pinch as phone/tablets are getting cheaper and more powerful (and they successfully abandoned most of low end market), Apple has their own chips, Sony/MS are selling out their consoles at incredible rate and people perceive them as value gaming machines (instead of PCs) and importantly, people are now conditioned to pay a lot more for tech than they were.

I mentioned how Ryzen is perceived, as well as flagship phones, unlike in past, ain't nobody gonna laugh at you for attempting to sell a 500 USD phone (like Ballmer did when iPhone 2G launched). Media did a lot of work to normalize typically luxurious hardware and well that's obvious even here. Many people have BFGPUs as Jensen said, despite most likely them being middle class people. They are no longer pushing for lower cost, but for more performance at any price. The era of Radeon HD 4870, a 200 USD flagship killer is dead and so is the era of HD 6950 (300 USD flagship killer). All these shortages also showed how impatient people are and that many are also willing to pay a price of scarcity. And despite Intel launching i5 10400F, it didn't have stellar sales until media stopped to shit on it for being cheaper. i3 10100F, I think, never sold particularly well. Meanwhile, poor value parts like 10600K sold well and obviously, other Intel parts along with Ryzens. Motherboard makers are selling cheaper boards, but if you don't pay gaming tax for proper VRMs, you will only get herpes from them. Now water cooling became normal too, despite still mostly failing to provide anything more than air coolers. People are paying for expensive shit and companies have no reason to stop feeding them. As sad it is to say that, but I think that there's just too little demand for lower end or cheaper parts, despite many strides made by LowSpecGamer, TechYesCity, RandomGamingInHD and others. And despite there seemingly being new technological challenges, the elephant in the room isn't that, but public perception of budget hardware. That's kinda the same reason why super low end cars disappeared altogether (I'm talking about Toyota Aygo, VW Polo, Citroen C1, Ford Puma/Ka). And buying a new car switched from being normal to rich man's endeavor. And beyond normal cars, cheap RWD and FWD coupes also died.

All those reasons are why I think we won't get Ryzen 7 5800GE for 200 EUR or modern equivalent of it. Despite Alder Lake launch and small price cut from AMD, I think that it's all we will get, while most chips will continue to get more and more expensive. I think that Zen 4 or Zen 3+ (whatever AMD will launch next) will generally cost more and deliver more performance, but price/performance ratio will continue to sink.

I think you angered quite a few forum members with the first sentence. :D
:cool:


As for me, I can see where you're coming from. I wouldn't say I don't like Ryzen. The proper phrase would be Ryzen doesn't fit my use case with their current lineup, even though they're excellent CPUs.
I'm not sure if I would even like them if AMD made them dirt cheap. There was a ton of braindead AMD marketing, braindead AMD fanboys, braindead media hyping it up to the moon. Many events lead to my feeling like stepped into poop, every single time I see Ryzen. To this day, I still think that Ryzen name is stupid, infantile and egoistic. FX meant nothing and Phenom was awesome. Athlon was between FX and Phenom. I probably care too much about such things tbh.


[*]Chipset software, Zen Master and Ryzen specific power plan needed for them to run properly. I mean, software controlled CPU? What the F? Thankfully, this doesn't apply to Zen 3.
Not a first time that AMD did this. Athlon 64 needed cool and quiet software for it to work with XP. K10 era stuff needed ATi Catalyst (lol yes, that catalyst) for drivers and some things to adjust. In late K10 era and early FX era AMD was pushing AMD Overdrive. It was a bit poop and thankfully was optional. But yeah, Ryzen is the worst offender with various software. Even worse, if you have AMD card and AMD CPU in same machine. That's just asking for various bugs and glitches after every single driver update.

High idle power consumption. I just can't not mention this. Efficiency under load is great, but how much of its active time does one's PC spend in idle? Mine around 80-90%.
That depends on motherboard too. On Intel side, many boards don't have functional C states beyond C3 and will randomly lockup or black screen randomly if left enabled. Also computers consume quite a lot of power at idle, despite chip being very power efficient at low end. Surprisingly fans can consume quite a bit of power, so does chipset. A wrong choice of power supply can also mean significantly higher power use at idle.

Bonkers clock readings. OK, I know about "effective clocks" in HWinfo, but come on. Do I really need HWinfo just to get a simple clock reading?
That's the same on Intel, since it's very hard to avoid observer's effect, when monitoring chip that can rapidly enter various C states.


I even love my 11700 and I don't understand why it's so fashionable to hate 10/11th gen.
Media doesn't care about locked chips, Intel got a lot of flak for various limitations of Comet Lake non Z platforms with 2666Mhz RAM speed being highest supported speed on locked Core i parts and even lower 2400Mhz on Celeron/Pentium. Oh and hype for Ryzen for finally matching Intel (somehow many people perceived that as destroying Intel). Rocket lake had a terrible flagship and locked parts were more expensive than Comet Lake ones, despite tiny performance gains. And on top of that we have that infamous HWUB B560 video, which certainly did no favors in recovering public perception of Intel.


I would also love to build a SFF rig around Alder Lake, but I don't have any spare money to play with right now. :ohwell:
That 11700 is still cool.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Amd did good catching up and beating intel with zen 3... but raptor lake and metereo lake will dominate in single core performance
We'll see - they're promising major ST gains just from the V-cache on the updated Zen3 in a few months, and Zen4 will no doubt focus further on ST gains (as increasing core counts for MSDT markets has zero purpose at this point). Given that Intel just barely caught up with Zen3's IPC (a bit ahead or a bit behind depending on whether you're using DDR4 or DDR5 and the specific combination of workloads), it'll be very interesting to see how these things shake out going forward. Intel still manages to clock consistently higher, but at the cost of huge power draws, and their cores are surprisingly large compared to even Zen3 - so both have clear advantages and disadvantages. The next few generations will be really interesting.
I think you angered quite a few forum members with the first sentence. :D As for me, I can see where you're coming from. I wouldn't say I don't like Ryzen. The proper phrase would be Ryzen doesn't fit my use case with their current lineup, even though they're excellent CPUs. My issues are:
  • The deal breaker: They run hot. I know, everyone says otherwise, but They. Run. Hot. Period. Of course you can slap a triple-fan AIO, or a Noctua NH-D15 on anything and call it a day, but that doesn't mean easy cooling. Not for me at least. In my terms, easy cooling means building your system in a small form factor case, using a small or medium-sized down-draft cooler like my Shadow Rock LP and stay within safe temperatures without modifying your power limits or volt-modding. Except for my R3 3100 (which I love), no other Ryzen chip can do that as of now. Something's wrong with the heat dissipation of the 7 nm chiplets, or the IHS design, or something else (I don't know and I don't care until it's fixed in a future generation).
  • And a few minor ones:
    • Chipset software, Zen Master and Ryzen specific power plan needed for them to run properly. I mean, software controlled CPU? What the F? Thankfully, this doesn't apply to Zen 3.
    • High idle power consumption. I just can't not mention this. Efficiency under load is great, but how much of its active time does one's PC spend in idle? Mine around 80-90%.
    • Bonkers clock readings. OK, I know about "effective clocks" in HWinfo, but come on. Do I really need HWinfo just to get a simple clock reading?
The AMD CPU that I loved the most is the Athlon 64 3000+. It destroyed 3 GHz Pentium 4s while it only ran at 2 GHz (some only 1.8, but mine was 2 GHz on Socket 754) and being a lot cooler. Though I have to admit that ever since the introduction of Core, Intel has been a smooth, stable and hassle-free experience on all fronts. I've owned, or at least tried nearly every generation, and didn't really dislike anything about any of them. I even love my 11700 and I don't understand why it's so fashionable to hate 10/11th gen. I would also love to build a SFF rig around Alder Lake, but I don't have any spare money to play with right now. :ohwell:
I definitely see where you're coming from, and what you're saying here makes a lot of sense. I have a few comments though:
  • The current impression of Intel chips running hot is a) down to their choice of clocking their top SKUs crazy high and disregarding power consumption to look good in benchmarks, and b) because they generally only (or at least mostly) seed review samples for high end chips. You'll probably find 10x or more 11900K reviews compared to 11400 reviews. Non-K high end SKUs or T SKUs are essentially never provided for review, despite selling massively in OEM systems (and this might be precisely to avoid unfavorable comparisons to poorly/cheaply configured OEM systems). Public perceptions are always unreliable and flaky, but Intel has done themselves zero favors in this regard.
  • Zen power ratings are a bit weird - Ryzen master consistently reports 15-ish watts lower power consumption (cores+SoC) compared to package power readings in HWinfo and the like. I frankly don't know what to trust. At least in TPUs reviews, total system idle power is so close as to not make a difference.
  • As you say, the need for a specific power plan is rather weird, and IMO speaks to some kind of poor cooperation with MS. Thankfully that's long gone, and I doubt we'll have to deal with anything like it going forward. But having software control the CPU? Isn't that what modern OSes do? Sure, you can argue for a difference between the OS and third party software and drivers, but the main difference here is that it's visible to you as a user, not whether the software is there or not. It should auto-install through Windows Update though, that's a minimum requirement IMO. But I also think good software-OS-hardware integration is key for getting the most out of our hardware going forward.
  • For the clock readings, I think that's an unavoidable consequence of an increasing number of separate clock-gated parts of the CPU, an increasing number of clock planes, different speeds across cores, different speeds according to the load, etc. When the clock speed is controlled per-core in tiny increments at a staggering pace by bespoke hardware, there's no way human-readable software will be able to keep up. I agree that it's pretty weird to see an "effective clock" for a core at 132MHz when the core clock is reporting 3.9GHz, but ... I don't see how it ultimately matters. At least in my case, there is only any real variance between the two at ilde; under load they equalize (within ~25MHz). The fact that effective clocks can differ between main and SMT threads per core tells me there's something in play here that's well beyond my understanding. In other words, it's not that the data reported isn't accurate, but rather that computers are getting so complex my understanding is falling behind. And to be honest, that's a very good thing.
None of that takes away from the fact that lower clocked 10th and 11th gen Intel CPUs are great (and even quite efficient), and your point about Zen2 and Zen3 being difficult to cool is also quite relevant. I don't mind, but then I'm not using one of these chips with an LP air cooler. That my 5800X settles in the mid-to-high 60s at 120W running OCCT with a 280mm rad, the P14s spinning at 1600rpm, and my pump at 3400? Even accounting for the Aquanaut being a mediocre-at-best CPU block, that's pretty high overall considering the thermal load. This might be an advantage Intel gets from their larger core designs, even on the new 7 process. Though I also think us PC builders will need to get used to higher core temperatures in the future, as thermal densities are only increasing. I hope we can avoid laptop-like "everything spends any time under load bouncing off tJmax" scenarios though.
Phenom II X6 at low end was 200 USD/EUR, it were those Athlon II X4 chips that were dirt cheap at 100USD/EUR with 4 K10 cores, but with chopped L3 cache and locked multiplier. But even if new chips are more sophisticated and some costs (that you mentioned) are now higher, isn't there anything that could be done to make them cheaper? Like backporting 5000 series Ryzen chips to older node, to make them cheaper. Somehow Intel still managed to make i5 10400F cheap, at least relatively cheap.
10400F is 14nm though, and Intel has a lot more scale than AMD to cut costs, at ~4x the market share. Backporting might work, but the Zen3 core is designed for the density and physical properties of the 7nm node, so a backport will perform worse and likely run quite hot (just look at rocket lake and its latency regressions). Sadly the only thing to really do is wait for the supply crunch to alleviate and for economies of scale to overtake the margin-hiking AMD is currently riding on.
Or are they? They are just barely faster than K10 Athlon IIs and are as fast as FX Athlons. Also overclocking. Athlon X4 760K could clock to 5Ghz on modest board and cooling and at that point it may actually beat 200GE in MT stuff.
At, what, 7-8x the power and on a platform that barely supports USB3.0, let alone any modern I/O? Yeah, sure, they might tag along with a bottom-end chip like that still, but even for free the value proposition there would be pretty terrible given the effort needed for OCing (especially for a beginner), the more expensive cooler, the noise, and the lack of upgrade path (buy a 200GE and you can move to a used 5950X in time!). Bargain-basement parts always compete with used parts in value, but they have inherent advantages due to being new.
At least you had an option to have those. I haven't seen Intel T chips available after Haswell.
They're found in tons of AIOs and SFF OEM PCs, but they are really rare at retail. SFF afficionados tend to hunt them down from time to time, but you often need grey-market sources like Ebay.
On FM2+ platform low power mode didn't quite work, I'm still recovering trust of AMD after their complete disregard of low power features and their TDP lies.
I haven't heard of any issues with these on AM4 - but then all they do is adjust PPT, TDC and EDC limits for the chip, just like you can do manually, and which is a core method of PBO overclocking or underclocking/volting. I'm currently running my 5800X at a 120W PPT, and it never goes above that, ever (and it still boosts higher than stock).
And cynical me just thinks, that AMD and nV just want to make PC DIY market premium and to destroy cheaper part market as those are super low margin parts. And let's be honest, DIY PC market is a bit in pinch as phone/tablets are getting cheaper and more powerful (and they successfully abandoned most of low end market), Apple has their own chips, Sony/MS are selling out their consoles at incredible rate and people perceive them as value gaming machines (instead of PCs) and importantly, people are now conditioned to pay a lot more for tech than they were.
You're onto something, but I don't think it's down to anyone wanting to destroy anything, it's more down to the fact that entry-level hardware is getting sufficiently powerful to take the place of what used to be low-end gaming hardware. Smartphones and APU laptops suddenly providing decent gaming experiences has all but killed the low-end GPU market. But the real kicker is that this has coincided with the current supply crunch which has completely erased lower midrange and midrange parts as well, pushing prices to astronomical levels. No doubt a lot of execs would love for this to stick around, as it's making them filthy rich. I just hope people don't get used to the idea of $1000 being normal for a GPU. If that happens, PC gaming will transform radically, and not for the better.
I mentioned how Ryzen is perceived, as well as flagship phones, unlike in past, ain't nobody gonna laugh at you for attempting to sell a 500 USD phone (like Ballmer did when iPhone 2G launched). Media did a lot of work to normalize typically luxurious hardware and well that's obvious even here. Many people have BFGPUs as Jensen said, despite most likely them being middle class people. They are no longer pushing for lower cost, but for more performance at any price. The era of Radeon HD 4870, a 200 USD flagship killer is dead and so is the era of HD 6950 (300 USD flagship killer). All these shortages also showed how impatient people are and that many are also willing to pay a price of scarcity. And despite Intel launching i5 10400F, it didn't have stellar sales until media stopped to shit on it for being cheaper. i3 10100F, I think, never sold particularly well. Meanwhile, poor value parts like 10600K sold well and obviously, other Intel parts along with Ryzens. Motherboard makers are selling cheaper boards, but if you don't pay gaming tax for proper VRMs, you will only get herpes from them. Now water cooling became normal too, despite still mostly failing to provide anything more than air coolers. People are paying for expensive shit and companies have no reason to stop feeding them. As sad it is to say that, but I think that there's just too little demand for lower end or cheaper parts, despite many strides made by LowSpecGamer, TechYesCity, RandomGamingInHD and others. And despite there seemingly being new technological challenges, the elephant in the room isn't that, but public perception of budget hardware. That's kinda the same reason why super low end cars disappeared altogether (I'm talking about Toyota Aygo, VW Polo, Citroen C1, Ford Puma/Ka). And buying a new car switched from being normal to rich man's endeavor. And beyond normal cars, cheap RWD and FWD coupes also died.
You're mostly right, but it's not down to a lack of demand IMO - remember, the 1060 is still the most used GPU out there by a massive margin. It's mainly down to a lot of those people having working, okay hardware already, and holding off until something new comes along at an acceptable price. All the while, enthusiasts are going crazy and paying 2-3-4-5x as much for a GPU as most of them would have been willing to do five years ago. There is absolutely a general move in business towards luxury products with higher margins, especially as markets get more commoditized (just look at TridentZ Royal RAM) - it's a way for them to get massively larger profits for essentially the same product. But in most markets this opens the door for new budget brands - this is hard in the PC space with no more X86 licences though. Still, it might happen - or AMD and Intel might realize that they are leaving a massive market unaddressed and ripe for smartphone/console makers, and start catering to the millions if not billions who can't afford a $500 GPU yet still want to game.
All those reasons are why I think we won't get Ryzen 7 5800GE for 200 EUR or modern equivalent of it. Despite Alder Lake launch and small price cut from AMD, I think that it's all we will get, while most chips will continue to get more and more expensive. I think that Zen 4 or Zen 3+ (whatever AMD will launch next) will generally cost more and deliver more performance, but price/performance ratio will continue to sink.
There's another factor here: if AMD has a piece of silicon that qualifies for the speed and power of a $400 5800X, they will sell it as that rather than a potential $200 5800E, simply because it's the same silicon, so the same production cost, and thus the increased price is just more profit. As long as they have a limited supply of silicon and high yields (i.e. a low rate of chips that fail to meet 5800X levels of power and speed), they will stick to the more profitable chip. As I said above, if they don't do this they will literally be sued by their shareholders, and likely for hundreds of millions of dollars if not billions. Current US business laws essentially force them into doing so, and the only way out of prioritizing high margin products is if you can argue that you sell more volume with higher prices - an argument that doesn't work when you're supply constrained and selling out constantly.
I'm not sure if I would even like them if AMD made them dirt cheap. There was a ton of braindead AMD marketing, braindead AMD fanboys, braindead media hyping it up to the moon. Many events lead to my feeling like stepped into poop, every single time I see Ryzen. To this day, I still think that Ryzen name is stupid, infantile and egoistic. FX meant nothing and Phenom was awesome. Athlon was between FX and Phenom. I probably care too much about such things tbh.
Hehe, I can understand that. There's lots of stupid marketing all around. I tend to filter most of it out, as best I can at least. Ryzen marketing has nothing on previous AMD GPU marketing though ... *shudder*
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
10400F is 14nm though, and Intel has a lot more scale than AMD to cut costs, at ~4x the market share. Backporting might work, but the Zen3 core is designed for the density and physical properties of the 7nm node, so a backport will perform worse and likely run quite hot (just look at rocket lake and its latency regressions). Sadly the only thing to really do is wait for the supply crunch to alleviate and for economies of scale to overtake the margin-hiking AMD is currently riding on.
Intel likely could backport Celeron-i3 SKUs to 14nm. Could be cheaper to make bigger chips on cheaper node. But I guess, it's not going to be worth it for them financially. Who knows, maybe it actually is. Maybe there's an odd demand for low end chips.


At, what, 7-8x the power and on a platform that barely supports USB3.0, let alone any modern I/O? Yeah, sure, they might tag along with a bottom-end chip like that still, but even for free the value proposition there would be pretty terrible given the effort needed for OCing (especially for a beginner), the more expensive cooler, the noise, and the lack of upgrade path (buy a 200GE and you can move to a used 5950X in time!). Bargain-basement parts always compete with used parts in value, but they have inherent advantages due to being new.
Athlon II X4 chips go for nothing on fleabay and they were very efficient. Now you shall not buy them as their performance without L3 cache and lack of modern instructions just murders them. But FM2+ stuff is whole another beats. There is proper USB 3 support, unless you buy bottom of the barrel boards and 760Ks are quite inexpensive. 760K is very cool FX based chip and can achieve significant overclock with stock AMD cooler. Richland cores were exceptional overclockers as they ran very cool. You may achieve 600Mhz OC with stock cooler, if not more. It's not worth it to get it now, but those people who got it new, still have little reason to upgrade to new Athlon. Significant upgrades start at i3/Ryzen 3 level for them.

They're found in tons of AIOs and SFF OEM PCs, but they are really rare at retail. SFF afficionados tend to hunt them down from time to time, but you often need grey-market sources like Ebay.
Sucks for them, but I want to mention that S series are also gone.

I haven't heard of any issues with these on AM4 - but then all they do is adjust PPT, TDC and EDC limits for the chip, just like you can do manually, and which is a core method of PBO overclocking or underclocking/volting. I'm currently running my 5800X at a 120W PPT, and it never goes above that, ever (and it still boosts higher than stock).
When I tried to use lower cTDP, my board made chip to downclock to lowest P state (800 MHz) and caused stuttering and poor performance, while still boosting to over 4GHz. It was complete trash. I tested that on two boards, same behaviour. Also I must mention that this was available only to rare FM2(+) chips and my particular chip was Athlon X4 845, which is rare, unicorn chip on even rarer architecture, which is Carrizo. Most FM platform chips likely didn't have anything liek cTDP down mode. I used A4 6300 and A6 7400K and those had nothing like that. Carrizo chips were harvested laptop dies and in general behaved differently from most other Athlons. Fun fact, it beats 870K, despite having lower model number. It beats it in performance and power usage.

You're onto something, but I don't think it's down to anyone wanting to destroy anything, it's more down to the fact that entry-level hardware is getting sufficiently powerful to take the place of what used to be low-end gaming hardware. Smartphones and APU laptops suddenly providing decent gaming experiences has all but killed the low-end GPU market.
I would strongly disagree with that. APUs are only usable with really old and low resolutions like 720p or 900p, they just can't handle much more. Only top tier APUs can handle 1080p with bearable FPS, but the whole idea of APU is to have it cheap, not to pay premium for better model. And that's today, they will not last much longer.

I know I will get a lot of flak for saying that APUs are not great, but I personally couldn't imagine living with one. And going to resolutions bellow 1080p (and to be honest to 1080p on non 1080 monitor) is rough. And it doesn't end there, but you are effectively forced to play at low/medium settings + you share RAM with CPU and all you get is 40-50 fps average in return. Some titles are unplayable even at 720p low and it sucks. Might as well buy 100 EUR Haswell machine with i5 or i7 slap some random fleabay card and be better off than with APU. And APUs as I said haven't been available where I live at all, minus ridiculous 5700G. RX 580s are still 180-220 EUR where I live. APUs are only great in a sense that maybe you expected them to be a display adapter and they can run a game or two somewhat well, but buying it intentionally for gaming imo is taking things too far.

But the real kicker is that this has coincided with the current supply crunch which has completely erased lower midrange and midrange parts as well, pushing prices to astronomical levels. No doubt a lot of execs would love for this to stick around, as it's making them filthy rich. I just hope people don't get used to the idea of $1000 being normal for a GPU. If that happens, PC gaming will transform radically, and not for the better.
And from all I gather, it seems that people are fine with that. Disturbing thing is that typical low end tectubers sort of disappeared. BudgetBuildsOfficial only recently came back. Low Spec Gamer is on some odd hiatus. Oz talks Hw has very low video output. TechYesCity has been obviously talking more about higher end stuff. Good thing that Green Ham Gaming posted a video, but it doesn't seem like he will actually return. Steve from RGinHD is likely still doing like always, but I don't watch him anymore. Kryzzp seems to be rising. So despite some modest successes, it seems that low end PC media is in malaise and has been so for a whole year. And some techtubers like Greg Salazar just completely quite budget builds and Dawid does tech is very similar.


You're mostly right, but it's not down to a lack of demand IMO - remember, the 1060 is still the most used GPU out there by a massive margin. It's mainly down to a lot of those people having working, okay hardware already, and holding off until something new comes along at an acceptable price. All the while, enthusiasts are going crazy and paying 2-3-4-5x as much for a GPU as most of them would have been willing to do five years ago. There is absolutely a general move in business towards luxury products with higher margins, especially as markets get more commoditized (just look at TridentZ Royal RAM) - it's a way for them to get massively larger profits for essentially the same product. But in most markets this opens the door for new budget brands - this is hard in the PC space with no more X86 licenses though. Still, it might happen - or AMD and Intel might realize that they are leaving a massive market unaddressed and ripe for smartphone/console makers, and start catering to the millions if not billions who can't afford a $500 GPU yet still want to game.
To be honest, there's VIA and Zhaoxin with x86 licenses. Most likely they can't produce anything truly good, but they exist and may one day rise like a phoenix. Considering how fast China is growing and that they had some made in China programme to make things in China, Zhaoxin has a slim chance to rise. They sort of had plans to take on Ryzen in one old presentation, how that turned out I don't know. Would be cool if they made something decent.

Wikipedia says that latest Zhaoxin models are Ryzen level, but information is really scarce.

There's another factor here: if AMD has a piece of silicon that qualifies for the speed and power of a $400 5800X, they will sell it as that rather than a potential $200 5800E, simply because it's the same silicon, so the same production cost, and thus the increased price is just more profit. As long as they have a limited supply of silicon and high yields (i.e. a low rate of chips that fail to meet 5800X levels of power and speed), they will stick to the more profitable chip. As I said above, if they don't do this they will literally be sued by their shareholders, and likely for hundreds of millions of dollars if not billions. Current US business laws essentially force them into doing so, and the only way out of prioritizing high margin products is if you can argue that you sell more volume with higher prices - an argument that doesn't work when you're supply constrained and selling out constantly.
That honestly sounds like some really retarded legislation, but I guess US had enough of IPO trolls already.

Hehe, I can understand that. There's lots of stupid marketing all around. I tend to filter most of it out, as best I can at least. Ryzen marketing has nothing on previous AMD GPU marketing though ... *shudder*
Oh, that's another poop I try to avoid stepping into. Vega was awful and anything beyond it was also not very well marketed. There were some stupid releases of cards that no one asked for (Radeon VII, RX 590). The last well marketed card was RX 480. Polaris in general was well marketed and to be honest Polaris was one of the best GPU launches that AMD had after Radeon HD series. And if we talk about crappy AMD marketing, did you know that AMD tried to sell SSDs and RAM? They called them Radeon for some reason. It couldn't get any more confusing. Imagine that era, buying Radeon card, installing Crimson drivers, installing catalyst motherboard drivers, setting up Radeon SSD and RAM, dealing with 990FX chipset. That was the worst mismatch of legacy and new marketing phrases and then product lines. In 2013-2014, who even knew what Catalyst was (excluding enthusiasts)? Oh and Overdrive, a leftover from Phenom II, which was ported to work with pre Vishera FX and was on life support with Vishera FX chips. Sometimes I really wonder, if AMD wouldn't be better off without marketing at all.
 
Joined
Jul 5, 2013
Messages
28,260 (6.75/day)
There has been so much drama about my awards, so now I'm trying to be a bit more explicit where I suspect people will be like "wtf why is this getting an award?
For the record, and I hope this doesn't seem like ass-kissing, you didn't miss a beat. I think you anticipated the complaints(and whining) that people would have about the testing methodology and presented your results accordingly.
Breaking down every single award recommendation seems overkill though and would essentially replace the conclusion text
Thinking you really don't need to. While a short general explaination might be useful, anyone with a decade's worth of experince or more is going to intuitively understand. The awards were clearly based on a logical consumer perspective based ethic and were spot on.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
  • The current impression of Intel chips running hot is a) down to their choice of clocking their top SKUs crazy high and disregarding power consumption to look good in benchmarks, and b) because they generally only (or at least mostly) seed review samples for high end chips. You'll probably find 10x or more 11900K reviews compared to 11400 reviews. Non-K high end SKUs or T SKUs are essentially never provided for review, despite selling massively in OEM systems (and this might be precisely to avoid unfavorable comparisons to poorly/cheaply configured OEM systems). Public perceptions are always unreliable and flaky, but Intel has done themselves zero favors in this regard.
I see what you mean. Intel wants to look good with their CPU's performance, but end up looking bad with their (unlocked) power consumption. I think they could have saved the reputation of Comet Lake and Rocket Lake by enforcing stricter default values on motherboards. Since the performance crown went to AMD in those generations anyway, Intel could have clawed back on the efficiency and unlocking potential. By that I mean, you can look at 10/11th gen chips the way the media did: as power hogs that can be downgraded to sip less power and be slower - or you can look at them the way I do: as 65/125 W locked chips with some hidden potential that you can unlock, provided you have a good motherboard and a proper cooling setup. They're extremely versatile in this sense, and that's why I love them.

  • As you say, the need for a specific power plan is rather weird, and IMO speaks to some kind of poor cooperation with MS. Thankfully that's long gone, and I doubt we'll have to deal with anything like it going forward. But having software control the CPU? Isn't that what modern OSes do? Sure, you can argue for a difference between the OS and third party software and drivers, but the main difference here is that it's visible to you as a user, not whether the software is there or not. It should auto-install through Windows Update though, that's a minimum requirement IMO. But I also think good software-OS-hardware integration is key for getting the most out of our hardware going forward.
It isn't visible to the user anyway, unless you install Ryzen Master, which never really worked for me for some reason. Whenever I tried to install it separately, it said it's already installed, but I couldn't find it anywhere. If I'd ever managed to run it, I would have considered using it, but in this state, it's a piece of rubbish. I prefer controlling CPU-related things in the BIOS anyway - that way I can make sure that everything works the same way even after an OS reinstall, or if I ever get drunk enough to try a magical new Linux distro.

  • For the clock readings, I think that's an unavoidable consequence of an increasing number of separate clock-gated parts of the CPU, an increasing number of clock planes, different speeds across cores, different speeds according to the load, etc. When the clock speed is controlled per-core in tiny increments at a staggering pace by bespoke hardware, there's no way human-readable software will be able to keep up. I agree that it's pretty weird to see an "effective clock" for a core at 132MHz when the core clock is reporting 3.9GHz, but ... I don't see how it ultimately matters. At least in my case, there is only any real variance between the two at ilde; under load they equalize (within ~25MHz). The fact that effective clocks can differ between main and SMT threads per core tells me there's something in play here that's well beyond my understanding. In other words, it's not that the data reported isn't accurate, but rather that computers are getting so complex my understanding is falling behind. And to be honest, that's a very good thing.
I know, but Intel somehow still manages to report relatively normal idle clocks. On Ryzen, it's also a matter of power plan, I think. When I switched to "maximum energy savings", my Ryzen CPUs did what I would call a proper idle - that is, sub-20 W power consumption and clocks in or near the 1 GHz range. The problem was that whenever I fired up something that needed power, the speed ramp-up was so gradual that a few seconds had to pass before the CPU got up to full speed. It was enough to screw with your Cinebench score or game loading times. I've never seen anything like this before. In "balanced" and "maximum performance" mode, the switch between speed states is instantaneous (as it should be), but idle clocks are reportedly all over the place with power consumption in the 25-30 W range. Considering my PCs idle time, neither is ideal, imo. I just couldn't find the perfect balance, no matter how much I tried.

None of that takes away from the fact that lower clocked 10th and 11th gen Intel CPUs are great (and even quite efficient), and your point about Zen2 and Zen3 being difficult to cool is also quite relevant. I don't mind, but then I'm not using one of these chips with an LP air cooler. That my 5800X settles in the mid-to-high 60s at 120W running OCCT with a 280mm rad, the P14s spinning at 1600rpm, and my pump at 3400? Even accounting for the Aquanaut being a mediocre-at-best CPU block, that's pretty high overall considering the thermal load. This might be an advantage Intel gets from their larger core designs, even on the new 7 process. Though I also think us PC builders will need to get used to higher core temperatures in the future, as thermal densities are only increasing. I hope we can avoid laptop-like "everything spends any time under load bouncing off tJmax" scenarios though.
I agree, and I hope so too. :) I love small form factor builds, and I don't want to give up building them just because IHS design can't keep up with chip manufacturing technologies. :(

High 60s temp on a 5800X tuned to 120 W sounds quite good, but considering that you need a 280 mm rad to achieve that... ugh. :confused: My 11700 set to 125 W runs at 75 °C max in Cinebench with a Shadow Rock LP. In games, it's in the high 60s as well.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Intel likely could backport Celeron-i3 SKUs to 14nm. Could be cheaper to make bigger chips on cheaper node. But I guess, it's not going to be worth it for them financially. Who knows, maybe it actually is. Maybe there's an odd demand for low end chips.
Demand for low-end chips is pretty stable and high, but cutting production pushes a lot of those purchases (especially business/OEM ones) into a higher tier rather than not making a purchase at all - which means chipmakers make more money. IMO, they'll rather keep making low-end chips on older nodes and architectures, as that would be cheaper and easier than any backporting, and might perform just as well. I wouldn't be surprised if we saw different architectures across the low end and high end within a couple of generations. (Of course AMD is already doing that in mobile, with combined updated Zen2 and Zen3 across their 5000u series.) Intel has also historically had the scale to produce 2-3 dice across their MSDT range though, so we might just see them prioritize some really small dice for lower end parts.
Sucks for them, but I want to mention that S series are also gone.
S was 65W though, so it has been subsumed into the non-lettered series as TDPs for lower end chips have dropped.
When I tried to use lower cTDP, my board made chip to downclock to lowest P state (800 MHz) and caused stuttering and poor performance, while still boosting to over 4GHz. It was complete trash. I tested that on two boards, same behaviour. Also I must mention that this was available only to rare FM2(+) chips and my particular chip was Athlon X4 845, which is rare, unicorn chip on even rarer architecture, which is Carrizo. Most FM platform chips likely didn't have anything liek cTDP down mode. I used A4 6300 and A6 7400K and those had nothing like that. Carrizo chips were harvested laptop dies and in general behaved differently from most other Athlons. Fun fact, it beats 870K, despite having lower model number. It beats it in performance and power usage.
Hm, that's weird. 45W mode was perfectly fine on my old A8-7600 that used to run my HTPC. Never had any problems with it, but ultimately I went back to 65W mode as that PC ran relatively cool anyhow and the CPU was slow enough that I wanted all the performance I could get.
I would strongly disagree with that. APUs are only usable with really old and low resolutions like 720p or 900p, they just can't handle much more. Only top tier APUs can handle 1080p with bearable FPS, but the whole idea of APU is to have it cheap, not to pay premium for better model. And that's today, they will not last much longer.
Well, that depends on your expectations. I'm perfectly fine with playing games at a mix of 900p and 1080p at mid-low settings on my 4650G HTPC. It's not as good as playing them on my main PC, obviously, but that depends a lot on the game as well. Rocket League at 900p ~90fps on the TV plays great with FreeSync, and is more enjoyable than on my main PC, even if that runs a much higher frame rate, detail level and resolution.
I know I will get a lot of flak for saying that APUs are not great, but I personally couldn't imagine living with one. And going to resolutions bellow 1080p (and to be honest to 1080p on non 1080 monitor) is rough. And it doesn't end there, but you are effectively forced to play at low/medium settings + you share RAM with CPU and all you get is 40-50 fps average in return. Some titles are unplayable even at 720p low and it sucks. Might as well buy 100 EUR Haswell machine with i5 or i7 slap some random fleabay card and be better off than with APU. And APUs as I said haven't been available where I live at all, minus ridiculous 5700G. RX 580s are still 180-220 EUR where I live. APUs are only great in a sense that maybe you expected them to be a display adapter and they can run a game or two somewhat well, but buying it intentionally for gaming imo is taking things too far.
Again: depends on your expectations. And not all regions have well functioning used components markets. That also depends what you're looking for, of course. And you keep making value comparisons between new parts and used ones, which ... well, used parts will always win. That's a given. You don't buy a brand-new APU setup if value is your only priority. You don't buy anything brand new if value is your only priority.
And from all I gather, it seems that people are fine with that. Disturbing thing is that typical low end tectubers sort of disappeared. BudgetBuildsOfficial only recently came back. Low Spec Gamer is on some odd hiatus. Oz talks Hw has very low video output. TechYesCity has been obviously talking more about higher end stuff. Good thing that Green Ham Gaming posted a video, but it doesn't seem like he will actually return. Steve from RGinHD is likely still doing like always, but I don't watch him anymore. Kryzzp seems to be rising. So despite some modest successes, it seems that low end PC media is in malaise and has been so for a whole year. And some techtubers like Greg Salazar just completely quite budget builds and Dawid does tech is very similar.
It's not much of a surprise given the price hikes and how the used markets across the world look right now - there isn't much use in making videos promoting buying cheap used gear to game on when that gear is either not cheap or not available any longer.
To be honest, there's VIA and Zhaoxin with x86 licenses. Most likely they can't produce anything truly good, but they exist and may one day rise like a phoenix. Considering how fast China is growing and that they had some made in China programme to make things in China, Zhaoxin has a slim chance to rise. They sort of had plans to take on Ryzen in one old presentation, how that turned out I don't know. Would be cool if they made something decent.

Wikipedia says that latest Zhaoxin models are Ryzen level, but information is really scarce.
Those Zhaoxin CPUs are ... well, they're a lot better than what VIA has had before, but they're not great, with low clock speeds and ST performance behind an AMD A10-9700. They could absolutely get some massive cash infusion and suddenly turn into a viable third X86 vendor, but that's highly unlikely - especially now that Intel is buying large parts of VIA's X86 CPU design team.
That honestly sounds like some really retarded legislation, but I guess US had enough of IPO trolls already.
This is not new. IIRC it was made into law in the early 90s, and was a dream of neoliberal economists since the Nixon era. And this does absolutely nothing to combat IPO trolls, it just ensures a downward spiral of ever more predatory and destructive business practices further enriching the wealthy - but then that's the whole point.
Oh, that's another poop I try to avoid stepping into. Vega was awful and anything beyond it was also not very well marketed. There were some stupid releases of cards that no one asked for (Radeon VII, RX 590). The last well marketed card was RX 480. Polaris in general was well marketed and to be honest Polaris was one of the best GPU launches that AMD had after Radeon HD series. And if we talk about crappy AMD marketing, did you know that AMD tried to sell SSDs and RAM? They called them Radeon for some reason. It couldn't get any more confusing. Imagine that era, buying Radeon card, installing Crimson drivers, installing catalyst motherboard drivers, setting up Radeon SSD and RAM, dealing with 990FX chipset. That was the worst mismatch of legacy and new marketing phrases and then product lines. In 2013-2014, who even knew what Catalyst was (excluding enthusiasts)? Oh and Overdrive, a leftover from Phenom II, which was ported to work with pre Vishera FX and was on life support with Vishera FX chips. Sometimes I really wonder, if AMD wouldn't be better off without marketing at all.
I think the RAM and SSD stuff was just an (ill-fated) attempt at getting some traction for the Radeon brand, which was overall seen as decently good at the time. They would have needed to put a lot more effort into it for it to be even remotely succesful though, as it mostly just came off as "hey, we put some stickers on these things, now they're a bit more expensive", which ... yeah. Vega was pretty good in principle, the problem - as with the 12900K! - was that they pushed clocks way too high. A lower clocked Vega 64 or 56 could come pretty close to the efficiency of contemporary Nvidia GPUs, though at lower performance tiers of course. The main issue was the 64CU architectural limit of GCN, which put AMD into a corner until RDNA was ready, with literally the only way of increasing performance being pushing clocks crazy high. What made this issue much, much worse was the harebrained marketing, the "poor Volta" stuff and all that. If they had sold Vega as an upper midrange alternative? It would have kicked butt, but it would have left them without a high end product. Which, of course, they soon realized that they couldn't make anyhow, so they could have saved themselves the embarassment to begin with.

But that's exactly what we're seeing now, and what me and @AusWolf were talking about above: if Intel had only limited their chips to lower power levels at stock, they would have come off looking better, at least IMO. "Not as fast, but good, and pretty efficient" is a better look than "can keep up (and even win at times), but at 2x the power and impossible to cool". Vega and the recent generations of Intel CPUs are impressively analogous in that way.
I see what you mean. Intel wants to look good with their CPU's performance, but end up looking bad with their (unlocked) power consumption. I think they could have saved the reputation of Comet Lake and Rocket Lake by enforcing stricter default values on motherboards. Since the performance crown went to AMD in those generations anyway, Intel could have clawed back on the efficiency and unlocking potential. By that I mean, you can look at 10/11th gen chips the way the media did: as power hogs that can be downgraded to sip less power and be slower - or you can look at them the way I do: as 65/125 W locked chips with some hidden potential that you can unlock, provided you have a good motherboard and a proper cooling setup. They're extremely versatile in this sense, and that's why I love them.
Yeah, that's a good approach, and I really wish Intel went that route. I guess they don't know how to be a good underdog :p (Which, to be fair, took AMD the best part of a decade to learn as well.) Recent Intel chips at lower power, including laptops, are pretty great. It's when you let them run free, as Intel does with their "hey, we have a power spec, but you know, just ignore it if you want to" policies, that things go down the crapper.
It isn't visible to the user anyway, unless you install Ryzen Master, which never really worked for me for some reason. Whenever I tried to install it separately, it said it's already installed, but I couldn't find it anywhere. If I'd ever managed to run it, I would have considered using it, but in this state, it's a piece of rubbish. I prefer controlling CPU-related things in the BIOS anyway - that way I can make sure that everything works the same way even after an OS reinstall, or if I ever get drunk enough to try a magical new Linux distro.
That sounds weird. Ryzen Master doesn't work on my 4650G (expected, as it isn't supported), but I've had no problems other than that. I also prefer BIOS control, but I'm learning more and more to enjoy letting the CPU control itself. The new Ryzen tweaking methods - PBO tuning, Curve Optimizer, etc. - are a really great compromise between letting the CPU control itself with a granularity and adaptability that a human could never achieve, yet still tweak things and optimize things past stock.
I know, but Intel somehow still manages to report relatively normal idle clocks. On Ryzen, it's also a matter of power plan, I think. When I switched to "maximum energy savings", my Ryzen CPUs did what I would call a proper idle - that is, sub-20 W power consumption and clocks in or near the 1 GHz range. The problem was that whenever I fired up something that needed power, the speed ramp-up was so gradual that a few seconds had to pass before the CPU got up to full speed. It was enough to screw with your Cinebench score or game loading times. I've never seen anything like this before. In "balanced" and "maximum performance" mode, the switch between speed states is instantaneous (as it should be), but idle clocks are reportedly all over the place with power consumption in the 25-30 W range. Considering my PCs idle time, neither is ideal, imo. I just couldn't find the perfect balance, no matter how much I tried.
Intel's boost systems are much, much simpler than AMD's though. Slower, less dynamic, less opportunistic, and with less granularity. So, coming from my relatively limited understanding of these things, that seems logical to me. I've never really paid much attention to my idle power beyond noticing that it reads much higher in HWinfo and similar tools than in AMD's official tools, which ... well, it's weird and sub-optimal, but ultimately just tells me that if I want an idle power measurement I need to get my power meter out :p
I agree, and I hope so too. :) I love small form factor builds, and I don't want to give up building them just because IHS design can't keep up with chip manufacturing technologies. :(
Same. Given how easy to cool AMD's APUs are I still have hope, though it'll sure be interesting to see how more exotic packaging like 3D cache affects these things.
High 60s temp on a 5800X tuned to 120 W sounds quite good, but considering that you need a 280 mm rad to achieve that... ugh. :confused: My 11700 set to 125 W runs at 75 °C max in Cinebench with a Shadow Rock LP. In games, it's in the high 60s as well.
It bears repeating that the Aquanaut is a mediocre CPU block at best though. It has a DDC pump, so flow is decent, but it has a reverse flow layout (i.e. sucks water through the microfins rather than pushing it down and through them), which generally performs worse, and doesn't have the best flow characteristics either. I could probably drop another 10 degrees with a top-of-the-line water block, but then I'd need somewhere to mount my pump. So for me it's fine :p Gaming temperatures are definitely better, in the 40s or 50s depending on how much heat the GPU is dumping into the loop, 60s at most under heavy loads. I would really love to see someone test a bunch of different air coolers on MCM Ryzen with those offset mounting kits you can buy, just to see how much the off-centered die placement affects these things.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Well, that depends on your expectations. I'm perfectly fine with playing games at a mix of 900p and 1080p at mid-low settings on my 4650G HTPC. It's not as good as playing them on my main PC, obviously, but that depends a lot on the game as well. Rocket League at 900p ~90fps on the TV plays great with FreeSync, and is more enjoyable than on my main PC, even if that runs a much higher frame rate, detail level and resolution.
I meant games more like AAA titles. Something like GTA 5, RDR2. It would perform well in GTA 5 (it's slower than GTX 650 Ti, which I'm familiar with), but in RDR2 it wouldn't. Maybe at 720p low-medium. And it's much easier to speak of APUs, if you don't use it as your only system, they look quite good, but if you used it as daily system, you will soon change your opinion about them.

Again: depends on your expectations. And not all regions have well functioning used components markets. That also depends what you're looking for, of course. And you keep making value comparisons between new parts and used ones, which ... well, used parts will always win. That's a given. You don't buy a brand-new APU setup if value is your only priority. You don't buy anything brand new if value is your only priority.
I disagree, if you buy cheap and long lasting hardware, it makes sense to do that. Also never forget reliability of computers, they tend to become increasingly unreliable after their first decade. There are many risks with buying used hw. But if that means avoiding something like Ryzen APU, eh it may be a reasonable choice.

It's not much of a surprise given the price hikes and how the used markets across the world look right now - there isn't much use in making videos promoting buying cheap used gear to game on when that gear is either not cheap or not available any longer.
I expected a lot more out of them. They could have made a killing during these times, if they started survival guides and taught how to be thrifty and hunting for deals in unusual places. And there were some deals. RX 550 for a very long time was mostly unaffected by C19 and it wasn't completely unbearable survival card. There also was GT 1030. They could have just posted more content about low spec friendly games and that would have been a lot better than hiatus. And at ultra low end market nothing has changed, so they could have made some videos about that too. They are really not that badly affected by all this compared to more middle class channels, which lost their lifeblood. Low end gamers are far more resilient.


Those Zhaoxin CPUs are ... well, they're a lot better than what VIA has had before, but they're not great, with low clock speeds and ST performance behind an AMD A10-9700. They could absolutely get some massive cash infusion and suddenly turn into a viable third X86 vendor, but that's highly unlikely - especially now that Intel is buying large parts of VIA's X86 CPU design team.
Oh well, I kinda wanted the third player in this industry. Zhaoxin could have been a good replacement for Cyrix.


This is not new. IIRC it was made into law in the early 90s, and was a dream of neoliberal economists since the Nixon era. And this does absolutely nothing to combat IPO trolls, it just ensures a downward spiral of ever more predatory and destructive business practices further enriching the wealthy - but then that's the whole point.
But doesn't it hurt companies, their long term stability and long term profitability?


I think the RAM and SSD stuff was just an (ill-fated) attempt at getting some traction for the Radeon brand, which was overall seen as decently good at the time. They would have needed to put a lot more effort into it for it to be even remotely succesful though, as it mostly just came off as "hey, we put some stickers on these things, now they're a bit more expensive", which ... yeah. Vega was pretty good in principle, the problem - as with the 12900K! - was that they pushed clocks way too high. A lower clocked Vega 64 or 56 could come pretty close to the efficiency of contemporary Nvidia GPUs, though at lower performance tiers of course. The main issue was the 64CU architectural limit of GCN, which put AMD into a corner until RDNA was ready, with literally the only way of increasing performance being pushing clocks crazy high. What made this issue much, much worse was the harebrained marketing, the "poor Volta" stuff and all that. If they had sold Vega as an upper midrange alternative? It would have kicked butt, but it would have left them without a high end product. Which, of course, they soon realized that they couldn't make anyhow, so they could have saved themselves the embarassment to begin with.
Vega was fine, but marketing was awful. And it wasn't just consumer Vega that got burned by awful PR, but also Vega Frontier Edition and Pro SSG. Ever since AMD got rid of FirePro branding, their enterprise cards have been a mess. Their marketing is awful or nonexisting.


But that's exactly what we're seeing now, and what me and @AusWolf were talking about above: if Intel had only limited their chips to lower power levels at stock, they would have come off looking better, at least IMO. "Not as fast, but good, and pretty efficient" is a better look than "can keep up (and even win at times), but at 2x the power and impossible to cool". Vega and the recent generations of Intel CPUs are impressively analogous in that way.
Maybe, the main problem with Vega was that it was meant to compete with earlier nV gen cards and they did that well, but tons of delays and it arrived as patched up abortion and had to compete with much better cards than it was meant for. To make matters worse, 1080 Ti was also unusually fast.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I meant games more like AAA titles. Something like GTA 5, RDR2. It would perform well in GTA 5 (it's slower than GTX 650 Ti, which I'm familiar with), but in RDR2 it wouldn't. Maybe at 720p low-medium. And it's much easier to speak of APUs, if you don't use it as your only system, they look quite good, but if you used it as daily system, you will soon change your opinion about them.
Again: depends on your expectations. If you are expecting to run an APU and play the newest AAA games at anything but rock-bottom settings, then yes, you are going to be disappointed. But a moderately priced APU like the 5600G still outpaces a "cheap" dGPU like the GT 1030 across a wide range of games, both in 1080p and 720p. It can absolutely still provide a good gaming experience - but clearly not a high resolution, high FPS recent AAA gaming experience. But nothing in that price range can.
I disagree, if you buy cheap and long lasting hardware, it makes sense to do that. Also never forget reliability of computers, they tend to become increasingly unreliable after their first decade. There are many risks with buying used hw. But if that means avoiding something like Ryzen APU, eh it may be a reasonable choice.
Of course there are. But now you are bringing a third variable into play - reliability. Between price, performance and reliability, you typically get to pick two. There's always a balance. And, of course, an APU gives you a really fast CPU and a great basis for future upgrades. Obviously it depends what you're looking for, but they really aren't bad.
I expected a lot more out of them. They could have made a killing during these times, if they started survival guides and taught how to be thrifty and hunting for deals in unusual places. And there were some deals. RX 550 for a very long time was mostly unaffected by C19 and it wasn't completely unbearable survival card. There also was GT 1030. They could have just posted more content about low spec friendly games and that would have been a lot better than hiatus. And at ultra low end market nothing has changed, so they could have made some videos about that too. They are really not that badly affected by all this compared to more middle class channels, which lost their lifeblood. Low end gamers are far more resilient.
Depends on their skill sets though. Also, how much useful advice is there to give when the used market is being scoured by scalpers and desperate GPU buyers anyhow? How many videos can you make saying "be patient, be on the lookout"? 'Cause that's about all they can do.
Oh well, I kinda wanted the third player in this industry. Zhaoxin could have been a good replacement for Cyrix.
It would have been cool, but the cash infusion required to be truly competitive is likely too large for anyone to take a chance on.
But doesn't it hurt companies, their long term stability and long term profitability?
Not if you buy into the harebrained idea of infinite economic growth (and the closely related idea of trickle-down economics), which underpins neoliberal economic thinking. Reality strongly contradicts this, but that sadly doesn't have an effect on people when there are short-term economic gains to be had.
Maybe, the main problem with Vega was that it was meant to compete with earlier nV gen cards and they did that well, but tons of delays and it arrived as patched up abortion and had to compete with much better cards than it was meant for. To make matters worse, 1080 Ti was also unusually fast.
Yep, they ended up competing against a particularly good Nvidia generation, but I don't think the delays hurt them that much - it wasn't that late. And the 1080 launched a full year before the Vega 64, after all. The main issue was that they insisted on competing in the high end with a GPU that simply didn't have the necessary hardware resources. The Fury X with 64 CUs @ 1.05GHz competed against the 2816 CUDA core @ 1.1GHz 980 Ti. Then, in the next generation, the still-64 CU Vega 64 was supposed to compete with the slightly lower core count (2560), but much higher clocked @1.7GHz 1080. The Vega arch just didn't scale that well for frequency, which put them in the dumb position of burning tons of power for at best similar performance, though typically a tad worse. If they had backed off the clocks by 200MHz or so (and thus left some OC potential for those who wanted it too), they could have had a killer alternative in between the 1070 and 1080 instead. Sure, it would have had to sell cheaper, but it would have looked so much better. But I think just the idea of "only" being able to compete with the 1080 (and not the Ti) was too much of a loss of face for the RTG to stomach stepping down even further. Which just goes to show that a short-term "good" plan can be pretty bad in the long term.


Oh, for anyone curious about E-core gaming performance, der8auer did a short test. He has the E-cores OC'd to 4.1GHz (so appartently early reports of E-cores not being OCable were wrong), but overall, performance is surprisingly decent. Still much slower than the P cores, obviously, but definitely workable. No doubt they're being hurt by their higher core-to-core and core-to-L3/DRAM latencies, but for what they are and the power draws seen, this really isn't bad.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Again: depends on your expectations. If you are expecting to run an APU and play the newest AAA games at anything but rock-bottom settings, then yes, you are going to be disappointed. But a moderately priced APU like the 5600G still outpaces a "cheap" dGPU like the GT 1030 across a wide range of games, both in 1080p and 720p. It can absolutely still provide a good gaming experience - but clearly not a high resolution, high FPS recent AAA gaming experience. But nothing in that price range can.
My problem is that people call APUs a value purchase, but I never really saw much value in them. How much value could there possibly be in rock bottom GPU, which is just barely acceptable today and will be useless in a year or two? For me that is not a value, but an epitome of buying cheap shit and keep buying it more often and then just wasting cash. For you it works, so I guess you get a reasonable value, but much of media claims an entirely different thing. A value combo was i3 10100F+RX 580. At 300 EUR it can run anything at 1080p 60 fps and usually high settings. I personally find that RX 580 can run anything in 1440p, but I'm willing to trade settings for that. Anyway, this is a value purchase, since for modest budget you get a combo that plays games well today and will do so for 4 years, you can even attempt to scrap by lowering resolution after those 4 years and get even more value out of that combo. Meanwhile, there is 5600G, which sort of costs the same and it already is at nearly scraping by phase. You may be able to use it with cutting resolution, settings and fps expectations for 3 years, but after that, it just doesn't have any potential. So, it's very obvious that i3 and RX 580 combo is much better value. Now you can't buy it new for that kind of money, but even before 'rona, media overhyped APUs a lot. Apparently, Linus once tried to offer some wisdom for budget buyers:

But their community wasn't having any of it and after dealing with consequences, he did one toned down video later and now he more or less started to say that cheap shit may be good (cheap shit - cheap cards and APUs). Low Spec Gamer advised to buy 200GE, but his advice is more for those fucked up markets and his advice is actually adequate, because 200GE sucks, but hey if you can live with it for a while, then it's so cheap to replace with something also cheap later and in fucked up countries, it's a better deal than paying a tithe to some scalper for overpriced Core 2 Duo. Or better yet, just use what you have with low spec hacks. If you pay nothing and get something out of it, your value is technically infinite.

But if you are in country with half decent economy and second hand market, those APUs don't really make much sense to get, unless for HTPC or strictly 720p gaming or retro gaming.

BTW I'm not buying that GT 1030 is now dethroned. Vega 11 historically was slower than GT 1030. That review doesn't state which GT 1030 they used, so it could have been that e-waste DDR4 version.


Of course there are. But now you are bringing a third variable into play - reliability. Between price, performance and reliability, you typically get to pick two. There's always a balance. And, of course, an APU gives you a really fast CPU and a great basis for future upgrades. Obviously it depends what you're looking for, but they really aren't bad.
I'm not sure if APU gives you a good CPU. Didn't they have gimped PCIe lanes to make iGPU work? If that's ture, then they kinda fail as being good CPUs for later graphics card upgrade, especially when there are gimped cards like RX 6600. And then you add into mix, that you may have a PCIe 3 board, then you will be looking at some massive bottlenecking.


Depends on their skill sets though. Also, how much useful advice is there to give when the used market is being scoured by scalpers and desperate GPU buyers anyhow? How many videos can you make saying "be patient, be on the lookout"? 'Cause that's about all they can do.
I think that there was potential and competition was really low. The big thing would have been low spec friendly games and finding ways to enjoy older, cheaper and weaker hardware. Looking for deals on various market would be secondary. It's really not that tough, that's more or less what Green Ham Gaming used to be big on.


It would have been cool, but the cash infusion required to be truly competitive is likely too large for anyone to take a chance on.
What about commie government? Don't they fund various projects?


Not if you buy into the harebrained idea of infinite economic growth (and the closely related idea of trickle-down economics), which underpins neoliberal economic thinking. Reality strongly contradicts this, but that sadly doesn't have an effect on people when there are short-term economic gains to be had.
But infinite growth is kinda real, just complicated and volatile. If you look at real GDP, that seems to be true, it's just that people have no idea what nominal and real GDP is or worse yet, use something really stupid to evaluate economy, something liek stock markets. Also past some periods being rich in growth, after a while it ends and infinite growth is just close to 1% with some fluctuations. And beyond real GDP, I think that buying power, median wage and some life quality indexes + HDI should be taken into account. I haven't looked at actual data, but HDI seems to have slowly rising in first world countries, so is life quality, but wages grow slowly and actual purchasing power of people + real GDP seem to be either growing at miniscule rate or slowly eroding and they are certainly volatile too. But I guess those people (I have no idea who they are, so I'm just guessing that they are some retarded liberals) that claim things about infinite growth don't really talk about that, do they?

On that note, I looked at AMD's stock. It seems to perform really well, but if you look at their profitability graphs, it seems that AMD has made an absolute killing in 2020. They made tiems more profit than 2019 and 2018 combined. Somehow I don't really think that they are overpricing their stuff just because of tough times, I think that they are ripping us off while they can and Ryzen prices should be 20% if not more lower.

Yep, they ended up competing against a particularly good Nvidia generation, but I don't think the delays hurt them that much - it wasn't that late. And the 1080 launched a full year before the Vega 64, after all. The main issue was that they insisted on competing in the high end with a GPU that simply didn't have the necessary hardware resources. The Fury X with 64 CUs @ 1.05GHz competed against the 2816 CUDA core @ 1.1GHz 980 Ti. Then, in the next generation, the still-64 CU Vega 64 was supposed to compete with the slightly lower core count (2560), but much higher clocked @1.7GHz 1080. The Vega arch just didn't scale that well for frequency, which put them in the dumb position of burning tons of power for at best similar performance, though typically a tad worse. If they had backed off the clocks by 200MHz or so (and thus left some OC potential for those who wanted it too), they could have had a killer alternative in between the 1070 and 1080 instead. Sure, it would have had to sell cheaper, but it would have looked so much better. But I think just the idea of "only" being able to compete with the 1080 (and not the Ti) was too much of a loss of face for the RTG to stomach stepping down even further. Which just goes to show that a short-term "good" plan can be pretty bad in the long term.
The more time goes on, the more I think that RTG is just retarded at doing business. Since HD 7000 series and Rx 2xx series, they just keep overhyping incompetent products and hope that they sell. Old school ATi seemed to be doing business much better and certainly didn't try to pull "poor Volta" kind of crap. It seems that RTG inherited some ATis problems like still poor reliability of drivers and software and sometimes outright zero quality control (RX 5000 series) + bad qualities of AMD, that is overhyping various crap even if it doesn't deliver and generally awful PR. They are rich tech company, but their actions are no better than moody teen's, who is trying to make his homework seem better than it is.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
My problem is that people call APUs a value purchase, but I never really saw much value in them. How much value could there possibly be in rock bottom GPU, which is just barely acceptable today and will be useless in a year or two? For me that is not a value, but an epitome of buying cheap shit and keep buying it more often and then just wasting cash.
APUs aren't primarily meant for AAA gaming. With a modern APU, you essentially get a CPU and a GPU that has all the video decoding capabilities you need in a HTPC, usually for the price of a CPU alone. How is that not awesome value? ;)

AMD's APUs are great if you include some casual/oldshcool gaming too. The only things that kept me from going that route with my HTPC are the fact that I had a spare R3 3100 lying around, and that the 5300G isn't commercially available. But oh boy, I'd love to have one! :rolleyes:

On Intel's side, basically everything is an APU except for the F SKUs, which are a terrible value, imo. For £10 more, you get an iGPU that is great for diagnostics, or just having something to hook your monitor up to while you're waiting for your new GPU, or for extra display outputs - that's what I use mine for.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
APUs aren't primarily meant for AAA gaming. With a modern APU, you essentially get a CPU and a GPU that has all the video decoding capabilities you need in a HTPC, usually for the price of a CPU alone. How is that not awesome value? ;)
Then why not all of them have just Vega 3 instead of 7 and 11?

AMD's APUs are great if you include some casual/oldshcool gaming too. The only things that kept me from going that route with my HTPC are the fact that I had a spare R3 3100 lying around, and that the 5300G isn't commercially available. But oh boy, I'd love to have one! :rolleyes:
Or are they actually? By great, I guess you mean solid 60 fps, 1080p at least and high settings at least. I can say that GTX 650 Ti could actually run most of old games well, but some it couldn't. And well there is Crysis, which GTX 650 Ti only properly ran at 1024x768 medium. By solid I mean not dipping bellow 40 fps. Cards like GT 730 GDDR5 can't do that. GT 730 struggles in Far Cry. GT 1030 might be the lowest end card that is actually great at old school games. And that's Vega 11 territory. GTX 650 Ti also managed to play old school games with nVidia control panel enhancements, but only in non demanding games it managed to achieve solid 60 fps with some supersampling. RX 580 can do some supersampling (or SSAA) in Colin McRae 2005 (up to 4x at 1280x1024), but it can't provide perfect performance in Unreal Tournament 2004), but it could run it well with adaptive 8x MSAA. I will lower bar somewhat and say that old games running at 1080p, solid 60 fps and at least high settings (minus odd stuff like PhysX, which still destroys performance) is a good experience and that card or iGPU that can do it is great. But I really doubt if Vega 7 or 11 can pull that off. Solid 60 fps to me is 0.1% lows of no less than 45 fps and 1% lows of no less than 55 fps. Games like F.E.A.R. are old, but still demanding on lower end hardware. I can say that Vega 7 most likely can't run it well with high settings at 1080p. Far Cry 2 most likely wouldn't run well on Vega 11. UT3, may not even run well at medium settings. And Crysis wouldn't be playable at anything above low and 1024x768. Mafia II also likely wouldn't run well at anything above 1080p low.

All in all, iGPUs most likely aren't great for older games. They are not great and their potential is limited. Instead of saying that they are great, old or not demanding games are just all that they can do without lagging badly. Regarding old games, there might be some odd bugs and glitches on modern hardware, that also contributes to them not being great. Another thing is that something like Vega 7, might actually be beaten by GT 1010. You might say that it's pointless to compare free iGPU with dedicated card, but back in socket FM2 days, APUs had no problems wiping floor with GT x10 GDDR5 cards. Cheap A8 APU had performance times better than GT 710 GDDR5. Even A6 or A4 APU may have been faster than GT 710 GDDR5. But even then they weren't really worth it. The budget banger was cheapest Athlon X4 and Radeon HD 7750. Which meant Athlon X4 740 (MSRP of 71 USD) and then 109 USD for card. That's 180 USD for both and for a bit less, you could get 6800K, which was weaker. If you add 20-30 USD to budget, you get 7770, which had like 100 stream cores more and that's an easy 20% performance improvement. Or 10 USD more, you could get Athlon A4 750K and overclock it by 600Mhz with stock cooler, which beats 760K. Unlike APU, Radeon 7750 can still run some games at 1080p low-medium and 60 fps. APU was already struggling with that in 2014. And that's a good scenario. All dual core APUs were e-waste since the day they were made as their CPU portion was just simply too slow to play games well and their GPU portion was usually half of A10 or A8. I personally own A4 6300 and A6 7400K and have benched them extensively. Neither is great or really acceptable in their era games and they are really slow in day to day stuff, they aged like milk. And there's no argument to make about adding dedicated card later either. There was some value in cheapest quad core APUs as they made cheap daily drivers for work and could run old games, but if you wanted to play games in any capacity, Radeon HD 7750 was a complete no brainer. Along with 7750, there was also GTX 650.

All in all, I stay with my argument that APU value was always questionable for gamers, even cheap gamers. And there wasn't much point in pinching pennies that badly, since even a modest library of games would have made those savings on HW make look silly and insignificant and yet they would have made your gaming experience noticeably worse if not outright garbage. That is, unless you pirate all your games, then maybe penny pinching makes sense then. But then again, 7750 or 7770 were selling with codes for several legit AAA games. So despite saving few dollars on hw at cost of gaming experience, APUs still made little sense if you wanted to stay legal.


On Intel's side, basically everything is an APU except for the F SKUs, which are a terrible value, imo. For £10 more, you get an iGPU that is great for diagnostics, or just having something to hook your monitor up to while you're waiting for your new GPU, or for extra display outputs - that's what I use mine for.
Good for you, chips with integrated graphics in Lithuania are scalped too. They can easily cost 40% more than F variants and therefore barely make sense, but during these times, it seems that having a display adapter is getting expensive. I guess that beats buying brand new GT 710.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
My problem is that people call APUs a value purchase, but I never really saw much value in them. How much value could there possibly be in rock bottom GPU, which is just barely acceptable today and will be useless in a year or two? For me that is not a value, but an epitome of buying cheap shit and keep buying it more often and then just wasting cash. For you it works, so I guess you get a reasonable value, but much of media claims an entirely different thing. A value combo was i3 10100F+RX 580. At 300 EUR it can run anything at 1080p 60 fps and usually high settings. I personally find that RX 580 can run anything in 1440p, but I'm willing to trade settings for that. Anyway, this is a value purchase, since for modest budget you get a combo that plays games well today and will do so for 4 years, you can even attempt to scrap by lowering resolution after those 4 years and get even more value out of that combo. Meanwhile, there is 5600G, which sort of costs the same and it already is at nearly scraping by phase. You may be able to use it with cutting resolution, settings and fps expectations for 3 years, but after that, it just doesn't have any potential. So, it's very obvious that i3 and RX 580 combo is much better value. Now you can't buy it new for that kind of money, but even before 'rona, media overhyped APUs a lot. Apparently, Linus once tried to offer some wisdom for budget buyers:
You're making some weird combinations here, a 2020 CPU with a 2017 GPU? Those have AFAIK never been sold new at the same time. The $200 GPU pairing for a 10100 would be an RX 5500 or GTX 1650, neither of which are particularly impressive. You're right that they don't specify the type of 1030, and I've seen some weird stuff from Tom's throughout the years, but I still expect them to be sufficiently competent to use the G5 version of the 1030. But even if it is, that would at best mean that the G5 1030 performs about on par with this. Good luck getting a 6c12t CPU+a 1030G5 for $260.

So, again, either you're arguing for buying used (those 580s you keep bringing up), which is of course a valid argument but entirely different value proposition still, or you have to admit that currently there are no alternatives even remotely close in price and performance. Of course that's partly down to the current market situation, but that affects everything equally.
But their community wasn't having any of it and after dealing with consequences, he did one toned down video later and now he more or less started to say that cheap shit may be good (cheap shit - cheap cards and APUs). Low Spec Gamer advised to buy 200GE, but his advice is more for those fucked up markets and his advice is actually adequate, because 200GE sucks, but hey if you can live with it for a while, then it's so cheap to replace with something also cheap later and in fucked up countries, it's a better deal than paying a tithe to some scalper for overpriced Core 2 Duo. Or better yet, just use what you have with low spec hacks. If you pay nothing and get something out of it, your value is technically infinite.
But the 200GE was never meant as anything but bargain-basement. It literally launched at $60. It was never a good CPU. It paired well with bargain-basement budgets, allowing you to build a PC for next to nothing yet have a warranty. That's about what you can expect for $60.
But if you are in country with half decent economy and second hand market, those APUs don't really make much sense to get, unless for HTPC or strictly 720p gaming or retro gaming.
Unless you want a warranty and/or an upgrade path.
BTW I'm not buying that GT 1030 is now dethroned. Vega 11 historically was slower than GT 1030. That review doesn't state which GT 1030 they used, so it could have been that e-waste DDR4 version.
Vega 11 was paired with old, slow Zen cores, and was clocked much lower than 4000 and 5000-series APUs. This is not an equal comparison, as the newer generations are considerably faster.
I'm not sure if APU gives you a good CPU. Didn't they have gimped PCIe lanes to make iGPU work? If that's ture, then they kinda fail as being good CPUs for later graphics card upgrade, especially when there are gimped cards like RX 6600. And then you add into mix, that you may have a PCIe 3 board, then you will be looking at some massive bottlenecking.
You really should stop commenting on things you're not up to date on. APUs after the 3000-series have 16 PCIe 3.0 lanes for the GPU, and both 4000 and 5000-series (Zen2 and Zen3, but both with less L3 cache than desktop CPUs) perform very well. The 5600G is slower than a 5600X, but not by much at all. So for those $250 you're getting a kick-ass CPU with an entry-level GPU included. Also, gimped? A 1-2% performance loss is not noticeable.
I think that there was potential and competition was really low. The big thing would have been low spec friendly games and finding ways to enjoy older, cheaper and weaker hardware. Looking for deals on various market would be secondary. It's really not that tough, that's more or less what Green Ham Gaming used to be big on.
They would have been drowned out by the YouTube algorithm for making repetitive content. IMO this was likely completely out of their hand. Not to mention just how soul-crushing that content would likely be to make, week after week of the same stuff. Good on them for finding better things to do.
What about commie government? Don't they fund various projects?
I doubt they have any interest in spending billions on financing a CPU architecture where they have to pay licensing fees to a US company, so no. Besides, they have their own efforts.
But infinite growth is kinda real, just complicated and volatile. If you look at real GDP, that seems to be true, it's just that people have no idea what nominal and real GDP is or worse yet, use something really stupid to evaluate economy, something liek stock markets.
Oh dear god no. Besides the blindingly obvious point that infinite economic growth in a world of finite physical resources is literally, logically and physically impossible, the idea that we're seeing "infinite" growth in current economies is an out-and out lie. Firstly, our global economic system is predicated upon rich countries exploiting the poor through labor and material extraction. The "growth" of the past four-five centuries is largely extracting resources in poor areas and processing and selling them in rich ones. Which, in case it isn't obvious, means the poor areas get poorer - just that nobody wants to count this, so it generally isn't talked about in these contexts. The problem is that eventually, some poor areas manage to rid themselves of their oppressors/colonial regimes/corporate overlords and gain some autonomy, which moves the exploitation elsewhere. Recently we've seen this in how Chinese workers are refusing to work under the horrible conditions they have endured for the past couple of decades, which forces a lot of manufacturing to move to new countries where the workforce is as of yet not sufficiently worn down and pissed off. But what happens when this cycle repeats again? At some point you run out of new poor countries to move manufacturing to, just as you run out of poor area to strip of their resources.

Also, GDP is deeply tied into both colonialist and imperialist trade practices as well as financial "industries" that "create" value largely by shuffling numbers around until they look bigger. And, of course, it is possibly the most actively manipulated metric in the world, given how much of a country's future is dependent upon it - unless it increases by ~7% a year you're looking at massive job losses and a rapid collapse of the economy, which forces the never-ending invention of new ways of making the numbers look bigger. GDP is a deeply problematic measure in many ways.

This is of course all looking past the boom-bust-cycle that's built into our current economic systems.
Also past some periods being rich in growth, after a while it ends and infinite growth is just close to 1% with some fluctuations. And beyond real GDP, I think that buying power, median wage and some life quality indexes + HDI should be taken into account. I haven't looked at actual data, but HDI seems to have slowly rising in first world countries, so is life quality, but wages grow slowly and actual purchasing power of people + real GDP seem to be either growing at miniscule rate or slowly eroding and they are certainly volatile too. But I guess those people (I have no idea who they are, so I'm just guessing that they are some retarded liberals) that claim things about infinite growth don't really talk about that, do they?
Nope, because talking about those things undermines the ideology they are selling to voters, with ideas like "low taxes are good for you (never mind that the infrastructure you depend upon is falling apart)" and "cutting taxes for the rich creates jobs (despite literally all evidence contradicting this statement)". The wage stagnations we've seen in most of the Global North in the past 4-5 decades, plus the ever-expanding wealth gap, is a product of a conscious and willed process. And sadly people keep voting for these idiots.
On that note, I looked at AMD's stock. It seems to perform really well, but if you look at their profitability graphs, it seems that AMD has made an absolute killing in 2020. They made tiems more profit than 2019 and 2018 combined. Somehow I don't really think that they are overpricing their stuff just because of tough times, I think that they are ripping us off while they can and Ryzen prices should be 20% if not more lower.
Yeah, as I've said, they're actively focusing on "high ASP" (as in: expensive) components during the combined supply crunch and demand spike. Which is exactly why they're making a killing. There's absolutely a truth to them being supply limited, but they are also making conscious choices to only put out the most expensive and profitable products from what supply they have (as are Intel, as are Nvidia, etc., etc.).
The more time goes on, the more I think that RTG is just retarded at doing business. Since HD 7000 series and Rx 2xx series, they just keep overhyping incompetent products and hope that they sell. Old school ATi seemed to be doing business much better and certainly didn't try to pull "poor Volta" kind of crap. It seems that RTG inherited some ATis problems like still poor reliability of drivers and software and sometimes outright zero quality control (RX 5000 series) + bad qualities of AMD, that is overhyping various crap even if it doesn't deliver and generally awful PR. They are rich tech company, but their actions are no better than moody teen's, who is trying to make his homework seem better than it is.
RTG is shut down though, they were integrated after Raja left AFAIK. And they do seem to have turned a corner since RDNA - yes, the 5700 XT was buggy and problematic, but they're making great architectures with a ton of potential, and their marketing has gone from "overpromise and underdeliver" to "pretty accurate for PR". I just hope they keep developing in this direction, but also that they don't suddenly start expecting these price levels to stay normal going forward. If that's the case, I'm just going full console gamer after this PC gets too slow.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You're making some weird combinations here, a 2020 CPU with a 2017 GPU? Those have AFAIK never been sold new at the same time. The $200 GPU pairing for a 10100 would be an RX 5500 or GTX 1650, neither of which are particularly impressive.
RX 580 is still selling new almost everywhere and is superior to 1650 due to 8 GB VRAM and raw power advantage. 1650 Super is closer to RX 580 equivalent, but still lack VRAM. RX 5500 8GB is similar to RX 580 in performance and VRAM.

You're right that they don't specify the type of 1030, and I've seen some weird stuff from Tom's throughout the years, but I still expect them to be sufficiently competent to use the G5 version of the 1030. But even if it is, that would at best mean that the G5 1030 performs about on par with this.
"Just buy it" -Tom's hardware, 2018.


Good luck getting a 6c12t CPU+a 1030G5 for $260.
That's not really impossible. i5 10400F in Lithuania is 143.99 EUR and cheapest GT 1030 G5 sells at 91.99 EUR. So total is 235.98 EUR. After currency exchange to USD (https://www.xe.com/currencyconverter/convert/?Amount=235.98&From=EUR&To=USD), that's 270.05 USD. That's so close to your specified budget. If you downgrade CPU to Ryzen 1600 AF, then CPU cost is just 131.99 EUR and with GT 1030, USD total is just 256.31. So, um, I fit into budget and have two options of CPUs. i5 is a bit faster than 1600 AF. I'm pretty sure that in USA, there are more store and potentially more deals to be found, you might be able to fit 3600 into that budget. And of course, there is eBay and AliExpress with all those sweet sweet decommissioned Xeons selling for pittance. I found E5-2698 V3 for 149.88 USD (+ 19.39 USD shipping to LTU). That's cheaper than i5 and 1600 AF, but you are getting 16 Haswell cores with HT. Unfortunately with base clock of just 2.3 GHz and boost clock of 3.6 GHz. Still, that's quite amazing chip that fits into APU budget.


So, again, either you're arguing for buying used (those 580s you keep bringing up), which is of course a valid argument but entirely different value proposition still, or you have to admit that currently there are no alternatives even remotely close in price and performance. Of course that's partly down to the current market situation, but that affects everything equally.
i5 + 1030 G5 is close, but depending on how many lanes 5600G/5700G has, maybe that's a better deal. Or maybe it's just worth to go YOLO with Ali Xeon and hope to overclock it a bit. If I found more sensible 8C/16T Haswell Xeon, then I could hunt for deals at Caseking.de. They had some sensibly priced RX 550s, 1050 Tis and 1650s. Ali Xeon and 1050 Ti seems to be quite doable idea, the only problem is finding motherboard. But for 1050 Ti, it might be worth gambling.


But the 200GE was never meant as anything but bargain-basement. It literally launched at $60. It was never a good CPU. It paired well with bargain-basement budgets, allowing you to build a PC for next to nothing yet have a warranty. That's about what you can expect for $60.
To be honest, I bought Athlon X4 870K for 40 EUR. It's probably not as good as Athlon 200GE, but so damn close. I saw new Ryzen 1200 going for 60 EUR once and it was new, not refurbished and with cooler included. BTW new Pentium 4 630 goes for 10.04 EUR right now, what an awesome deal :D. No cooler or old school sticker though.


Unless you want a warranty and/or an upgrade path.
Let's be honest here, if you bought Athlon 200GE when it was new and survived with it this long and most likely have A320 board, then you are capped at 3000 Ryzens (unless your board has extended support) and then is it really worth investing in now dead end platform, which is artificially capped to older chips? It might be a better idea to just buy a new platform entirely.


Vega 11 was paired with old, slow Zen cores, and was clocked much lower than 4000 and 5000-series APUs. This is not an equal comparison, as the newer generations are considerably faster.
You can overclock old Vega 11 and for Vega 11, those Zen cores are fast enough.


You really should stop commenting on things you're not up to date on. APUs after the 3000-series have 16 PCIe 3.0 lanes for the GPU, and both 4000 and 5000-series (Zen2 and Zen3, but both with less L3 cache than desktop CPUs) perform very well. The 5600G is slower than a 5600X, but not by much at all. So for those $250 you're getting a kick-ass CPU with an entry-level GPU included. Also, gimped? A 1-2% performance loss is not noticeable.

Only 8x for PCIe and older PCIe gen 3 if you have lower end or older board. Now tell me, how much performance is lost by using less than ideal PCIe gen and then on top of that cutting lanes in half.

They would have been drowned out by the YouTube algorithm for making repetitive content. IMO this was likely completely out of their hand. Not to mention just how soul-crushing that content would likely be to make, week after week of the same stuff. Good on them for finding better things to do.
Or did they actually find anything better? Budget builds official seems to be happy to be back. I'm pretty sure that they may have had C19 related difficulties or didn't think of how to make content.

I doubt they have any interest in spending billions on financing a CPU architecture where they have to pay licensing fees to a US company, so no. Besides, they have their own efforts.
I'm staying with Zhaoxin, they have big chances of success. There are many IPO scams regarding CPUs in China and Zhaoxin is not a scam.


If I had half working brains at least, I would just pour that money in Zhaoxin, who have been making x86 chips for years and aren't terribly behind AMD and Intel. They have some experience, albeit in limited market, but that's still better than just give it to scammers. And you are suggesting that whole new architecture with no software or hardware or ecosystem is a better deal. It's clearly not. better to invest in Zhaoxin and then claim that it is fully Chinese arch. If we are looking at different arch, then ARM is fine as alternative if x86 becomes unviable. Huawei can make some top tier chips. There's also RiscV if the worst comes.


Oh dear god no. Besides the blindingly obvious point that infinite economic growth in a world of finite physical resources is literally, logically and physically impossible, the idea that we're seeing "infinite" growth in current economies is an out-and out lie. Firstly, our global economic system is predicated upon rich countries exploiting the poor through labor and material extraction. The "growth" of the past four-five centuries is largely extracting resources in poor areas and processing and selling them in rich ones. Which, in case it isn't obvious, means the poor areas get poorer - just that nobody wants to count this, so it generally isn't talked about in these contexts. The problem is that eventually, some poor areas manage to rid themselves of their oppressors/colonial regimes/corporate overlords and gain some autonomy, which moves the exploitation elsewhere. Recently we've seen this in how Chinese workers are refusing to work under the horrible conditions they have endured for the past couple of decades, which forces a lot of manufacturing to move to new countries where the workforce is as of yet not sufficiently worn down and pissed off. But what happens when this cycle repeats again? At some point you run out of new poor countries to move manufacturing to, just as you run out of poor area to strip of their resources.
You don't seem to mention service sector at all, which may not use many if any physical resources at all. And that's a huge sector in advanced countries. As long as we have money and appetite for that, well they can serve us and we very obviously has less money than our wants. It may not be infinite exactly, but with currently unknown potential.

Also, GDP is deeply tied into both colonialist and imperialist trade practices as well as financial "industries" that "create" value largely by shuffling numbers around until they look bigger. And, of course, it is possibly the most actively manipulated metric in the world, given how much of a country's future is dependent upon it - unless it increases by ~7% a year you're looking at massive job losses and a rapid collapse of the economy, which forces the never-ending invention of new ways of making the numbers look bigger. GDP is a deeply problematic measure in many ways.
What an alternative could be? HDI adjusted GDP?

This is of course all looking past the boom-bust-cycle that's built into our current economic systems.
Or is it really? Couldn't we just have a perfect stagnation?


Nope, because talking about those things undermines the ideology they are selling to voters, with ideas like "low taxes are good for you (never mind that the infrastructure you depend upon is falling apart)" and "cutting taxes for the rich creates jobs (despite literally all evidence contradicting this statement)". The wage stagnations we've seen in most of the Global North in the past 4-5 decades, plus the ever-expanding wealth gap, is a product of a conscious and willed process. And sadly people keep voting for these idiots.
It seems that you think that socialism or other similar right ideology would be a way forward.

Yeah, as I've said, they're actively focusing on "high ASP" (as in: expensive) components during the combined supply crunch and demand spike. Which is exactly why they're making a killing. There's absolutely a truth to them being supply limited, but they are also making conscious choices to only put out the most expensive and profitable products from what supply they have (as are Intel, as are Nvidia, etc., etc.).
I disagree with you about Intel. Intel unlike AMD produces far more chips and they seem to have Pentiums and Celerons available. If you look for those, you can find them at reasonable prices. And nVidia is selling GT 710 and GT 730s new, not great, but at least better than literally nothing from AMD.


RTG is shut down though, they were integrated after Raja left AFAIK. And they do seem to have turned a corner since RDNA - yes, the 5700 XT was buggy and problematic, but they're making great architectures with a ton of potential, and their marketing has gone from "overpromise and underdeliver" to "pretty accurate for PR". I just hope they keep developing in this direction, but also that they don't suddenly start expecting these price levels to stay normal going forward. If that's the case, I'm just going full console gamer after this PC gets too slow.
5700 XT had stock voltage issues and many cards were affected by that. It may be as much as 10% if not more of those cards. I'm honestly not sure about about RTG (I'm gonna call them that, beat calling them AMD's graphics card division). Beyond 5000 series, I haven't seen anything that has a tiny bit of appeal. I'm completely priced out of market currently (it didn't take much, once prices climbed past 300 EUR, I was done. 200 EUR is my graphics card ideal budget) and getting more and more cynical. 5000 series were already weak with budget cards, they only made 5500 XT, which was underwhelming card and it didn't gain any traction. I have to admit, that I quite liked 5500 XT for some reason, but my RX 580 serves me well.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Then why not all of them have just Vega 3 instead of 7 and 11?
As long as the price difference compared to regular CPUs is negligible, I'm okay with having a bigger iGPU in them. ;)

Or are they actually? By great, I guess you mean solid 60 fps, 1080p at least and high settings at least. I can say that GTX 650 Ti could actually run most of old games well, but some it couldn't. And well there is Crysis, which GTX 650 Ti only properly ran at 1024x768 medium. By solid I mean not dipping bellow 40 fps. Cards like GT 730 GDDR5 can't do that. GT 730 struggles in Far Cry. GT 1030 might be the lowest end card that is actually great at old school games. And that's Vega 11 territory. GTX 650 Ti also managed to play old school games with nVidia control panel enhancements, but only in non demanding games it managed to achieve solid 60 fps with some supersampling. RX 580 can do some supersampling (or SSAA) in Colin McRae 2005 (up to 4x at 1280x1024), but it can't provide perfect performance in Unreal Tournament 2004), but it could run it well with adaptive 8x MSAA. I will lower bar somewhat and say that old games running at 1080p, solid 60 fps and at least high settings (minus odd stuff like PhysX, which still destroys performance) is a good experience and that card or iGPU that can do it is great. But I really doubt if Vega 7 or 11 can pull that off. Solid 60 fps to me is 0.1% lows of no less than 45 fps and 1% lows of no less than 55 fps. Games like F.E.A.R. are old, but still demanding on lower end hardware. I can say that Vega 7 most likely can't run it well with high settings at 1080p. Far Cry 2 most likely wouldn't run well on Vega 11. UT3, may not even run well at medium settings. And Crysis wouldn't be playable at anything above low and 1024x768. Mafia II also likely wouldn't run well at anything above 1080p low.

All in all, iGPUs most likely aren't great for older games. They are not great and their potential is limited. Instead of saying that they are great, old or not demanding games are just all that they can do without lagging badly. Regarding old games, there might be some odd bugs and glitches on modern hardware, that also contributes to them not being great. Another thing is that something like Vega 7, might actually be beaten by GT 1010. You might say that it's pointless to compare free iGPU with dedicated card, but back in socket FM2 days, APUs had no problems wiping floor with GT x10 GDDR5 cards. Cheap A8 APU had performance times better than GT 710 GDDR5. Even A6 or A4 APU may have been faster than GT 710 GDDR5. But even then they weren't really worth it. The budget banger was cheapest Athlon X4 and Radeon HD 7750. Which meant Athlon X4 740 (MSRP of 71 USD) and then 109 USD for card. That's 180 USD for both and for a bit less, you could get 6800K, which was weaker. If you add 20-30 USD to budget, you get 7770, which had like 100 stream cores more and that's an easy 20% performance improvement. Or 10 USD more, you could get Athlon A4 750K and overclock it by 600Mhz with stock cooler, which beats 760K. Unlike APU, Radeon 7750 can still run some games at 1080p low-medium and 60 fps. APU was already struggling with that in 2014. And that's a good scenario. All dual core APUs were e-waste since the day they were made as their CPU portion was just simply too slow to play games well and their GPU portion was usually half of A10 or A8. I personally own A4 6300 and A6 7400K and have benched them extensively. Neither is great or really acceptable in their era games and they are really slow in day to day stuff, they aged like milk. And there's no argument to make about adding dedicated card later either. There was some value in cheapest quad core APUs as they made cheap daily drivers for work and could run old games, but if you wanted to play games in any capacity, Radeon HD 7750 was a complete no brainer. Along with 7750, there was also GTX 650.
Let the TPU review of the 5600G speak for itself. Its iGPU matches, or even beats the GT 1030 in almost every game while it currently cost exactly the same as the 5600X in the UK. You might argue that the 5600X has a bigger L3 cache, but considering that a GT 1030 costs between £80-100, you can't argue against the 5600G offering great value. You just simply can't.

Let's not bring the GT 710 and 730 into the picture, either. The GT 1030 is an entirely different class of GPU. I know, I've owned all of these.

Same with the Radeon HD 7000 series that don't even have driver support anymore.

Good for you, chips with integrated graphics in Lithuania are scalped too. They can easily cost 40% more than F variants and therefore barely make sense, but during these times, it seems that having a display adapter is getting expensive. I guess that beats buying brand new GT 710.
Exactly.

All in all, I stay with my argument that APU value was always questionable for gamers, even cheap gamers. And there wasn't much point in pinching pennies that badly, since even a modest library of games would have made those savings on HW make look silly and insignificant and yet they would have made your gaming experience noticeably worse if not outright garbage. That is, unless you pirate all your games, then maybe penny pinching makes sense then. But then again, 7750 or 7770 were selling with codes for several legit AAA games. So despite saving few dollars on hw at cost of gaming experience, APUs still made little sense if you wanted to stay legal.
Just read what I said please:
APUs aren't primarily meant for AAA gaming. With a modern APU, you essentially get a CPU and a GPU that has all the video decoding capabilities you need in a HTPC, usually for the price of a CPU alone. How is that not awesome value? ;)
The ability to play games at modest settings on an AMD APU is an extra feature.
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
As long as the price difference compared to regular CPUs is negligible, I'm okay with having a bigger iGPU in them. ;)
But that is clearly not a case past Vega 7/8. Ryzen 2400G was overpriced.

Let the TPU review of the 5600G speak for itself. Its iGPU matches, or even beats the GT 1030 in almost every game while it currently cost exactly the same as the 5600X in the UK.
Well, that's actually cool and surprising. I don't wanna be a dick again, but that's another review, where GT 1030 model is not specified. I have looked at YT and it seems that GT 1030 is matched by new APUs (Vega 7/8), but ffs why don't reviewers state what they have. DDR4 and GDDR5 versions are not even close. Here's a video of both:

To be honest, they are literally the same, but GT 1030 has an edge in GTA, but then 5600G has an edge in AC or Hitman. It seems that TPU tested GDDR5 version.


You might argue that the 5600X has a bigger L3 cache, but considering that a GT 1030 costs between £80-100, you can't argue against the 5600G offering great value. You just simply can't.
I guess, I can't. But if you state, that just getting a display adapter with good chip is all you need, then 11400 exists. If you are at extreme budget and want to paly games, then Ali Xeon + whatever you can find on eBay for 150 EUR. Maybe GTX 960. Caseking had RX 550 2GB for 110-130 EUR. I saw some RX 550s for 86 EUR locally new. And if you only need a display adapter with just reasonable chip, then there's is i3 10100. So, I'm not so sure if 5600G is such a great deal. It may be for some people, but there are some other options with similar budget or even lower budget. Let's not forget that iGPU also steals some RAM to use as VRM and 16GB is bare minimum and iGPU prefers fast RAM. That's something to consider, if your RAM market is poor.


Let's not bring the GT 710 and 730 into the picture, either. The GT 1030 is an entirely different class of GPU. I know, I've owned all of these.
As display adapters they all work.

Same with the Radeon HD 7000 series that don't even have driver support anymore.
So what? They have drivers, they are simply not updated anymore. That really isn't a big deal. But performance might be, here is 7750:

It's certainly not as much of e-waste as one would have thought and they are not expensive at this point, maybe.


Just read what I said please:

The ability to play games at modest settings on an AMD APU is an extra feature.
If you need to have a chip just for videos, then it's an awful deal. 300+ EUR is way too much for that. Just get Celeron and be done with it. AMD A4 chips used to cost 40 EUR in the past. It can still run some games. Far Cry at intro ran at 1080p Ultra and 54 fps, Dirt 3 ran at 900p medium-high 50 fps, Duke Nukem Forever ran at 1080p high (no AA and post processing) at 34 fps, UT 2004 at Ultra and 1080p ran at 103 fps. A4 6300 beat ATi X800 XT Platinum Edition, so if you look at older titles, even a modest APU most likely can run them. A4 6300 was good up to 2008 games, but it could run WRC 4 at 800x600 at unstable 40 fps, so it was survivable with until 2014. The experience isn't great and neither it would be on Ryzen APU today. That's why sticking big iGPU is mostly pointless and if you want something actually respectable, something that can actually run games and do so for a while, then getting a graphics card is a no brainer. I don't see much point in Vega 11 existing. Maybe it's reasonable for modest heterogenous computing or video transcoding tasks, but as display adapter it simply costs way too much. Well, it's better than buying Ryzen and GT 710, which is a meme card at this point and to some extent fails to be a proper display adapter, but is that all that 5600G/5700G can beat in value?

Even commiting a sin that is buying some part earlier and some later:

Is pointless with 5600G/5700G, just buy 10400/10500/11400/11500 and save some cash for that GPU. I just don't a see a point in these two Ryzen APUs, they underdeliver and cost way too much. C19 changes nothing about its horrendous value proposition. At least where I live, 10500 costs 208 EUR and 5600G costs 313 EUR and there is 10100 for 125 EUR. And that's an exceptional deal for Ryzen. Usually it costs 320 EUR. You can even find 11500 for 200 EUR. So where's the value in those Ryzens?
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I guess, I can't. But if you state, that just getting a display adapter with good chip is all you need, then 11400 exists. If you are at extreme budget and want to paly games, then Ali Xeon + whatever you can find on eBay for 150 EUR. Maybe GTX 960. Caseking had RX 550 2GB for 110-130 EUR. I saw some RX 550s for 86 EUR locally new. And if you only need a display adapter with just reasonable chip, then there's is i3 10100. So, I'm not so sure if 5600G is such a great deal. It may be for some people, but there are some other options with similar budget or even lower budget. Let's not forget that iGPU also steals some RAM to use as VRM and 16GB is bare minimum and iGPU prefers fast RAM. That's something to consider, if your RAM market is poor.
Yes, the 11400 is a great HTPC chip as well. I agree that AMD's offerings are a bit overpriced for such use cases, but for what you're getting for the money, they are great products. 6-core Zen 3 with a GT 1030 level iGPU for less than £300 is good. What you use it for is a different story. The 5300G would be a sweet product - it's a shame it's unavailable through retail.

Ebay/aliexpress are totally not descriptive of current (new) market conditions. Everyone knows that the value and quality of used anything varies greatly among individual items, so arguments for cherry-picked deals on used stuff are irrelevant and not helpful. In context: I bought my RTX 2070 on ebay, but I wouldn't recommend everyone to do the same.

As display adapters they all work.
The 3090 works as a display adapter as well. Why is it so expensive, then?

Seriously, comparing the GT 710 and 730 to the 1030 is pointless. Different generation, different video decoder, different performance class. Anno 2021, I can't recommend a Kepler GPU for home theatre purposes. It lacks a 4k 60 Hz output, and it lacks support for decoding increasingly widespread codecs like H.265. Such videos stutter on a GT 710 even at 1080p.

So what? They have drivers, they are simply not updated anymore. That really isn't a big deal. But performance might be, here is 7750:

It's certainly not as much of e-waste as one would have thought and they are not expensive at this point, maybe.
Same again: outdated HDMI/DP standards, lack of support for modern video codecs. Maybe OK for oldschool gaming, but useless as a modern HTPC card.

Also, see my comment on used stuff above.

If you need to have a chip just for videos, then it's an awful deal. 300+ EUR is way too much for that. Just get Celeron and be done with it. AMD A4 chips used to cost 40 EUR in the past. It can still run some games. Far Cry at intro ran at 1080p Ultra and 54 fps, Dirt 3 ran at 900p medium-high 50 fps, Duke Nukem Forever ran at 1080p high (no AA and post processing) at 34 fps, UT 2004 at Ultra and 1080p ran at 103 fps. A4 6300 beat ATi X800 XT Platinum Edition, so if you look at older titles, even a modest APU most likely can run them. A4 6300 was good up to 2008 games, but it could run WRC 4 at 800x600 at unstable 40 fps, so it was survivable with until 2014. The experience isn't great and neither it would be on Ryzen APU today. That's why sticking big iGPU is mostly pointless and if you want something actually respectable, something that can actually run games and do so for a while, then getting a graphics card is a no brainer. I don't see much point in Vega 11 existing. Maybe it's reasonable for modest heterogenous computing or video transcoding tasks, but as display adapter it simply costs way too much. Well, it's better than buying Ryzen and GT 710, which is a meme card at this point and to some extent fails to be a proper display adapter, but is that all that 5600G/5700G can beat in value?

Even commiting a sin that is buying some part earlier and some later:

Is pointless with 5600G/5700G, just buy 10400/10500/11400/11500 and save some cash for that GPU. I just don't a see a point in these two Ryzen APUs, they underdeliver and cost way too much. C19 changes nothing about its horrendous value proposition. At least where I live, 10500 costs 208 EUR and 5600G costs 313 EUR and there is 10100 for 125 EUR. And that's an exceptional deal for Ryzen. Usually it costs 320 EUR. You can even find 11500 for 200 EUR. So where's the value in those Ryzens?
So you're saying that an old A4 6300 is good because it can run UT 2004, but the new Ryzen G-series is rubbish because... why exactly? You're not making much sense. :wtf:

I agree that a modern Celeron is perfect for watching movies. I still think AMD's APUs offer good value if you want to include some casual low-spec gaming as well.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Yes, the 11400 is a great HTPC chip as well. I agree that AMD's offerings are a bit overpriced for such use cases, but for what you're getting for the money, they are great products. 6-core Zen 3 with a GT 1030 level iGPU for less than £300 is good. What you use it for is a different story. The 5300G would be a sweet product - it's a shame it's unavailable through retail.
5600G is 200 EUR cool, not 300 EUR.


Ebay/aliexpress are totally not descriptive of current (new) market conditions. Everyone knows that the value and quality of used anything varies greatly among individual items, so arguments for cherry-picked deals on used stuff are irrelevant and not helpful. In context: I bought my RTX 2070 on ebay, but I wouldn't recommend everyone to do the same.
But you would totally buy a cheap Xeon. There are 12 core or better Xeons for a bit over 100 EUR. That's just a deal that's hard to pass up. They are Haswell too, so not completely ancient. Boards are a problem to get and definitely a sketchy part of getting a Xeon rig.


The 3090 works as a display adapter as well. Why is it so expensive, then?
Was not meant to be one maybe?


Seriously, comparing the GT 710 and 730 to the 1030 is pointless. Different generation, different video decoder, different performance class. Anno 2021, I can't recommend a Kepler GPU for home theatre purposes. It lacks a 4k 60 Hz output, and it lacks support for decoding increasingly widespread codecs like H.265. Such videos stutter on a GT 710 even at 1080p.
It doesn't matter that you can or cannot recommend them, they are still sold new and as display adapters. They are supposed to do the same stuff. That they suck is a different matter.

Same again: outdated HDMI/DP standards, lack of support for modern video codecs. Maybe OK for oldschool gaming, but useless as a modern HTPC card.
I found 7750 port specs, they say that HDMI is 1.4a and supports 1440p. Display port supports 4K. Radeon HD 7750 supports UVD 4.0 and VCE 1.0 that let's you decode H.264 without problems, but H.265 or VP9 or AV1 are not decodable. Only H.265 is a problem. Polaris doesn't decode AV-1 or VP-9, which are used on Youtube. I have been using RX 580 for a while and everything works quite okay. I can't play 4K60 videos. 4K videos work fine. So despite that being a very concrete spec, it doesn't seem to be entirely correct either. If the worst comes, then there's a browser extension called h264ify, which forces YT to use h.264 codec. That means less video quality, same bandwidth requirements and severe resolution limitations. Topping out at 1080p or 720p only. I remember using GTX 650 Ti with Windows 7 last year and it could do 1440p well, but 4K is sketchy and 4K60 is just nope. With linux (Xubuntu maybe?), RX 580 could run 8K video. 8K60 was just not runable at all. So, you would need to get 7750 first to see what it can actually do. GTX 650 Ti doesn't support h.265 or VP-9, but worked well with Youtube.


So you're saying that an old A4 6300 is good because it can run UT 2004, but the new Ryzen G-series is rubbish because... why exactly? You're not making much sense. :wtf:
You said that "running game is a bonus", I'm just putting it into perspective what it means. Anything can run games as long as you are willing to accept sacrifices, limit game recency and limit graphics. I's not really an unique selling point that computer can run games. Ryzen APUs just happen to be faster at that, but often experience with them is closer to suffering. That's what you get with A4 6300 too, but cheaper. Sure A4 6300 is much worse, but if they both can't properly run modern games, then does it really matter which of them is faster?

My bare minimum settings are 900p low with 45 fps average with 1% lows not being into less than 20 fps. 720p just looks awful and certainly not acceptable anymore. I would rather use 1280x1024 than 720p. Anyway, in most demanding games 5600G fails to achieve my minimum spec. The minimum spec meeting card is RX 560 4GB:

I own it myself and have upgraded from it, because it no longer ran games at 1440p (and yes it used to do that just fine up until 2017 or 2018). I could still use it if I needed to, but it didn't feel great. Going to 5600G would mean some very real sacrifices and going from so so experience to unpleasant experience. I would rather get i3 10100F and buy used RX 580 for 200 EUR.

I agree that a modern Celeron is perfect for watching movies. I still think AMD's APUs offer good value if you want to include some casual low-spec gaming as well.
Not sure what casual low spec gaming means. Even Intel GMA is good for Minesweeper, for me casual gaming still means something 3D and usually similar HW requirements to non casual games. Firewatch may work on 5600G, but it's only 2-3 hours long. Plenty of cheap or free indie games barely run or don't run on GTX 650 Ti. Something like Genshin Impact might run at 900p medium on 5600G, but it's hardly casual game. You need it to run well to properly land attacks. Therefore my minimum performance spec barely works for game like this.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
5600G is 200 EUR cool, not 300 EUR.
Link. If it's 200 EUR in Lithuania, all the better. :)

But you would totally buy a cheap Xeon. There are 12 core or better Xeons for a bit over 100 EUR. That's just a deal that's hard to pass up. They are Haswell too, so not completely ancient. Boards are a problem to get and definitely a sketchy part of getting a Xeon rig.
No, I would not buy a cheap Xeon. Server chips are usually clocked significantly lower than consumer ones, and there's no way I could use 12 weak cores.

It doesn't matter that you can or cannot recommend them, they are still sold new and as display adapters. They are supposed to do the same stuff. That they suck is a different matter.
Are you saying that I should recommend them for HTPC purposes because they are still sold new as display adapters? Are you aware that we're talking about two entirely different things?

I found 7750 port specs, they say that HDMI is 1.4a and supports 1440p. Display port supports 4K. Radeon HD 7750 supports UVD 4.0 and VCE 1.0 that let's you decode H.264 without problems, but H.265 or VP9 or AV1 are not decodable. Only H.265 is a problem. Polaris doesn't decode AV-1 or VP-9, which are used on Youtube. I have been using RX 580 for a while and everything works quite okay. I can't play 4K60 videos. 4K videos work fine. So despite that being a very concrete spec, it doesn't seem to be entirely correct either. If the worst comes, then there's a browser extension called h264ify, which forces YT to use h.264 codec. That means less video quality, same bandwidth requirements and severe resolution limitations. Topping out at 1080p or 720p only. I remember using GTX 650 Ti with Windows 7 last year and it could do 1440p well, but 4K is sketchy and 4K60 is just nope. With linux (Xubuntu maybe?), RX 580 could run 8K video. 8K60 was just not runable at all. So, you would need to get 7750 first to see what it can actually do. GTX 650 Ti doesn't support h.265 or VP-9, but worked well with Youtube.
So according to you, buying a used GPU on ebay and being forced to employ tricks for smooth video playback is better than buying an APU with a half-decent iGPU that can play all your videos natively for the price of a CPU alone?

You said that "running game is a bonus", I'm just putting it into perspective what it means. Anything can run games as long as you are willing to accept sacrifices, limit game recency and limit graphics. I's not really an unique selling point that computer can run games. Ryzen APUs just happen to be faster at that, but often experience with them is closer to suffering. That's what you get with A4 6300 too, but cheaper. Sure A4 6300 is much worse, but if they both can't properly run modern games, then does it really matter which of them is faster?
How can you even compare an APU from 2013 to the 5600G? The 6300 has a weak CPU, an even weaker iGPU and again, no support for the latest codecs and display standards. Even if you found one on ebay, buying it for a modern HTPC would be a total waste.

It's better to spend 2-300 EUR on something useful than 100 EUR on rubbish.

My bare minimum settings are 900p low with 45 fps average with 1% lows not being into less than 20 fps. 720p just looks awful and certainly not acceptable anymore. I would rather use 1280x1024 than 720p. Anyway, in most demanding games 5600G fails to achieve my minimum spec. The minimum spec meeting card is RX 560 4GB:

I own it myself and have upgraded from it, because it no longer ran games at 1440p (and yes it used to do that just fine up until 2017 or 2018). I could still use it if I needed to, but it didn't feel great. Going to 5600G would mean some very real sacrifices and going from so so experience to unpleasant experience. I would rather get i3 10100F and buy used RX 580 for 200 EUR.
That is your minimum spec. I used to play The Witcher 3 at 1080p medium-low on a GT 1030 without issues. If the 5600G's iGPU is anywhere near that level (according to reviews it is), it's fine. Not great, but fine. I would certainly not try to do the same on an A4 6300.

Not sure what casual low spec gaming means. Even Intel GMA is good for Minesweeper, for me casual gaming still means something 3D and usually similar HW requirements to non casual games. Firewatch may work on 5600G, but it's only 2-3 hours long. Plenty of cheap or free indie games barely run or don't run on GTX 650 Ti. Something like Genshin Impact might run at 900p medium on 5600G, but it's hardly casual game. You need it to run well to properly land attacks. Therefore my minimum performance spec barely works for game like this.
Are you under the assumption that the 5600G can't run 3D games and applications? Maybe you should read some reviews and watch some youtube videos on the topic before we continue this conversation.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Link. If it's 200 EUR in Lithuania, all the better. :)
I was saying that it would be good for 200 EUR chip, but not for 300 EUR one.

No, I would not buy a cheap Xeon. Server chips are usually clocked significantly lower than consumer ones, and there's no way I could use 12 weak cores.
Hmmmm....

Either Ali Xeon or i3 10100F.

Are you saying that I should recommend them for HTPC purposes because they are still sold new as display adapters? Are you aware that we're talking about two entirely different things?
Yes I am. GT 710 still works as 1080p display adapter.


So according to you, buying a used GPU on ebay and being forced to employ tricks for smooth video playback is better than buying an APU with a half-decent iGPU that can play all your videos natively for the price of a CPU alone?
Yes, but you miss the point. I managed to find CPU and GPU for price of just APU. You can buy i5 10400F and have 140 EUR left over. That budget is enough for R9 370, GTX 960 from eBay. In that quote, I was talking about dirt cheap cards, you are asking different thing here. Anyway, I think that it's worth getting i5 and either of those eBay cards. better yet, you can look for regional deals. I managed to find GTX 1050 Ti for same 140 EUR budget. You employ tricks (well no trick if you reread that previous post) if you buy used card for next to nothing, but if you spend reasonable budget, well you don't do that. And you still miss a point, 7750 is nearly as good as iGPU, 1050 Ti can actually play games at acceptable settings and framerate, unlike iGPU.


How can you even compare an APU from 2013 to the 5600G? The 6300 has a weak CPU, an even weaker iGPU and again, no support for the latest codecs and display standards. Even if you found one on ebay, buying it for a modern HTPC would be a total waste.

It's better to spend 2-300 EUR on something useful than 100 EUR on rubbish.
You miss the point again. This replay was to "being able to run modern games". In that one aspect A4 and 5600G aren't too different.


That is your minimum spec. I used to play The Witcher 3 at 1080p medium-low on a GT 1030 without issues. If the 5600G's iGPU is anywhere near that level (according to reviews it is), it's fine. Not great, but fine. I would certainly not try to do the same on an A4 6300.
My minimum spec is 900p low, so that's significantly worse than your GT 1030 experience in Witcher.


Are you under the assumption that the 5600G can't run 3D games and applications? Maybe you should read some reviews and watch some youtube videos on the topic before we continue this conversation.
All of them at my defined minimum spec? Well, it certainly fails to do that. 1600x900 and low setting with 45 fps average wasn't much to ask in 2009. It certainly isn't that much in 2021.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I was saying that it would be good for 200 EUR chip, but not for 300 EUR one.
I only agree with this because Intel's 6-core CPUs are selling at that price point. It's not only APUs that are overpriced on AMD's side at the moment, but everything.

Hmmmm....

Either Ali Xeon or i3 10100F.
My money is on the i3 in this matter. I was talking about video playback, you were talking about games. Both of these tasks make better use of fewer but faster cores.

Yes I am. GT 710 still works as 1080p display adapter.
You're talking about a display adapter. I'm talking about a graphics card for a HTPC. I believe I've already explained how these are two totally different things:
Anno 2021, I can't recommend a Kepler GPU for home theatre purposes. It lacks a 4k 60 Hz output, and it lacks support for decoding increasingly widespread codecs like H.265. Such videos stutter on a GT 710 even at 1080p.
Being able to send out a 1080p signal on Windows desktop is not enough. Let's leave it at that.

Yes, but you miss the point. I managed to find CPU and GPU for price of just APU. You can buy i5 10400F and have 140 EUR left over. That budget is enough for R9 370, GTX 960 from eBay. In that quote, I was talking about dirt cheap cards, you are asking different thing here. Anyway, I think that it's worth getting i5 and either of those eBay cards. better yet, you can look for regional deals. I managed to find GTX 1050 Ti for same 140 EUR budget. You employ tricks (well no trick if you reread that previous post) if you buy used card for next to nothing, but if you spend reasonable budget, well you don't do that. And you still miss a point, 7750 is nearly as good as iGPU, 1050 Ti can actually play games at acceptable settings and framerate, unlike iGPU.
Again, you're talking about ebay. I'm talking about new. You can't say that AMD/Intel/nvidia's pricing of current gen products is bad just because you found something old that you like on ebay.

With this, I will stop responding to comments about ebay, as they are totally irrelevant.

This replay was to "being able to run modern games". In that one aspect A4 and 5600G aren't too different.
Yes they are.

My minimum spec is 900p low, so that's significantly worse than your GT 1030 experience in Witcher.
Then you should have no problem playing it with a GT 1030 or a 5600G.

All of them at my defined minimum spec? Well, it certainly fails to do that. 1600x900 and low setting with 45 fps average wasn't much to ask in 2009. It certainly isn't that much in 2021.
That depends on the game. Also, just because a chip doesn't meet your desired specs in X game, it doesn't mean that it's a failed product altogether.
 
Last edited:
Joined
Dec 14, 2019
Messages
1,210 (0.66/day)
Location
Loose in space
System Name "The black one in the dining room" / "The Latest One"
Processor Intel Xeon E5 2699 V4 22c/44t / i7 14700K @5.8GHz
Motherboard Asus X99 Deluxe / ASRock Z790 Taichi
Cooling Arctic Liquid Freezer II 240 w/4 Silverstone FM121 fans / Arctic LF II 280 w Silverstone FHP141's
Memory 64GB G.Skill Ripjaws V DDR4 2400 (8x8) / 96GB G.Skill Trident Z5 DDR5 6400
Video Card(s) EVGA RTX 1080 Ti FTW3 / Asus Tuff OC 4090 24GB
Storage Samsung 970 Evo Plus, 1TB Samsung 860, 4 Western Digital 2TB / 2TB Solidigm P44 Pro & more.
Display(s) 43" Samsung 8000 series 4K / 65" Hisense U8N 4K
Case Modded Corsair Carbide 500R / Modded Corsair Graphite 780 T
Audio Device(s) Asus Xonar Essence STX/ Asus Xonar Essence STX II
Power Supply Corsair AX1200i / Seasonic Prime GX-1300
Mouse Logitech Performance MX, Microsoft Intellimouse Optical 3.0
Keyboard Logitech K750 Solar, Logitech K800
Software Win 10 Enterprise LTSC 2021 IoT / Win 11 Enterprise IoT LTSC 24H2
Benchmark Scores https://www.passmark.com/baselines/V11/display.php?id=202122048229
A close friend lives near a Micro Center and wants me to fix a Toyota Prius he fux0red the electrical system on next week. I'm having him pick me up a 12700K as partial payment for the labor. As soon as Newegg gets the ASRock Z690 Steel Legend WiFi 6E back in stock I'm getting one of those to put the 12700K in. Combined with 64GB of G.Skill TridentZ 3600 Cas 16 B die I already have along with the rest of the parts needed to build another new rig I'll be set for a while at a very low cost since all I have to buy is the mobo and that uses DDR4 ($269) The 12700K has the same number of P cores the 12900K does but 4 fewer E cores meaning it sucks a less power with close to the same performance. I'll be using a Arctic Liquid Freezer II 360 on it so cooling won't be an issue either.

 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Again, you're talking about ebay. I'm talking about new. You can't say that AMD/Intel/nvidia's pricing of current gen products is bad just because you found something old that you like on ebay.

With this, I will stop responding to comments about ebay, as they are totally irrelevant.
Don't like eBay and get polio from used deals? Well, there is one last resort deal where I live:

Quadro T600 and i3 10100F, the last resort for modest 1080p gaming. Quadro T600 is cut down GTX 1650, but to compensate that, it got faster VRAM. It has 4GB VRAM, which is pretty much minimum for gaming today and it can play any video. So, that is similar to APU as this setup is very power efficient. You pay with CPU power to fit into budget, but hey that's nearly 1650. It's actually really cool card during these times and it is selling at pretty much its MSRP.

And very importantly, that combo is still 3 EUR cheaper than Ryzen APU:

And if you complete a whole build, it's actually overall very cheap and unlike with APU, your RAM isn't hijacked by iGPU, so same 16GB are left to CPU.


Then you should have no problem playing it with a GT 1030 or a 5600G.
No, not really. Any card or APU should do that in modern AAA games. 5600G falls apart in Cyberpunk for example and there are some games that aren't playable at those settings. T600 has a much better chance.


That depends on the game. Also, just because a chip doesn't meet your desired specs in X game, it doesn't mean that it's a failed product altogether.
My initial point was that APUs never really made much sense for gaming. You are always better off with something dedicated, even if lower end.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
A close friend lives near a Micro Center and wants me to fix a Toyota Prius he fux0red the electrical system on next week. I'm having him pick me up a 12700K as partial payment for the labor. As soon as Newegg gets the ASRock Z690 Steel Legend WiFi 6E back in stock I'm getting one of those to put the 12700K in. Combined with 64GB of G.Skill TridentZ 3600 Cas 16 B die I already have along with the rest of the parts needed to build another new rig I'll be set for a while at a very low cost since all I have to buy is the mobo and that uses DDR4 ($269) The 12700K has the same number of P cores the 12900K does but 4 fewer E cores meaning it sucks a less power with close to the same performance. I'll be using a Arctic Liquid Freezer II 360 on it so cooling won't be an issue either.
Congrats! :) Post your experiences, please! :D

Personally, I have nothing against ebay. The reason I wouldn't recommend anyone (especially new PC builders) to buy their stuff there is this:
1. Deals are temporary. What you posted may be there today, but may disappear by tomorrow. As such, it does not represent the actual value of the product in general. It only represents the value of that particular piece at that particular moment. You cannot say that the Quadro T600 is a good deal in general just because you found one on a local second-hand selling site.
2. No warranty.
3. You don't know what you get until you get it. Maybe it's good as new. Maybe it's f***ed. Maybe it's dirty as hell and you spend a whole day cleaning it. Maybe the person you're recommending it to doesn't know how to take a graphics card apart for cleaning. Then again, it's not a good deal for that person.
4. You're buying from a person instead of a registered company. Sure, you have buyer's protection on ebay, but you never have 100% protection against scammers.

All in all, if you have a friend who knows his/her stuff about computers, sure, recommend him/her a good deal on ebay. But even then, that's one good deal, and not a picture of the current PC market. Here on TPU, we all have varying levels of understanding about PCs. Some of us have been building them for decades. Some of us are just getting into it now. Recommending ebay deals to people in general is a bad idea.

Like I said, I bought my RTX 2070 on ebay. Would I recommend you to do the same? Maybe, as you know a thing or two about PCs. Would I recommend it in general? Hell no.

Prices of new hardware and the second-hand market are two entirely different things.

No, not really. Any card or APU should do that in modern AAA games. 5600G falls apart in Cyberpunk for example and there are some games that aren't playable at those settings. T600 has a much better chance.
Why should a £280 APU play any modern game?

My initial point was that APUs never really made much sense for gaming. You are always better off with something dedicated, even if lower end.
You really like it when I repeat myself, don't you? ;)
APUs aren't primarily meant for AAA gaming. With a modern APU, you essentially get a CPU and a GPU that has all the video decoding capabilities you need in a HTPC, usually for the price of a CPU alone. How is that not awesome value? ;)
The ability to play games at modest settings on an AMD APU is an extra feature.
 
Last edited:
Top