• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 5700G

Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.
 
Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.
Given that the 5800X and Intel's 10th and 11th gen 8-cores can now be found on discount regularly, these might be on discount soon enough.
 
I disagree. By the time you cut Vega down to 6 cores, the performance is closer to a 1030, and also if you run any kind of emulation, 4-cores is the bare-minimum for doing anything more modern than ps2/Gamecube.
Read my post again, please. :) I was talking about HTPC, not emulation and gaming.
 
Excellent.
we can use Fluid Motion, which is very power efficient, but still gives you extremely smooth video.
Bluesky Framerate Converter and this APU will keep me entertained for a long long time!
 
Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.
Unpopular opinion, I know, but I've actually found Intel's 11th gen easier to cool than Zen 2 (I'm not sure how Zen 3 compares, as I've only tried the 5950X with a 240 mm AIO). The 3600 I tried once nearly overheated in the small office case (in my signature) even with a be quiet! Shadow Rock LP cooler while maxing out its 88 W power target. I had to manually lock its PPT to 65 W to be anywhere near usable. The 11700 as a comparison could easily run up to around 100 W PL1 in the same setup. Not to mention, Intel's 14 nm inefficiency only shows when you're using the whole CPU. I've found power consumption during gaming to be quite modest (60-80 W maximum during Cyberpunk 2077 with around 50% usage and 4.1-4.3 GHz).

I don't disagree that Zen 3 is awesome, but 10 and 11th gen Intel isn't as bad as people tend to believe.
 
Unpopular opinion, I know, but I've actually found Intel's 11th gen easier to cool than Zen 2 (I'm not sure how Zen 3 compares, as I've only tried the 5950X with a 240 mm AIO). The 3600 I tried once nearly overheated in the small office case (in my signature) even with a be quiet! Shadow Rock LP cooler while maxing out its 88 W power target. I had to manually lock its PPT to 65 W to be anywhere near usable. The 11700 as a comparison could easily run up to around 100 W PL1 in the same setup. Not to mention, Intel's 14 nm inefficiency only shows when you're using the whole CPU. I've found power consumption during gaming to be quite modest (60-80 W maximum during Cyberpunk 2077 with around 50% usage and 4.1-4.3 GHz).

I don't disagree that Zen 3 is awesome, but 10 and 11th gen Intel isn't as bad as people tend to believe.
GN recently reviewed the OEM only 5800 non-X, one of the most efficient CPU right now.
The problem with a lot of desktop boards is they want to shove as much power into the CPU as possible so that their board bench higher.
Some boards back at Zen2 launch even under reported the actual power draw of the CPU and try to trick the CPU in to drawing more power.
 
Last edited:
Good review and interesting to see the results.

The power consumption numbers are a little odd. 150W vs 126W for the 5600x in cinebench, but both are 65W parts. But, 107W vs 134W for the 5600x in prime95.

Maybe the 5600x is not using it's full 65W in cinebench, but I'd expect both to be at their respective power limit in p95?

1628010383654.png
1628010391831.png
 
I'm a bit confused, on the iGPU performance page, it says the following:

"That's also the reason why we include an additional data point, DDR4-3200, to get a feel for how dropping memory speed from DDR4-3800 to the more affordable DDR4-3200 impacts the FPS rates. "

Maybe I'm wrong, but wouldn't the use of the word "additional" mean that the benchmarks with the 3200Mhz RAM speed are IN ADDITION to the 3800Mhz RAM Speed benchmarks? So if that's the case, where are the 3800Mhz RAM speed iGPU benchmarks? I only see 3200 Mhz. I really wanted to see if goin up to 3600Mhz+ would result in better performance, anyone know of another review that does this?
 
GN recently reviewed the OEM only 5800 non-X, the most efficient CPU right now.
The problem with a lot of desktop boards is they want to shove as much power into the CPU as possible so that their board bench higher.
Some boards back at Zen2 launch even under reported the actual power draw of the CPU and try to trick the CPU in to drawing more power.
I know, I accounted for deviations - though my Asus Tuf B550M-Plus Wifi (and B560M-Plus Wifi) is pretty great with power targets and reporting accuracy.

The problem is, AMD has an 88 W power target on their 65 W TDP CPUs, which is too much for the small chiplets when your airflow is restricted. Intel's larger monolithic dies spread the heat more evenly, resulting in a CPU that's much easier to cool. Of course reviews only look at CPUs on an open test bench with proper cooling, but the data generated this way isn't representative of SFF cases. Just because a tower cooler can run a CPU it doesn't automatically mean that said CPU is good enough for a SFF build with lesser cooling. I learned this the hard way.

Another problem is that AMD's most efficient CPUs are not available for DIY for some reason.

Good review and interesting to see the results.

The power consumption numbers are a little odd. 150W vs 126W for the 5600x in cinebench, but both are 65W parts. But, 107W vs 134W for the 5600x in prime95.

Maybe the 5600x is not using it's full 65W in cinebench, but I'd expect both to be at their respective power limit in p95?

View attachment 211104 View attachment 211105
AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).
 
AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).
PPT is similar to Intel PL2 that there is a boost duration for OEM systems.
DIY motherboard tends to just ignore it and let the CPU boost perpetually, that is an issue a lot of SFF builds run into.
This behavior was introduced on Zen2 after Intel allows motherboard makers to run MCE.
Zen1 CPUs on stock did strictly obey their power rating.
 
AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).
My point was that both have the same power limit. So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).
 
My point was that both have the same power limit. So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).
Sorry, my bad. This is actually an interesting observation.

PPT is similar to Intel PL2 that there is a boost duration for OEM systems.
DIY motherboard tends to just ignore it and let the CPU boost perpetually, that is an issue a lot of SFF builds run into.
This behavior was introduced on Zen2 after Intel allows motherboard makers to run MCE.
Zen1 CPUs on stock did strictly obey their power rating.
I'm not sure about that. AMD doesn't even state what the power target is, as their TDP doesn't have power in the formula. It would be nice to test an OEM system for CPU power consumption, and see how it does compared to every 65 W DIY system's 88 W PPT.
 
^ Yeah this has always been the problem with 'premium' priced APU's. $360 is almost 3-4x the price that the 3200G / 3400G were and makes little sense for budget gamers vs buying a cheaper CPU + GPU unless they really want to pay through the nose for a niche slim ITX build in a case like the Inwin Chopin. I bought an i5-10400F for ÂŁ124 and GTX1660 for ÂŁ159 (total ÂŁ283) with 4-5x the performance and certainly wouldn't spend anywhere near the same money on "between GT1030 and GTX1050" class performance (which is what this APU has) vs simply buying a cheap 1050Ti / 1060 on Ebay. MOAR CORES doesn't do a thing for low-end gaming with such strong GPU bottlenecks, you just sit there with impressively low CPU usage to match the impressively low frame-rates (20-50fps at 1080p in most games here). And budget gamers tend to not have 3800 speed RAM lying around so either more money on top for a possible RAM upgrade (or lower performance for typical budget 2666-3200 modules) needs to be factored too. What would have been interesting is if AMD had released a cheap 5300G for the same price as the 3200G (ÂŁ79 at one point) during the worst of the GPU shortages, but they're refusing to sell that even now (outside of OEM), so even after 2-3 years there's still no real "upgrade" to the 3200G / 3400G at anywhere near the same price point. People who can't afford ÂŁ80 CPU's + ÂŁ150 GPU's tend to not buy +ÂŁ300 APU's with half the performance...


Yes that's exactly how it works. The iGPU "memory size" is just the "window" the game sees. If a game needs 2GB VRAM but you have it set to 512MB VRAM, then it will use +1.5GB more from regular RAM instead of VRAM (which for iGPU's is the same thing). ie, if a game uses say 2GB VRAM and 3GB system RAM, and you lower APU "VRAM" size from 2GB to 512MB in the BIOS, it will appear to use only 512MB VRAM in MSI Afterburner, etc, but the 3GB system RAM will increase to 4.5GB RAM as more system RAM gets used as an "overflow".
You bought a 1660 for 149ÂŁ when and where, this year?!.
While I agree with you on the premise it's too dear, in this market, it probably isn't.
 
Maybe I'm wrong, but wouldn't the use of the word "additional" mean that the benchmarks with the 3200Mhz RAM speed are IN ADDITION to the 3800Mhz RAM Speed benchmarks? So if that's the case, where are the 3800Mhz RAM speed iGPU benchmarks? I only see 3200 Mhz. I really wanted to see if goin up to 3600Mhz+ would result in better performance, anyone know of another review that does this?
3800 MHz = green
3200 MHz = brown

reworded the text slightly to help with that
 
Vega is definitely dated, but given the constraints of DDR4, integrated RDNA2+ is best paired with DDR5/LPDDR5. I look forward to AM5 APUs and FP7+ laptops with integrated RDNA2.

You still managed to eke out a 7% increase at 2.4GHz gfx clock, though that also improves whole architecture performance (caches, raster, pixel, geometry engines) before going out to DDR4. Pixel engine caching in L2 does work in Vega, but pales in comparison to RDNA2's cache subsystem and overall improvements.

iGPU is also heavily impacted by single-core performance at 720-1080p in gaming. I'd be interested in seeing 4.9GHz+ single core (like a 5900X), but I don't know if it's doable.

My point was that both have the same power limit. So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).

If prime95 is hammering memory, 5600X is MCM and IOD is 1-hop away. The extra power consumption on 5600X can be explained by IF PHYs being active between CCD and IOD, whereas 5700G is a monolithic design.
 
Have I read this correctly? All 20 PCI Express lanes from the CPU are available (albeit in GEN 3), unlike the 3400G which only had 12 available? And does that mean that 4x4x4x4 bifurcation is available for the primary PCI Express x16 slot?

doubt it
Care to test this? Just plop a 4x4 nvme adapter card on it and check if all drives are available.
 
Care to test this? Just plop a 4x4 nvme adapter card on it and check if all drives are available.
Doesn't that require a special motherboard too?
 
This is great performance for an APU, but really too expensive for what the iGPU offer.
For cheap builds, the high cost of this just really makes no sense as you can just buy a cheaper CPU and use the money saved on a cheap GPU that's still much faster than the iGPU. For stuff like office PCs then this is just plainly overkill anyway too.
 
Something that you could do on this APU paired with discrete graphics is perform a desktop resolution upscale form the active signal resolution with AMD Radeon's super virtual resolution giving it a bit like a SSAO upscale or DSR rendered off the iGPU. Basically the iGPU would hardware itself would be untapped otherwise so it's a good way to harness that hardware and make active use of it. It would be a bit like FSR, but with no overhead on the discrete GPU because the iGPU would be taking the cloned desktop resolution from the discrete graphics output and then doing upscale to it on the iGPU.

Basically a more powerful mClassic since I'm sure the APU has much more upscale processing power than that solution. It's not trying to render the scene, but rather taking individual frames and doing upscale on them in real time so basically you could perform AA on the iGPU and save perform on the discrete card from having to do so.
 
I have a 4650G in an ASRock X300 Deskmini case so the 5700G isn't enough of an upgrade for me, but they are great APU's. I think people that consider them as a cheap gaming option are missing the point a little, that is what the 5300G is for. The 5700G is for small form factors that can't physically fit a dGPU but need a lot of processing punch. I love that my X300 is the size of my hand.

And if you consider new parts rather then second hand (which is an unfair comparison, some people or businesses care about warranty), $360 for a pseudo-5700X plus 1050 equivalent is a bargain. Even a 1030 is still going for $120-130 these days which is ridiculous. That means you are getting a 5700X equivalent for ~$240, and can use form factors you can't use for a 11400f plus GT 1030 build.
 
Price? meh
product? oh hell yeah

The IGP really needs an upgrade (coming from the guy with a 3090...) but theres a LOT of people who need a good CPU with an average IGP, and this fits the bill really well for them
 
The point of the 5700G is for the IGP, but Vega8 is so dated and shit that it's pointless.

I bought a budget 2500U with dual-channel DDR4-2400 for Christmas in 2017 and almost 4 years later all AMD's can be bothered to put in their top end APU is the same shit but clocked about 60% higher. The only difference is faster DDR4 is now available and a desktop socket gets more power budget to play with.

RDNA APUs already exist in AMD's lineup but us DIY consumers get the left-overs and scrapings!
While I agree, it still doubles intels. Also it's a pretty small igp for the tier of chip. For the price and 8 core config, I think most users will be using a dedicated card with this anyway.

So ya would have been nicer/better with RDNA/RDNA2, but I don't think it matters too much in regards to the market segment............ie compared to intel.

Now..................however, for something that would be only an igp/APU system, I would like something like a 3300x cpu with an RDNA2 igp with 20 CU's (1280 shaders) for $200 ish. I think that would be nice, especially for HTPC, barebones gaming machines, etc.
 
Have been running the 4750G in a A300 for a year now to host some VMs.
Really these are the best CPUs right now for size constrained tiny PCs.
 
Doesn't that require a special motherboard too?
Not really, even some X370 board supports Bifurcation.
I know the Asrock X370 board I had has that option, so does my current X399.
 
Back
Top