# AMD Ryzen 7 5700G



## W1zzard (Aug 3, 2021)

With the Ryzen 7 5700G, AMD is finally bringing their most powerful APU to the retail DIY channel. With 512 graphics cores based on the Vega architecture, the IGP is over twice as fast as Intel's Rocket Lake graphics. Thanks to the Zen 3 architecture, the eight CPU cores are blazing fast, too.

*Show full review*


----------



## somebodys_kid (Aug 3, 2021)

Have I read this correctly? All 20 PCI Express lanes from the CPU are available (albeit in GEN 3), unlike the 3400G which only had 12 available?  And does that mean that 4x4x4x4 bifurcation is available for the primary PCI Express x16 slot?


----------



## W1zzard (Aug 3, 2021)

somebodys_kid said:


> bifurcation


doubt it


----------



## Salvo39 (Aug 3, 2021)

Was there only 512MB addressable to the iGPU? Do you think iGPU performance will change if more memory is allocated to it?


----------



## W1zzard (Aug 3, 2021)

Salvo39 said:


> Was there only 512MB addressable to the iGPU? Do you think iGPU performance will change if more memory is allocated to it?


Great idea, let me test that


----------



## damric (Aug 3, 2021)

Also the slide that says DDR4-4266 possible on the new memory controller. I'd love to see you test that if you have any kits available to see how performance of the IGP scales.


----------



## londiste (Aug 3, 2021)

@W1zzard, any plans for doing a clock-for-clock comparison between 5700G and 5800X - or maybe 5600G and 5600X since these are more closely matched out of the box?



Salvo39 said:


> Was there only 512MB addressable to the iGPU? Do you think iGPU performance will change if more memory is allocated to it?


Quite sure more is addressable to iGPU and the amount of RAM allocated to iGPU does not affect performance (other than scenarios with running out of it of course). At least based on how this worked with old 2400G and 5700G should not that much different when it comes to iGPU.


----------



## W1zzard (Aug 3, 2021)

londiste said:


> Quite sure more is addressable to iGPU and the amount of RAM allocated to iGPU does not affect performance (other than scenarios with running out of it of course). At least based on how this worked with old 2400G and 5700G should not that much different when it comes to iGPU.


Yup, I think so too, still something worth verifying


----------



## dir_d (Aug 3, 2021)

Great review, seems like a solid product just need to get rid of the Vega iGPU and move to RDNA.


----------



## BSim500 (Aug 3, 2021)

> "The problem is that for 1080p gaming, the integrated graphics are simply not powerful enough, not even at the lowest possible setting. For pure gaming, you'll be better off with a several year old graphics card (that supports DirectX 12), paired with a value-champ CPU like Core i5-11400F, Ryzen 3 3300X, or 10400F, in that order. These CPUs go for around $170, which frees up $200 for a graphics card."


^ Yeah this has always been the problem with 'premium' priced APU's. $360 is almost 3-4x the price that the 3200G / 3400G were and makes little sense for budget gamers vs buying a cheaper CPU + GPU unless they really want to pay through the nose for a niche slim ITX build in a case like the Inwin Chopin. I bought an i5-10400F for £124 and GTX1660 for £159 (total £283) with 4-5x the performance and certainly wouldn't spend anywhere near the same money on "between GT1030 and GTX1050" class performance (which is what this APU has) vs simply buying a cheap 1050Ti / 1060 on Ebay. MOAR CORES doesn't do a thing for low-end gaming with such strong GPU bottlenecks, you just sit there with impressively low CPU usage to match the impressively low frame-rates (20-50fps at 1080p in most games here). And budget gamers tend to not have 3800 speed RAM lying around so either more money on top for a possible RAM upgrade (or lower performance for typical budget 2666-3200 modules) needs to be factored too. What would have been interesting is if AMD had released a cheap 5300G for the same price as the 3200G (£79 at one point) during the worst of the GPU shortages, but they're refusing to sell that even now (outside of OEM), so even after 2-3 years there's still no real "upgrade" to the 3200G / 3400G at anywhere near the same price point. People who can't afford £80 CPU's + £150 GPU's tend to not buy +£300 APU's with half the performance...



londiste said:


> Quite sure more is addressable to iGPU and the amount of RAM allocated to iGPU does not affect performance (other than scenarios with running out of it of course). At least based on how this worked with old 2400G and 5700G should not that much different when it comes to iGPU.


Yes that's exactly how it works. The iGPU "memory size" is just the "window" the game sees. If a game needs 2GB VRAM but you have it set to 512MB VRAM, then it will use +1.5GB more from regular RAM instead of VRAM (which for iGPU's is the same thing). ie, if a game uses say 2GB VRAM and 3GB system RAM, and you lower APU "VRAM" size from 2GB to 512MB in the BIOS, it will appear to use only 512MB VRAM in MSI Afterburner, etc, but the 3GB system RAM will increase to 4.5GB RAM as more system RAM gets used as an "overflow".


----------



## AusWolf (Aug 3, 2021)

_"As long as you have any half-decent CPU cooler, then you'll have no problems keeping the 5700G cool. This makes AMD's new APUs an excellent choice for compact small form factor ITX systems."_

I have a feeling that I'll have to test that claim sometime in the future.  So far, my experience with Ryzen (except the 3100) in SFF builds with limited airflow has been pretty poor - hence my build log, link in my signature. Even the 3600 I tried once nearly overheated in my Aerocool CS-101 case with the factory cooler. Slapping a be quiet! Shadow Rock LP helped a little, but not much. Ryzen CPUs are great *if* you have that half-decent cooler *and* a case with good airflow. I'm really curious about the APUs in similar situations. My theory is that the monolithic die should help with heat distribution a little, but as I said: I'll probably have to test it.  (just don't know when, as I don't really have spare 400 bucks at the moment)


----------



## neblogai (Aug 3, 2021)

Talking about energy efficiency: if the integrated GPU is all you are going to use- consider getting 5600G/5700G with an ASRock Deskmini X300. Those systems run the APU as a SoC, without a chipset- thus have stupidly low power consumption: just over 10W idle, compared to ~50W as tested here.


----------



## W1zzard (Aug 3, 2021)

Benchmark results at 2 GB for IGP have been added, no change


----------



## mechtech (Aug 3, 2021)

Salvo39 said:


> Was there only 512MB addressable to the iGPU? Do you think iGPU performance will change if more memory is allocated to it?


Yes.  I have a 3400G on a B550 motherboard and I have it set to 8GB.  (32GB system ram)


----------



## Salvo39 (Aug 3, 2021)

W1zzard said:


> Benchmark results at 2 GB for IGP have been added, no change


You are awesome. Thank you for re-running the tests.


----------



## mechtech (Aug 3, 2021)

Nice review W1zz.  The comparison with the GT1030 was good.  I think the only thing that would have been better would be to compare igp to 3400G.


----------



## Chrispy_ (Aug 3, 2021)

The point of the 5700G is for the IGP, but Vega8 is so dated and shit that it's pointless.

I bought a budget 2500U with dual-channel DDR4-2400 for Christmas in 2017 and almost 4 years later all AMD's can be bothered to put in their top end APU is the same shit but clocked about 60% higher. The only difference is faster DDR4 is now available and a desktop socket gets more power budget to play with.

RDNA APUs already exist in AMD's lineup but us DIY consumers get the left-overs and scrapings!


----------



## defaultluser (Aug 3, 2021)

Having nearly 1050 performance in an IGP in an impressive achievement.  If my GTX 960-powered HTPC suddenly died, on me, I would just replace it with the 5600g.

The cut cache does affect some games, but you still get a noticeable memory controller latency bump over Zen 2 (i.e. all games are faster than 3800xt.)


----------



## AusWolf (Aug 3, 2021)

defaultluser said:


> Having nearly 1050 performance in an IGP in an impressive achievement.  If my GTX 960-powered HTPC suddenly died, on me, I would just replace it with the 5600g.
> 
> The cut cache does affect some games, but you still get a noticeable memory controller latency bump over Zen 2 (i.e. all games are faster than 3800xt.)


Agreed. Shame that the 5300G isn't coming to DIY. That would be the ideal HTPC part. Both the 5600G and 5700G are overpowered and too expensive for such use.


----------



## defaultluser (Aug 3, 2021)

neblogai said:


> Talking about energy efficiency: if the integrated GPU is all you are going to use- consider getting 5600G/5700G with an ASRock Deskmini X300. Those systems run the APU as a SoC, without a chipset- thus have stupidly low power consumption: just over 10W idle, compared to ~50W as tested here.




But that can't run Zen 3 (so you're stuck going fishing for a not-really-existing Zen 2).  Asrock cant be bothered to update the chipset to A520



AusWolf said:


> Agreed. Shame that the 5300G isn't coming to DIY. That would be the ideal HTPC part. Both the 5600G and 5700G are overpowered and too expensive for such use.




I disagree. By the time you cut Vega down to 6 cores, the performance is closer to a 1030, and also if you run any kind of emulation, 4-cores is the bare-minimum for doing anything more modern than ps2/Gamecube.


----------



## RedelZaVedno (Aug 3, 2021)

$360 for 5700G is just too much if you're building gaming PC. You can opt for GTX 1650 + 10400F or 2nd hand 570 4gb + 10700F combo around the same budget and have much better gaming experience.

AMD needs to offer Zen3 + 12 or 16 CU RDNA1/2 APU to make me interested.


----------



## neblogai (Aug 3, 2021)

defaultluser said:


> But that can't run Zen 3 (so you're stuck going fishing for a not-really-existing Zen 2).  Asrock cant be bothered to update the chipset to A520



You are probably mistaking this with A300? The newer X300 has 5700G/5600G in their official supported CPU list. 
Also- A520 is an actual chipset(chip), which these systems do not physically have, but run everything from the APU. And the mentioned 10W power use would no longer be there with A520, B550, etc. chipsets.


----------



## Shatun_Bear (Aug 3, 2021)

With that superb power efficiency, this is the ITX build king.



RedelZaVedno said:


> $360 for 5700G is just too much if you're building gaming PC. You can opt for GTX 1650 + 10400F or 2nd hand 570 4gb + 10700F combo around the same budget and have much better gaming experience.
> 
> AMD needs to offer Zen3 + 12 or 16 CU RDNA1/2 APU to make me interested.



I dont think people buy these only for serious gaming. For a multimedia or productivity machine yes, or esports maybe. In which case a huge 1650 in your itx case is unnecessary.


----------



## RedelZaVedno (Aug 3, 2021)

Shatun_Bear said:


> I dont think people buy these only for serious gaming. For a multimedia or productivity machine yes, or esports maybe. In which case a huge 1650 in your itx case is unnecessary.


I agree. This APU is great for something like HTPC. But I hoped AMD would offer us 12 and 16CU APUs by now. Let's hope we get them with Zen4+RDNA2/3 and DDR5 APU combo. Such APUs should rival 1060/580 level of performance and make APUs truly viable option for 1080p/60Hz gamers without the need to take much compromises.


----------



## Zubasa (Aug 3, 2021)

RedelZaVedno said:


> I agree. This APU is great for something like HTPC. But I hoped AMD would offer us 12 and 16CU APUs by now. Let's hope we get them with Zen4+RDNA2/3 and DDR5 APU combo. Such APUs should rival 1060/580 level of performance and make APUs truly viable option for 1080p/60Hz gamers without the need to take much compromises.


It is not as simple as putting more CUs in the iGPU. Desktop DDR4 is just not fast enough to feed more CUs.
This is why AMD went from 11CUs down to 8CUs, as the overclockers have shown that the 3CUs offers almost no performance.
RDNA2 requires a big cache, and the APU just do not have enough free die space for it, thus they even cut down the CPU cache from Zen3 CPUs.


----------



## HD64G (Aug 3, 2021)

Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.


----------



## Zubasa (Aug 3, 2021)

HD64G said:


> Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.


Given that the 5800X and Intel's 10th and 11th gen 8-cores can now be found on discount regularly, these might be on discount soon enough.


----------



## AusWolf (Aug 3, 2021)

defaultluser said:


> I disagree. By the time you cut Vega down to 6 cores, the performance is closer to a 1030, and also if you run any kind of emulation, 4-cores is the bare-minimum for doing anything more modern than ps2/Gamecube.


Read my post again, please.  I was talking about HTPC, not emulation and gaming.


----------



## minami (Aug 3, 2021)

Excellent. 
we can use Fluid Motion, which is very power efficient, but still gives you extremely smooth video.
Bluesky Framerate Converter and this APU will keep me entertained for a long long time!


----------



## AusWolf (Aug 3, 2021)

HD64G said:


> Nice review as usual @W1zzard ! And that product isn't bad at all when sold at slight discount vs its MSRP. For anyone not using top-notch GPUs (>$500MSRP) it loses on average close to 5% of gaming performance vs the best CPU while using much less wattage and can be cooled much easier.


Unpopular opinion, I know, but I've actually found Intel's 11th gen easier to cool than Zen 2 (I'm not sure how Zen 3 compares, as I've only tried the 5950X with a 240 mm AIO). The 3600 I tried once nearly overheated in the small office case (in my signature) even with a be quiet! Shadow Rock LP cooler while maxing out its 88 W power target. I had to manually lock its PPT to 65 W to be anywhere near usable. The 11700 as a comparison could easily run up to around 100 W PL1 in the same setup. Not to mention, Intel's 14 nm inefficiency only shows when you're using the whole CPU. I've found power consumption during gaming to be quite modest (60-80 W maximum during Cyberpunk 2077 with around 50% usage and 4.1-4.3 GHz).

I don't disagree that Zen 3 is awesome, but 10 and 11th gen Intel isn't as bad as people tend to believe.


----------



## Zubasa (Aug 3, 2021)

AusWolf said:


> Unpopular opinion, I know, but I've actually found Intel's 11th gen easier to cool than Zen 2 (I'm not sure how Zen 3 compares, as I've only tried the 5950X with a 240 mm AIO). The 3600 I tried once nearly overheated in the small office case (in my signature) even with a be quiet! Shadow Rock LP cooler while maxing out its 88 W power target. I had to manually lock its PPT to 65 W to be anywhere near usable. The 11700 as a comparison could easily run up to around 100 W PL1 in the same setup. Not to mention, Intel's 14 nm inefficiency only shows when you're using the whole CPU. I've found power consumption during gaming to be quite modest (60-80 W maximum during Cyberpunk 2077 with around 50% usage and 4.1-4.3 GHz).
> 
> I don't disagree that Zen 3 is awesome, but 10 and 11th gen Intel isn't as bad as people tend to believe.


GN recently reviewed the OEM only 5800 non-X, one of the most efficient CPU right now.
The problem with a lot of desktop boards is they want to shove as much power into the CPU as possible so that their board bench higher.
Some boards back at Zen2 launch even under reported the actual power draw of the CPU and try to trick the CPU in to drawing more power.


----------



## blu3dragon (Aug 3, 2021)

Good review and interesting to see the results.

The power consumption numbers are a little odd.  150W vs 126W for the 5600x in cinebench, but both are 65W parts.  But, 107W vs 134W for the 5600x in prime95.

Maybe the 5600x is not using it's full 65W in cinebench, but I'd expect both to be at their respective power limit in p95?


----------



## AnarchoPrimitiv (Aug 3, 2021)

I'm a bit confused, on the iGPU performance page, it says the following:

"That's also the reason why we include an *additional *data point, DDR4-3200, to get a feel for how dropping memory speed from DDR4-3800 to the more affordable DDR4-3200 impacts the FPS rates. "

Maybe I'm wrong, but wouldn't the use of the word "additional" mean that the benchmarks with the 3200Mhz RAM speed are IN ADDITION to the 3800Mhz RAM Speed benchmarks?  So if that's the case, where are the 3800Mhz RAM speed iGPU benchmarks?  I only see 3200 Mhz.  I really wanted to see if goin up to 3600Mhz+ would result in better performance, anyone know of another review that does this?


----------



## AusWolf (Aug 3, 2021)

Zubasa said:


> GN recently reviewed the OEM only 5800 non-X, the most efficient CPU right now.
> The problem with a lot of desktop boards is they want to shove as much power into the CPU as possible so that their board bench higher.
> Some boards back at Zen2 launch even under reported the actual power draw of the CPU and try to trick the CPU in to drawing more power.


I know, I accounted for deviations - though my Asus Tuf B550M-Plus Wifi (and B560M-Plus Wifi) is pretty great with power targets and reporting accuracy.

The problem is, AMD has an 88 W power target on their 65 W TDP CPUs, which is too much for the small chiplets when your airflow is restricted. Intel's larger monolithic dies spread the heat more evenly, resulting in a CPU that's much easier to cool. Of course reviews only look at CPUs on an open test bench with proper cooling, but the data generated this way isn't representative of SFF cases. Just because a tower cooler can run a CPU it doesn't automatically mean that said CPU is good enough for a SFF build with lesser cooling. I learned this the hard way.

Another problem is that AMD's most efficient CPUs are not available for DIY for some reason.



blu3dragon said:


> Good review and interesting to see the results.
> 
> The power consumption numbers are a little odd.  150W vs 126W for the 5600x in cinebench, but both are 65W parts.  But, 107W vs 134W for the 5600x in prime95.
> 
> ...


AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).


----------



## newtekie1 (Aug 3, 2021)

AnarchoPrimitiv said:


> So if that's the case, where are the 3800Mhz RAM speed iGPU benchmarks? I only see 3200 Mhz.


The green line are the 3800MHz benchmarks.


----------



## Zubasa (Aug 3, 2021)

AusWolf said:


> AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).


PPT is similar to Intel PL2 that there is a boost duration for OEM systems.
DIY motherboard tends to just ignore it and let the CPU boost perpetually, that is an issue a lot of SFF builds run into.
This behavior was introduced on Zen2 after Intel allows motherboard makers to run MCE.
Zen1 CPUs on stock did strictly obey their power rating.


----------



## blu3dragon (Aug 3, 2021)

AusWolf said:


> AMD's TDP unlike Intel's, doesn't reflect power consumption. Most 65 W TDP AMD parts are actually set up with an 88 W default power limit (PPT).


My point was that both have the same power limit.  So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).


----------



## AusWolf (Aug 3, 2021)

blu3dragon said:


> My point was that both have the same power limit.  So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).


Sorry, my bad. This is actually an interesting observation.



Zubasa said:


> PPT is similar to Intel PL2 that there is a boost duration for OEM systems.
> DIY motherboard tends to just ignore it and let the CPU boost perpetually, that is an issue a lot of SFF builds run into.
> This behavior was introduced on Zen2 after Intel allows motherboard makers to run MCE.
> Zen1 CPUs on stock did strictly obey their power rating.


I'm not sure about that. AMD doesn't even state what the power target is, as their TDP doesn't have power in the formula. It would be nice to test an OEM system for CPU power consumption, and see how it does compared to every 65 W DIY system's 88 W PPT.


----------



## TheoneandonlyMrK (Aug 3, 2021)

BSim500 said:


> ^ Yeah this has always been the problem with 'premium' priced APU's. $360 is almost 3-4x the price that the 3200G / 3400G were and makes little sense for budget gamers vs buying a cheaper CPU + GPU unless they really want to pay through the nose for a niche slim ITX build in a case like the Inwin Chopin. I bought an i5-10400F for £124 and GTX1660 for £159 (total £283) with 4-5x the performance and certainly wouldn't spend anywhere near the same money on "between GT1030 and GTX1050" class performance (which is what this APU has) vs simply buying a cheap 1050Ti / 1060 on Ebay. MOAR CORES doesn't do a thing for low-end gaming with such strong GPU bottlenecks, you just sit there with impressively low CPU usage to match the impressively low frame-rates (20-50fps at 1080p in most games here). And budget gamers tend to not have 3800 speed RAM lying around so either more money on top for a possible RAM upgrade (or lower performance for typical budget 2666-3200 modules) needs to be factored too. What would have been interesting is if AMD had released a cheap 5300G for the same price as the 3200G (£79 at one point) during the worst of the GPU shortages, but they're refusing to sell that even now (outside of OEM), so even after 2-3 years there's still no real "upgrade" to the 3200G / 3400G at anywhere near the same price point. People who can't afford £80 CPU's + £150 GPU's tend to not buy +£300 APU's with half the performance...
> 
> 
> Yes that's exactly how it works. The iGPU "memory size" is just the "window" the game sees. If a game needs 2GB VRAM but you have it set to 512MB VRAM, then it will use +1.5GB more from regular RAM instead of VRAM (which for iGPU's is the same thing). ie, if a game uses say 2GB VRAM and 3GB system RAM, and you lower APU "VRAM" size from 2GB to 512MB in the BIOS, it will appear to use only 512MB VRAM in MSI Afterburner, etc, but the 3GB system RAM will increase to 4.5GB RAM as more system RAM gets used as an "overflow".


You bought a 1660 for 149£ when and where, this year?!.
While I agree with you on the premise it's too dear, in this market, it probably isn't.


----------



## W1zzard (Aug 3, 2021)

AnarchoPrimitiv said:


> Maybe I'm wrong, but wouldn't the use of the word "additional" mean that the benchmarks with the 3200Mhz RAM speed are IN ADDITION to the 3800Mhz RAM Speed benchmarks? So if that's the case, where are the 3800Mhz RAM speed iGPU benchmarks? I only see 3200 Mhz. I really wanted to see if goin up to 3600Mhz+ would result in better performance, anyone know of another review that does this?


3800 MHz = green
3200 MHz = brown

reworded the text slightly to help with that


----------



## Alpha_Lyrae (Aug 3, 2021)

Vega is definitely dated, but given the constraints of DDR4, integrated RDNA2+ is best paired with DDR5/LPDDR5. I look forward to AM5 APUs and FP7+ laptops with integrated RDNA2.

You still managed to eke out a 7% increase at 2.4GHz gfx clock, though that also improves whole architecture performance (caches, raster, pixel, geometry engines) before going out to DDR4. Pixel engine caching in L2 does work in Vega, but pales in comparison to RDNA2's cache subsystem and overall improvements.

iGPU is also heavily impacted by single-core performance at 720-1080p in gaming. I'd be interested in seeing 4.9GHz+ single core (like a 5900X), but I don't know if it's doable.



blu3dragon said:


> My point was that both have the same power limit.  So why would the 5700G use more power in cinebench, but then less power than the 5600x in p95 (and also less power than it uses in cinebench as well).



If prime95 is hammering memory, 5600X is MCM and IOD is 1-hop away. The extra power consumption on 5600X can be explained by IF PHYs being active between CCD and IOD, whereas 5700G is a monolithic design.


----------



## Dredi (Aug 3, 2021)

W1zzard said:


> somebodys_kid said:
> 
> 
> > Have I read this correctly? All 20 PCI Express lanes from the CPU are available (albeit in GEN 3), unlike the 3400G which only had 12 available?  And does that mean that 4x4x4x4 bifurcation is available for the primary PCI Express x16 slot?
> ...


Care to test this? Just plop a 4x4 nvme adapter card on it and check if all drives are available.


----------



## W1zzard (Aug 3, 2021)

Dredi said:


> Care to test this? Just plop a 4x4 nvme adapter card on it and check if all drives are available.


Doesn't that require a special motherboard too?


----------



## persondb (Aug 3, 2021)

This is great performance for an APU, but really too expensive for what the iGPU offer.
For cheap builds, the high cost of this just really makes no sense as you can just buy a cheaper CPU and use the money saved on a cheap GPU that's still much faster than the iGPU. For stuff like office PCs then this is just plainly overkill anyway too.


----------



## InVasMani (Aug 3, 2021)

Something that you could do on this APU paired with discrete graphics  is perform a desktop resolution upscale form the active signal resolution with AMD Radeon's super virtual resolution giving it a bit like a SSAO upscale or DSR rendered off the iGPU. Basically the iGPU would hardware itself would be untapped otherwise so it's a good way to harness that hardware and make active use of it. It would be a bit like FSR, but with no overhead on the discrete GPU because the iGPU would be taking the cloned desktop resolution from the discrete graphics output and then doing upscale to it on the iGPU. 

Basically a more powerful mClassic since I'm sure the APU has much more upscale processing power than that solution. It's not trying to render the scene, but rather taking individual frames and doing upscale on them in real time so basically you could perform AA on the iGPU and save perform on the discrete card from having to do so.


----------



## Tom Yum (Aug 3, 2021)

I have a 4650G in an ASRock X300 Deskmini case so the 5700G isn't enough of an upgrade for me, but they are great APU's. I think people that consider them as a cheap gaming option are missing the point a little, that is what the 5300G is for. The 5700G is for small form factors that can't physically fit a dGPU but need a lot of processing punch. I love that my X300 is the size of my hand. 

And if you consider new parts rather then second hand (which is an unfair comparison, some people or businesses care about warranty), $360 for a pseudo-5700X plus 1050 equivalent is a bargain. Even a 1030 is still going for $120-130 these days which is ridiculous. That means you are getting a 5700X equivalent for ~$240, and can use form factors you can't use for a 11400f plus GT 1030 build.


----------



## Mussels (Aug 3, 2021)

Price? meh
product? oh hell yeah

The IGP really needs an upgrade (coming from the guy with a 3090...) but theres a LOT of people who need a good CPU with an average IGP, and this fits the bill really well for them


----------



## mechtech (Aug 4, 2021)

Chrispy_ said:


> The point of the 5700G is for the IGP, but Vega8 is so dated and shit that it's pointless.
> 
> I bought a budget 2500U with dual-channel DDR4-2400 for Christmas in 2017 and almost 4 years later all AMD's can be bothered to put in their top end APU is the same shit but clocked about 60% higher. The only difference is faster DDR4 is now available and a desktop socket gets more power budget to play with.
> 
> RDNA APUs already exist in AMD's lineup but us DIY consumers get the left-overs and scrapings!


While I agree, it still doubles intels.  Also it's a pretty small igp for the tier of chip.  For the price and 8 core config, I think most users will be using a dedicated card with this anyway.

So ya would have been nicer/better with RDNA/RDNA2, but I don't think it matters too much in regards to the market segment............ie compared to intel.

Now..................however, for something that would be only an igp/APU system, I would like something like a 3300x cpu with an RDNA2 igp with 20 CU's (1280 shaders) for $200 ish.  I think that would be nice, especially for HTPC, barebones gaming machines, etc.


----------



## Crackong (Aug 4, 2021)

Have been running the 4750G in a A300 for a year now to host some VMs.
Really these are the best CPUs right now for size constrained tiny PCs.


----------



## Zubasa (Aug 4, 2021)

W1zzard said:


> Doesn't that require a special motherboard too?


Not really, even some X370 board supports Bifurcation.
I know the Asrock X370 board I had has that option, so does my current X399.


----------



## ixi (Aug 4, 2021)

Thanks for review! Just wondering, does it really have better fps with 2400MHz kit than with 3200MHz and 3800?

I'm surprised that adding 1.5GB to iGPu doesnt help. I guess RAM is the bottleneck or vega


----------



## Dredi (Aug 4, 2021)

W1zzard said:


> Doesn't that require a special motherboard too?


No. If you have some of the threadripper boards still on hand, some came with such an adapter board that houses 4 nvme drives at 4x speed each, with no chips in play. The multi drive support in these devices is done via the processor bifuricating the 16x slot twice. If they work, a 4x4 slot pcie motherboard would be possible to manufacture (probably does not exist), but you can also just buy some mini-itx board and a slot splitting riser. 

Something like this for example: https://www.asus.com/Motherboards-Components/Motherboards/Accessories/HYPER-M-2-X16-CARD-V2/

Gigabyte has bundled such cards with their threadripper boards as well.


----------



## W1zzard (Aug 4, 2021)

ixi said:


> does it really have better fps with 2400MHz kit than with 3200MHz and 3800?


It has the best FPS with 3800, 6% lower with 3200, even lower with 2400 (not sure exactly how much)



ixi said:


> I'm surprised that adding 1.5GB to iGPu doesnt help


The way this works is that you can either completely allocate one chunk of memory to the IGP, which is then inaccessible from the rest of the system. Or you use a small initial chunk and the driver dynamically reserves more memory as required


----------



## neblogai (Aug 4, 2021)

ixi said:


> Thanks for review! Just wondering, does it really have better fps with 2400MHz kit than with 3200MHz and 3800?



It looks a bit confusing in the charts- but that is with DDR4 3800, and iGPU OC at 2400MHz, not DDR4 at 2400.


----------



## W1zzard (Aug 4, 2021)

neblogai said:


> It looks a bit confusing in the charts- but that is with DDR4 3800, and iGPU OC at 2400MHz, not DDR4 at 2400.


Ooooooh, now I understand, any suggestions how to improve the chart bar labels?


----------



## LTUGamer (Aug 4, 2021)

What is point to put "high" performance IGP in Ryzen 7 (Core config: 512:32:8, 2000MHz) and put low performance IGP in Ryzen 3 (Core config: 384:24:8, 1700MHz). Such GPUs could be useful for Ryzen 7 owners which don't care about performance at all. However such graphics (Core config: 512:32:8, 2000MHz) could be paired with Ryzen 3. That could be useful for budget gamers or for those who have plans to upgrade graphics later


----------



## Bomby569 (Aug 4, 2021)

Sure it may be helpfull to a lot of people, especially now. But i just wanted them to put half a rdna core in all CPU's to be able to trouble shoot my pC


----------



## Mussels (Aug 4, 2021)

Bomby569 said:


> Sure it may be helpfull to a lot of people, especially now. But i just wanted them to put half a rdna core in all CPU's to be able to trouble shoot my pC


i think they plan to for AM5, and this might well be their practise runs for how to implement it, and get feedback

IMO the regular series could do with a really basic, super low TDP GPU for say... 4K 2D output and video playback at 60Hz.
Then have the G series with lower overall TDP and a better IGP, for the ITX/AIO niche


----------



## Bomby569 (Aug 4, 2021)

Zubasa said:


> GN recently reviewed the OEM only 5800 non-X, one of the most efficient CPU right now.
> The problem with a lot of desktop boards is they want to shove as much power into the CPU as possible so that their board bench higher.
> Some boards back at Zen2 launch even under reported the actual power draw of the CPU and try to trick the CPU in to drawing more power.


If you are concerned that much about power draw (the difference is negligable especially wasting power on rgb) you really shouldn't spend this money on a CPU anyway, lot's of power on old zeons for dirt cheap.


----------



## THU31 (Aug 4, 2021)

This is a weird APU. On one hand, the CPU is almost as fast as the 5800X, but it costs less and uses less power.

On the other hand, the GPU is good pretty much only for desktop use or video playback.

Maybe with DDR5 and the 3D cache will allow them to put more CUs in. 4C/8T with 16 CUs would be really nice with enough memory bandwidth.


----------



## Dredi (Aug 4, 2021)

THU31 said:


> This is a weird APU. On one hand, the CPU is almost as fast as the 5800X, but it costs less and uses less power.
> 
> On the other hand, the GPU is good pretty much only for desktop use or video playback.
> 
> Maybe with DDR5 and the 3D cache will allow them to put more CUs in. 4C/8T with 16 CUs would be really nice with enough memory bandwidth.


Van Gogh, presumably found in the steam deck, is almost exactly what you wrote. We will probably see some devices featuring it early next year.


----------



## Chrispy_ (Aug 4, 2021)

mechtech said:


> While I agree, it still doubles intels.  Also it's a pretty small igp for the tier of chip.  For the price and 8 core config, I think most users will be using a dedicated card with this anyway.
> 
> So ya would have been nicer/better with RDNA/RDNA2, but I don't think it matters too much in regards to the market segment............ie compared to intel.
> 
> Now..................however, for something that would be only an igp/APU system, I would like something like a 3300x cpu with an RDNA2 igp with 20 CU's (1280 shaders) for $200 ish.  I think that would be nice, especially for HTPC, barebones gaming machines, etc.


IMO yes, the CPU/IGP balance is always wrong on APUs. They have far more CPU power than a casual user needs, and far too little silicon real-estate for the IGP.
Even at double Intels' IGP performance, a lot of the 5700G's gaming performance is sub-30fps on lowest settings. That's unacceptable enough that you may as well just not bother.

A sensible eSports APU would, for example, be a modest 4C/8T solution with perhaps 16CU (1024 unified shaders) of RNDA/Navi architecture. That would also make for the most excellent 25W general-purpose mobile part for thin&light gaming laptops. 1080p60 medium in AAA titles? Yes please!


----------



## Rebe1 (Aug 4, 2021)

Wtf just happen to CSGO test??? It is nearly impossible to achieve that high fps @ 1080p with low settings on iGpu (Vega 8), even with Ram 3800 MHz 1:1 and after any OC of the integrated graphics... 

@W1zzard could you please describe how did you test this game? Hardware Unboxed and PurePC has also recently tested R5 5600G and they have much, MUCH lower results in csgo... 

200+ fps in 1080p on Vega 8 would be outstanding result, basically good enough for competitive mm!


----------



## newtekie1 (Aug 4, 2021)

Rebe1 said:


> @W1zzard could you please describe how did you test this game? Hardware Unboxed and PurePC has also recently tested R5 5600G and they have much, MUCH lower results in csgo...


Hardware Unboxed used medium settings, PurePC used high settings, it makes a pretty big difference.


----------



## Shatun_Bear (Aug 4, 2021)

persondb said:


> ...and use the money saved on a *cheap GPU that's still much faster* than the iGPU.



Anything over 1030 level performance is seriously overpriced for the last several months so I dont think there is any good value in the dGPU market right now.

GTX 1650's are selling for £200 used here, which is outrageous. That level of performance would be a worthwhile step up from this. If you try to go really old, like RX 580s, they go for around the same for 8gb models.


----------



## Rebe1 (Aug 4, 2021)

newtekie1 said:


> Hardware Unboxed used medium settings, PurePC used high settings, it makes a pretty big difference.


But not 240 vs 70 on avg fps... I tested few months back on my old gtx 960 and @1080p high vs low was ~30% fps difference, not 200%


----------



## Dredi (Aug 4, 2021)

Rebe1 said:


> But not 240 vs 70 on avg fps... I tested few months back on my old gtx 960 and @1080p high vs low was ~30% fps difference, not 200%


HU uses, to my knowledge, some recorded pro games as input. Many sites use some synthetic cs:go demo sequence. I have no idea what tpu or the other mentioned sites use.


----------



## mechtech (Aug 5, 2021)

Chrispy_ said:


> IMO yes, the CPU/IGP balance is always wrong on APUs. They have far more CPU power than a casual user needs, and far too little silicon real-estate for the IGP.
> Even at double Intels' IGP performance, a lot of the 5700G's gaming performance is sub-30fps on lowest settings. That's unacceptable enough that you may as well just not bother.
> 
> A sensible eSports APU would, for example, be a modest 4C/8T solution with perhaps 16CU (1024 unified shaders) of RNDA/Navi architecture. That would also make for the most excellent 25W general-purpose mobile part for thin&light gaming laptops. 1080p60 medium in AAA titles? Yes please!


Agreed.  Especially since all the 8-20CU cards don’t exist anymore!!   Should have something to replace them with.  If they won’t make dedicated cards in that size the igp should be.

I would be willing to bet if they put out a 14-16CU RDNA2 card with 4GB of gddr5 on a 128-bit bus on GF 12nm and no ray tracing or anything with a price of $130 US they would probably sell tons of them.

it could replace all RX460/560 cards and probably the 470/570s as well.


----------



## AThomas (Aug 5, 2021)

Well, as I have read most of the reviews. The 3000 series was viable before the pandemic, not it's not and it's overpriced where you can find one. The 4000 series APU's are basically non-existent.

 This fits the bill for exactly what I want. I am not a gamer on PC's, never have been. That said I have wanted to play Space Engineers and none of my machines will play it, either one machine still running 32-bit Windows 7 (HTPC) so I can't increase the RAM of the machine without reinstalling Windows 7 64-bit or Windows 8.1 64-bit something I didn't to do as I wouldn't perform well anyway on that board.

 The rest of the PC's are Intel and too thin for discrete GPU. 

 In any event, I can finally build some powerful machines, take advantage of multi-threaded performance and play some games, mainly emulation.  Rebuild the media server to do much more than it's ever done and had to split to another machine. 

 The problem now is these when paired with fast memory are not any less than pre-builts (when it include all the hardware to have a working PC) which don't allow to take advantage of overclocking.


----------



## W1zzard (Aug 5, 2021)

Rebe1 said:


> But not 240 vs 70 on avg fps... I tested few months back on my old gtx 960 and @1080p high vs low was ~30% fps difference, not 200%












just standing on dust2, around 200 fps


----------



## pexxie (Aug 5, 2021)

@W1zzard Any clue what the max pixel clock or max digital resolution is on that IGP? AMD don't seem to provide any of this. For work purposes; I'm wondering if it can handle dual 4K60hz displays, which would mean a max digital res of at least 7680x2160@60hz. Or 2x 540Mhz (or 1x 1080Mhz?) pixel clocks. I guess it depends on the motherboard's display output limits too, but I'm not sure to what extent.

Thank you in advance for any insights.

_EDIT: It could actually be one big single pixel clock, like at 1080Mhz - not yet familiar with how this aspect works exactly._


----------



## W1zzard (Aug 5, 2021)

pexxie said:


> Any clue what the max pixel clock or max digital resolution is on that IGP?


No idea, but since it's based on Vega I'd say 2x 4K60 at least, probably much higher.

https://www.techpowerup.com/gpu-specs/docs/amd-vega-architecture.pdf page 13


----------



## mechtech (Aug 5, 2021)

Same price as 5800x in canada since 5800x on sale.   Also 5700g $360 is is about 440$ can. Not 500 Newegg.









						AMD Ryzen 7 5800X - Ryzen 7 5000 Series Vermeer (Zen 3) 8-Core 3.8 GHz Socket AM4 105W Desktop Processor - 100-100000063WOF - Newegg.com
					

Buy AMD Ryzen 7 5800X - Ryzen 7 5000 Series Vermeer (Zen 3) 8-Core 3.8 GHz Socket AM4 105W Desktop Processor - 100-100000063WOF with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.ca
				












						AMD Ryzen 7 5700G - Ryzen 7 5000 G-Series Cezanne (Zen 3) 8-Core 3.8 GHz Socket AM4 65W AMD Radeon Graphics Desktop Processor - 100-100000263BOX - Newegg.com
					

Buy AMD Ryzen 7 5700G - Ryzen 7 5000 G-Series Cezanne (Zen 3) 8-Core 3.8 GHz Socket AM4 65W AMD Radeon Graphics Desktop Processor - 100-100000263BOX with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.ca


----------



## Xuper (Aug 6, 2021)

This CPU doesn't support AV1 , apple / Intel / Nvidia support AV1 yet AMD RDNA2 is the only GPU that supports AV1 , WTF is this? AMD should explain , someone buy apu and 5 or more years later , buyer will notice that 5700g doesn't support av1 , What's point of buying apu if buyer wants to enjoy it ?


----------



## TheGuruStud (Aug 6, 2021)

Xuper said:


> This CPU doesn't support AV1 , apple / Intel / Nvidia support AV1 yet AMD RDNA2 is the only GPU that supports AV1 , WTF is this? AMD should explain , someone buy apu and 5 or more years later , buyer will notice that 5700g doesn't support av1 , What's point of buying apu if buyer wants to enjoy it ?


I've never used av1 in my life, don't plan to and it doesn't matter when the CPU is beastly. 
It's an oem cpu that's available for enthusiasts, b/c supply is good, now.


----------



## londiste (Aug 6, 2021)

TheGuruStud said:


> I've never used av1 in my life, don't plan to and it doesn't matter when the CPU is beastly.
> It's an oem cpu that's available for enthusiasts, b/c supply is good, now.


Are you sure? Netflix, Youtube and Facebook should all be using AV1 if supported...
Hardware decoder is definitely better than going to the CPU for it, if not faster then much-much more efficient.

However, it is only latest iterations of GPUs that have hardware AV1 decoding, in addition to RDNA2 Nvidia's Ampere and Intel's Xe. We have known for a while that 5000G is still Vega. So not that big of a surprise.


----------



## TheGuruStud (Aug 6, 2021)

londiste said:


> Are you sure? Netflix, Youtube and Facebook should all be using AV1 if supported...
> Hardware decoder is definitely better than going to the CPU for it, if not faster then much-much more efficient.
> 
> However, it is only latest iterations of GPUs that have hardware AV1 decoding, in addition to RDNA2 Nvidia's Ampere and Intel's Xe. We have known for a while that 5000G is still Vega. So not that big of a surprise.


YouTube still uses avc, vp9, 264 if av1 unsupported (also resolution dependent I think). You can also force 264.

These new encoders are useless to me. The bitrate is cut down too far using the BS hype of "50% more efficient" or whatever they make up. Quality just gets worse. You can pry x264 from my cold, dead, hands!


----------



## Mussels (Aug 6, 2021)

Xuper said:


> This CPU doesn't support AV1 , apple / Intel / Nvidia support AV1 yet AMD RDNA2 is the only GPU that supports AV1 , WTF is this? AMD should explain , someone buy apu and 5 or more years later , buyer will notice that 5700g doesn't support av1 , What's point of buying apu if buyer wants to enjoy it ?


gunna be years before this is needed... and also, are you really comparing the Nvidia 30 series stupidly priced cards vs an APU, with a niche feature like AV1 support?
these CPUs can smash it out in software mode without an issue

You really gunna get a budget APU for 8K video playback HTPCs, for content that doesnt exist yet?


----------



## londiste (Aug 6, 2021)

Mussels said:


> and also, are you really comparing the Nvidia 30 series stupidly priced cards vs an APU, with a niche feature like AV1 support?


How about Intel CPUs with Xe iGPU that are competing directly to these APUs?


----------



## Chrispy_ (Aug 6, 2021)

londiste said:


> How about Intel CPUs with Xe iGPU that are competing directly to these APUs?


Xe has it, but chances are good Xe will be superseded by something newer before AV1 is needed for mainstream video playback.

Given that you can play back AV1 in software on an Pentium Silver dual-core, I don't think any of these 8C/16T Zen3 chips are going to be burning more than 10W playing back AV1, at which point, is there really any benefit to wasting silicon area on it?

For an ultraportable using a fanless 7.5W CPU then sure, efficiency is everything but these are desktop chips with a 65W TDP connected to mains power at all times, and RDNA-based updated APUs are coming next gen so it's not like this will be a problem going forwards. For the product lifespan of the 5700G, AV1 hardware decode is unnecessary.


----------



## mechtech (Aug 6, 2021)

Only other things I wish W!zz put in would be Witcher 3 just cause it goes back a ways and gives something for older comparison.  And perhaps a first gen Zen like 1700 or something.


----------



## InVasMani (Aug 6, 2021)

DSC 1.2a over DP 1.4 on one of these APU's using AMD Virtual Super Resolution on the APU for the desktop resolution with desktop cloning from a discrete GPU would be neat. Upscale the discrete GPU's active signal resolution passed over to the APU via desktop cloning with a desktop resolution upscale via the Virtual Super Resolution. In fact you could turn on Radeon Boost for the discrete GPU on something like a RX6600 while the APU performs virtual super resolution which then offsets it in real time. It think it would be quite interesting in practice. I'd be quite keen to see it done.


----------



## Liquid Cool (Aug 9, 2021)

Thanks for the comprehensive review W1z.  Some of the guys on the forum are big fans of the APU's...and I am definitely one of them.  Very Appreciative....

The Ryzen 5 5600G is the first processor I've purchased directly from AMD.  I usually just grab one from Newegg, but they were sold out at the time and I wasn't taking any chances.  Before I started this post I did check Newegg's inventory and they are back in stock now.  The 5700G is in stock as well.

To be honest, I'm happy with the purchase, but I was also very happy with my Ryzen 3 3200G.  Although, considering the premium the 3200G's are selling at...the $100 upgrade to the Ryzen 5 5600G seemed like a no brainer. 




I sold mine a couple of weeks ago...they might still be going for this, or a little higher or lower...I don't know I haven't looked, but mine was only up for a few minutes before it sold. 

Considering I have a B450 series motherboard, this will be probably be the last processor I purchase for quite some time. 

Best Regards,

Liquid Cool


----------



## londiste (Aug 9, 2021)

Liquid Cool said:


> Considering I have a B450 series motherboard, this will be probably be the last processor I purchase for quite some time.


I was about to ask whether your B450 board supported 5600G but checking the compatibility list, it actually does. Then checked my previous B450 board and it also supports 5600G which was a nice surprise. Both Ryzen 3000 and Ryzen 5000 launches were a mess (and Ryzen 4000G was largely irrelevant with APU limitations and availability) at least when it comes to boards I had - support for new processors was spotty, late and occasionally problematic. AMD has really done well with support this time around.


----------



## Zubasa (Aug 9, 2021)

londiste said:


> I was about to ask whether your B450 board supported 5600G but checking the compatibility list, it actually does. Then checked my previous B450 board and it also supports 5600G which was a nice surprise. Both Ryzen 3000 and Ryzen 5000 launches were a mess (and Ryzen 4000G was largely irrelevant with APU limitations and availability) at least when it comes to boards I had - support for new processors was spotty, late and occasionally problematic. AMD has really done well with support this time around.


Also, since these APUs only support PCI-E 3.0, it is logical to use a B450 board in a budget build.


----------



## Chrispy_ (Aug 9, 2021)

Zubasa said:


> Also, since these APUs only support PCI-E 3.0, it is a logical to use a B450 board in a budget build.


Not only that, but there are almost no PCIe 4.0 SSDs that make financial sense for an APU, coupled with the near-imperceptible benefit that PCIe 4.0 SSDs bring compared to PCIe 3.0 for general consumer workloads.

It's the difference between an app taking 4.9s and 5.0s to load, or an autosave taking 850ms instead of 900ms. No single user is going to care


----------



## AlB80 (Aug 14, 2021)

> Integrated Radeon Vega Graphics​AMD updated a few things.


What about "Scan Converters" and "Packers"? Or did the author want to leave us in the dark?


----------



## Tom Sunday (Aug 24, 2021)

I don't really care about either AMD or INTEL coming out with more powerful or refreshed desktop CPU chips. They have no value for me. Per Gartner desktop PC shipments have been dropping steadily from 157 million shipped worldwide in 2010 to just 79 million in 2020. Any PC growth at all in 2020 was strictly based on the robust mobile PC market, with "mobile PCs" showing a 49% growth.

NVIDIA it seems puts no real value on PC desktop type GPU's either since it's less than 1% of its market share. It's more money in their pockets to overwhelmingly produce automotive, cellular, consumer appliances, router and laptop chips which have a market share of over 90%. This is what people really need and can't do without. For the better part of a decade, for me it has always been a Alienware laptop all of the time. Or using my company laptop at same time.Then occasionally I started working from the desktop PC that had been gathering dust in my home office. No thank you.

Then recently I purchased a premium 4K chromebook for further easy use in my kitchen, garden and prolonged waiting at the DMV. I am by nature a very lazy person and want instant convenience,100% portability and availability. I am confident that mobiles will continue to be the preferred computing method for the masses. Working from a physical office and working from home, (its the new norm) desktop PCs will even further lose out in the marketplace. I guess it's called reality and not good news for the dinosaur which the desktop PC has already become or is surely heading for!


----------



## Bomby569 (Aug 25, 2021)

Tom Sunday said:


> I don't really care about either AMD or INTEL coming out with more powerful or refreshed desktop CPU chips. They have no value for me. Per Gartner desktop PC shipments have been dropping steadily from 157 million shipped worldwide in 2010 to just 79 million in 2020. Any PC growth at all in 2020 was strictly based on the robust mobile PC market, with "mobile PCs" showing a 49% growth.
> 
> NVIDIA it seems puts no real value on PC desktop type GPU's either since it's less than 1% of its market share. It's more money in their pockets to overwhelmingly produce automotive, cellular, consumer appliances, router and laptop chips which have a market share of over 90%. This is what people really need and can't do without. For the better part of a decade, for me it has always been a Alienware laptop all of the time. Or using my company laptop at same time.Then occasionally I started working from the desktop PC that had been gathering dust in my home office. No thank you.
> 
> Then recently I purchased a premium 4K chromebook for further easy use in my kitchen, garden and prolonged waiting at the DMV. I am by nature a very lazy person and want instant convenience,100% portability and availability. I am confident that mobiles will continue to be the preferred computing method for the masses. Working from a physical office and working from home, (its the new norm) desktop PCs will even further lose out in the marketplace. I guess it's called reality and not good news for the dinosaur which the desktop PC has already become or is surely heading for!



You couldn't have a worst timming man. Anytime in recent history and this would be true, not now, not anymore. Desktop has never been this strong, never.


----------



## danc (Aug 30, 2021)

Hey W1zzard, i like to add about the dedicated VRAM. My findings it does affect game performance, provided the games tested do not already use heavy gobs of vram.

A good test is Unigine Heaven 4.0, extreme at 1600:900, it uses about 1GB+ Vram only 

Between the auto (512mb)  to dedicated (2GB), the minimum fps went from 6.5fps to 15fps on the 2GB dedicated. The final scores went from 560 to 590.

So yes, you can lose performance on old engine and i assume emulators, if you left the vram on auto.


----------



## Condelio (Nov 22, 2021)

RedelZaVedno said:


> $360 for 5700G is just too much if you're building gaming PC. You can opt for GTX 1650 + 10400F or 2nd hand 570 4gb + 10700F combo around the same budget and have much better gaming experience.
> 
> AMD needs to offer Zen3 + 12 or 16 CU RDNA1/2 APU to make me interested.


It's interesting to see how value propositions vary so wildly depending where you live. In my country a 5700g (that costs the same as a 12600k) costs more or less half of a used rx570-4gb alone.

A new 5600g (depending stock, or the face of the moon) sometimes costs the same as 5700g, sometimes 4/5 its cost.


----------



## msroadkill612 (Nov 29, 2021)

somebodys_kid said:


> Have I read this correctly? All 20 PCI Express lanes from the CPU are available (albeit in GEN 3), unlike the 3400G which only had 12 available?  And does that mean that 4x4x4x4 bifurcation is available for the primary PCI Express x16 slot?


no u havnt or its wrong - the apu loses 8 lanes to the desktop ryzens

ur mobo may still let u bifurcate the 8 lane slot tho to x4x4


----------



## jeremyshaw (Nov 29, 2021)

msroadkill612 said:


> no u havnt or its wrong - the apu loses 8 lanes to the desktop ryzens
> 
> ur mobo may still let u bifurcate the 8 lane slot tho to x4x4


4000G and 5000G series seem to have 16+4+4 PCIe (main PCIe slot, usually; main M.2 slot, usually; chipset) lanes available for use, same as all mainstream/enthusiast Zen/Zen+/Zen2/Zen3 desktop CPUs on AM4. 2000G/3000G only has 8+4+4.

Laptop APU variants may be more limited, but that is another issue.


----------



## Mussels (Nov 29, 2021)

jeremyshaw said:


> 4000G and 5000G series seem to have 16+4+4 PCIe (main PCIe slot, usually; main M.2 slot, usually; chipset) lanes available for use, same as all mainstream/enthusiast Zen/Zen+/Zen2/Zen3 desktop CPUs on AM4. 2000G/3000G only has 8+4+4.
> 
> Laptop APU variants may be more limited, but that is another issue.


^ This is correct, 4000G and 5000G series have the full x16, just at gen 3.0


----------

