• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ryzen Owners Zen Garden

Curious what peoples experience with the igpu on the 7800X3D is like.

Other than I can't work out how to disable it in bios (only deactivate in windows) I was wonder what use I'd even have for it being active (I have a 6800XT in the system)

I considered re-enabling it for my 4k second screen (only 60hz) and using the onboard output of the motherboard, but I feel like that would create some unforeseen crazy.

I game on my primary (144hz 1080p) and put all my streaming stuff on the 4k screen (slobs, ground control, discord, chat window, a yt feed of noncopyrighted music)

Anyone got any input ?
 
110W?
5800x3D has a 142W limit


That's how mine behaves, i'm confused by GuruStuds claims as its not matching how I know the 5800x3D to behave
Right you are. I had 110 stuck in my head (probably last OC cpu was 110 max).
Me has no good memory and it's been a while.
You're also forgetting Asucks board.
I did a lot of testing and this yields the highest MT scores. Regular pbo +200 was pretty meh at 4.4.
 
I never saw a ryzen 5800x3d over 105 watts for gaming useless anyway.

PPT:65 Watt is nice :)
 
Curious what peoples experience with the igpu on the 7800X3D is like.

Other than I can't work out how to disable it in bios (only deactivate in windows) I was wonder what use I'd even have for it being active (I have a 6800XT in the system)

I considered re-enabling it for my 4k second screen (only 60hz) and using the onboard output of the motherboard, but I feel like that would create some unforeseen crazy.

I game on my primary (144hz 1080p) and put all my streaming stuff on the 4k screen (slobs, ground control, discord, chat window, a yt feed of noncopyrighted music)

Anyone got any input ?
It's handy for fault finding if nothing else. As you have an AMD graphics card, nothing crazy should happen if you enable it.
Not sure why you can't disable it in the UEFI, at least my motherboard has such an option.
Reading the manual, you have to go into the AMD CBS option under Advanced, then go in under NBIO Common Options and find iGPU Configuration where you should be able to disable it.
 
Last edited:
Curious what peoples experience with the igpu on the 7800X3D is like.

Other than I can't work out how to disable it in bios (only deactivate in windows) I was wonder what use I'd even have for it being active (I have a 6800XT in the system)

I considered re-enabling it for my 4k second screen (only 60hz) and using the onboard output of the motherboard, but I feel like that would create some unforeseen crazy.

I game on my primary (144hz 1080p) and put all my streaming stuff on the 4k screen (slobs, ground control, discord, chat window, a yt feed of noncopyrighted music)

Anyone got any input ?
I use it to connect my secondary display (a 7" 1024x600 touchscreen) which lowers the idle power consumption on my 7800 XT considerably (from 35-45 W to 11-12 W).

The only "unforeseen crazy" it creates is that the "Streaming" tab with all its options disappears from the Adrenalin driver menu, which a known bug that AMD is working on. But I don't stream, so whatevs. :)
 
Last edited:
Curious what peoples experience with the igpu on the 7800X3D is like.

Other than I can't work out how to disable it in bios (only deactivate in windows) I was wonder what use I'd even have for it being active (I have a 6800XT in the system)

I considered re-enabling it for my 4k second screen (only 60hz) and using the onboard output of the motherboard, but I feel like that would create some unforeseen crazy.

I game on my primary (144hz 1080p) and put all my streaming stuff on the 4k screen (slobs, ground control, discord, chat window, a yt feed of noncopyrighted music)

Anyone got any input ?

I don't think you need to think about disabling at all. If you don't use it by plugging into it, it doesn't draw power or rob you of performance. Ryzen has gotten extremely good at power gating unused parts of the die, whether it's iGPU or cores.

The more meaningful difference is on the monolithic APUs where having a dGPU and not using iGPU will improve Fabric and UMC speeds. But even there, there's no need to manually disable the iGPU as it's completely shut down when not in use - just plug in a dGPU.

On chiplet CPU the I/O die and interconnects are claiming virtually all of your idle power (idle CCD consumes like, 0.1W), but the RDNA2 iGPU is a tiny portion of the IOD, most likely already aggressively power gated, and is a kinda far-fetched idea to try and squeeze out some power savings there. Better savings can be had by just cutting VSOC.

2CU iGPU is not a particularly impressive unit, but it'll still handle 2D work on a single screen just fine such as those uses you listed. Potentially helps all dGPUs in lowering memory clock (therefore idle power) as well, Radeon a bit more, depending on your monitor setup.

As to BIOS options, I can't really remember anywhere in AGESA either that explicitly offers iGPU disable function. Maybe there is one on AM5, maybe not.
 
I don't think you need to think about disabling at all. If you don't use it by plugging into it, it doesn't draw power or rob you of performance. Ryzen has gotten extremely good at power gating unused parts of the die, whether it's iGPU or cores.
Not to mention that the 7800X3D doesn't come anywhere near its default PPT in any load, so saving a Watt or two on the iGPU is pretty much pointless.

2CU iGPU is not a particularly impressive unit, but it'll still handle 2D work on a single screen just fine such as those uses you listed. Potentially helps all dGPUs in lowering memory clock (therefore idle power) as well, Radeon a bit more, depending on your monitor setup.
The only thing I wouldn't recommend is connecting your primary display to your iGPU and using the graphics card for 3D work / games. It works flawlessly on Nvidia, but it's kinda bugged on AMD.

But connecting your primary display to the dGPU, and the secondary to the iGPU is fine. :)

As to BIOS options, I can't really remember anywhere in AGESA either that explicitly offers iGPU disable function. Maybe there is one on AM5, maybe not.
There is one (at least on my MSi). It's hidden somewhere in the Advanced Settings section.
 
As to BIOS options, I can't really remember anywhere in AGESA either that explicitly offers iGPU disable function. Maybe there is one on AM5, maybe not.
I can check for you, but its usually just down to the choice of what display is the primary (PEG 1/2/ or IGP)

Inside the OS, they power down based on your chosen settings, so the high performance power plan keeps the PCI-E lanes at full speed while balanced will let them drop to 1.1 to save power, etc.
 
Yeah thanks ppls.

I was curious figuring that 'something' might have been happening on the igpu since windows could at least see it as a device (which I disabled). I am going to putt my 4k display on the igpu output and play some video, youtube, try it with streaming running on my gaming screen and see what kind of performance, use case I can derive out of it.

I'll report back when I have some answers (maybe metrics depending on what I can ascertain)
 
Yeah thanks ppls.

I was curious figuring that 'something' might have been happening on the igpu since windows could at least see it as a device (which I disabled). I am going to putt my 4k display on the igpu output and play some video, youtube, try it with streaming running on my gaming screen and see what kind of performance, use case I can derive out of it.

I'll report back when I have some answers (maybe metrics depending on what I can ascertain)

You can also just monitor HWInfo to see if your Package Power and SOC Power changes, with iGPU enabled and in use / with iGPU enabled and unused / with iGPU disabled. I doubt there will be more than a negligible difference between the latter two scenarios. Just don't pay attention to the power metrics under the HWInfo iGPU block ("AMD Radeon"), AMD iGPU figures have never been calculated in a useful way in HWInfo.
 
Yeah thanks ppls.

I was curious figuring that 'something' might have been happening on the igpu since windows could at least see it as a device (which I disabled). I am going to putt my 4k display on the igpu output and play some video, youtube, try it with streaming running on my gaming screen and see what kind of performance, use case I can derive out of it.

I'll report back when I have some answers (maybe metrics depending on what I can ascertain)
Anything on the IGP saves performance - it may be tiny, but it's something.
 
My 5950x does alright compared to the crappy 5800x i had. will boost to 5.150 hear and their. 4.975 often so no complaints here.
Screenshot (3).png
 
My 5950x does alright compared to the crappy 5800x i had. will boost to 5.150 hear and their. 4.975 often so no complaints here.View attachment 332560
5950X is definitely a beast the snappiest AM4 CPU you can buy. The 5900X is smooth as butter but the 5950X is like a GT1 car vs GT2. Well maybe the Silver Arrow vs the Panos.
 
Ah, ok. I see. Thinking I missed a conversation? To be fair though, a stock 5800X is an excellent CPU.
It is, I run by 5800X stock all the time and just let it boost automatically when needed.
 
I can see their point though. The "X" versions of the Ryzen CPU's are supposed to be the unlocked "better" versions and to not get a solid OC out of it would be a disappointment and let down.
True...
Though 5800X is a "special" case due to its single CCD/high heat dissipation compared to rest 5000.
If heat transfer is not happening fast enough, CO tweaking won't have as much effect and maybe this can be interpreted as bad binned.
 
True...
Though 5800X is a "special" case due to its single CCD/high heat dissipation compared to rest 5000.
If heat transfer is not happening fast enough, CO tweaking won't have as much effect and maybe this can be interpreted as bad binned.
So that's why the 5800X runs hotter than other Ryzen CPUs. Good to know, when I benchmark the CPU for example, and use HWMonitor, I notice that out of the 8 cores, 2 of them seem really hot like up to +90C while the others are hovering around 40C to 70C.

Regarding what you mentioned about the single CCD. Take a look at this.

 
So that's why the 5800X runs hotter than other Ryzen CPUs. Good to know, when I benchmark the CPU for example, and use HWMonitor, I notice that out of the 8 cores, 2 of them seem really hot like up to +90C while the others are hovering around 40C to 70C.

Regarding what you mentioned about the single CCD. Take a look at this.

We've seen occasional 2CCD 6-cores and 8-cores around for a long while. Even on TPU. They aren't any different in terms of thermals or performance - all the cores are still on CCD0, CCD1 is there but it's dud dark silicon.

If CCD1 was functional it would be very obvious and big news, because gaming (and general) performance on those CPUs would essentially return to Zen 2 and go in the absolute shitter, having to split their 6 or 8 cores between 2 CCDs.

A 30-50C delta between cores just means that you weren't loading all the cores, it wasn't an all-core workload. 10-20C delta on one CPU is plausible but less common on 6-8 core parts, just means bad luck of the draw with IHS contact unfortunately.
 
Last edited:
Yeah thanks ppls.

I was curious figuring that 'something' might have been happening on the igpu since windows could at least see it as a device (which I disabled). I am going to putt my 4k display on the igpu output and play some video, youtube, try it with streaming running on my gaming screen and see what kind of performance, use case I can derive out of it.

I'll report back when I have some answers (maybe metrics depending on what I can ascertain)
The only noticeable difference I'd expect is some power saved on your gaming GPU when idle (assuming that it's connected to a high-resolution and/or high refresh rate display). The iGPU consumes peanuts in power, which I doubt you can see in any monitoring program, and since the 7800X3D never comes close to its power limit with or without iGPU use, no performance is wasted there, either.
 
Though 5800X is a "special" case due to its single CCD/high heat dissipation compared to rest 5000.
I honestly do not know if I can agree with that. I don't have the X, but I have the X3D and it is super easy to cool.. easier than a maxed out 5600X :D
 
I honestly do not know if I can agree with that. I don't have the X, but I have the X3D and it is super easy to cool.. easier than a maxed out 5600X :D
If it's anything like the 7800X3D vs the 7700X, then I'd say that's because its lower clock speeds and much lower operating voltage.
 
If it's anything like the 7800X3D vs the 7700X, then I'd say that's because its lower clock speeds and much lower operating voltage.
But.. but.. what about that layer of cache everyone goes on about :D
 
Back
Top