• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rumor: AMD Ryzen 7000 (Raphael) to Introduce Integrated GPU in Full Processor Lineup

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
The rumor mill keeps crushing away; in this case, regarding AMD's plans for their next-generation Zen designs. Various users have shared pieces of the same AMD roadmap, which apparently places AMD in an APU-focused landscape come their Ryzen 7000 series. we are currently on AMD's Ryzen 5000-series; Ryzen 6000 is supposed to materialize via a Zen 3+ design, with improved performance per watt obtained from improvements to its current Zen 3 family. However, Ryzen 7000-series is expected to debut on AMD's next-gen platform (let's call it AM5), which is also expected to introduce DDR5 support for AMD's mainstream computing platform. And now, the leaked, alleged roadmaps paint a Zen 4 + Navi 2 APU series in the works for AMD's Zen 4 debut with Raphael - roadmapped for manufacturing at the 5 nm process.

The inclusion of an iGPU chip with AMD's mainstream processors may signal a move by AMD to produce chiplets for all of its products, and then integrating them in the final product. You just have to think about it in the sense that AMD could "easily" pair one of the eight-core chiplets from the current Ryzen 5800X, for example, with an I/O die (which would likely still be fabricated with Global Foundries) an an additional Navi 2 GPU chiplet. It makes sense for AMD to start fabricating GPUs as chiplets as well - AMD's research on MCM (Multi-Chip Module) GPUs is pretty well-known at this point, and is a given for future development. It means that AMD needed only to develop one CPU chiplet and one GPU chiplet which they can then scale on-package by adding in more of the svelte pieces of silicon - something that Intel still doesn't do, and which results in the company's monolithic dies.





That this APU integration occurs throughout the whole of AMD's lineup makes sense. For one, the integrated GPU can be used to further accelerate certain tasks on your PC. Add to that the increased reputation that AMD has garnered via its generations of Zen, and you'll see how manufacturers (OEMs especially) might now be looking amenably at integrating AMD solutions into their mainstream, high-volume manufacturing products. The full APU road also makes sense in the AM5 platform - access to DDR5 might improve performance of the integrated GPU chiplets, and it makes sense that the new chiplet pairing requires a new platform and pin layout to make work.

View at TechPowerUp Main Site
 
Poll has no option for backup GPU when you don't have a dedicated video card or just want to troubleshoot something.
 
It looks like an optional feature, considering the dashed lines in the original leak, i.e. som SKUs might have it.
 
Having a CPU with an iGPU is always great as it allows you to swap discrete GPUs any time you want and also you can withdraw from gaming temporarily (which is sometimes quite difficult) by selling your discrete GPU.
 
Is mostly useless unless PS5 shrink to 5nm 150mm2 add HBM2e 16GB and 2TB flash under the HS. yeah.
 
Next is replacement of the fpu - remember AMD apus can write at the same address space that the cpu is doing. It will bring so much power to the mix since integer and floating point units will split up and workloads won't creep on each other's power constraints.
 
By all means it is still couple generations away, when they actually delivered GPU supplies might return normal again.
 
By all means it is still couple generations away, when they actually delivered GPU supplies might return normal again.
After couple generations away maybe will have no more generations. :D
 
It looks like an optional feature, considering the dashed lines in the original leak, i.e. som SKUs might have it.
That is my thinking as well. If Zen 4 doubles cores it will likely mean that Ryzen 3 and Ryzen 5 will have an iGPU if AMD will keep using one IO, one Core/Cache and adds one iGPU chiplet. On Ryzen 7 and Ryzen 9 that will likely use two chiplets there would not be enough physical space left to fit an iGPU chiplet there.

Unless ofcourse AMD does what Intel will do with LGA 1700 and will make AM5 socket bigger than AM4. We will see i guess.
 
It's hard to understand why AMD didn't include at least a modest 2D graphics processor on every client I/O die. Did the marketing guys and gals conclude that it's better to have nothing than to have something inferior to a Celeron's iGPU?
 
It's hard to understand why AMD didn't include at least a modest 2D graphics processor on every client I/O die. Did the marketing guys and gals conclude that it's better to have nothing than to have something inferior to a Celeron's iGPU?
Because what you claim is modest, is not that easy to do. Not only do you need the 2D GPU, but you also need to add the I/O bits required, which takes up pins, which means a larger socket/CPU. Sure, it might only be a couple of dozens of pins (for DP and HDMI), but that's still quite a few. Also, once you have a "simple" 2D GPU, people are going to want a video decoder, which takes up even more die space, as what's the point of having a 2D GPU if you can't use it to watch videos?
So it might seem like a small ask, but it makes everything more complex.

This is kind of old, but it gives you an idea of how many pins Intel dedicates to graphics.
 
It looks like an optional feature, considering the dashed lines in the original leak, i.e. som SKUs might have it.
I think the dashed lines were a moiré pattern (optical illusion), and not actually part of the picture. It's quite difficult to tell though.

I expect AMD to do something similar to what Intel does with their "F" CPUs - selling the dies with defective GPUs as CPUs without integrated graphics.
 
Having a CPU with an iGPU is always great as it allows you to swap discrete GPUs any time you want and also you can withdraw from gaming temporarily (which is sometimes quite difficult) by selling your discrete GPU.
I have old Gpu's that are no longer worth anything that cover this for me no need for an IGpu here..

Still got a radeon 4890 & evga geforce gtx 650 superclocked in a box in the closet for this purpose.
 
I think the dashed lines were a moiré pattern (optical illusion), and not actually part of the picture. It's quite difficult to tell though.

I expect AMD to do something similar to what Intel does with their "F" CPUs - selling the dies with defective GPUs as CPUs without integrated graphics.
Looks pretty dashed to me...

AMD-Raphael-Zen4-Roadmap.jpg
 
This whole thing only appears to be speculation. It sounds more like an idea thats been put in a memo and dropped on the floor hoping it will gain some traction.
 
Yet one full sure thing! AM5 will have more pins. Maybe much more. Because more cores and probably iGPU and hmm with what in architecture of cores AMD will maintain AI?
 
Because what you claim is modest, is not that easy to do. Not only do you need the 2D GPU, but you also need to add the I/O bits required, which takes up pins, which means a larger socket/CPU. Sure, it might only be a couple of dozens of pins (for DP and HDMI), but that's still quite a few. Also, once you have a "simple" 2D GPU, people are going to want a video decoder, which takes up even more die space, as what's the point of having a 2D GPU if you can't use it to watch videos?
So it might seem like a small ask, but it makes everything more complex.

This is kind of old, but it gives you an idea of how many pins Intel dedicates to graphics.
It is that hard? They have all the required IP and the AM4 socket already includes all the I/O bits required. It would be somewhat trivial (in the context of the engineering resources required to do it on a big company like AMD) to adapt GPU features equivalent to Kabini for example (128 GPU Cores, updated VCE, dual display output, audio output, etc) to the 14nm I/O die, a modest die size increase. There is probably non technical reasons to this, because technical ones are mostly clear. Maybe AMD prefers from a marketing standpoint to not provide an IGP rather than provide a mock of an IGP and look incompetent, maybe they want more differences between products to avoid cannibalization, etc.
 
Next is replacement of the fpu - remember AMD apus can write at the same address space that the cpu is doing. It will bring so much power to the mix since integer and floating point units will split up and workloads won't creep on each other's power constraints.
Why would someone do that? You can't seriously expect every floating point arithmetic to be offload to GPU and cause CPU pipeline to stall for 1000+ cycles? fpu in CPU is for the low-latency operations within the out-of-order pipeline, GPU not relevant in this context. Even something like AVX still has a strong point against GPU. They have extremely low latency and can be scheduled inside the CPU pipeline.

GPGPU also has an excellent INT32 throughput. Why didn't someone propose to offload CPU integer unit to GPGPU? Well the reason is obvious.
 
Because what you claim is modest, is not that easy to do. Not only do you need the 2D GPU, but you also need to add the I/O bits required, which takes up pins, which means a larger socket/CPU. Sure, it might only be a couple of dozens of pins (for DP and HDMI), but that's still quite a few. Also, once you have a "simple" 2D GPU, people are going to want a video decoder, which takes up even more die space, as what's the point of having a 2D GPU if you can't use it to watch videos?
So it might seem like a small ask, but it makes everything more complex.

This is kind of old, but it gives you an idea of how many pins Intel dedicates to graphics.
I/O pins are already there - or else, APUs on AM4 wouldn't be able to output video. All B550 boards have a HDMI output, many have DP too, even many X570 boards have one or both.
1080p decoding or 720p encoding for videoconferencing would put some load on the CPU - I have no idea how much, I'm just guessing that one core dedicated to video should be enough. That used to be a problem in 2007, it shouldn't be a problem now.

Managing customers' expectations would be the hardest part. Yes, you can have a nice Ryzen-based office PC with two displays without a dGPU. But no, you can't do exactly everything without a dGPU.
 
Is mostly useless unless PS5 shrink to 5nm 150mm2 add HBM2e 16GB and 2TB flash under the HS. yeah.
Yeah, I would love a powerhouse APU, something with say at least 8GB of HBM2, 8-12 CPU cores, and at least the power of 40 RDNA2 CUs....I wouldn't care if it required a socket the size of threadripper either....the most epic m-ITX builds could be made with an APU like that. But apparently, those that want such a thing are a niche market....most people don't want their CPU to be locked into a specific GPU forever, at least that's what I've been told....I could see such an APU be popular in the OEM market though, basically small gaming machines with the power of an Xbox Series X...or could you imagine what kind of GPU power could be put in the package of an Epyc CPU? 16 CPU Cores with 60-72 RDNA2 CUs (basically a 5950X+6800/6800XT with 12-16GB of HBM2e....I could imagine that it'd be hard to get some people on board with a $1500 APU though, but I could imagine such an APU could make for some interesting high density server applications
 
fpu in CPU is for the low-latency operations
I don't see how that is true since they have seperate schedulers for it. It would be zero cost for AMD to bunch them together. Notice you might be missing +1GHz this brings to the table.
Even something like AVX still has a strong point against GPU.
AVX is stupid. You cannot expand the bus interface. This isn't a clown cpu, you aren't impressing anybody. That'll deplete the muxes' power all too randomly.
GPGPU also has an excellent INT32 throughput
Gpus are supposed to run float. AMD can handle short and simple integer scheduling on its cpus.
 
Next is replacement of the fpu - remember AMD apus can write at the same address space that the cpu is doing. It will bring so much power to the mix since integer and floating point units will split up and workloads won't creep on each other's power constraints.
fpu? What do you mean by that? Electric brain substitute? (Excuse me for trying to do away with common buzzwords, E.B.S. is just an AI!DL!NPU! unit.) That would make sense as it would execute its own program code - like a GPU does, not a part of CPU's code.
 
A refresh for Zen 3 on 6nm as the ultimate, final options for AM4 makes sense, providing either more efficiency or more power and keeps AMD in the spotlight against Intel, and closes AM4 nicely while also bringing up the numbers to match their Radeon series (inb4 System 69 builds; 6900X/6900XT). That would unfortunately leave the APUs once again a half-step behind; since they're only now releasing and with no plans to refresh those until the shift to DDR5 and PCIe 5.0.
 
The server market is demanding an iGPU and this is probably why they are doing it. Epyc will come with an iGPU as sure as the sun rises in the east and sets in the west.

I think it's great to have a highly efficient yet very good iGPU to compliment the discrete GPU. And let's face integrated RDNA2 should be pretty darn good and will probably handle 1080p gaming with decent settings so the laptop market will love this.
 
Is mostly useless unless PS5 shrink to 5nm 150mm2 add HBM2e 16GB and 2TB flash under the HS. yeah.
Sure, they can customise that for you since that is what AMD does best. The question is how much are you willing to pay for it?

When they design a cheap, there are a lot of factors to consider. It is not just slapping a bunch of things on people's wish list on it and then call it a day. That's why while some ideas are nice to have, I know it won't be practical and also not cheap.

To me, even a simple iGPU will be quite helpful when it comes to troubleshooting and in times of GPU shortage. At least you can still get the display to work.
 
Back
Top