Friday, December 31st 2021

AMD's Upcoming Mobile Rembrandt APU Makes an Early Appearance

If you've been waiting for more details about AMD's next mobile platform, then you're in luck, as the motherload has dropped today, with lots of details about the new Rembrandt APU's that are launching next year. Not only has a picture of the first motherboard, with adhering laptop showed up, but we also have a mostly complete block diagram and a list of expected SKU's, even though not all SKU models are revealed as yet.

AMD's Rembrandt APU will be its first APU with PCIe 4.0 support, which in itself might not be worth the wait, but if paired with the right GPU, this might help increase the performance somewhat compared to the previous generation of APUs from AMD. The bigger news is USB4 support, plus a new GPU which we so far don't know too much about, but it's speculated that it'll be called Radeon RX 680M and should offer 12 compute units. DDR5 memory support is also expected, so Rembrandt clearly has a new memory controller, since the APU is still based on the Zen 3 architecture.
It looks like we can expect as many as 24 different SKU's, with a TDP of either 28 or 45 Watts for now, split between two different packaging types, FP7 and FP7r2. The first three models that have been revealed are the Ryzen 7 6800H with a 4.7 GHz boost clock, the Ryzen 9 6900HX with a 4.9 GHz boost clock and the Ryzen 9 6980HX with a 5 GHz boost clock. All three CPUs are said to have eight cores and 16 threads. The leaked notebook is the Alienware m17 R5 from Dell and it's said to sport up to a Radeon RX 6850M XT GPU with 12 GB of GDDR6 memory, in addition to its Ryzen 9 6980HX APU. The motherboard picture is said to have a Radeon RX 6700M with 10 GB of GDDR6, as it's from a lower-end model. It looks like we have some competition to look forward to next year between AMD and Intel in the mobile space, that is if the current shortages of everything doesn't ruin it.
Sources: VideoCardz, @ExecuFix
Add your own comment

29 Comments on AMD's Upcoming Mobile Rembrandt APU Makes an Early Appearance

#26
Chrispy_
AusWolfThank you! :toast:

Some people consider it sacrilege to say anything bad about Ampere as it is nvidia's most advanced architecture, and 8 nm Samsung is just sooo goood... but hey, people, let's look at the facts! The desktop 3060 eats about the same power as a 2070 and performs at the same level as the 2070 - not to mention that the 2070 isn't more than a 1080 with raytracing and DLSS support. Ergo, performance/watt hasn't changed since Pascal. All the higher tier chips throw efficiency out of the window. Ergo, Ampere isn't as advanced as it's advertised to be. It's a slightly reworked Turing built by Samsung instead of TSMC, nothing more.

Chips with such high power and cooling requirements should have no presence in a laptop in my opinion.
I'm genuinely curious to see what Nvidia does with the MX500-series; Raytracing performance at under 100W is barely viable on Ampere even using DLSS as a crutch - it adds die area, cost, and spreads the logic out on the die making clocks lower for any given operating voltage. This is NOT good for 25-50W laptop GPUs.

I read (here, or linked from here at least) about the MX500-series being evolutions of the Turing 16-series architecture, but refined for efficiency and using the slightly more efficient Samsung 8nm process. Again, we don't have a like-for-like comparison of TSMC12FF vs Samsung8 but I don't believe there's much efficiency to be gained from the move. Either way, any improvement is better than none and presumably Samsung8 is less constrained than anything from TSMC right now, so that should also help availability a bit too.

An MX-series chip at, say 28W isn't going to get anyone stoked, but at the same time it may match a 1650 Max-Q at 40W, and realistically that GPU can do everything any current game demands of it. Sure, you're going to be turning down settings just to get a stable 60fps but it's less about how good a 28W GPU looks and more about what you can actually do at all. If the 1650 is the lowest viable dGPU for current gaming, then getting that at 28W instead of 40W is a huge win for anyone trying to run a game on battery, or trying not to cook their testicles on the sofa. 1080p60 at low-medium settings really isn't too bad, especially when you're only seeing it on a 14" or 15" screen instead of a 27" or 32" gaming monitor.

I've said it a hundred times or more, pretty graphics are nice but they don't change the gameplay or game design. As long as the framerate is decent and the graphics settings required to meet that framerate don't look too compromised, then the experience is largely the same, just without needing to spend $2000+ and have something that's either hot/noisy/both.
Posted on Reply
#27
AusWolf
Chrispy_Again, we don't have a like-for-like comparison of TSMC12FF vs Samsung8 but I don't believe there's much efficiency to be gained from the move.
We actually do. ;) 3060 = 2070 = 1080 in both performance and power consumption. Sure, they're different designs, but does that actually matter? The former two add ray tracing and DLSS and the 3060 has +4 GB extra VRAM - so essentially, 6 RAM chips instead of 8 using a narrower bus which should theoretically need less power, but anyway...
Chrispy_I'm genuinely curious to see what Nvidia does with the MX500-series; Raytracing performance at under 100W is barely viable on Ampere even using DLSS as a crutch - it adds die area, cost, and spreads the logic out on the die making clocks lower for any given operating voltage. This is NOT good for 25-50W laptop GPUs.

...

An MX-series chip at, say 28W isn't going to get anyone stoked, but at the same time it may match a 1650 Max-Q at 40W, and realistically that GPU can do everything any current game demands of it. Sure, you're going to be turning down settings just to get a stable 60fps but it's less about how good a 28W GPU looks and more about what you can actually do at all. If the 1650 is the lowest viable dGPU for current gaming, then getting that at 28W instead of 40W is a huge win for anyone trying to run a game on battery, or trying not to cook their testicles on the sofa. 1080p60 at low-medium settings really isn't too bad, especially when you're only seeing it on a 14" or 15" screen instead of a 27" or 32" gaming monitor.

I've said it a hundred times or more, pretty graphics are nice but they don't change the gameplay or game design. As long as the framerate is decent and the graphics settings required to meet that framerate don't look too compromised, then the experience is largely the same, just without needing to spend $2000+ and have something that's either hot/noisy/both.
I agree, especially since modern games don't offer much extra on "ultra" graphics settings when compared to "high" or "medium". I remember the times when "low" was unplayably ugly, but those times are over.

I also agree that the 1650 is about the lowest viable gaming dGPU nowadays (though the 1050 and the 960 4 GB might still have a few last words), but that's about as high as I would go in a laptop in terms of power consumption. When we're talking about 5+ kg and a requirement to be constantly plugged in, we're not really talking about a laptop anymore. You might as well build a small form factor gaming PC at that level. A laptop should be all about mobility. Heavy and power-hungry components defeat its purpose.

On the other hand, big chips offer some level of configurability. When I drag the power slider down to 71% (125W) on my 2070, I only lose about 5-7% in performance. So who knows, nvidia might be onto something. Personally, I hold APUs at a much higher value than dGPUs on mobile fronts, but we'll see.
Posted on Reply
#28
Vayra86
R0H1T24 isn't much actually, though I hope it doesn't swell to 48 by the end of the year :slap:


Mobile APU's :wtf:
Yeah... all those gamers left without GPUs are most certainly going to be looking at mobile APUs with more graphical grunt / support for new stuff. Laptop market is the next thing to get higher demand really, it already is. Except laptops shall remain shit and overpriced for gaming, most likely.

But also consider, if this is possible, the option to use mobile APUs in non-mobile systems/ SFF / AIO and maybe even DIY channel.
Posted on Reply
#29
Chrispy_
Chrispy_I read (here, or linked from here at least) about the MX500-series being evolutions of the Turing 16-series architecture, but refined for efficiency and using the slightly more efficient Samsung 8nm process.
I went and looked it up - and it's bad news, there will be no TU117 sans RT hardware on Samsung 8nm

The MX550 is just a tweak of the useless and inefficient MX450
The MX570 is yet another bottom-of-the-barrel die-harvest for the worst, most defective 3050Ti cores (GA107)

Neither of those are worth talking about, Rembrandt will likely perform better than the MX450 without needing a dGPU and its associated cooling/power budget, making the MX550 a write-off. The MX570 is Ampere which we already know from the 3050 and 3050Ti scales down pretty poorly in terms of power efficiency. If the 3050Ti isn't worth talking about from a performance/Watt perspective, then something that performs far worse in a similar power envelope isn't going to get anyone excited :(
Posted on Reply
Add your own comment
Dec 18th, 2024 03:25 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts