• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Samsung RDNA2-based Exynos 2200 GPU Performance Significantly Worse than Snapdragon 8 Gen1, Both Power Galaxy S22 Ultra

Architectures designed for scalabiling up like RDNA which have expectations of massive chips like the 5120 shader RX6900XT don't typically scale down well at all.

Scalability and efficiency are almost mutually-exclusive because if you're NOT scaling up, then the compromises you have to make to enable scalability prevent you from min-maxing for a compact/zero-waste design that works well at the lowest end of the spectrum.
 
Ok, I understand this is the first effort of AMD in the low low low power space, BUT what is the point of launching a GPU which is not even faster than the last gen and tout it so much?

It's not the first effort, Qualcomms Adreno GPUs used to be AMD (Adreno 200 is AMD Z430) - part of Imageon.
AMD sold that GPU division to Qualcomm.
 
It's not the first effort, Qualcomms Adreno GPUs used to be AMD (Adreno 200 is AMD Z430) - part of Imageon.
AMD sold that GPU division to Qualcomm.
And they've done mobile in the form of IGPs since.

Just as well, the only mobiles I'm touching from Samsung is tablets anyway. Probably not even that, now that they've moved the A series over to Unisoc :shadedshu:
 
It's not the first effort, Qualcomms Adreno GPUs used to be AMD (Adreno 200 is AMD Z430) - part of Imageon.
AMD sold that GPU division to Qualcomm.
That's unrelated to what they're doing now.

Ok, I understand this is the first effort of AMD in the low low low power space, BUT what is the point of launching a GPU which is not even faster than the last gen and tout it so much?
The same can be said about Ryzen 2000 (~3 %), Intel Skylake 6000 (? %), Radeon RX 6500 XT.....
 
It's not the first effort, Qualcomms Adreno GPUs used to be AMD (Adreno 200 is AMD Z430) - part of Imageon.
AMD sold that GPU division to Qualcomm.

Adreno -> unscramble -> Radeon

But yeah AMD hasn't gone for super low power GPU's in a long time.
 
This have always been the case. There was like one generation when Exonys was on par with Qualcomm, but otherwise Exonys has been worse in every repsect. And the sneaky thing is Samsung send out the Snapdragon variants to reviewers even in areas in which the Snapdragons aren't sold.
 
It's not the first effort, Qualcomms Adreno GPUs used to be AMD (Adreno 200 is AMD Z430) - part of Imageon.
AMD sold that GPU division to Qualcomm.
This time AMD people themselves can shout Adios My Dineros! :)
 
Don't tell me they used AMD's drivers instead of Mesa's :slap:
 
Architectures designed for scalabiling up like RDNA which have expectations of massive chips like the 5120 shader RX6900XT don't typically scale down well at all.

Scalability and efficiency are almost mutually-exclusive because if you're NOT scaling up, then the compromises you have to make to enable scalability prevent you from min-maxing for a compact/zero-waste design that works well at the lowest end of the spectrum.

I'm thinking something much simpler... do RDNA/RDNA2 drivers do tile-based rendering at all?

Mobile is optimized for tile-based deferred rendering. While desktop GPUs do a different style of rendering/rasterization. Maybe these mobile-GPU benchmarks are largely a test that favors tile-based deferred rendering?
 
I can't see thermals in any of those benchies, including the german one.

Basically, who says the SoC stack is clocked same to estimate IPC perf?
 
I do know Adreno GPU team is stemming from AMD but that is a bit too much of a stretch to still call it AMD.
The adreno gpu architecture has been changed so many times I doubt it has too much in common with the last GPU the team under amd had.

For a first effort it is good, BUT Samsung works with AMD to get an upper hand on competition in this space, not to lament themselves that "something better will come".
This isn't a GPU made in a year, they have been working on this project for at least a couple of years, so this is quite bad given their initial ambitions.
 
Meanwhile I thought Exynos 2200 will actually make a difference. Lol bye bye RDNA2 powered breakthrough....
 
I'm curious, what the issues are? RDNA2 not efficient enough for such low power device? Poor job on Samsung's part with integration and/or manufacturing? Software problems, driver, firmware etc?
 
eh processing power for your average user like me, who uses netflix and that is about it, is I mean well pointless.

personally I find iOS to be finnicky to use, whenever I would try to hit next episode on netflix in bottom right corner on mini ipad it would sometimes mess up and make me go into episodes all over, it was really annoying, never had a single issue with next episode on android. so i mean android it is for me. cause im lazy and don't care and just want netflix in background at work
The problem here is the same phone model is powered by 2 different SoC and if you happen to live in the EU you basically have way less gpu performance than the US counterpart, because yes. That's not good.


The EU already has usually to pay some more for the same product on US but now it comes with an objectively worse hardware. Which I personally wouldn't mind if it was priced accordingly, but you know, things don't work like that.
 
Nothing will touch the performance of Apple A series chips in the handheld space for the foreseeable future. There was some hope that the RDNA-2 based Xclipse would start to catch up. Sadly this doesn’t look like the case. Android users will have to wait longer to get the kinds of performance numbers iOS users have been enjoying for a few years now.

Doesn't matter how fast you're going when you're going in the wrong direction. When Apple is forced to open the app store that might start to change, until then, the extra performance from A series chips only serves for epeen gratification.

The problem here is the same phone model is powered by 2 different SoC and if you happen to live in the EU you basically have way less gpu performance than the US counterpart, because yes. That's not good.


The EU already has usually to pay some more for the same product on US but now it comes with an objectively worse hardware. Which I personally wouldn't mind if it was priced accordingly, but you know, things don't work like that.

The price is because of taxes and tarifs, with the trade war tarifs still in place the prices are pretty similar. US prices are also regularly quoted without taxes because every state is different while in the EU we're used to see VAT already applied.

But it's not all bad, the Snapdragon versions are usually locked down pretty tigth and see no custom rom development contrary to Exynos.
 
But it's not all bad, the Snapdragon versions are usually locked down pretty tigth and see no custom rom development contrary to Exynos.
I used to tinker with cfw on my phones, but unless you go with popular terminals you are pretty much locked down with official or unofficial and most likely abandoned ports of proper cfws, which makes the scene pretty irrelevant if that's the case. I used the Sony's ZX premium and Xperia 5 for the last 4 years and it's just a desert for me, at least the official roms are decent and quite easy to debloat even without rooting.
 
A GPU architecture not designed for smartphones, performs poorly on smartphones. Why exactly is this news?
 
I ordered S22 Ultra 256GB and it will be Exynos version as I live in Germany. :laugh:
I think it would be in any case good upgrade as I am coming from S10 plus.
Gaming is not my thing on the phone so GPU performance is not the most important for me as long as I can still watch YouTube videos. :rolleyes:
I use the camera more often though.

For gaming I have this special AMD machine: :D

I was also first thinking about iPhone 13 Pro max but I am long term Samsung user since first Galaxy S in 2010 and more used to Android.
I have nothing against Apple as now I own iPad Pro 11 3rd generation (5G with 256GB).
 
Last edited:
Embarrassing for both Scamsung and AMD IMO. Final nail the Exynos coffin surely. Why they persisted with this garbage is laughable. 2100 was half-way decent compared to the prior crap, but I'd buy Mediatek SoC over Crapnyos any day. Apple must be cacking themselves stupid. Australia only the Crapnyos versions, so S21 will be my last Samsung unless they solely move to Qualcomm across all markets.
 
Here is also a video that I found quite interesting.
This guy is talking about the S22 Ultra version with Snapdragon 8 gen 1.

 
I ordered S22 Ultra 256GB and it will be Exynos version as I live in Germany. :laugh:
I think it would be in any case good upgrade as I am coming from S10 plus.
Gaming is not my thing on the phone so GPU performance is not the most important for me as long as I can still watch YouTube videos. :rolleyes:
I use the camera more often though.
Consider that coming from an S21 Ultra you would actually get less battery life...
Without mentioning the reduced RAM configuration sold at the same price points of the previous model (S21 Ultra topped at 16GB, S22 Ultra tops at 12GB), another great "upgrade" that Samsung shamelessly offers to its customers.
Honestly the only redeeming point of the S22 is the pledge of 4 OS updates and 5 years of security support, in other regards it's just a waste of money.
 
I ordered S22 Ultra 256GB and it will be Exynos version as I live in Germany. :laugh:
I think it would be in any case good upgrade as I am coming from S10 plus.
Gaming is not my thing on the phone so GPU performance is not the most important for me as long as I can still watch YouTube videos. :rolleyes:
I use the camera more often though.

For gaming I have this special AMD machine: :D

I was also first thinking about iPhone 13 Pro max but I am long term Samsung user since first Galaxy S in 2010 and more used to Android.
I have nothing against Apple as now I own iPad Pro 11 3rd generation (5G with 256GB).

from camera comparisons I have seen, the s22 ultra is still better than the iphone pro max, at end of day these cameras are coming down to taste though, both are amazing with pictures obviously, but in the comparison shots i preferred the s22 ultra camera.

now the real question is, are you going to make memories with friends and loved ones and be able to save those memories in high quality? at end of day, thats all that phone is, a tool to enhance life and remember it more clearly - imo anyway
 
from camera comparisons I have seen, the s22 ultra is still better than the iphone pro max, at end of day these cameras are coming down to taste though, both are amazing with pictures obviously, but in the comparison shots i preferred the s22 ultra camera.

now the real question is, are you going to make memories with friends and loved ones and be able to save those memories in high quality? at end of day, thats all that phone is, a tool to enhance life and remember it more clearly - imo anyway
Yes camera is one of the more often use case for me as it in the pocket all the time and when one is out with kids and sees a nice moment one can quickly capture it. :)
I have not carried my SLR camera in a long while expect on some special occasions.

I save the original photos and videos from the phone regularly to my own home Synology server.
If for example grandparents like to print a nice picture for the kids, I usually send it as zip file wih email otherwise it loses quality if you share it with WhatsApp.
 
My guess is that RDNA2 thrives on high clockspeed. None of the RDNA2 GPUs run below 2Ghz. But with high clockspeed, you will need to pay a power penalty, which also results in exponential increase in heat output. The combination of Samsung's node that seems to suffer with high power consumption and thermal throttling, may result in poor performance. There are rumors that Samsung had to lower the clockspeed significantly to keep the GPU running stably. And if reducing the clockspeed still results in significant throttling based on some feedbacks I saw, I would expect performance to tank. At least based on the previous generation of SN and Samsung SOCs, Samsung's node seems to be the root cause of significant thermal throttling. Not that there is no throttling when Qualcomm was using TSMC, but not to such an extend.

FYI RDNA2 works great at medium speeds and is very efficient but of course these are for desktop (and theoretically laptop) power usages, not the ~5W that a phone uses. I was clocking down my 6600XT in AB recently to see the effects and get this in Unigine Valley:

2668MHz Cores, 2160MHz Memory, +20% power: 145-155W
2360 Cores, 2160 Memory: 95-105W
2040 Cores, 2000 Memory: 70-80W
1740 Cores, 2000 Memory: 55-60W
1350 Cores, 2000 Memory: 40-45W

FYI games actually seem to load the GPU less than Valley so the 2360/2160 @95-105W was actually using 85-90W in SotTR, HZD, Minecraft w/shaders. BTW, I was getting 10% lower FPS in SotTR for that 33% power reduction, showing that RDNA2 can be much more efficient at saner GPU speeds.

Not sure how much you'd need to cut 1350/2000 to get to something you could stick in a phone at a little more than 1/10 the power, though. Certainly way fewer cores and memory bandwidth but you can see the power savings still decreasing faster than linear, even from 2040 to 1350.
 
Back
Top