Monday, January 17th 2022

Samsung Introduces Game Changing Exynos 2200 Processor with Xclipse GPU Powered by AMD RDNA2 Architecture

Samsung Electronics Co., Ltd., a world leader in advanced semiconductor technology, today announced its new premium mobile processor, the Exynos 2200. The Exynos 2200 is a freshly designed mobile processor with a powerful AMD RDNA 2 architecture based Samsung Xclipse graphics processing unit (GPU). With the most cutting-edge Arm -based CPU cores available in the market today and an upgraded neural processing unit (NPU), the Exynos 2200 will enable the ultimate mobile phone gaming experience, as well as enhancing the overall experience in social media apps and photography.

"Built on the most advanced 4-nanometer (nm) EUV (extreme ultraviolet lithography) process, and combined with cutting-edge mobile, GPU, and NPU technology, Samsung has crafted the Exynos 2200 to provide the finest experience for smartphone users. With the Xclipse, our new mobile GPU built with RDNA 2 graphics technology from the industry leader AMD, the Exynos 2200 will redefine mobile gaming experience, aided by enhanced graphics and AI performance," said Yongin Park, President of System LSI Business at Samsung Electronics. "As well as bringing the best mobile experience to the users, Samsung will continue its efforts to lead the journey in logic chip innovation."
The Industry's First Hardware-accelerated Ray Tracing on Mobile for the Ultimate Gaming Experience
The Xclipse GPU is a one-of-a-kind hybrid graphic processor that is positioned between the console and the mobile graphic processor. Xclipse is the combination of 'X' that represents Exynos, and the word 'eclipse'. Like an eclipse, the Xclipse GPU will bring an end to the old era of mobile gaming and mark the start of an exciting new chapter.

With the high-performance AMD RDNA 2 architecture as its backbone, the Xclipse inherits advanced graphic features such as hardware accelerated ray tracing (RT) and variable rate shading (VRS) that were previously only available on PCs, laptops and consoles.

Ray tracing is a revolutionary technology that closely simulates how light physically behaves in the real world. By calculating the movement and the color characteristic of light rays as they bounce off the surface, ray tracing produces realistic lighting effects for graphically rendered scenes. To offer the most immersive graphics and user experiences even on mobile, Samsung has collaborated with AMD to realize the industry's first ever hardware-accelerated ray tracing on mobile GPU.

Variable rate shading is a technique that optimizes GPU workload by allowing developers to apply lower shading rate in areas where overall quality will not be affected. This gives GPU more room to work on areas that matter most to the gamers and improve frame-rate for smoother gameplay.

In addition, the Xclipse GPU comes with various technologies such as advanced multi-IP governor (AMIGO) that enhance overall performance and efficiency.

"AMD RDNA 2 graphics architecture extends power-efficient, advanced graphics solutions to PCs, laptops, consoles, automobiles and now to mobile phones. Samsung's Xclipse GPU is the first result of multiple planned generations of AMD RDNA graphics in Exynos SoCs," said David Wang, Senior Vice President of Radeon Technologies Group at AMD. "We can't wait for mobile phone customers to experience the great gaming experiences based on our technology collaboration."

Enhanced 5G Connectivity and Ironclad Security Features
The Exynos 2200 is one of the first in the market to integrate Arm's latest Arm v9 CPU cores which offer a substantial improvement over Arm v8 in terms of security and performance, the two areas that are becoming critically important in today's mobile communications devices.

The octa-core CPU of Exynos 2200 is designed in a tri-cluster structure made up of a single powerful Arm Cortex -X2 flagship-core, three performance and efficiency balanced Cortex-A710 big-cores and four power-efficient Cortex-A510 little-cores.

"The digital experiences of tomorrow require new levels of performance, security and efficiency," said Rene Haas, President of IP Products Group (IPG) at Arm. "As one of the first processors to incorporate the new Arm v9 CPU cores, Samsung's Exynos 2200 takes advantage of Arm's Total Compute strategy and key security features, like Memory Tagging Extension (MTE), to deliver the purpose-built compute and specialized processing needed to power future mobile experiences."

The Exynos 2200 offers more powerful on-device artificial intelligence (AI) with an upgraded NPU. The NPU's performance has doubled compared to its predecessor, allowing more calculations in parallel and enhancing the AI performance. The NPU now offers much higher precision with FP16 (16bit floating point) support in addition to power efficient INT8 (8-bit integer) and INT16.

Also, the Exynos 2200 integrates a fast 3GPP Release 16 5G modem supporting both sub-6 GHz and mmWave (millimeter Wave) spectrum bands. With E-UTRAN New Radio - Dual Connectivity (EN-DC), which utilizes both 4G LTE and 5G NR signals, the modem can boost the speed up to 10 Gbps.

For safekeeping, the Exynos 2200 comes with Integrated Secure Element (iSE) to store private cryptographic keys as well as to play a role as RoT (Root of Trust). Also, an inline encryption HW for UFS and DRAM has been reinforced to have user data encryption safely shared only within the secure domain.

Providing Enhanced Visual Experience and Professional-level Quality Images
The Exynos 2200's image signal processor (ISP) architecture has been redesigned to support the latest image sensors for ultra-high resolution of up to 200 megapixel (MP). At 30 frames-per- second (FPS), the ISP supports up to 108 MP in single camera mode, and 64+36Mp in dual camera mode. It can also connect up to seven individual image sensors and drive four concurrently for advanced multi-camera setups. For video recording, the ISP supports up to 4K HDR (or 8K) resolution.

Together with the NPU, the ISP utilizes an advanced content-aware AI camera for more refined and realistic results. When taking a photograph, the machine learning based AI camera recognizes multiple objects, the environment, and faces within the scene. It then applies optimal settings for color, white balance, exposure, dynamic range, and more to produce professional-level quality images.

With 8K resolution support, the Exynos 2200's advanced multi-format codec (MFC) makes videos truly come to life. It decodes videos up to 4K at 240fps or 8K at 60fps and encodes up to 4K at 120fps or 8K at 30 FPS. In addition, the MFC integrates power efficient AV1 decoder enabling longer playback time. The advanced display solution features HDR10+ adding more dynamic range and depth to the picture and offers refresh rates of up to 144 Hz for a more responsive and smoother transitioning experience when scrolling or playing games.

The Exynos 2200 is currently in mass production.
Add your own comment

30 Comments on Samsung Introduces Game Changing Exynos 2200 Processor with Xclipse GPU Powered by AMD RDNA2 Architecture

#1
AusWolf
What's all this AI crap every tech company seems to be on about? How much AI do I need to read my emails?

I'm really curious about the graphics performance, though - not that it's any more needed in a phone than AI.
Posted on Reply
#2
wolf
Performance Enthusiast
I really want to see this pitted against the Snapdragon 8 gen1. I would like to pick up the S22 Ultra this year, and am very excited to mess around with mobile RDNA2, but not if it's going to be a shitshow like the S20 series and other models where the Exynos was outright worse compared to the SD. At least in the S21 series they traded blows and the SD victory was narrow, but I won't want a product that's worse in every way.
Posted on Reply
#3
TechLurker
AusWolfWhat's all this AI crap every tech company seems to be on about? How much AI do I need to read my emails?
I'd like something like a "dumb" Halo AI that could at least read through all my emails, organize them, and even summarize them on my behalf. Maybe even compose some responses to important ones that I can review before hitting send.

But that's still some years away and I have to still manage my e-mails like I manage my physical mail; manually sorting through them and responding where appropriate.
Posted on Reply
#4
Fouquin
btarunrThe Industry's First Hardware-accelerated Ray Tracing on Mobile for the Ultimate Gaming Experience
Blatant fucking arrogance. They're not the first, and they're six years late. PowerVR GR6500 and its contemporary models have been out long enough to be EOL. Exynos 2200 is hardly "the first" hardware RT mobile chip.
Posted on Reply
#5
watzupken
wolfI really want to see this pitted against the Snapdragon 8 gen1. I would like to pick up the S22 Ultra this year, and am very excited to mess around with mobile RDNA2, but not if it's going to be a shitshow like the S20 series and other models where the Exynos was outright worse compared to the SD. At least in the S21 series they traded blows and the SD victory was narrow, but I won't want a product that's worse in every way.
Previously when Samsung was using their custom ARM cores, performance on the CPU side of thing is very poor. I think they do not have the expertise to go into custom CPU, at least its been proven so far. Now that they are using just ARM cores, and pairing it with an alternative GPU (RDNA2), I think performance can't be that poor. The only thing that will limit its performance is the node itself. Samsung's fab is generally less efficient as compared to TSMC, at least we have seen this many times in the past. The saving grace for Samsung is that Qualcomm is also using Samsung fab last year, so that puts them somewhat on parity. I am unclear if Qualcomm have moved back to TSMC for their SD 8 Gen 1, but if they are still on Samsung's fab, then chances is that even the Qualcomm chip will not perform as well, just like the SN 888.
Posted on Reply
#6
WhoDecidedThat
watzupkenThe saving grace for Samsung is that Qualcomm is also using Samsung fab last year, so that puts them somewhat on parity.
Not for long. Qualcomm is switching to TSMC for their mid year 2022 refresh (SD 8 Gen 1+). Reason? Poor performance by Samsung's node.
Posted on Reply
#7
Ralfies
I said "Oh no..." out loud and face palmed when I read "Xclipse." Maybe I'm missing something really clever...

I'm curious to see how the new little cores perform.
Posted on Reply
#8
dyonoctis
AusWolfWhat's all this AI crap every tech company seems to be on about? How much AI do I need to read my emails?

I'm really curious about the graphics performance, though - not that it's any more needed in a phone than AI.
It's mainly used for photos/videos. Since there's only so much that you can do with a captor so small, they have to use another means to enhance the photos. It's the software that really define if a phone is good or bad at photography nowadays.

There's also a few a.i task that they want to do locally rather than using the cloud. But it's stuff that quietely happens in the background (classifications of data, facial/text/object/voice recognition, for live translation, better AR...) "A.I" stuff isn't new, but stuff like the neural engine allow the phone to execute those task with a lower power draw.

If still somewhat funny to see a tech enthusiast considering that phones have peaked as soon as reading emails became possible, and that progress should be halted :D gen Z doesn't only use their phones as a calling/reading tool/web broswer, it's a whole entertainment device. What you might find unnecessary is a treasure for other
Posted on Reply
#9
Garrus
FouquinBlatant fucking arrogance. They're not the first, and they're six years late. PowerVR GR6500 and its contemporary models have been out long enough to be EOL. Exynos 2200 is hardly "the first" hardware RT mobile chip.
The RT was only in desktop development cards containing the GPU design that never went in to mass production, right? So come on, it is the first working SHIPPING silicon within a mobile envelope. Calm the language, they are correct. None of POWERVR's licensees picked up the hardware right?
PowerVR Ray Tracing project: our GR6500 GPU is now working at full speed on a 28nm chip integrated on a PCIe evaluation board.
Posted on Reply
#10
TheLostSwede
News Editor
AusWolfWhat's all this AI crap every tech company seems to be on about? How much AI do I need to read my emails?

I'm really curious about the graphics performance, though - not that it's any more needed in a phone than AI.
In phones it's mostly used for computational photography, but none of it is really AI, it's just additional, non-standard processing blocks that have been added, bit everyone loves to call it AI. It's a bit like a next generation DSP with more advanced processing for specific tasks.
wolfI really want to see this pitted against the Snapdragon 8 gen1. I would like to pick up the S22 Ultra this year, and am very excited to mess around with mobile RDNA2, but not if it's going to be a shitshow like the S20 series and other models where the Exynos was outright worse compared to the SD. At least in the S21 series they traded blows and the SD victory was narrow, but I won't want a product that's worse in every way.
Most people don't have a choice, since Samsung only ships the Qualcomm phones to the US, whereas everyone else gets the Exynos phones.
GarrusThe RT was only in desktop development cards containing the GPU design that never went in to mass production, right? So come on, it is the first working SHIPPING silicon within a mobile envelope. Calm the language, they are correct. None of POWERVR's licensees picked up the hardware right?
No it wasn't, that was Caustic Graphics, which never seemingly shipped to customers and that was a company IMG bought. They have been shipping RT capable GPU designs for several generations now, but since most of they customers only seem to care about cost...
That said, it depends on what level of RT we're talking about as well, since until last year, IMG has only offered partial hardware RT designs, but we obviously don't know what Samsung and AMD have cooked up either.
Posted on Reply
#11
watzupken
blanarahulNot for long. Qualcomm is switching to TSMC for their mid year 2022 refresh (SD 8 Gen 1+). Reason? Poor performance by Samsung's node.
It is a strange decision to do a mid cycle switch because that means they need to work with both Samsung and TSMC to develop basically the same chip. Generally the mid cycle refresh brings about higher clock speed, and nothing new from a hardware perspective based on what I recall. And in doing so, they are deterring people to get the early chips based on Samsung's fab. And I suspect it will also cost more when using TSMC's fab, so not sure if it will result in higher pricing.
Posted on Reply
#12
bug
wolfI really want to see this pitted against the Snapdragon 8 gen1. I would like to pick up the S22 Ultra this year, and am very excited to mess around with mobile RDNA2, but not if it's going to be a shitshow like the S20 series and other models where the Exynos was outright worse compared to the SD. At least in the S21 series they traded blows and the SD victory was narrow, but I won't want a product that's worse in every way.
Rumor has it it won't fare well in the graphic department. The GPU is said to be slower than the latest Mali, for example. (Not sure where I read that, I'll link it here ifI can find it again.)
Posted on Reply
#13
watzupken
My opinion is that RDNA2 tends to fare well with high clock speed. I believe AMD's SOC graphic solution may be the same as what we see on desktop, where try to use clock speed to give performance a boost. And if you look at recent chips produced by Samsung fabs, they can't run well at high clock speed without burning through a tonne of power and producing a lot of heat. So by marrying RDNA2 with Samsung node, the clock speed advantage is effectively nullified. Ironically, I thin Samsung LSI may need to switch to TSMC if they want to produce something that is competitive since I feel their own fab is letting them down.
Posted on Reply
#14
z1n0x
Xclipse. It roll off of the tongue so well and it's so catchy. /s

"Xclipse is the combination of 'X' that represents Exynos, and the word 'eclipse'. Like an eclipse," Big brain marketing right there. Someone is actually getting paid for shit like this.:laugh:
Posted on Reply
#15
AusWolf
dyonoctisIt's mainly used for photos/videos. Since there's only so much that you can do with a captor so small, they have to use another means to enhance the photos. It's the software that really define if a phone is good or bad at photography nowadays.
Sounds fancy, but I guess I can live without it. I'm fine with my nearly 10 year-old Sony compact camera. :)
dyonoctisThere's also a few a.i task that they want to do locally rather than using the cloud. But it's stuff that quietely happens in the background (classifications of data, facial/text/object/voice recognition, for live translation, better AR...)
Also background surveillance, telemetrics and targeted ads, perhaps? AI sounds cool, bot no one said that you need it.
dyonoctisIf still somewhat funny to see a tech enthusiast considering that phones have peaked as soon as reading emails became possible, and that progress should be halted :D gen Z doesn't only use their phones as a calling/reading tool/web broswer, it's a whole entertainment device. What you might find unnecessary is a treasure for other
I'm not gen Z, but gen Y in its strictest sense. As such, I fail to understand how anything equipped with a 5-7" screen, tiny speakers and no tactile control can serve as an entertainment device. If I want to watch a film, I've got a 55" UHD TV. If I want to play a game, I've got my PC with a 24" monitor, a keyboard and a mouse. If I'm on the go, I've got a laptop. Music = my Sony MP3 player.

I guess I'm getting old. Slowly but surely. :ohwell:
TheLostSwedeIn phones it's mostly used for computational photography, but none of it is really AI, it's just additional, non-standard processing blocks that have been added, bit everyone loves to call it AI. It's a bit like a next generation DSP with more advanced processing for specific tasks.
Ah I see. The only difference is that calling it DSP would actually make it sellable for people like me (even though I don't take photos with my phone very often). :D "AI" makes me cringe.
Posted on Reply
#16
TheLostSwede
News Editor
z1n0xXclipse. It roll off of the tongue so well and it's so catchy.:laugh: /s
Especially as most mandarin speakers can't pronounce X...
watzupkenIt is a strange decision to do a mid cycle switch because that means they need to work with both Samsung and TSMC to develop basically the same chip. Generally the mid cycle refresh brings about higher clock speed, and nothing new from a hardware perspective based on what I recall. And in doing so, they are deterring people to get the early chips based on Samsung's fab. And I suspect it will also cost more when using TSMC's fab, so not sure if it will result in higher pricing.
I don't think it's a mid cycle switch, but rather for the Gen 2 chip which is set to launch later this year.
bugRumor has it it won't fare well in the graphic department. The GPU is said to be slower than the latest Mali, for example. (Not sure where I read that, I'll link it here ifI can find it again.)
There has been a week of rumours about it, but it's obviously not something anyone outside of Samsung really knows much about.
It's not that the GPU is slower as such from my understanding, but the fact that Samsung LSI has had some issues ramping the clock speeds without making a really hot chip.
Posted on Reply
#17
AusWolf
TheLostSwedeEspecially as most mandarin speakers can't pronounce X...
You mean Xspecially, right? :p
Posted on Reply
#18
TheLostSwede
News Editor
AusWolfAh I see. The only difference is that calling it DSP would actually make it sellable for people like me (even though I don't take photos with my phone very often). :D "AI" makes me cringe.
Well, we can call it a smarter way of processing certain data, since things like object recognition etc. is handled automagically by that bit of silicon, something that would otherwise rely on the main processor(s) in the SoC to do a lot of the heavy lifting, whereas the NPU or whatever they want to call it, can do it faster and more power efficient.
It's not the kind of AI that we see in movies for sure, it's really just a terrible marketing term.
AusWolfYou mean Xspecially, right? :p
There's a reason why the Xbox hasn't done well in many Asian countries...
Posted on Reply
#19
z1n0x
No frequency and performance claims. Gee, i wonder why that is? Samsung SoC on Samsung process.

Samsung should let AMD design them the entire SoC. It might turn out better, actually.
Posted on Reply
#20
AusWolf
TheLostSwedeWell, we can call it a smarter way of processing certain data, since things like object recognition etc. is handled automagically by that bit of silicon, something that would otherwise rely on the main processor(s) in the SoC to do a lot of the heavy lifting, whereas the NPU or whatever they want to call it, can do it faster and more power efficient.
It's not the kind of AI that we see in movies for sure, it's really just a terrible marketing term.
That's good to know. I would hate having to teach my phone what not to do while I'm not using it. Oh wait... :wtf:
TheLostSwedeThere's a reason why the Xbox hasn't done well in many Asian countries...
Wow, I didn't know. Interesting. :) I guess shopkeepers can't quite understand what customers are asking for when trying to buy one, so they just give them a PS instead. :roll:

It's also kind of sad considering the never-before-seen amount of X-es in basically every product name.
Posted on Reply
#21
bug
TheLostSwedeThere has been a week of rumours about it, but it's obviously not something anyone outside of Samsung really knows much about.
It's not that the GPU is slower as such from my understanding, but the fact that Samsung LSI has had some issues ramping the clock speeds without making a really hot chip.
I'm not reading too much into it. Just adjusting expectations, that's all.
Posted on Reply
#22
TheLostSwede
News Editor
bugI'm not reading too much into it. Just adjusting expectations, that's all.
I mean, there's no proof to back any of that up, so we're just going to have to wait and see.
Posted on Reply
#23
Wirko
z1n0xXclipse. It roll off of the tongue so well and it's so catchy. /s
Remember, an X at the beginning of the word is supposed to be pronounced as 'Z' in English.
Posted on Reply
#24
dyonoctis
AusWolfSounds fancy, but I guess I can live without it. I'm fine with my nearly 10 year-old Sony compact camera. :)


Also background surveillance, telemetrics and targeted ads, perhaps? AI sounds cool, bot no one said that you need it.


I'm not gen Z, but gen Y in its strictest sense. As such, I fail to understand how anything equipped with a 5-7" screen, tiny speakers and no tactile control can serve as an entertainment device. If I want to watch a film, I've got a 55" UHD TV. If I want to play a game, I've got my PC with a 24" monitor, a keyboard and a mouse. If I'm on the go, I've got a laptop. Music = my Sony MP3 player.

I guess I'm getting old. Slowly but surely. :ohwell:
Not that I'm defending the current state of mobile gaming ( 90% of them are trash), but I have fonds memories of the games I've played on gameboy. The screens were under 3", but they still managed to entertain a whole generation with them :D. Since I've played the like of Golden Sun, Zelda, Metroid... the current mobile gaming isn't great for me, but the youngsters don't know what mobile gaming used to mean.

On device A.I would actually means less data going out. As LostSwede said, "A.I" is used as a marketing terms because it's simpler to understand for the mass, but a lot of what's happening is simple machine learning/ and the automatisation of tasks that used to be a drag for the user. You can instantly find a picture that had a cat, a lake in it since the phone add those tags by itself, you can copy text from a picture, call a number from a picture, translate a poster written in a foreign language... It's only called "A.I" because you used to have to do those task manually, ( and anything related to language/speech processing WILL be called A.I when it's marketed to a mainstream audience)

You can be mad at the marketing, but it's definitely an handy thing, it's making it's way into design tools as well, If you ever done hair masking you know that designer don't do it for fun :D.
Posted on Reply
#25
Wirko
AusWolfAh I see. The only difference is that calling it DSP would actually make it sellable for people like me (even though I don't take photos with my phone very often). :D "AI" makes me cringe.
But maybe it's not AI. Or maybe it is AI. While there is no universal definition, this one is good enough:
ProgrammerHumor/comments/a07d0u
Posted on Reply
Add your own comment
May 21st, 2024 18:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts