Wednesday, September 13th 2023
Nintendo Switch 2 to Feature NVIDIA Ampere GPU with DLSS
The rumors of Nintendo's next-generation Switch handheld gaming console have been piling up ever since the competition in the handheld console market got more intense. Since the release of the original Switch, Valve has released Steam Deck, ASUS made ROG Ally, and others are also exploring the market. However, the next-generation Nintendo Switch 2 is closer and closer, as we have information about the chipset that will power this device. Thanks to Kepler_L2 on Twitter/X, we have the codenames of the upcoming processors. The first generation Switch came with NVIDIA's Tegra X1 SoC built on a 20 nm node. However, later on, NVIDIA supplied Nintendo with a Tegra X1+ SoC made on a 16 nm node. There were no performance increases recorded, just improved power efficiency. Both of them used four Cortex-A57 and four Cortex-A53 cores with GM20B Maxwell GPUs.
For the Nintendo Switch 2, NVIDIA is said to utilize a customized variant of NVIDIA Jetson Orin SoC for automotive applications. The reference Orin SoC carries a codename T234, while this alleged adaptation has a T239 codename; the version is most likely optimized for power efficiency. The reference Orin design is a considerable uplift compared to the Tegra X1, as it boasts 12 Cortex-A78AE cores and LPDDR5 memory, along with Ampere GPU microarchitecture. Built on Samsung's 8 nm node, the efficiency would likely yield better battery life and position the second-generation Switch well among the now extended handheld gaming console market. However, including Ampere architecture would also bring technologies like DLSS, which would benefit the low-power SoC.
Sources:
@Kepler_L2, GitHub, via Tom's Hardware
For the Nintendo Switch 2, NVIDIA is said to utilize a customized variant of NVIDIA Jetson Orin SoC for automotive applications. The reference Orin SoC carries a codename T234, while this alleged adaptation has a T239 codename; the version is most likely optimized for power efficiency. The reference Orin design is a considerable uplift compared to the Tegra X1, as it boasts 12 Cortex-A78AE cores and LPDDR5 memory, along with Ampere GPU microarchitecture. Built on Samsung's 8 nm node, the efficiency would likely yield better battery life and position the second-generation Switch well among the now extended handheld gaming console market. However, including Ampere architecture would also bring technologies like DLSS, which would benefit the low-power SoC.
118 Comments on Nintendo Switch 2 to Feature NVIDIA Ampere GPU with DLSS
I wish Nintendo would finally make a powerful non mobile console again on the performance level of the PS5 / xBox Series X. Keep the Switch and offer something for everyone.
And yes, Nintendo is the awesome one, just like Fred Rogers.
Switch is a nice thing I use one myself and it it fun especially with titles like Zelda and of course the Mario games.
Been waiting for the switch 2 since I am getting one.
A bit disappointed to see Nintendo's new console starting off behind the times (again) when XSX and PS5 launched with bleeding edge tech.
And going with ARM on future consoles is unlikely for compatibility reasons. It would completely break backward compatibility, make porting new games to PC harder and render older games unusable. Also thus far there is no proper x86 to ARM emulator. Raster is still the king whether you like it or not. Even games that incorporate some lever of RT are still hybrids of raster and RT. Also switch 2 is going ampere, not ada. Get you facts straight. They improve RT every generation like Nvidia. Nvidia only has advantage because they started with RT one generation before AMD.
Also how is Radeon related to consoles? AMD's semi group uses Radeon IP in the SoC's but the consoles are not Radeon branded. I doubt it will be forgotten as both AMD and Intel have integrated it, unlike with PhysX that remained Nvidia exclusive.
But companies have shown that it's possible to produce comparable software based systems like UE's Lumen that look nearly the same.
Also i would argue that modern games already have good fake lighting systems. Only reflections are very hard to fake with raster.
Shadows not so much as raster is very good there already and adding RT shadows is mostly useless.
Global Illumination could be another advantage for RT.
Personally i want games to start focusing on physics and AI instead of graphics. The best, most photorealistic graphics fall apart as soon as a dumb AI that walks like a robot enters the scene and opens their mouth with bad lip syncing in an environment where you throw a grenade in a coffee cup and the coffee cup gets a small black stain that dissapears after a few seconds - no damage done. You cant even take down trees like in Crysis.
Its marketing driven, definitely not user driven like, say, DLSS which gets modded in everywhere.
There IS tooling to RT- ify all the things, mind.
But I already knew this the moment it got announced. We're going to brute force something that we used to do much more efficiently, for mediocre to very low IQ gains, in a time when resources get scarce and climate is banging on the door? And when we see that Moore's Law is becoming ever harder to keep up, with the end of silicon on the horizon? Good luck with that. Even if it was going to be a success, reality will kill it sooner or later. But frankly, its a completely pointless exercise, as there are still games coming out that combine baked and dynamic lighting just fine to get near equal results. They have the economical advantage, because they can make better looking games run on a far broader range of hardware. The supposed cost or time to market advantage for developers is thus far an unproven marketing blurb as well, and even so, dev cost is really never a show stopper, its all about sales. To me this is common sense, honestly. Economic laws never lie.
I honestly hope AMD stays the course wrt RT. Integration at a low cost/die space, sure. Integration at the cost of raw perf? Pretty shitty. Ah right, they're not just clocking GPUs as high as possible because that's the most profitable bottom line anymore? Interesting, the world changed overnight!
Both Ada and RDNA3 clock about equal and both are not efficient at the top of the VF curve - this is common as it is virtually every GPU gen past 32nm. It doesn't say a thing about how good the node is, it mostly tells us how greedy chip makers are.
I'm not sure how you devised this story. Samsung's 8nm is plagued by issues, its not efficient, and Ampere suffers from transient spiking. 320W for an X80 is not unseen? Okay. Also I'm not entirely sure why AMD is in that comparison, RDNA2 isn't even remotely the subject here.
Sure, they can clock lower for Nintendo, but that still doesn't make it a good node, especially not for a low power device. Its yesteryear's tech, so its mostly just cheap.
Modern AAA release that are too heavy for the switch are streamed from the cloud. Nintendo doesn't "hinder advancements in game engine". If anything, back when Nintendo made "PS/Xbox clones", they were still losing support from third party devs. The game Cube was more powerful than the PS2 but didn't have nearly has much third-party developers supports. Nintendo choosing to make a niche for themselves saved them from sharing the same fate as Sega.
I've heard many things, but Nintendo living in its own niche somehow running the PC gaming master race is a first. There are reasons to dislike nintendo, but it's not one of them.
To clarify my reasoning further: in the case of mainstream engine like unity, those engines also need to accommodate use case that are not related to gaming. I've studied VR/AR in college, we had professionals from the field talking with us, there's a need for an engine that can scale down, because you can't just assume that the user will be using a state-of-the-art device.
Also, are you comparing raytracing to physX? As if rt is a propietary nvidia technology? It is not, and the "comparable" is just writing their own rt code, or simply using a mix of pre baked and dynamic lighting.
Besides, real time RT has great benefits for developers, not just nice graphics for players, it cuts down on a lot of the development time that would be dedicated to faking the lighting or baking the lighting in a map, since they can set it up and forget about it, so in the future when rt is less costly, this will be great! But in the present day, when devs do this, we get immortals of Aveum, we need more time for this technology to mature, clearly.
When did we lose sight of pre baked environments for static games, look at cs2, runs great, looks great, we don't have to sacrifice so much.
Your complaints are completely flat as you dont understand the market. If the hardware will keep you from buying a Nintendo, then you were not going to buy a Nintendo anyway because they are marketed for the games and not the hardware.
And dont even get me started with your anti consumer nonsense, you're mad that Nintendo wont give away it's IP when that is the only reason they are still in business.
*There is so much wrong with this comment I have to make an edit.
Sure Nintendo Switch is selling like hot cakes and all that, but it’s still behind the best seller, the PS2 so it got some ways to go(20 mill to go but that’s easy for them) and the PS4 Lifetime sales is 117 million consoles sold. So if we just combine PS4 and PS5 alone, the Switch is not outselling like that and people have more Xbox One, Xbox One S, Xbox One X, PS4 and PS5 than the Nintendo Switch so stop that noise.
PlayStation has been making great games just like Nintendo ESPECIALLY in first party games AND it’s at 1080p60fps or higher with the luxury of having 3rd party games without the need to be released as a cloud variant. There are a lot of great games out there and the Switch can’t play majority of them unless they are indie games or older games from the PS3 and XBOX 360 era which is still impressive …3 to 4 years ago. And Microsoft has GamePass which is enough said, essentially compared to Nintendo’s alternative.
Also I know that is a Nerds Tech fashion to hate Apple(I don’t follow dumb trends like this) but to sit here and praise Nintendo as the good guy between Apple and Nintendo is absolutely ludicrous especially since Nintendo HATES their customers and that’s a fact within the gaming industry. The fact that they sent one of their fans to jail for 3 years and then slapped a 10 million dollar fine on top of that. People who steal iPhones don’t get that harsh of a punishment but since it’s Nintendo, it’s ok and the fan “got what he deserved”. Also the same company that does not listen to its fans when it comes to beloved games and the same company that’s trying to stop people Emulating their games because they decided to stop selling the game and refuse to keep the old consoles up to date and make their fans go to third party companies to fix their consoles. Don’t do Fred Rogers like that. He’s way more respectable than Nintendon’t.
In the future full RT should cost like 5~10%, assuming nVidia and AMD care about gaming by then.
However my point stands with regards to anti-consumer practices: en.wikipedia.org/wiki/Nintendo#Intellectual_property_protection
Also not limited to software but also hardware by not acknowledging joycon drift initially. Only after massive backlash and threat of class action lawsuits they caved. Also the whole ROM business would not even exist if it were not for Nintendo's own draconian rules.
They call it protecting their IP. I call it not capitalizing by utilizing and licensing your IP.
Most companies license their IP and are very much in business. So that's a complete baloney argument that Nintento would somehow cease to exist if they allowed their IP on other platforms. Yeah totally sounds like a company i want to give my money to.
AMD's bringing FSR3 & they're gonna basically steamroll anything other than a Nintendo backed console ~ that too mostly because of the popularity of first party Nintendo games. This next gen should be really interesting because AMD's probably at par with anything Nvidia brings to the table & much more efficient with Zen cores overall. You'll also have to remember a lot of mobile games could also cross over, also RDNA is featuring in Exynos, though that also holds true for the switch.
Not in this decade, that's for sure!
If you are reading something someplace that is referencing nVidia TensorRT, that stands for Run Time, not Ray Tracing.
NVIDIA TensorRT is a runtime library and optimizer for deep learning inference that delivers lower latency and higher throughput across NVIDIA GPU products. TensorRT enables customers to parse a trained model and maximize the throughput by quantizing models to INT8, optimizing use of the GPU memory and bandwidth by fusing nodes in a kernel, and selecting the best data layers and algorithms based on the target GPU.
www.nvidia.com/content/dam/en-zz/Solutions/gtcf21/jetson-orin/nvidia-jetson-agx-orin-technical-brief.pdf