• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nintendo Switch 2 to Feature NVIDIA Ampere GPU with DLSS

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
That was a very impressive response... and missed the point entirely.
My guy, your sub 100 gflop gpu is not going to be actually rendering anything in the past decade plus at 4k 60 fps. It can do video, it can stream games running on another system at 4k60fps, but there is no way it is actually rendering anything close to og switch at 4k 60fps no sweat.

So, since actually running games at 4k 60fps is what any normal person would assume you mean, when you say it can run games '4k 60fps no sweat', especially when you responded to a post talking about render times/dlss time ratios. Why dont you, in detail, lay out your abnormal reasoning for what you meant when you said 'hell the pi 5 can do 4k 60 without breaking a sweat' Since only you know what this coded message means, so that I can actually respond to that secret coded message.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
My guy, your sub 100 gflop gpu is not going to be actually rendering anything in the past decade plus at 4k 60 fps. It can do video, it can stream games running on another system at 4k60fps, but there is no way it is actually rendering anything close to og switch at 4k 60fps no sweat.

So, since actually running games at 4k 60fps is what any normal person would assume you mean, when you say it can run games '4k 60fps no sweat', especially when you responded to a post talking about render times/dlss time ratios. Why dont you, in detail, lay out your abnormal reasoning for what you meant when you said 'hell the pi 5 can do 4k 60 without breaking a sweat' Since only you know what this coded message means, so that I can actually respond to that secret coded message.
I slapped him on ignore as i got tired of responding to his posts like this, it's nice to see someone else put the effort in to try and get logic involved.
 
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
That was a very impressive response... and missed the point entirely.
I've got to ask, what was your point? It's clear the Switch successor will not be playing its AAA games at 4k60 without some hefty upscaling. Is your point that you'll be able to play basic 2D games at 4k60 on it? I agree with that, but that's not exactly impressive.
 
Joined
Jul 5, 2013
Messages
27,923 (6.69/day)
'hell the pi 5 can do 4k 60 without breaking a sweat' Since only you know what this coded message means, so that I can actually respond to that secret coded message.
Secret coded message? Seriously? 4k isn't that difficult anymore. It was being done by Midrange GPU's 6 years ago. Current ARM SOCs can do it well, within certain reasonable limits. NVidia's premium SOC's have a much easier time of it.

I slapped him on ignore as i got tired of responding to his posts like this, it's nice to see someone else put the effort in to try and get logic involved.
Good move. :thumbs_up:

I've got to ask, what was your point? It's clear the Switch successor will not be playing its AAA games at 4k60 without some hefty upscaling. Is your point that you'll be able to play basic 2D games at 4k60 on it? I agree with that, but that's not exactly impressive.
The NVidia SOC that Nintendo is rumored to be interested in & working on is capable of much better compute than the Tegra SOC currently in the Switch, which is over 8 years old at this time. My point is that 4k60 is not out of the scope of what is possible for the newer SOC, and far more than 2D.

You folks seem to think it takes a Geforce 4090 to do 4k well. It does not. $k was being done, very well, on a Geforce GTX1070 years ago. The current NVidia ARM SOCs are more than capable of it.
 
Last edited:

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
Secret coded message? Seriously? 4k isn't that difficult anymore. It was being done by Midrange GPU's 6 years ago. Current ARM SOCs can do it well, within certain reasonable limits. NVidia's premium SOC's have a much easier time of it.


Good move. :thumbs_up:

The NVidia SOC that Nintendo is rumored to be interested in & working on is capable of much better compute than the Tegra SOC currently in the Switch, which is over 8 years old at this time. My point is that 4k60 is not out of the scope of what is possible for the newer SOC, and far more than 2D.

You folks seem to think it takes a Geforce 4090 to do 4k well. It does not. 4k was being done, very well, on a Geforce GTX1070 years ago. The current NVidia ARM SOCs are more than capable of it.

You specifically stated a 76.8 Gflop gpu, like 1/3rd an Xbox 360 gpu, could do '4k gaming no sweat' and are now bringing up 6 Tflop gpu's as if they were the same exact thing in the same exact range of capabilities. You have demonstrated you have no concept of what's going on.

You also, somehow, don't seem to understand, that those 6/7 year old gpu's were running what are now 10 to 6 year old games at the time they came out, and new games have actually kept coming out since then. Those gpus, will NOT run modern games at 4k like they ran decade old games at 4k. Switch 2, will be a modern system focusing on MODERN games. Nobody is going to be impressed by it running games from 2010-2016 at 4k. Nobody cares, thats expected, people would be shocked if it couldn't. Also the 1070 didn't do 4k "very well", it typically did 30-40fps, which is.... serviceable. It did NOT do 4k 60, as the standard, as you are trying to infer.

A gtx 1070 is a 6tflop (12 tera ops total considering 6 tflops fp32 and 6 tops int32) machine that took 150 watts to power. Thats twice the cuda compute power of the ga10f in the t239 in the switch 2 at its likely 1ghz clock speed, and 10 watt or less power draw.

Obviously, the t239 is waaaaaaaaaaayyy more powerful per watt than the gtx 1070, but it doesnt get to use anywhere NEAR the same amount of power draw as the gtx 1070's 150 watts, because its a tegra, a mobile design, so in the end its only half as as powerful in cuda compute, and can run for hours on a battery, in a small contained enclosure without overheating, which the 1070 could never dream of.

So then, if the statements "the switch 2 only has half the cuda compute of the gtx 1070." AND the statement "the switch 2 will be running (not all, but it will run) more advanced games than the gtx 1070 could run at 4k, because it has waaaaaaayyy more compute" are both true (they are)....

What's the proprietary nvidia only hardware feature that satisfies both statements that did not exist back then for gp architecture like the 1070?

If you figure out what that is, you're back at my very first post you responded to, which lays out that exact math in detail, and you should have figured out why you should have never posted anything you did to begin with.
 
Last edited:
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
Current ARM SOCs can do it well, within certain reasonable limits.
Could you point me to some evidence for this? Given the iPhone 15 Pro Max can't run Genshin Impact at a stable 60fps at its native 2796x1290, I don't see what hope an older Nvidia SoC has at 4k.
 
Joined
Jul 5, 2013
Messages
27,923 (6.69/day)
You specifically stated a 76.8 Gflop gpu, like 1/3rd an Xbox 360 gpu, could do '4k gaming no sweat' and are now bringing up 6 Tflop gpu's as if they were the same exact thing in the same exact range of capabilities. You have demonstrated you have no concept of what's going on.

You also, somehow, don't seem to understand, that those 6/7 year old gpu's were running what are now 10 to 6 year old games at the time they came out, and new games have actually kept coming out since then. Those gpus, will NOT run modern games at 4k like they ran decade old games at 4k. Switch 2, will be a modern system focusing on MODERN games. Nobody is going to be impressed by it running games from 2010-2016 at 4k. Nobody cares, thats expected, people would be shocked if it couldn't. Also the 1070 didn't do 4k "very well", it typically did 30-40fps, which is.... serviceable. It did NOT do 4k 60, as the standard, as you are trying to infer.

A gtx 1070 is a 6tflop (12 tera ops total considering 6 tflops fp32 and 6 tops int32) machine that took 150 watts to power. Thats twice the cuda compute power of the ga10f in the t239 in the switch 2 at its likely 1ghz clock speed, and 10 watt or less power draw.

Obviously, the t239 is waaaaaaaaaaayyy more powerful per watt than the gtx 1070, but it doesnt get to use anywhere NEAR the same amount of power draw as the gtx 1070's 150 watts, because its a tegra, a mobile design, so in the end its only half as as powerful in cuda compute, and can run for hours on a battery, in a small contained enclosure without overheating, which the 1070 could never dream of.

So then, if the statements "the switch 2 only has half the cuda compute of the gtx 1070." AND the statement "the switch 2 will be running (not all, but it will run) more advanced games than the gtx 1070 could run at 4k, because it has waaaaaaayyy more compute" are both true (they are)....

What's the proprietary nvidia only hardware feature that satisfies both statements that did not exist back then for gp architecture like the 1070?

If you figure out what that is, you're back at my very first post you responded to, which lays out that exact math in detail, and you should have figured out why you should have never posted anything you did to begin with.
You're really trying hard, eh? More power to you. I say it's possible, you say it's not. Next year we'll find out who's right, wrong or whether it's somewhere inbetween.

Could you point me to some evidence for this? Given the iPhone 15 Pro Max can't run Genshin Impact at a stable 60fps at its native 2796x1290, I don't see what hope an older Nvidia SoC has at 4k.
Seriously? You're going to mention one game? Don't bother with any more. I don't care to debate this further. As was said above; I say it's possible, you say it's not. Next year we'll find out who's right, wrong or whether it's somewhere inbetween.

Ok, we're done.
 

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
We actually have some pretty interesting benches to extrapolate from, to see what kind of options switch 2 devs will be looking at.

We have nvidia provided 'ballpark' benchmarks from nvidias dlss development guides for dlss 2 and 3.1.blah.blah

And now we have a 4090 benchmark for dlss 3.5 through Nsight.

This shows a pretty interesting improvement scale of the dlss fixed execution time of about 20-30% improvement jumps for each bench. (Jumps generally get higher the higher the input res, some close to 40%)

I'm going to bring this up now, because I don't think the person this setup was intended for would ever actually make it to the point where I could use it, but that 0.2 ms execution time for dlss 3.5 wasn't actually quite for 4k, it was for 1440 ultrawide or 3440 x 1440p. Which is more pixels than standard 1440p, but about 60% less pixels than 4k.

So it's an easy math problem to solve with acceptable accuracy. 0.2 ms * 1.6 == around 0.32 ms. Which falls neatly in line with the dlss improvement pattern, with the previous bench (3.1.blah) being .51 Ms.

Comparing benches for amperes 3090, we have 4k on 2.blah at 1.028 ms, and on 3.1.blah at 0.79 ms.

The pattern fits well enough.

So now we can start nailing down some associations.

The 4090 has 16384 cuda cores, the 3090 has 10496, a difference of 1.561x. The 4090's boost clock (used for benches) is 2.52 Ghz, the 3090's is 1.695, a difference of 1.4867x a total difference of about 2.321x. (These are all rounded results)

Let's ballpark test:
Cuda 3090 tflops = 35.58, X 2.321= 82.58
Cuda 4090 Tflops = 82.58 Tflops. Dang good match.

Tensor sparse fp16
3090 = 284.64 * 2.321 = 660.649
4090 = 660.64 ÷ 2.321 = 284.6359

Damn good. Peak Theoretical is a good match.

Testing on dlss performance results, with known 3090 and 4090 performance benches provided by nvidia for the 3.1.blah dlss. Let's see how peak theoretical stacks up to real world performance.

Dlss 3.1.blah:
4090 4k = 0.51 ms
3090 4k = .79 ms

Only a 1.55x difference instead of that 2.321x difference. Huh. Peak Theoretical over promising for the high end, as expected. Had peak theoretical been real world accurate, the 3090 would take 1.18 ms to execute dlss. A 1.49x offset in favor of the weaker hardware.

8k: 4090 = 1.97 vs 3090 = 2.98. Only 1.52x real world performance difference. Peak Theoretical over promising again. I should probably do this for each resolution and take an average, but whatever.

The ratio between Peak theoretical and real use reminds me of the difference between the "Double fp32" ampere (and on) Cuda Peak theoretical, and real world application with typical 30% integer use, leaving only 1.7X fp32. This bodes well for less powerful hardware, it shows it won't get hit has hard in real world applications, as the peak theoretical would make it seem. Although unlike the cuda core example, this is likely dlss on tensor cores being less beneficial the faster the cuda cores are.

Both of these are barely using a fraction of the render time to perform dlss, leaving well over 90% of the render time to the cuda cores. The tensor cores are barely utilized. This is great news for the question of 'can a system with only 48 tensor cores clocked at 1ghz use dlss well'.

So now let's apply this to the t239 ga10f gpu in the switch 2 at 1 GHz:

Ga10f = 1536 cuda cores and 48 tensor cores for 3.072 Tflops dense fp32, and 24.576 Tflops sparse Fp16 @ 1Ghz.

Grabbing our metrics from before:
3090 = 10496 cuda cores @ 1.695 Ghz
4090 = 16384 cuda cores @ 2.52 Ghz
GA10F=1536 Cuda cores @ 1 Ghz. (No boost for you, extra conservative speculation)

3090 has 6.83 X the cuda cores, and 1.695x the clock, for a total of 11.577X the ga10f.

4090 has 10.65 X the cuda cores and 2.52 X the clock, for 26.838 X the GA10F.

Let's test:

3090: 35.58 tflops ÷ 11.577 = 3.073.
Ga 10f 3.072 tflops * 11.577 = 35.56. Good match.

4090 = 82.58 tflops ÷ 26.838 = 3.076. Another good match for peak theoretical.

So let's extrapolate dlss 4k execution times.

3090 was 0.79 ms: 0.79ms X 11.577 = 9.146 Ms.

4090 was 0.51 ms: 0.51ms X 26.838 = 13.687 ms. But what about that 1.49X "over promise?" 13.687÷1.49 = 9.18ms. Yeah, now thats a good match.

Does the 3090 also have an 'over promise ratio'? Maybe, but in order to get the data I need to confirm that, I would need to bench the ga10f dlss execution time, and if I could do that, I wouldn't be doing this.

So here we are. We are looking at 8.3 ms to dlss from 1080p to 4k for dlss 3.1.blah.

We have a real world bench for a 4090 on dlss 3.5 being .2 ms (actually between 0.1 to 0.2) with 1440p ultrawide, and extrapolated to 4k from that with .32 ms, so, .32 X 26.838 = 8.588 ms. Speculate the "overpromise" ratio? 8.588 ÷ 1.49 = 5.76 ms?

Either one of those is very promising. Most of dlss can be run concurrently, with few dependencies with cuda cores.

1440p also has roughly half the execution time cost, and an even smaller input resolution, meaning even more graphical bells and whistles can be stuffed in the frame and better performance on top.

Because the tensor cores have been so very under utilized on high end pc hardware, it looks like switch 2 will actually have plenty of frame time, to render as high fidelity a frame as it can with 1536 cuda cores, and plenty of frame time to perform dlss with tensor cores.

I'm feeling 1440p is going to be the majority sweet spot for switch 2 docked, which is a great upscale fit for 4k tv's.

Switch 'ports'/upgrades(as in not just playing through bc, but the bigger non switch version of the game) .......wii u ports... will..... will they do that again Xenoblade x at last? And ps4 ports will likely be 4k, and even 4k 60.

And there will of course be those standout 4k 60fps modern games, likely with a smart hybrid forward renderer.

Thinking about the other systems on the market like the ps5, brings up an interesting picture as well.

At 10 tflops, the ps5 is a bit more than 3x more powerful than the switch 2. But with dlss performance, the switch 2 only needs to render at 1/4th the resolution. Something interesting to look forward to for sure.

There will be no 4k 120fps games with anything resembling a modern game. Anyone who says anything like that should be laughed at.
 
Last edited:
Joined
Jul 5, 2013
Messages
27,923 (6.69/day)
There will be no 4k 120fps games with anything resembling a modern game. Anyone who says anything like that should be laughed at.
Funny you should say that. Because a few years ago, there were a few nitwits that "laughed at" my suggestion that the 3000 series of RTX cards couldn't do RayTracing@4k... Yeah. Go ahead, laugh. I'm not the one who's going to look like a fool..
 

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
Funny you should say that. Because a few years ago, there were a few nitwits that "laughed at" my suggestion that the 3000 series of RTX cards couldn't do RayTracing@4k... Yeah. Go ahead, laugh. I'm not the one who's going to look like a fool..
Wow.

You guessed that gpu's with hardware dedicated to hardware accelerated ray tracing would be good at ray tracing.

That's so good..... for you.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
It's not me that counts. It's marketing departments, tech press, youtubers that will promote RT as "life like visuals" and anything competing as inferior. This could explain why people where buying RTX 3050 instead of RX 6600 cards. Nvidia is using RT overdrive in CP 2077 as a tech demo and at the same time as a way to make even the lowest Nvidia card look superior to anything from competitors. You can be certain that many will swallow that marketing and ignore everything without an Nvidia logo on it. And yes they will buy an RTX 3060 or an RTX 4060 thinking that they can enable overdrive and enjoy smooth graphics.

As for upscaling, others hate those and would never think of them as important. But even there we get videos with 10 times magnification and 10 time slow down in frames where it is proven that the superiority of DLSS is so great that using anything else will ruin gaming. Of course fun part where the same thing is said now with DLSS 3.5 with ray reconstruction against plain DLSS 3. What was good yesterday (DLSS 3.0), it's sub par today because of the newer version.

It's not me that overplay anything. It's marketing departments, the press, youtubers, individuals who overplay those and many consumers will believe that, yeah, DLSS specifically and Raytracing is everything today.
You talking about PC market now in a Switch 2 thread.

DLSS enables higher and more stable frame rates plus better quality upscaling, thats far more meaningful for console players and will likely be utilised on every Nintendo developed game on the switch 2.

Also just because the odd game markets it, its not representative of the market as a whole, Sony and Microsoft, barely mention it and I expect Nintendo wont mention it at all as their consoles are not about cutting edge visuals. The main talked about feature on launch of PS5 and Xbox series is their support for VRR and higher frame rates. For switch it was about the portability, the detachable controllers, its dock etc.

Can see the compromises that had to be made on the Zelda switch game. DLSS will open things up for Nintendo.

For reference I got the 3080 instead of AMD primarily for two reasons, SGSSAA (drivers) and cost, this was in the middle of the price gouging, Nvidia was selling FE in the UK at MSRP, and AMD wasnt. If AMD add SGSSAA my next GPU is a AMD, I take extra VRAM over dedicated RT any day of the week.
 
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
Seriously? You're going to mention one game? Don't bother with any more. I don't care to debate this further. As was said above; I say it's possible, you say it's not. Next year we'll find out who's right, wrong or whether it's somewhere inbetween.
I'm sorry if I came off as inflammatory, I assumed you had some counter examples in mind that help give some evidence to your stance & I was hoping throwing out one example that goes against your point would get you to give me a counter example.
 
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
You talking about PC market now in a Switch 2 thread.
That's NOT an argument. That's just a try to downplay an argument. I don't have much hopes when someone starts a reply first by trying to downplay someone's points. It shows that he doesn't have a reply. But you did post more, so let's see those
DLSS enables higher and more stable frame rates plus better quality upscaling, thats far more meaningful for console players and will likely be utilised on every Nintendo developed game on the switch 2.
I first talked about marketing and now you throw at me the marketing I talked about. Nice.
Also just because the odd game markets it, its not representative of the market as a whole, Sony and Microsoft, barely mention it and I expect Nintendo wont mention it at all as their consoles are not about cutting edge visuals. The main talked about feature on launch of PS5 and Xbox series is their support for VRR and higher frame rates. For switch it was about the portability, the detachable controllers, its dock etc.
Nintendo was never visuals, but there are game ports where graphics are so bad that it is actually bad. Portability and stuff was the key point of switch, but buyers of that device might try to play something more dimanding and just get dissapointed.
Can see the compromises that had to be made on the Zelda switch game. DLSS will open things up for Nintendo.
If they need to do compromises on Zelda, imagine more dimanding titles.
For reference I got the 3080 instead of AMD primarily for two reasons, SGSSAA (drivers) and cost, this was in the middle of the price gouging, Nvidia was selling FE in the UK at MSRP, and AMD wasnt. If AMD add SGSSAA my next GPU is a AMD, I take extra VRAM over dedicated RT any day of the week.
I guess now you represent the "whole market". And (JMO) no, you will again choose Nvidia no matter what AMD will do.

4k isn't that difficult anymore. It was being done by Midrange GPU's 6 years ago.
Looking at RTX 3050 that is probably twice that Switch iGPU, with extra memory bandwidth and no limitations in how that bandwidth will be split between the GPU and the CPU part of the SOC and more importantly no power limitations that the Switch will have, it will be difficult, even with the advantage of games tailored for Switch 2's specific hardware and capabilities. Graphics will be low to mid settings at best and probably DLSS performance will be used at 4K. Of course some games will have simpler graphics and lower needs by design. Those will play nicely.
 
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
Looking at RTX 3050 that is probably twice that Switch iGPU
But that's a 130W card, we're expecting a 10W, maybe 15W TDP for the Switch 2? So if it's the same architecture but has 1/10th of the power available, how can that get anywhere near half the performance of the 3050?

From a 3050 TPU review I can see the 3050 gets an average of 30-35fps at 4k highest settings in a wide range of AAA games from the last 5 years or so. Assuming you can double the fps with lower settings & then get another 50% with DLSS that puts the 3050 in the 90fps range. So even with the ambitious assumption that the Switch 2 will have half the performance of the 3050, we're still only talking in the 45fps range at 4k with lowered settings.

Are any of my assumptions or rough maths majorly flawed anywhere here?
 

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
Looking at RTX 3050 that is probably twice that Switch iGPU, with extra memory bandwidth and no limitations in how that bandwidth will be split between the GPU and the CPU part of the SOC and more importantly no power limitations that the Switch will have, it will be difficult, even with the advantage of games tailored for Switch 2's specific hardware and capabilities. Graphics will be low to mid settings at best and probably DLSS performance will be used at 4K. Of course some games will have simpler graphics and lower needs by design. Those will play nicely.

I'll put them up for you.... I think you were a bit generous.

Ga10f:
1 12 SM ampere rtx gpc: 6 TM's,
6 Polymorph engines,
1,536 cuda cores,
48 Tensor cores,
48 TMU's,
16 Rops,
12 Ray trace cores.
(Lapsu$ ransom attack, nvn2 graphics api dump)

2 lppdr5 ram blocks, dual channel, 128 bit bus, standard 102 GB/s.
(Commercial shipping manifesto product description for t239, being shipped to nvidia india)
Capacity not confirmed, rumored to be 12 gb for retail unit and 16 for dev sdk.

Likely downclocked to 1-1.3 GHz docked, 500-650 portable (Lapsu$ ransom attack nvn2 clock speed profiles. There was one higher but thats likely just a stress test)

Cpu: Arm 1 cluster, 8 cores. (Nvidia employee updating public Linux for tegra.)
Almost certainly a a78c. Probably 2 or 2.5 GHz.

Are any of my assumptions or rough maths majorly flawed anywhere here?

That's about the long and short of it. There will be the standard console boons, closed environment, no pc overhead. Horizon is a very lean os

Nintendo will almost assuredly strip ptx from the shader compiles again, for just the cubins and a nice little performance boost.

And NVN, or I guess nvn2 now is a very low level api which gains another nice little performance boost.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
But that's a 130W card, we're expecting a 10W, maybe 15W TDP for the Switch 2? So if it's the same architecture but has 1/10th of the power available, how can that get anywhere near half the performance of the 3050?

From a 3050 TPU review I can see the 3050 gets an average of 30-35fps at 4k highest settings in a wide range of AAA games from the last 5 years or so. Assuming you can double the fps with lower settings & then get another 50% with DLSS that puts the 3050 in the 90fps range. So even with the ambitious assumption that the Switch 2 will have half the performance of the 3050, we're still only talking in the 45fps range at 4k with lowered settings.

Are any of my assumptions or rough maths majorly flawed anywhere here?
MY GOD WHY DO YOU CUT 5 WORDS FROM MY POST AND REPOST THE REST OF IT IN DIFFERENT WORDS AS YOUR REPLY?
Are you high right now, or is this just some strange psychological need? Just asking. Politely.

I think you were a bit generous.
Being generous makes the argument more apparent.
That being said I should have used the mobile RTX 3050 Max-Q, but thought about that model later and chose not to change my post.
 
Last edited:
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
MY GOD WHY DO YOU CUT 5 WORDS FROM MY POST AND REPOST THE REST OF IT IN DIFFERENT WORDS AS YOUR REPLY?
Are you high right now, or is this just some strange psychological need? Just asking. Politely.
Sorry, in my haste to reply whilst trying to get my daughter out the house I originally missed that you did mention the power limitation so my post was framed all wrong & the questions to you weren't needed. We do generally agree on the rough performance, I was just using some numbers to explore it & came to the same conclusion as you.

Good thinking on the 3050 Max-Q, that's actually the closest thing we've got for comparison in the 30-series, I wonder if there's some 4k benchmarks of that.
 
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Sorry, in my haste to reply whilst trying to get my daughter out the house I originally missed that you did mention the power limitation so my post was framed all wrong & the questions to you weren't needed. We do generally agree on the rough performance, I was just using some numbers to explore it & came to the same conclusion as you.

Good thinking on the 3050 Max-Q, that's actually the closest thing we've got for comparison in the 30-series, I wonder if there's some 4k benchmarks of that.
Don't post when you have REAL work to do, especially family work. The other person's posts wouldn't go anywhere.

NVIDIA GeForce RTX 3050 Laptop GPU - Benchmarks and Specs - NotebookCheck.net Tech
 
Last edited:
Joined
Apr 9, 2013
Messages
289 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
Don't post when you have REAL work to do, especially family work. The other person's posts wouldn't go anywhere.

NVIDIA GeForce RTX 3050 Laptop GPU - Benchmarks and Specs - NotebookCheck.net Tech
We were having a standoff at the time anyway so there wasn't much to do except try & keep my temper with her! 4 going on 13 at the moment :ohwell:

Yeah there's a limited number of 4k benchmarks with it though, which makes sense tbh because it is clearly not up to 4k AAA gaming! They're kind of all over the place so it's hard to make any conclusions really. I guess just using 3DMark results & comparing to GPUs with lots of 4k testing is probably good enough though.
 
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Yeah there's a limited number of 4k benchmarks with it though, which makes sense tbh because it is clearly not up to 4k AAA gaming! They're kind of all over the place so it's hard to make any conclusions really. I guess just using 3DMark results & comparing to GPUs with lots of 4k testing is probably good enough though.
Different laptops, different power limits, different cooling solutions, no wonder that numbers are all over the place. Also most laptops with RTX 3050 will come with a 1080p screen and probably no reviewer will connect an external 4K monitor to test this mobile GPU at that resolution. So, few 4K results.
 

Soupsammich

New Member
Joined
Nov 18, 2021
Messages
29 (0.03/day)
Really wish nvidia wouldn't hide nsight pros rt core profiling behind forced ndas.

Have a strong hunch it's the denoisers thats been kicking the butts of smaller rtx cards, and not the actual raytracing calculations.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
That's NOT an argument. That's just a try to downplay an argument. I don't have much hopes when someone starts a reply first by trying to downplay someone's points. It shows that he doesn't have a reply. But you did post more, so let's see those

I first talked about marketing and now you throw at me the marketing I talked about. Nice.

Nintendo was never visuals, but there are game ports where graphics are so bad that it is actually bad. Portability and stuff was the key point of switch, but buyers of that device might try to play something more dimanding and just get dissapointed.

If they need to do compromises on Zelda, imagine more dimanding titles.

I guess now you represent the "whole market". And (JMO) no, you will again choose Nvidia no matter what AMD will do.


Looking at RTX 3050 that is probably twice that Switch iGPU, with extra memory bandwidth and no limitations in how that bandwidth will be split between the GPU and the CPU part of the SOC and more importantly no power limitations that the Switch will have, it will be difficult, even with the advantage of games tailored for Switch 2's specific hardware and capabilities. Graphics will be low to mid settings at best and probably DLSS performance will be used at 4K. Of course some games will have simpler graphics and lower needs by design. Those will play nicely.
I think you just have explained very well its nothing to do with marketing but instead you have a very strong opinion on your like of RT. Telling me that the PC enthusiast market is totally relevant to the Nintendo console market was the clear indicator of that, you just clutching at straws at this point, and of course there is no debate to be had here now, you have your opinion and we leave it at that.
 
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I think you just have explained very well its nothing to do with marketing but instead you have a very strong opinion on your like of RT. Telling me that the PC enthusiast market is totally relevant to the Nintendo console market was the clear indicator of that, you just clutching at straws at this point, and of course there is no debate to be had here now, you have your opinion and we leave it at that.
Guess what. It's not "my opinion against reality", it's just "my opinion" and "your opinion". Don't flatter yourself by saying that's me "clutching at straws" when you had no arguments at all. Only thing you had to present was your opinions why I was wrong, but no argument to base it.
And yes everything is marketing.
 
Top