Saturday, December 18th 2021

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

NVIDIA has just announced three new mobile GPUs, although the question is how new any of them really are, as the model names suggest they're anything but. First up is the GeForce RTX 2050, which should be based on the Turing architecture. The other two GPUs are the GeForce MX550 and MX570, both presumably based on the Ampere architecture, although NVIDIA hasn't confirmed the specifics.

The GeForce RTX 2050 features 2048 CUDA cores, which is more than the mobile RTX 2060, but it has lower clock speeds and a vastly lower power draw at 30-45 Watts depending on the notebook design choices and cooling. It's also limited to 4 GB of 64-bit GDDR6 memory, which puts this in GeForce MX territory when it comes to memory bandwidth, as NVIDIA quotes an up to memory bandwidth of a mere 112 GB/s.
The two GeForce MX parts also support GDDR6 memory, but beyond that, NVIDIA hasn't released anything tangible in terms of the specs. NVIDIA mentions that the GeForce MX550 will replace the MX450, while stating both GPUs are intended to boost the performance for video and photo editing on the move, with the MX570 also being suitable for gaming. All three GPUs are said to ship in laptops sometime this coming spring.

Update: According to a tweet by ComputerBase, who has confirmed the information with Nvidia, the RTX 2050 and the MX570 are based on Ampere and the GA107 GPU, with the MX550 being based on Turin and the TU117 GPU. The MX570 is also said to support DLSS and "limited" RTX features, whatever that means.
Sources: Nvidia, Nvidia 2, @ComputerBase
Add your own comment

33 Comments on NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

#26
Raven Rampkin
ARFNvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
Happy to learn about your personal AMD preferences under an Nvidia article.
Posted on Reply
#27
trsttte
AusWolfWhat's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
What's the point of a very weak discrete gpu on a laptop when igps are getting almost-ish as fast anyway?
TheLostSwedeUpdated the article accordingly. There wasn't enough details available at the time I wrote it and other sites hinted at what I originally wrote.

Also:
There were recent headlines of the upcoming AMD Rembrandt apus doing 2700 on time spy so what's the point of having a slightly better dgpu as well?
Posted on Reply
#28
Ruru
S.T.A.R.S.
trsttteWhat's the point of a very weak discrete gpu on a laptop when igps are getting almost-ish as fast anyway?



There were recent headlines of the upcoming AMD Rembrandt apus doing 2700 on time spy so what's the point of having a slightly better dgpu as well?
For Intel-based laptops? :confused:
Posted on Reply
#29
AusWolf
Chrispy_Oh, forget "preserving your settings", this affects anyone installing an Nvidia GPU on an HDMI display; if you are unlucky and have a "winning" combo that auto-detects as limited dynamic range it will be wrong automatically from the beginning and it'll re-wrong every driver update without fail. You have to be aware of it, know to change it, and know that it'll always need resetting after each and every update.

I guess this is why so many videos still exist in shitty limited RGB with grey blacks and poor contrast. It's even spawned several AMD vs Nvidia comparison videos where the AMD video clearly has higher contrast than the Nvidia video and the ignorant vlogger is just presenting the two as "differences" when they simply need to set their drivers to output at full dynamic range.

Yes, they're ignorant, but no this is not their fault. It's Nvidia's fault and their drivers have been a trainwreck for years. I kind of wish I was running an AMD GPU right now because the Nvidia control panel makes me want to rant about negligence and bad UI every time I have to actually use it (which is mercifully almost never).
Like I said: Don't be lazy, check your settings. ;) Don't assume that any driver or program gets installed with your preferred settings by default - nvidia, AMD or otherwise.

As for the UI, I have no issues with it. It has been the same for the last 15 years, which to me, is just convenient. I hate re-learning to navigate menus just because some random UI engineer decided that I should. AMD's new drivers look nice, but their menu structure is overcomplicated, imo. It's like nvidia's Control Panel and GeForce Experience under one app. At least nvidia gives you the option to install and use them separately.
Posted on Reply
#30
Ruru
S.T.A.R.S.
AusWolfAs for the UI, I have no issues with it. It has been the same for the last 15 years, which to me, is just convenient. I hate re-learning to navigate menus just because some random UI engineer decided that I should. AMD's new drivers look nice, but their menu structure is overcomplicated, imo. It's like nvidia's Control Panel and GeForce Experience under one app. At least nvidia gives you the option to install and use them separately.
Haven't even realized that or time simply just goes way too fast. I remember being like "man that new Nvidia panel sucks" when that came back then.
Posted on Reply
#31
AusWolf
MaenadHaven't even realized that or time simply just goes way too fast. I remember being like "man that new Nvidia panel sucks" when that came back then.
I remember, Control Panel looked exactly the same with my 7800 GS on Windows XP as it does nowadays. Some people might consider it negative, but I quite like its simplicity and function-orientedness. :)
Posted on Reply
#32
Ruru
S.T.A.R.S.
AusWolfI remember, Control Panel looked exactly the same with my 7800 GS on Windows XP as it does nowadays. Some people might consider it negative, but I quite like its simplicity and function-orientedness. :)
"Don't fix it if it ain't broken" :)
Posted on Reply
#33
trsttte
MaenadFor Intel-based laptops? :confused:
Tiger lake is doing between between 1500-1800 on time spy already, Alder Lake with DDR5/LPDDR5 and more mature architecture should give it a nice boost (let's ballkpark it to between 2000 and 2300), an extra chip to power and cool is not worth it for between ~25% to 40% more performance imo
Posted on Reply
Add your own comment
Jan 23rd, 2025 07:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts