Thursday, July 25th 2024

NVIDIA Plans RTX 3050 A with Ada Lovelace AD106 Silicon

NVIDIA may be working on a new RTX 3050 A laptop GPU using an AD106 (Ada Lovelace) die, moving away from the Ampere chips used in other RTX 30-series GPUs. While not officially announced, the GPU is included in NVIDIA's latest driver release and the PCI ID database as GeForce RTX 3050 A Laptop GPU. The AD106 die choice is notable, as it has more transistors and CUDA cores than the GA107 in current RTX 3050s and the AD107 in RTX 4050 laptops. The AD106, used in RTX 4060 Ti desktop and RTX 4070 laptop GPUs, boasts 22.9 billion transistors and 4,608 CUDA cores, compared to GA107's 8.7 billion transistors and 2,560 CUDA cores, and AD107's 18.9 billion transistors and 3,072 CUDA cores.

While this could potentially improve performance, it's likely that NVIDIA will use a cut-down version of the AD106 chip for the RTX 3050 A. The exact specifications and features, such as support for DLSS 3, remain unknown. The use of TSMC's 4N node in AD106, instead of Samsung's 8N node used in Ampere, could potentially improve power efficiency and battery life. The performance of the RTX 3050 A compared to existing RTX 3050 and RTX 4050 laptops remains to be seen, however, the RTX 3050 A will likely perform similarly to existing Ampere-based parts as NVIDIA tends to use similar names for comparable performance levels. It's unclear if NVIDIA will bring this GPU to market, but adding new SKUs late in a product's lifespan isn't unprecedented.
Source: Videocardz
Add your own comment

16 Comments on NVIDIA Plans RTX 3050 A with Ada Lovelace AD106 Silicon

#1
ARF
Doesn't make any sense. Why not 4550, or 4050S, or 5040, 5050, 5030, 5010 or something?
Posted on Reply
#2
sLowEnd
ARFDoesn't make any sense. Why not 4550, or 4050S, or 5040, 5050, 5030, 5010 or something?
It's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding
Posted on Reply
#3
ARF
sLowEndIt's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding
Probably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...
Posted on Reply
#4
Nomad76
News Editor
ARFProbably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...
I doubt these are "new AD106" chips, more likely enough leftovers in NVIDIA stocks.
Posted on Reply
#5
Lew Zealand
sLowEndIt's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding
And like the 2050 when compared to the regular 3050, this 3050-A will probably have half the bus width with a similar core count to the 4050, giving significantly lower performance.

Hence giving it the last-gen name so not to sully the "good name" of current-gen products with its very low performance.
ARFProbably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...
Just the opposite, see above.
Posted on Reply
#6
Sabotaged_Enigma
sLowEndIt's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding
Problem is that 2050 is just a cut-down version of 3050 using Ampere, which is also confusing naming
Posted on Reply
#7
mikesg
A 3050 with GDDR7 and the clocks turned down (40w) would be similar to a desktop 3050... hard to beat for value.
Posted on Reply
#9
mrnagant
Seems really silly when there is no 4050. At least it won't be as bad as AMD. They used to be really bad with this during the Terascale and GCN days. The Radeon 200 series had 4 different generations.

Maybe it'll be an OEM only card, where maybe the 3050 is still selling pretty good. Will be curious if drivers allow it to support DLSS3/3.5.
Posted on Reply
#10
Minus Infinity
Wow 4GB, 1768 CUDA cores and 64 bit bus.
AMD's cheers loudly as RX6400 knocked off it's throne as worst video card ever.
Posted on Reply
#11
kapone32
What people forget is that the 3050 replaced the 3060 which gave you 2 more GB of VRAM. They even had the nerve to charge more for it too. 3050 and then 4050 for about $200 more than 3060 based laptops from 2021.
Posted on Reply
#12
sLowEnd
Minus InfinityWow 4GB, 1768 CUDA cores and 64 bit bus.
AMD's cheers loudly as RX6400 knocked off it's throne as worst video card ever.
The RX6400 never had that throne. It doesn't need a power connector and at least has low profile options.

The GTX1630...exists
Posted on Reply
#13
mikesg
There's no way it has GDDR6 RAM...
Posted on Reply
#14
Lew Zealand
mikesgThere's no way it has GDDR6 RAM...
100% chance it has GDDR6, even the craptacular GTX 1630 has GDDR6. Go ahead and name the last dGPU Nvidia released without GDDR.
Posted on Reply
#15
mikesg
I meant it should have GDDR7.

It suits the 64-bit bus better as it's the equivalent of 128-bit GDDR6.
Posted on Reply
#16
chrcoluk
I feel with 4 gigs, it should be a 3030 card, as 4 gig now days is the same as having 2 gigs back when the 1030 was released.
Posted on Reply
Add your own comment
Nov 21st, 2024 07:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts