Sunday, February 13th 2022

NVIDIA GA107-based GeForce RTX 3050 is Real, Comes with 11% Lower TDP, Same Specs

When NVIDIA launched the GeForce RTX 3050 "Ampere" based on the "GA106" silicon with specifications that could be fulfilled with the smaller "GA107," we knew that the company could eventually start making RTX 3050 boards with the smaller chip, and they did. Igor's Lab reports that RTX 3050 cards based on GA107 come with a typical board power of 115 W, which is about 11 percent lower than that of the GA106-based cards (130 W).

There's no difference in specifications between the two cards. Both feature 2,560 CUDA cores across 20 streaming multiprocessors, 80 Tensor cores, 20 RT cores, and a 128-bit wide GDDR6 memory interface, holding 8 GB of memory that ticks at 14 Gbps data-rate (224 GB/s bandwidth). The GA106 and GA107 ASICs share a common fiberglass substrate, and hence are pin-compatible for the convenience of board partners, with the latter having a smaller die, so any cooling solution designed for the launch-day RTX 3050 should work perfectly fine with those based on GA107.
Sources: Igor's Lab, VideoCardz
Add your own comment

13 Comments on NVIDIA GA107-based GeForce RTX 3050 is Real, Comes with 11% Lower TDP, Same Specs

#1
Ruru
S.T.A.R.S.
Probably won't reflect to pricing -.-
Posted on Reply
#2
cvaldes
MaenadFINProbably won't reflect to pricing -.-
Actually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.
Posted on Reply
#3
defaultluser
cvaldesActually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.
How does this even happen? You're assuming there's going to be some official card rev?

The swaps will be invisible to end-users (and just-in-case NVIDIA has trouble sourcing enough GA107, they will likely keep the higherr TDP coolers for the next 6 months)
Posted on Reply
#4
Pumper
cvaldesActually the crypto miners will favor this variant over the original model since it will reduce electricity costs.
Will it though? Crypto mining does not load the GPU cores, only memory, so if memory is identical, mining wattage won't change whatsoever.
Posted on Reply
#5
Cutechri
NVIDIA just can't stop the segmentation.
Posted on Reply
#6
TheDeeGee
As long as triple MSRP is a thing none of this is interesting for the average joe.
Posted on Reply
#7
Unregistered
cvaldesActually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.
There is something weird, AMD's cards are much better for mining now but for some reason nVidia's cards are more expensive. Either gamers or miners are dumb.
#8
Chrispy_
PumperWill it though? Crypto mining does not load the GPU cores, only memory, so if memory is identical, mining wattage won't change whatsoever.
This is somewhat accurate.

The cores are still active, just at greatly reduced voltages and clocks. If GA107 cards have an 11% lower TDP, that is at full core+mem workloads like intensive gaming at higher resolution. It'll still probably translate to a ~5% reduced power consumption when ETH mining. Nothing significant but also any power reductions are welcome.

Other algorithms like ERGO, Autolykos, and Ravencoin all use more GPU core than ETH does, so arguably anyone mining those will benefit more.
Xex360There is something weird, AMD's cards are much better for mining now but for some reason nVidia's cards are more expensive. Either gamers or miners are dumb.
I think at this point it comes down to scalpers.
Posted on Reply
#9
AusWolf
I still don't understand why they couldn't make a 3050 under 75 W TDP with no external power connector (and low profile possibilities), and a 3050 Ti with the specs of the real-world 3050.
Posted on Reply
#10
defaultluser
AusWolfI still don't understand why they couldn't make a 3050 under 75 W TDP with no external power connector (and low profile possibilities), and a 3050 Ti with the specs of the real-world 3050.
They can - they are just saving those for the notebook cores.

AMD is trying to sell a part with the same power consumption in the 6500 XT while offering half the 4k performance (I think the power consumption is fine!)
Posted on Reply
#11
AusWolf
defaultluserThey can - they are just saving those for the notebook cores.

AMD is trying to sell a part with the same power consumption in the 6500 XT while offering half the 4k performance (I think the power consumption is fine!)
AMD took the wrong route of clocking a PCI-e 4x notebook chip through the roof and trying to sell it as a desktop GPU for gaming.

Power consumption is not fine 1. considering the performance level, 2. for people looking for low profile cards for compact systems. I understand that it's a niche market, but still, there's no need to squeeze the hell out of an otherwise crap GPU for 5% extra performance.
Posted on Reply
#12
lexluthermiester
btarunrThere's no difference in specifications between the two cards.
So in other words, this new version of the 3050 will be the better choice, if you can get one..

What I want to know is, where's the 3050ti? Or did I miss it?
Posted on Reply
#13
Mussels
Freshwater Moderator
I really hate subtle revisions and hardware changes, but at least they kept the core counts and clocks the same
Posted on Reply
Add your own comment
Dec 21st, 2024 20:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts