Tuesday, December 19th 2023

NVIDIA GeForce RTX 3050 6GB to Get a Formal Release in February 2024

The GeForce RTX 3050 has been around since January 2022, and formed the entry level of the company's RTX 30-series. It had its moment under the Sun during the crypto GPU shortage as a 1080p gaming option that sold around the $300 mark. With the advent of the RTX 40-series, NVIDIA is finding itself lacking an entry-level discrete GPU that it can push in high volumes. Enter the RTX 3050 6 GB. Cut down from the original RTX 3050, this SKU has 6 GB of memory across a narrower 96-bit GDDR6 memory interface, and fewer shaders. Based on the tiny GA107 "Ampere" silicon, it gets 2,048 CUDA cores compared to the 2,560 of the RTX 3050, a core-configuration NVIDIA refers to as the GA107-325. The card has a tiny typical graphics power (TGP) of just 70 W, and so we should see graphics cards without additional power connectors. The company plans to give the RTX 3050 6 GB a formal retail channel launch in February 2024, at a starting price of $179.
Source: VideoCardz
Add your own comment

63 Comments on NVIDIA GeForce RTX 3050 6GB to Get a Formal Release in February 2024

#26
mechtech
robert3892Right now AMD doesn't have a similar new video card to compete in that price range.
3050 isnt really new either.

An Rx6600 wit 8GB is cheaper.
Posted on Reply
#27
Ownedtbh
it has RTX cores for just $179
Maybe ill build a NIDS out of them.
Posted on Reply
#28
Denver
sLowEndAre we living on the same planet? As far as I'm aware, the 6500 XT was/is widely derided for missing AV1 decode and a HW encoder entirely and not moving performance forward at its price tier relative to the RX580/RX5500.
Ignore it, the RX6500 and RX6400 received negative reviews and criticism on practically all sites, it's worth pointing out that Nvidia still managed to overcome this by launching the legendary GTX 1630 for $150: NVIDIA GeForce GTX 1630 Specs | TechPowerUp GPU Database
Posted on Reply
#29
Lew Zealand
Ownedtbhit has RTX cores for just $179
Maybe ill build a NIDS out of them.
Useless RTX cores. This is heavily cut down from the full-fat 3050's already abysmal RT performance. The real feature it has is DLSS which it will need whenever available just at raster.
Posted on Reply
#30
Minus Infinity
This should be a $99 at most. At $179 they are smoking crack and inhaling magic mushroom vapours.
Posted on Reply
#31
Squared
Ownedtbhit has RTX cores for just $179
Intel 100 series and AMD 6000 and 7000 series APUs all have ray tracing. I think Nvidia is a little late to the budget gaming party with this.
Posted on Reply
#32
Dr. Dro
DenverIgnore it, the RX6500 and RX6400 received negative reviews and criticism on practically all sites, it's worth pointing out that Nvidia still managed to overcome this by launching the legendary GTX 1630 for $150: NVIDIA GeForce GTX 1630 Specs | TechPowerUp GPU Database
www.techpowerup.com/forums/threads/asus-radeon-rx-6500-xt-tuf-gaming.291024/

MSRP 200 (so $20 more for a card that had even less memory) price at the time of review mid-bubble, 350. It was being actively defended "as a proper 5500 XT successor", "might be my go to card if it's ever at MSRP", "it's 2% faster than the 5500 XT and wasn't price hiked, it's not so bad!" are some of the first comments in that review's comment thread. Eventually faced with reality, they just started to call it a bad card all around instead of going on expressive tirades about how AMD is actually dastardly evil.

I'm just saying...
Posted on Reply
#33
chstamos
This is one piece of shit card, the sheer shittiness of which is only to be rivaled by AMD's braindead RX6400 which needed PCIE4 for its full -pathetic- performance.

The value for money has completely vanished on the low end. It was never the sweet spot, granted, but never since S3 Virge have we seen such garbage infesting the entry level GPU range. And no - it wasn't the i740. The i740, for its time, and at the price it ended up selling, was much, much better than this festering dungpile of pus.
Posted on Reply
#34
Macro Device
Dr. Drowww.techpowerup.com/forums/threads/asus-radeon-rx-6500-xt-tuf-gaming.291024/

MSRP 200 (so $20 more for a card that had even less memory) price at the time of review mid-bubble, 350. It was being actively defended "as a proper 5500 XT successor", "might be my go to card if it's ever at MSRP", "it's 2% faster than the 5500 XT and wasn't price hiked, it's not so bad!" are some of the first comments in that review's comment thread. Eventually faced with reality, they just started to call it a bad card all around instead of going on expressive tirades about how AMD is actually dastardly evil.

I'm just saying...
I remember seeing 4 GB for >100 USD and asking the most rude and inappropriate "what in the [censored]" kinda question.

Buying GPUs for leisure never looked as much worse than throwing your gold away in the casino as it looked back then.
Posted on Reply
#35
Dr. Dro
Beginner Micro DeviceI remember seeing 4 GB for >100 USD and asking the most rude and inappropriate "what in the [censored]" kinda question.

Buying GPUs for leisure never looked as much worse than throwing your gold away in the casino as it looked back then.
It's not that I disagree with you on this, my sole point is that no one raises their pitchforks at the other camp with the same ease. $180 is steep for this thing, I agree, especially since you can more or less find the full fat one for that much, but it has a niche it can service that goes currently uncontested, the 6400's a worse performer and isn't adequate for a media PC.
Posted on Reply
#36
cvaldes
SquaredIntel 100 series and AMD 6000 and 7000 series APUs all have ray tracing. I think Nvidia is a little late to the budget gaming party with this.
My belief is this card's primary reason for existence isn't budget gaming but for corporate use when driving 3+ displays is a requirement.

Which is why they are doing this on low-end Ampere rather than recent Ada Lovelace.
Posted on Reply
#37
80-watt Hamster
Minus InfinityThis should be a $99 at most. At $179 they are smoking crack and inhaling magic mushroom vapours.
You'd think that, but the full fat 3050 is still mostly listing for $250 and up.
Posted on Reply
#38
Lew Zealand
chstamosThis is one piece of shit card, the sheer shittiness of which is only to be rivaled by AMD's braindead RX6400 which needed PCIE4 for its full -pathetic- performance.

The value for money has completely vanished on the low end. It was never the sweet spot, granted, but never since S3 Virge have we seen such garbage infesting the entry level GPU range. And no - it wasn't the i740. The i740, for its time, and at the price it ended up selling, was much, much better than this festering dungpile of pus.
The difference being that the 6400 is available for $120 while this is unreleased for $180. At least $120 is somewhat accessible and is drop-in for practically any PC with a PCIe slot, even with a weaker PSU as it's only 51W max.
Posted on Reply
#39
tussinman
Minus InfinityThis should be a $99 at most. At $179 they are smoking crack and inhaling magic mushroom vapours.
I remember 2 years ago the 3050 was a laughing stock. Like $300 USD for a card that could barely outperform a 2016 era GTX 1070. Nvidia in 2022/2023 actually started reselling the 2060 which was cheaper and faster than the 3050

Shocked that it's coming back in 2024, has to be a joke ? (might as well just keep making/selling the 2060)
Posted on Reply
#40
Ruined Mind
Some people have very old computers, with simple power supplies, so the card must not use more than 75 watts. Some of those people replay only old games when they have free time, and those games can almost reach 1920x1080 60 FPS with the highest settings with an RX 6400, even with a PCIe 3.0 x4 connection, so this 3050 6 GB card can definitely reach the full highest 1920x1080 60 FPS target because the RX 6400 almost can. (Some old games can't use FreeSync because their motions are tied to the frame-rate.)

Plus, my favorite game is an OpenGL game, from 2003, which, while using an RX 6400, reaches 30 FPS in the spot that's most difficult to render, and I'm remembering it reaching 60 FPS in that same spot while using a GT 1030 (while the GT 1030 had a PCIe 2.0 x4 connection). Yes, the RX 6400 is worse than the GT 1030 while playing an OpenGL game, even though I had a halved PCIe connection with the 1030. Nvidia cards have better support of OpenGL, but the RX 6400 is a vast improvement in all other games I've tried. (People with far better Radeon cards have mentioned the same problems with that specific game.)

Plus, are we sure that Nvidia will reduce the number of cores of the 3050 6 GB from 2560 to 2048? The version for laptops has 2560 cores, and it already doesn't use more than 75 watts. This article from "TechPowerUp" links to an article from the "VideoCardz" website, which links to a website called "Board Channels", which is a foreign website that we cannot see without logging in. But, this same story from a website called "WCCFTECH" links to a website called "ITHome" as the source, which is a foreign website too, but we don't need to log in to read it, so using a translation website shows us that the "ITHome" website claims the card will have 2560 cores.

Click here to see TechPowerUp's page about the 3050 6 GB that uses 75 watts and still has 2560 cores.

Click here to see the "ITHome" website that mentions 2560 cores.

Who should we trust? Perhaps Nvidia hasn't decided whether it wants to give 2560 cores to the bad people (like me). Plus, some rumors have suggested a range from 70 watts to 100 watts. Could the truth be that, regardless of whether Nvidia will choose 2560 cores or 2048 cores, this will be the final product Nvidia will ever provide to the bad people who have a slot-powered card as their cheapest route to more power, and when the stock is depleted, Nvidia will sell the true 4050 targeted at 90 to 100 watts, which caused some confused sources of rumors to mention the range of 70 to 100 watts?

This bad 3050 may be the final mercy shown to the bad people, provided only because of old stock of the regular 3050 that was either unsold after the pandemic or has defected components that must be disabled, so it cannot reach the full quality of the regular 3050. Without that situation, they might not have ever decided to release a slot-powered card again.

I haven't decided whether I will buy it, but I will tell you this: even if AMD releases a better card that still doesn't require more than 75 watts, and it's cheaper, I still won't choose it, because one of the few games I'll ever play more than once uses OpenGL, and AMD doesn't care about that. (And imagine the horror of Intel's drivers with OpenGL.)
Posted on Reply
#41
R0H1T
80-watt Hamsterbut the full fat 3050 is still mostly listing for $250 and up.
At least, I've never seen it listed under $250 here & generally above $300 (even on sale) although B/M stores might sell it slightly cheaper.
Posted on Reply
#42
TumbleGeorge
R0H1TAt least, I've never seen it listed under $250 here & generally above $300 (even on sale) although B/M stores might sell it slightly cheaper.
In region USA:

And for this month:
Posted on Reply
#43
R0H1T
Right & I'm not from the US, if that wasn't already clear.
Posted on Reply
#44
TumbleGeorge
In my country(Bulgaria) today price decrease to $250(same price in one other shop and little expensive in more shops). This price included 20% Vat and different border fees.
Posted on Reply
#45
Dr. Dro
3050 pricing here is... not good, difference to the 4060 is negligible right now. Interestingly enough, the 4090's price did not increase like in the US.
Posted on Reply
#46
kapone32
Wow, this has worse Specs than the 3060 laptop. It has the same amount of VRAM but half the bus and less cores. This card may struggle to play Newer titles at 1080P running native.
Posted on Reply
#47
Dr. Dro
kapone32Wow, this has worse Specs than the 3060 laptop. It has the same amount of VRAM but half the bus and less cores. This card may struggle to play Newer titles at 1080P running native.
Upon digging a little, it looks like it has similarities to the "3050 Laptop GPU Refresh" that some 2023 entry-level laptops are shipping with. It is different than the original model used in 2021 and 2022 laptops such as my Dell G15 5515. Looks like that they have re-released the "3050 Ti Laptop GPU" (with all 20 SMs enabled) with a narrower bus and now sell it under the "RTX 3050 Laptop GPU" brand, this desktop card being a bizarre design that has the original 3050M's shader configuration (16 out of 20 SMs) and the refresh 3050M's reduced memory bandwidth - basically the worst of both worlds.

I surmise these newer 6 GB mobile 3050s are slower than the original unless the VRAM budget is really badly blown up, because of the reduction in memory bandwidth. For what it's worth, if my laptop's performance (original 128-bit 3050 overclocked at 2 GHz and 13 Gbps memory, 80 watt version) is anything to go by, it should perform acceptable - but I have concerns. I think it'll be slower than the equivalent power (70 and 80 watt) versions of the original 3050M and especially the 3050M Ti.

Posted on Reply
#48
loracle706
Useless card, better lower rtx 3060/4060 price !!
Posted on Reply
#49
kapone32
Dr. DroUpon digging a little, it looks like it has similarities to the "3050 Laptop GPU Refresh" that some 2023 entry-level laptops are shipping with. It is different than the original model used in 2021 and 2022 laptops such as my Dell G15 5515. Looks like that they have re-released the "3050 Ti Laptop GPU" (with all 20 SMs enabled) with a narrower bus and now sell it under the "RTX 3050 Laptop GPU" brand, this desktop card being a bizarre design that has the original 3050M's shader configuration (16 out of 20 SMs) and the refresh 3050M's reduced memory bandwidth - basically the worst of both worlds.

I surmise these newer 6 GB mobile 3050s are slower than the original unless the VRAM budget is really badly blown up, because of the reduction in memory bandwidth. For what it's worth, if my laptop's performance (original 128-bit 3050 overclocked at 2 GHz and 13 Gbps memory, 80 watt version) is anything to go by, it should perform acceptable - but I have concerns. I think it'll be slower than the equivalent power (70 and 80 watt) versions of the original 3050M and especially the 3050M Ti.

That sounds like when I bought my 5800/3060 Strix laptop and they released the 4050 as a replacement with less hardware. Nvidia never changes their shady practices.
Posted on Reply
#50
Squared
cvaldesMy belief is this card's primary reason for existence isn't budget gaming but for corporate use when driving 3+ displays is a requirement.

Which is why they are doing this on low-end Ampere rather than recent Ada Lovelace.
Good point. For all of those possible markets the price seems a little tall considering that many new APUs get the same performance. (And reviewers always seem to oo and aw over the latest iGPU but 10 years ago iGPUs were limited by dual-channel CPU memory, and today iGPUs are limited by dual-channel CPU memory, so I don't think they've improved at all beyond what silicon chips in general have. So why has the entry-level GPU market nearly gone extinct? It's not because iGPUs are good, it's because desktops have almost gone extinct outside of the gaming world.)
Posted on Reply
Add your own comment
Jan 19th, 2025 01:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts