Wednesday, March 29th 2023

NVIDIA GeForce RTX 4050 "Ada" Launches This June

NVIDIA's mainstream GeForce RTX 4050 "Ada" graphics card, which succeeds the RTX 3050, reportedly launches in June 2023. This could end up being the highest-volume SKU in the RTX 40-series desktop lineup. Team green is planning to launch a new desktop SKU every month leading up to Summer. April will see the company launch the performance-segment RTX 4070, followed by the RTX 4060 Ti and RTX 4060 in May, and now we hear about the RTX 4050 in June.

The GeForce RTX 4050 is likely based on a highly cut down version of the 5 nm "AD107" silicon that also powers the RTX 4060 in its maxed out configuration. The AD107, much like the AD106, features a 128-bit wide GDDR6 memory interface. For the RTX 4050, NVIDIA could narrow this down to 96-bit, and give it 6 GB of GDDR6 memory, which is 25% less than the 8 GB of 128-bit GDDR6 memory that's standard for the current-generation RTX 3050. NVIDIA would have worked out the performance numbers, and the RTX 4050 might still end up generationally faster than the RTX 3050 despite this narrower/smaller memory.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

81 Comments on NVIDIA GeForce RTX 4050 "Ada" Launches This June

#51
enb141
Berfs11050 Ti supports h.264 decoding, and h.265 4:2:0 decoding. Source
Yeah, but are slower than the 6400 and also 1050 TI doesn't supports VRR.
AusWolfI don't use Kodi, but all H.264 and H.265 videos run butter smoothly in VLC and Youtube (in Chrome) with no CPU usage. Even 4K ones.
Yes but still no VRR or 10 bit on Smart TVs.
Posted on Reply
#52
mechtech
enb141Yeah, but are slower than the 6400 and also 1050 TI doesn't supports VRR.


Yes but still no VRR or 10 bit on Smart TVs.
Do smart TVs even do 10-bit? I thought hdmi tvs are 8-bit or does it depend on the version/year?
Posted on Reply
#53
ReallyBigMistake
Jensen
please release a 4050 Ti
just so I can have it in my collection
Posted on Reply
#54
enb141
mechtechDo smart TVs even do 10-bit? I thought hdmi tvs are 8-bit or does it depend on the version/year?
Yes, with videocards you need either HDMI 2.1 (Only newer AMD has it) or over DisplayPort to HDMI.

The Smarts TVs nees to be HDMI 2.1 as well, so only modern Smart TVs have 10 bit and even 12 bit.
Posted on Reply
#55
AusWolf
enb141Yes, with videocards you need either HDMI 2.1 (Only newer AMD has it) or over DisplayPort to HDMI.

The Smarts TVs nees to be HDMI 2.1 as well, so only modern Smart TVs have 10 bit and even 12 bit.
Then you'll need to wait for lower-end RX 7000-series cards, unless you want a 7900 XT in your HTPC. Nothing else has HDMI 2.1 at the moment.
Posted on Reply
#56
enb141
AusWolfThen you'll need to wait for lower-end RX 7000-series cards, unless you want a 7900 XT in your HTPC. Nothing else has HDMI 2.1 at the moment.
Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
Posted on Reply
#57
AusWolf
enb141Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
You're done with AMD, the only company that has a card with HDMI 2.1 that you admitted you need. I'm not sure I understand, but good luck, I guess?
Posted on Reply
#58
enb141
AusWolfYou're done with AMD, the only company that has a card with HDMI 2.1 that you admitted you need. I'm not sure I understand, but good luck, I guess?
You don't get it isn't it, you can get HDMI 2.1 with Display Port 1.4a, all video cards have that port.
Posted on Reply
#59
AusWolf
enb141You don't get it isn't it, you can get HDMI 2.1 with Display Port 1.4a, all video cards have that port.
DP to HDMI conversion isn't flawless, and I personally wouldn't rely on it. If you manage to make it work, good on you. I just wouldn't say it's the card's fault when it doesn't work, or when you don't get the signal that you wanted, as it wasn't the card's intended purpose in the first place.
Posted on Reply
#60
enb141
AusWolfDP to HDMI conversion isn't flawless, and I personally wouldn't rely on it. If you manage to make it work, good on you. I just wouldn't say it's the card's fault when it doesn't work, or when you don't get the signal that you wanted, as it wasn't the card's intended purpose in the first place.
Yeah, that's why my old 1050 TI worked way better than the new 6400, with that card DisplayPort to HDMI allowed me to get 10 bit and hardware decoding in Kodi was working, no VRR because 10xx series didn't had it.

Keep your AMD, I'll be back to NVidia as soon as they release their 4050.
Posted on Reply
#61
Ripcord
MaMooAMD thought it was wise to release the RX 6500 XT with only 4 GB. I don't think it worked out sales-wise.
Its not the 4GB of memory that hurts the 6500xt its the limited PCI-E bandwidth x4 just doesn't cut it on PCI-E 3 or below.
Posted on Reply
#62
ExcuseMeWtf
RipcordIts not the 4GB of memory that hurts the 6500xt its the limited PCI-E bandwidth x4 just doesn't cut it on PCI-E 3 or below.
Both actually.
If it had more VRAM, it wouldn't access PCI-E so much.
Posted on Reply
#63
TheinsanegamerN
ExcuseMeWtfBoth actually.
If it had more VRAM, it wouldn't access PCI-E so much.
It's also horribly bandwidth starved even if utilization is below 4GB.

That card really should have had a 6GB 96 bit or 8GB 128 bit bus, with a x8 connection.
Posted on Reply
#64
Jeelo
Gonna be overpriced for sure
Posted on Reply
#65
AusWolf
TheinsanegamerNIt's also horribly bandwidth starved even if utilization is below 4GB.
Is there a way to measure that? It has about half the GPU horsepower as the 6650 XT (which is a generally well-received card), and also half the memory bandwidth, which sounds about right to me.

I agree on the x8 connection, though.
Posted on Reply
#66
mechtech
enb141Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
really? my old RX 480 did 10-bit colour, at least pretty sure that it did. Although didnt matter cause I only had/have an 8-bit monitor and windows is only 8-bit desktop anyways.
Posted on Reply
#67
CyberCT
When I had my 2080TI (HDMI 2.0), it was capable of doing 10 bit color on the TV ( HDMI 2.1). It just couldn't do it at 2160p 120hz. But 1080p 120hz 4:4:4 chroma 10 bit color worked fine. This was on my 2021 Samsung 9 series QLED TV. Their higher end QLED TVs are all G-sync compatible too but you can't find Samsung documentation anywhere about it except of some random articles online, which is ridiculous IMO. This would be a great selling point for their TVs. But if you call Samsung and ask, they'll tell you it is G-sync compatible. And it is because I've been using it on the 30 series GPUs lol.
Posted on Reply
#68
enb141
mechtechreally? my old RX 480 did 10-bit colour, at least pretty sure that it did. Although didnt matter cause I only had/have an 8-bit monitor and windows is only 8-bit desktop anyways.
I said with Smart TVs, with monitors doesn't matters (if you have drivers for your monitor), the problem with Smart TVs is that they don't have windows drivers but NVidia cards manage to make it to work.
Posted on Reply
#69
Chrispy_
It had better be faster and better performance/$ than the 3050, BECAUSE THE 3050 IS TERRIBLE.

Anyone with a modicum of sense bought a Radeon 6600 which is a hands-down winner in every possible way - it's like 60% faster at the same price point and gives the more expensive 3060 12G reason to sweat.

If you hate team red for whatever reason, you could have thrown your 3050 money at the used market and picked up a 2060S for less. Not only does it have waaaaay higher performance but you'd still likely save enough cash to buy one of those giant foam middle fingers used at stadium events and mail it to Jensen's home address. The 4GB variant of the 3050 isn't even worth mocking; It's not PC to make fun of stillborn infants.
Posted on Reply
#70
Berfs1
enb141Yeah, but are slower than the 6400 and also 1050 TI doesn't supports VRR.
The RX 6400 came out over 3 years later. VRR is stupid in my opinion, it always causes capture cards to run out of sync, just causes so many unneeded problems. Just use a fixed refresh rate.
AusWolfThat's a full height card with a power connector.


The RX 6400 does too without a power connector and also in low profile flavours while being a lot faster for essentially the same price.
Yeah I unfortunately realized that as soon as I got the card, which baffles me because an 8 pin means the card can draw up to 225W, but this GPU's power limit at 55W default is 1/4th of 225W, it did not need that 8 pin. It's okay tho, I found a way to hide the cable on such a short card lol

As for the RX 6400, the RX 6400 came out in 2022, the GTX 1050 Ti came out late 2016. Congrats, AMD made a card that can do everything the 1050 Ti can do 3 years later and it still can't encode (because it's derived from an ultrabook dGPU). So it is a pretty useless graphics card if you ask me.
Posted on Reply
#71
enb141
Berfs1The RX 6400 came out over 3 years later. VRR is stupid in my opinion, it always causes capture cards to run out of sync, just causes so many unneeded problems. Just use a fixed refresh rate.
Stupid or not, is useful for video playback because some videos are at 24 frames, others at 50 frames, in those cases VRR is useful.

Even with fixed framerates, those videos feel flickering.
Posted on Reply
#72
Berfs1
enb141some videos are at 24 frames, others at 50 frames, in those cases VRR is useful.

Even with fixed framerates, those videos feel flickering.
VRR has a minimum refresh rate, I don't think a lot of monitors can do sub 40Hz. Though, if you set your monitor to 120 or 240 Hz, it will sync properly with both 24, 30, and 60 Hz video! I still don't understand why YouTube allows such different FPS numbers though...
Posted on Reply
#73
enb141
Berfs1VRR has a minimum refresh rate, I don't think a lot of monitors can do sub 40Hz. Though, if you set your monitor to 120 or 240 Hz, it will sync properly with both 24, 30, and 60 Hz video! I still don't understand why YouTube allows such different FPS numbers though...
Who's talking about monitors, I'm talking about smart TVs VRR not Freesync or GSync, which some smart TVs have.

My smart TV works at 30 and 24 Hz, at least in Kodi when I choose to use real framerate allowed me to set those refresh rates.
Posted on Reply
#74
AusWolf
Berfs1As for the RX 6400, the RX 6400 came out in 2022, the GTX 1050 Ti came out late 2016. Congrats, AMD made a card that can do everything the 1050 Ti can do 3 years later and it still can't encode (because it's derived from an ultrabook dGPU). So it is a pretty useless graphics card if you ask me.
Except that it's a hell of a lot faster, it needs less power (50-ish something W instead of 75), and it has HDMI 2.0 which can drive 4K 60 Hz displays. I'm using mine as a HTPC card, so for me, it's quite useful (well, it's actually a 6500 XT, but the point remains).
Posted on Reply
#75
enb141
AusWolfExcept that it's a hell of a lot faster, it needs less power (50-ish something W instead of 75), and it has HDMI 2.0 which can drive 4K 60 Hz displays. I'm using mine as a HTPC card, so for me, it's quite useful (well, it's actually a 6500 XT, but the point remains).
I can't use my 6400 for HTPC and I can't recommend it for HTPC either because you don't have VRR and you are limited to 8 bits with smart TVs, I'll be back to nvidia as soon as they release their 4050.
Posted on Reply
Add your own comment
Jun 11th, 2024 05:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts