Wednesday, April 24th 2019
![NVIDIA](https://tpucdn.com/images/news/nvidia-v1739475473466.png)
NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine
NVIDIA GeForce GTX 1650 has a significantly watered down multimedia feature-set compared to the other GeForce GTX 16-series GPUs. The card was launched this Tuesday (23 April) without any meaningful technical documentation for reviewers, which caused many, including us, to assume that NVIDIA carried over the "Turing" NVENC encoder, giving you a feature-rich HTPC or streaming card at $150. Apparently that is not the case. According to full specifications put out by NVIDIA on its website product-page that went up hours after product launch, the GTX 1650 (and the TU117 silicon) features a multimedia engine that's been carried over from the older "Volta" architecture.
Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
Sources:
Quickshot_Gaming (Reddit), NVIDIA
Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
92 Comments on NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine
There are some other options - like LAN streaming - but they'll always use your CPU in some way. That said, they may use less of it than encoding itself. :)
You can always try the encoder in your Radeon. It will work. It's just not as polished as competition.
Some people noticed problems with reserving GPU resources, i.e. when your GPU loads get near 100%, AMD cards often prioritize gaming. Basically, instead of losing 10-20% of fps on your monitor, you'll lose 95% of fps in the stream. This could have been fixed, though.
AMD encoding adds the most lag, so it's not recommended for scenarios when you play on the receiver (e.g. I stream to a TV). You'll get less lag encoding in Intel QuickSync, which means signal has to move over PCIe first. Yup, 720p is not an issue for sure. Ryzen will do that easily. :)
1080p should work as well. Give it a go.
It gets problematic when you start thinking about 4K. :p
But thanks for the tips! At least I need a capture card for some console gameplay, though which is the best way to capture component video (PS2)? A PCI-E HDMI input card would handle PS3. I have the same feeling. Maybe Turing's NVENC shares features with Volta's one, and they just disabled those?
Maybe B frame support is power hungry? It is actually quite complex. You're not just compressing a single frame, but also comparing it to neighbours.
They had 75W budget. Idle draw + fan is already 10W easily. You're left with 50-55W for game rendering. Even if B frame sucks 2W, it would make a difference.
It would be nice if a Turing owner could test it. :-) 100% disagree. This is a simple card put into mainstream OEM PCs or bought by people that aren't really spending a lot on computers (of either money and time).
The only thing a buyer should be forced to "research" is whether it can run the game he wants.
Which brings us to the issue of... whether lack of B frame support is really an issue for the target consumer. It's not, right?
B frame support simply produces smaller files at the same quality.
People use 1050Ti for esports and they can use 1650 just as well. It has the same NVENC encoding capabilities.
B frame support matters when you need to lower the bandwidth use. I don't think that's a huge problem for 1650 owners. It's not like they'll be streaming 4K@60fps. ;-)
This is interesting, though. In the Maxwell days, it was the 960 and 950 that had the better encoding and decoding compared to the 970 and 980 Ti. Now we have the exact opposite. It always did rub me the wrong way that my 980 was in some ways inferior to the cheaper and less performant product. I prefer this situation more.
Also, there must be some expansion of the NVENC block for Turing that made this decision make sense in the first place. They must have been trying to use as little silicon as possible, otherwise why not put your best foot forward?
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
bta is posting unsourced inflammatory clickbait articles left right and centre, and the same braindead trolls are posting "Intel/NVIDIA/MSI/<insert-company-name-here> sucks" comments in response.
Be careful with arguments like that. ;-)
But seriously, quite a lot. In fact most of us stream. :)
Keep in mind all kinds of remote desktops are basically streaming (RDP, Citrix, VMWare, Teamviewer and so on).
Have you ever used the "Project" function in Windows? How do you think that works? :p
We also stream video between devices we own (NAS to TV, PC/Mac to phone etc) - we just don't think about it. We click "show on TV" and it just magically works - like computers should :).
And yeah, the game streaming phenomenon is also quite significant.
There was also a time when people actually kept movies on drives and consciously encoded them all the time for their phones etc. I think this is a dying skill at the moment - it moved to the cloud mostly. Most people stream unconsciously. Computer chooses the technology for them.
Now, the actual support for NVENC is another matter. By default everything looks for a hardware encoder in the CPU/SoC.
But Citrix and VMware remote desktops already support NVENC (datacenter software - makes sense). Really, you'll criticize Nvidia because they've introduced a very advanced encoding feature and it's not available in the cheapest GPU. Oh my...
Why did the GT 1030 had NVENC removed when even the small Nintendo Switch's SoC has the power to use it? Why does the 1050 successor gets a gimped version of it's NVENC, when the 1050 had a full feature set at the time?
There is zero cost difference with enabling those features already on the hardware, and gives you good marketing for free.
What about AMD? Are you sure they're offering the same encoding features in all their GPUs?
BTW: do you even know what features are supported? :-D
I have no idea how to check this. AMD website is a mess. I think it's not there.
I looked at wikipedia, I've asked google. Nothing. It seems like all the humanity knowledge about VCE is coming from slides and leaks. :-D
en.wikipedia.org/wiki/Video_Coding_Engine
Not much in general. Nothing about VCE 4.0. And there's even VCE 4.1 now!. :-D
Even the name is problematic. Most of the web seems to think "C" is for either "Coding" or "Codec". AMD says its "Code" (why haven't they fixed the wiki article?)
The only thing most of the web agrees on is that VCE is rubbish.
Defending forced market segmentation on hardware that is perfectly capable of having a full feature set...
The only way to read this is: "Oh, you want to stream with a lower bitrate? Go get a (1660 to 2080Ti), even thou our low end hardware is more than capable of doing it".
Have you checked them all personally? Has anyone, ever?
Do you even know what features are supported by your 270X?
According to wikipedia, your GPU has VCE 1.0. I have to base this on wiki, because - obviously - AMD website doesn't mention it. In fact it doesn't mention VCE at all:
www.amd.com/en/support/graphics/amd-radeon-r9-series/amd-radeon-r9-200-series/amd-radeon-r9-270x
Anyway, it came out at roughly the same time Nvidia launched Maxwell.
So thanks to this lovely chart: developer.nvidia.com/video-encode-decode-gpu-support-matrix I know Maxwell introduced lossless H.264.
Does your GPU support lossless H.264?
I can get an AMD CPU, any of them, and I get AVX2. Good luck getting AVX (1, 2, 512) and FMA on any low end Intel CPU. I can get any AMD or Intel GPU and encode, good luck with that on a 1030 (free, useless Geforce Experience on them).
After some time, when B frames are standard, guess who will hate having a 1650?
Yes, Nvidia is the first to deploy it. There is absolutely no need to limit it to the most expensive hardware "just because".
Can you say, being 100% sure, whether your GPU supports the feature I mentioned or doesn't?
Because if you can't, how can you be sure that all AMD GPUs in a generation have the same functionality? Where do you get this knowledge from?
As long as the VCE version you are referring to is the same, there are no differences, as it should be. Same with Intel.
Now, if you want official information, too bad, AMD is the worst on that front. Never try their profiler, it doesn't even work.
Oh, look, AMD has B frames for H264 since VCE2.0, that's from 2013.