Wednesday, April 24th 2019

NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine
NVIDIA GeForce GTX 1650 has a significantly watered down multimedia feature-set compared to the other GeForce GTX 16-series GPUs. The card was launched this Tuesday (23 April) without any meaningful technical documentation for reviewers, which caused many, including us, to assume that NVIDIA carried over the "Turing" NVENC encoder, giving you a feature-rich HTPC or streaming card at $150. Apparently that is not the case. According to full specifications put out by NVIDIA on its website product-page that went up hours after product launch, the GTX 1650 (and the TU117 silicon) features a multimedia engine that's been carried over from the older "Volta" architecture.
Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
Sources:
Quickshot_Gaming (Reddit), NVIDIA
Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
92 Comments on NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine
You're into hardware. You follow this things. A normal gamer shouldn't. For him R9 370X and R9 380X are cards from the same generation. That's why AMD named them like that, isn't it? :)
Of course Nvidia does the same thing, but they also have good documentation, so you can find the actual specs pretty easily.
I asked @GoldenX whether his card offers a fairly basic encoding feature (lossless encoding). All he could tell me is that it offers the same features as cards built on the same GCN... Game streamers are a niche group. Streaming (we could call it "real-time encoding/decoding") is not a niche activity. Many people do it. They will still benefit from a hardware encoder/decoder.
The same way you are working for free now I guess. That's only for 4K encoding. The 1080p encoding is done the same.
Playing 4k video games is out of question on this card. You are kidding, no? There is no lossless encoding in any commercial video cards. You are confusing with audio encoding.
Or... we could just let the free market sort this out. IDK.
Bit rates for crf=0 would be too high, using crf=12-15 quality-wise there is no difference. Especially when we are talking about games being the video source.
This is ridiculous.
My point is a second graphics card wouldn't help any, because the load on the graphics card is in a different place. Again, I've actually used this extensively, including trying it with a second card in the computer.
Frankly, I actually use game streaming quite a lot (both up and down) - quite likely more than many forum members. I just don't think about it very much.
In general, lossless encoding is very important - both for saving and sharing lossless content. Remember that hardware encoders aren't used just for games. NVENC, as a hardware encoder, takes the encoding load from your CPU (that's the whole point) - of course if it supports the whole process (it may be missing something).
But the computer still uses a lot of other resources just to move data around, validate etc. This is what CPUs are for.
You know... GPUs were invented to take gaming load of the CPUs. CPU isn't doing the actual render, but it's still pretty well utilized. ;-)