Wednesday, April 24th 2019

NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine

NVIDIA GeForce GTX 1650 has a significantly watered down multimedia feature-set compared to the other GeForce GTX 16-series GPUs. The card was launched this Tuesday (23 April) without any meaningful technical documentation for reviewers, which caused many, including us, to assume that NVIDIA carried over the "Turing" NVENC encoder, giving you a feature-rich HTPC or streaming card at $150. Apparently that is not the case. According to full specifications put out by NVIDIA on its website product-page that went up hours after product launch, the GTX 1650 (and the TU117 silicon) features a multimedia engine that's been carried over from the older "Volta" architecture.

Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
Sources: Quickshot_Gaming (Reddit), NVIDIA
Add your own comment

92 Comments on NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine

#76
notb
R0H1TWould you like to know the GCN version corresponding wrt each of these cards? If you don't here's a good site for starters ~ GPU Database
I know VCE features correspond to GCN. But why should a consumer have to know this?
You're into hardware. You follow this things. A normal gamer shouldn't. For him R9 370X and R9 380X are cards from the same generation. That's why AMD named them like that, isn't it? :)

Of course Nvidia does the same thing, but they also have good documentation, so you can find the actual specs pretty easily.
I asked @GoldenX whether his card offers a fairly basic encoding feature (lossless encoding). All he could tell me is that it offers the same features as cards built on the same GCN...
lexluthermiesterRight, and streamers are that niche group. I think we're agreeing, just from two different pages in the same chapter/book.
Game streamers are a niche group. Streaming (we could call it "real-time encoding/decoding") is not a niche activity. Many people do it. They will still benefit from a hardware encoder/decoder.
Posted on Reply
#77
newtekie1
Semi-Retired Folder
lexluthermiesterRight, and streamers are that niche group. I think we're agreeing, just from two different pages in the same chapter/book.
Streamers aren't using a second card on a regular bases. Using a second card is a niche use case. It also doesn't make any sense when you really consider it, which is why almost no one does it.
Posted on Reply
#78
lexluthermiester
newtekie1Streamers aren't using a second card on a regular bases. Using a second card is a niche use case. It also doesn't make any sense when you really consider it, which is why almost no one does it.
Fair point, which goes back to the point of buying a more powerful card if you're gaming and stream at the same time.
notbGame streamers are a niche group. Streaming (we could call it "real-time encoding/decoding") is not a niche activity. Many people do it. They will still benefit from a hardware encoder/decoder.
I think you missed my point, but no worries.
Posted on Reply
#79
GoldenX
lexluthermiesterFair point, which goes back to the point of buying a more powerful card if you're gaming and stream at the same time.
Unless you want to encode at 4k and high framerate, then you need a nice 2080Ti AND a dedicated encoding card that may be more expensive than the 2080Ti.
Posted on Reply
#80
SoNic67
stimpy88Nasty, petty minded company, with an open contempt for its own customers.
How dare they not give away for free their product?
The same way you are working for free now I guess.
R0H1TYes, so solid that even the youtube reviews are giving it a meh/miss at that price :rolleyes:
Also what about the up to 15% better efficiency via Turing NVENC?
That's only for 4K encoding. The 1080p encoding is done the same.
Playing 4k video games is out of question on this card.
notboffers a fairly basic encoding feature (lossless encoding)
You are kidding, no? There is no lossless encoding in any commercial video cards. You are confusing with audio encoding.
Posted on Reply
#81
lexluthermiester
SoNic67How dare they not give away for free their product?
Right? How dare they sell an inferior performing GPU for more money..
Posted on Reply
#82
Assimilator
lexluthermiesterRight? How dare they sell an inferior performing GPU for more money..
OH NOES a company is selling a product for a price I don't agree with! Quick, call the Internet police!

Or... we could just let the free market sort this out. IDK.
Posted on Reply
#83
lexluthermiester
AssimilatorOr... we could just let the free market sort this out. IDK.
That's exactly what I was hinting at.
Posted on Reply
#84
jabbadap
SoNic67How dare they not give away for free their product?
The same way you are working for free now I guess.

That's only for 4K encoding. The 1080p encoding is done the same.
Playing 4k video games is out of question on this card.

You are kidding, no? There is no lossless encoding in any commercial video cards. You are confusing with audio encoding.
Uhm there are, at least on ffmpeg you can encode lossless video with nvenc. And yeah that nvenc/nvdec matrix have lossless supported on all nvenc chips listed.
Posted on Reply
#85
newtekie1
Semi-Retired Folder
GoldenXUnless you want to encode at 4k and high framerate, then you need a nice 2080Ti AND a dedicated encoding card that may be more expensive than the 2080Ti.
No you don't. Even encoding 4K doesn't really affect gaming performance when done on the same card. Again, you are still acting like the encoding happens on the same part of the GPU as game rendering, it doesn't. There is dedicated hardware built into the GPU that will encode up to 8K with next to no gaming performance loss.
Posted on Reply
#86
GoldenX
newtekie1No you don't. Even encoding 4K doesn't really affect gaming performance when done on the same card. Again, you are still acting like the encoding happens on the same part of the GPU as game rendering, it doesn't. There is dedicated hardware built into the GPU that will encode up to 8K with next to no gaming performance loss.
There is always a performance loss, you are wasting a lot of bandwidth. Good luck with encoding while using RTX.
Posted on Reply
#87
SoNic67
jabbadapffmpeg you can encode lossless video with nvenc.
Actually all commercial video encoding is based on lossy compression. There might be cases in h264 and h265 when you could have a loseless region (or something like a picture), but was never meant for whole actual video.
Bit rates for crf=0 would be too high, using crf=12-15 quality-wise there is no difference. Especially when we are talking about games being the video source.
Posted on Reply
#88
GoldenX
notbI know VCE features correspond to GCN. But why should a consumer have to know this?
You're into hardware. You follow this things. A normal gamer shouldn't. For him R9 370X and R9 380X are cards from the same generation. That's why AMD named them like that, isn't it? :)

Of course Nvidia does the same thing, but they also have good documentation, so you can find the actual specs pretty easily.
I asked @GoldenX whether his card offers a fairly basic encoding feature (lossless encoding). All he could tell me is that it offers the same features as cards built on the same GCN...


Game streamers are a niche group. Streaming (we could call it "real-time encoding/decoding") is not a niche activity. Many people do it. They will still benefit from a hardware encoder/decoder.
Lossless video is useless unless you want a 50TB file.
Posted on Reply
#89
notb
GoldenXLossless video is useless unless you want a 50TB file.
Lossless encoding is used by many processes and software. It doesn't mean you have to save a lossless file.
SoNic67You are kidding, no? There is no lossless encoding in any commercial video cards. You are confusing with audio encoding.
I've linked NVENC information earlier. Lossless encoding is supported.
Posted on Reply
#90
SoNic67
When your source is game originated, and especially when you are posting it on the net, lossless is useless. You need a professional camera to have that quality of the input that loseless would matter. And even then, you can distribute that only on HDD.
This is ridiculous.
Posted on Reply
#91
newtekie1
Semi-Retired Folder
GoldenX
There is always a performance loss, you are wasting a lot of bandwidth. Good luck with encoding while using RTX.
Lower framerates doesn't tell us why. The NVENC still uses a pretty big portion of the CPU. I'm using it to encode a 1080p video right now as I write the post, and on my 8700K, it's loading my CPU to 30%.

My point is a second graphics card wouldn't help any, because the load on the graphics card is in a different place. Again, I've actually used this extensively, including trying it with a second card in the computer.
Posted on Reply
#92
notb
SoNic67When your source is game originated, and especially when you are posting it on the net, lossless is useless. You need a professional camera to have that quality of the input that loseless would matter. And even then, you can distribute that only on HDD.
This is ridiculous.
My knowledge about game streaming is limited, so I'd rather not focus on that.
Frankly, I actually use game streaming quite a lot (both up and down) - quite likely more than many forum members. I just don't think about it very much.

In general, lossless encoding is very important - both for saving and sharing lossless content. Remember that hardware encoders aren't used just for games.
newtekie1Lower framerates doesn't tell us why. The NVENC still uses a pretty big portion of the CPU.
NVENC, as a hardware encoder, takes the encoding load from your CPU (that's the whole point) - of course if it supports the whole process (it may be missing something).
But the computer still uses a lot of other resources just to move data around, validate etc. This is what CPUs are for.

You know... GPUs were invented to take gaming load of the CPUs. CPU isn't doing the actual render, but it's still pretty well utilized. ;-)
Posted on Reply
Add your own comment
Apr 24th, 2025 23:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts