Sunday, April 30th 2017

NVIDIA to Support 4K Netflix on GTX 10 Series Cards

Up to now, only users with Intel's latest Kaby Lake architecture processors could enjoy 4K Netflix due to some strict DRM requirements. Now, NVIDIA is taking it upon itself to allow users with one of its GTX 10 series graphics cards (absent the 1050, at least for now) to enjoy some 4K Netflix and chillin'. Though you really do seem to have to go through some hoops to get there, none of these should pose a problem.

The requirements to enable Netflix UHD playback, as per NVIDIA, are so:
  • NVIDIA Driver version exclusively provided via Microsoft Windows Insider Program (currently 381.74).
  • No other GeForce driver will support this functionality at this time
  • If you are not currently registered for WIP, follow this link for instructions to join: insider.windows.com/
  • NVIDIA Pascal based GPU, GeForce GTX 1050 or greater with minimum 3GB memory
  • HDCP 2.2 capable monitor(s). Please see the additional section below if you are using multiple monitors and/or multiple GPUs.
  • Microsoft Edge browser or Netflix app from the Windows Store
  • Approximately 25Mbps (or faster) internet connection.
Single or multi GPU multi monitor configuration
In case of a multi monitor configuration on a single GPU or multiple GPUs where GPUs are not linked together in SLI/LDA mode, 4K UHD streaming will happen only if all the active monitors are HDCP2.2 capable. If any of the active monitors is not HDCP2.2 capable, the quality will be downgraded to FHD.
What do you think? Is this enough to tide you over to the green camp? Do you use Netflix on your PC?
Sources: NVIDIA Customer Help Portal, Eteknix
Add your own comment

59 Comments on NVIDIA to Support 4K Netflix on GTX 10 Series Cards

#26
Prima.Vera
evernessinceVideo don't use textures. They are just a series of compressed pictures encoded in a certain manner.
Yes, for playback no, but for processing like 10-bit HDR and effects it uses a lot. And also GPU power if is hardware based. Just look at your GPU/VRAM usage in Afterburner or such when playing 4K videos with DXVA2 enabled . ;)
DXVA2 implementations: native and copy-back
DXVA2 implementations come in two variants: native and copy-back.

With native implementation, the decoded video stays in GPU memory until it has been displayed. The video decoder must be connected to the video renderer with no intermediary processing filter. The video renderer must also support DXVA, which gives less freedom in the choice of renderers.

With copy-back implementation, the decoded video is copied from GPU memory back to the CPU's memory. This implementation doesn't have the limitations mentioned above and acts similar to a normal software decoder, however video stuttering will occur if the GPU is not fast enough to copy its memory back to the CPU's memory.

Native mode is advantageous unless there is a need for customized processing, as the additional copy-back operations will increase GPU memory load.[7]

GPUs that should be fast enough are:
  • AMD: Radeon HD 6xxx and newer
  • Nvidia: Nvidia GeForce 500 Series and newer
  • Intel: Intel HD Graphics 2000 and newer
Posted on Reply
#27
RejZoR
jmcslobI hate to say it but UHD is worth putting up with it... You're right if they did this on FHD I would cancel.
Not on PC but on the Widevine DRM they have security levels... Most devices only get 720p HD and very very few devices get full access.... Actually only high end Sony, Samsung and Vizio TVs get full access.. Other than that you have the Nvidia shield TV, Chromecast Ultra and the MiBox..
And under Fairplay I think only the newest Apple TV and the IPhone 6 get full UHD

Iron Fist, Marco Polo, Sense8, Travelers etc... The Netflix shows are literally better in UHD....
I hate the DRM but it's there so they can afford to make more shows in UHD thus worth it.

Edit:
Forgot my point..
I'm glad Nvidia is doing this..
The reason I went to Android TV over my HTPC was because I didn't have the option for 4K....
I'm not going to upgrade my PC for this alone since Android is obviously the better cheaper way to go these days but it's nice to have the option.
Dunno, I have a 4K TV (granted, it's just 42 incher) and the FHD video via Netflix looks amazingly sharp. I really don't even feel the need for more. Probably would if my screen was 65 incher or more though...
Posted on Reply
#28
hat
Enthusiast
I dunno, well encoded DVD rips look good on our 52 inch. 1080p looks really good... so good I resize to 720p when I do bluray rips.
Posted on Reply
#29
Mussels
Freshwater Moderator
as much as i hate this kind of thing, i'm glad i have a 10x0 card and a 4K TV now (9Mb DSL makes it useless, getting 100/40 in 4 months :P)
Posted on Reply
#30
bug
HotobuProblem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
It's not Netflix, it's physics: www.tftcentral.co.uk/articles/visual_acuity.htm
That said, you can still easily spot HDR, so...
Posted on Reply
#31
jabbadap
Prima.VeraMost likely is using the video as a constant flux of 4K textures needed to be offloaded first in the frame buffer, therefore the VRAM reqs. Probably this is how CUDA algorithms works....
Krhm there's no cuda algorithms in use, cuvid is deprecated. Nvidia has purevideo decoder hw nowadays called nvdec, which handles video decoding. So it does not use cuda shaders on video decoding.
Posted on Reply
#32
Unregistered
RejZoRDunno, I have a 4K TV (granted, it's just 42 incher) and the FHD video via Netflix looks amazingly sharp. I really don't even feel the need for more. Probably would if my screen was 65 incher or more though...
Just being honest...
I very much doubt you can see the difference between 720p, 1080p and 4k resolutions from 10' away... You do however notice those nice features that come with a newer 4k tv such as HDR and a better color scheme that comes with full color gamut and full array lighting zones.

So far I've only seen HDR10 in action which is the lowest of the HDR standards and is fantastic...

I originally watched Sense8 and Marco Polo in regular 4k and then again in 4k HDR and yes there is a dramatic difference...
. In my honest opinion 4k is utterly useless unless it comes along with HDR and you have the ability to stream @30mb/s...
I know it says 25Mb/s but it actually doesn't do a constant 25Mb/s.... It goes back and forth from 18Mb/s to 30Mb/s...

And yes Netflix 4k UHD is a bit better than Vudu, Amazon and Google play movies.
#33
RejZoR
Then again, why not also provide 1080p with HDR? While broadband is widely used these days, there are still people who can't get ridiculous speeds. I was one of those till recently, only being able to get 20/10 max. Watching HDTV channels already proved problematic because it was eating into my connection too much, making gaming impossible while someone watched TV. Luckily they unlocked 100/10 now.

Like you said, I'd appreciate better colors, but wouldn't really want 4K to waste my bandwidth for no real reason. Then again my LCD isn't HDR yet so that's not that much of an issue. It does have micro dimming so black stuff is really pitch black which is nice.
Posted on Reply
#35
yogurt_21
it seems like a lot of hoops and hardware for something that came default with my 400$ television...4K amazon as well...
Posted on Reply
#36
bug
Franzen4RealThanks for sharing that article, very interesting!
That was relevant to the topic, but most (all?) articles over there are well worth a read. Not to mention their reviews, but they only do like one monitor every month or so.
Posted on Reply
#37
danbert2000
Meh, this would have been useful to me at some point but I bought an Xbox One S for the UHD BluRay playback and now I use the Xbox for all streaming. It does a good job and it's nice to be able to use a remote with a pause button and remote-designed menus. The only issue with Netflix on the Xbox is that the current Netflix app forces HDR10 mode even when playing 1080p content. I don't think there's any degradation happening by forcing 8 bit content to be upsampled to 10 bit but before the latest firmware updates on my Vizio P50-C1 4KTV, the transition to HDR mode was long and fraught with weird bugs.

I'll probably try this out on my computer a couple times and then go back to using the Xbox.
RejZoRThen again, why not also provide 1080p with HDR? While broadband is widely used these days, there are still people who can't get ridiculous speeds. I was one of those till recently, only being able to get 20/10 max.
HDR does work at 1080p. Your TV must support HDR, which means it's going to be a 4k TV anyway, but if network conditions drop below 4K range, the stream will be 1080p HDR or even as low as 480p HDR. Netflix will send you any number of different bitrates and resolutions, the issue is that someone with a 1080p TV or a 4k TV but an old chromecast or apple TV won't have the hardware in place to decode HDR HEVC streams and make any use of the extra dynamic range or color gamut.
Posted on Reply
#38
Dippyskoodlez
also of note: egpus like the razer core count as a relay and disable this.

another fun fact: optimus allows 4k playback via kaby lake while the dgpu nvidia card accelerates another application, provided you can meet the hdcp compliance to play at all.
Posted on Reply
#39
TheoneandonlyMrK
Prima.VeraMost likely is using the video as a constant flux of 4K textures needed to be offloaded first in the frame buffer, therefore the VRAM reqs. Probably this is how CUDA algorithms works....
I don't think my lg telly has a 3Gb buffer or an nvidia gpu in it and i have defo played 4k on it interstellar was pretty good , not on Netflix though I noticed i could.
Posted on Reply
#40
Hotobu
bugIt's not Netflix, it's physics: www.tftcentral.co.uk/articles/visual_acuity.htm
That said, you can still easily spot HDR, so...
The problem is bad codec/poor compression. When I can easily see dramatic improvement when looking at Youtube/Amazon 4K vs 1080p, and when a 1080p Blu Ray makes Netflix 4K look pedestrian then it's a problem. It shouldn't take HDR to bring it up to standard.
Posted on Reply
#41
Mussels
Freshwater Moderator
my guess is that 4k netflix needs specific hardware decoding support (H.265? HEVC?), so to ensure its working properly they're just using a whitelist for certified devices and charging for support.
Posted on Reply
#42
jabbadap
Musselsmy guess is that 4k netflix needs specific hardware decoding support (H.265? HEVC?), so to ensure its working properly they're just using a whitelist for certified devices and charging for support.
It's using micro$aft drm called playready3.0, it has little to do with decoding support. All you need by decoder is hevc 10bit(nvidia since gm206 and amd since polaris) and of course hdcp2.2 capable monitor(and no other monitors connected). And yeah windows 10 only...
Posted on Reply
#43
bug
jabbadapIt's using micro$aft drm called playready3.0, it has little to do with decoding support. All you need by decoder is hevc 10bit(nvidia since gm206 and amd since polaris) and of course hdcp2.2 capable monitor(and no other monitors connected). And yeah windows 10 only...
Actually, I don't need anything. It's Netflix (and Hollywood) that need to sell me their merchandise ;)
Posted on Reply
#44
Prima.Vera
RejZoRThen again, why not also provide 1080p with HDR?
That's an interesting question. Maybe because there are no HD TVs out there with 10-bit color space and HDR?
I thing this a only a very new thingy for 4K TVs only, just to justify their still callous prices.
Posted on Reply
#45
Mussels
Freshwater Moderator
Prima.VeraThat's an interesting question. Maybe because there are no HD TVs out there with 10-bit color space and HDR?
I thing this a only a very new thingy for 4K TVs only, just to justify their still callous prices.
my 2011 sony did 12 bit at 1080p, full RGB.

Might not have the full contrast range, but as far as the color input/HDMI support goes its definitely been around for a while.
Posted on Reply
#46
Aquinus
Resident Wat-man
Prima.VeraMaybe because there are no HD TVs out there with 10-bit color space and HDR?
Musselsmy 2011 sony did 12 bit at 1080p, full RGB.

Might not have the full contrast range, but as far as the color input/HDMI support goes its definitely been around for a while.
My late plasma which I bought in 2010 was able to do at least 10-bit color. HDR hasn't been around nearly as long as deep color on consumer products.
Posted on Reply
#47
test1test2
Hate to break it to you, but nvidia is screwing you. You're not getting 10-bit from your game cards. You have to be using a Quadro. And for Adobe you have to enable 30 bit display in the preferences and advanced settings. And for the control panel driver setting it has to be set at 10-bit. Then there's the fact that your 10-bit monitor is probably 8+FRC, which is fake.

I wonder how they'll handle this scam they've been running when people try to play HDR content on their $2000 HDR monitors that go on sale this month. On the desktop your game cards only do 8-bit. They only do 10-bit in DX games.

EDIT: And as for the poster above with the 12-bit TV, I highly doubt it. 12-bit panels today cost over $40,000 and hardly any studio has them, plus there's no software until recently that even let you edit video/photos using 12-bits, or video cards besides the Quadro. CS6 only recently added support for 30 bit displays and 10-bit color editing. You can find a good article on Samsung I believe lying about their panel bit depth being 10 when it was really 8. They didn't know. And both Sony and Samsung refuse to tell customers the specs on their panels.

Dell and ASUS or Acer are only now this month and summer coming out with their HDR 4K PC monitors. 1000 nits and quantum dot, not the Dell. 100% Adobe and 95-97.7 DCI-P3. That is off the charts for monitors and better than anything the pros have. They should be TRUE 10-bit panels, not that FRC crap I mentioned above. And again you need a Quadro card for 10-bit output.
Posted on Reply
#49
jabbadap
test1test2I will edit my 1st post with this URL. I don't like the damn software that says I can't edit my own post too fast and calls me a spammer. This explains Nvidia and the Quadro and 30-bit.
www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf
File from year 2009. You should be update your reference. If my memory serves me right nvidia support 10bit/12bit output on movies and full screen gaming now-a-days. Professional programs, which still use OpenGL 10bit, is quadro only.
Posted on Reply
#50
bug
test1test2Hate to break it to you, but nvidia is screwing you. You're not getting 10-bit from your game cards. You have to be using a Quadro. And for Adobe you have to enable 30 bit display in the preferences and advanced settings. And for the control panel driver setting it has to be set at 10-bit. Then there's the fact that your 10-bit monitor is probably 8+FRC, which is fake.

I wonder how they'll handle this scam they've been running when people try to play HDR content on their $2000 HDR monitors that go on sale this month. On the desktop your game cards only do 8-bit. They only do 10-bit in DX games.
Can you make up your mind?
The only thing that requires a Quadro is 10bit for OpenGL (which kinda sucks for Linux, but Linux has more stringent issues anyway).
And yes, 10bit requires an end-to-end solution to work, that's in no way particular to Nvidia.
Posted on Reply
Add your own comment
Nov 22nd, 2024 01:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts