• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Support 4K Netflix on GTX 10 Series Cards

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Up to now, only users with Intel's latest Kaby Lake architecture processors could enjoy 4K Netflix due to some strict DRM requirements. Now, NVIDIA is taking it upon itself to allow users with one of its GTX 10 series graphics cards (absent the 1050, at least for now) to enjoy some 4K Netflix and chillin'. Though you really do seem to have to go through some hoops to get there, none of these should pose a problem.

The requirements to enable Netflix UHD playback, as per NVIDIA, are so:
  • NVIDIA Driver version exclusively provided via Microsoft Windows Insider Program (currently 381.74).
  • No other GeForce driver will support this functionality at this time
  • If you are not currently registered for WIP, follow this link for instructions to join: https://insider.windows.com/
  • NVIDIA Pascal based GPU, GeForce GTX 1050 or greater with minimum 3GB memory
  • HDCP 2.2 capable monitor(s). Please see the additional section below if you are using multiple monitors and/or multiple GPUs.
  • Microsoft Edge browser or Netflix app from the Windows Store
  • Approximately 25Mbps (or faster) internet connection.





Single or multi GPU multi monitor configuration
In case of a multi monitor configuration on a single GPU or multiple GPUs where GPUs are not linked together in SLI/LDA mode, 4K UHD streaming will happen only if all the active monitors are HDCP2.2 capable. If any of the active monitors is not HDCP2.2 capable, the quality will be downgraded to FHD.



What do you think? Is this enough to tide you over to the green camp? Do you use Netflix on your PC?

View at TechPowerUp Main Site
 
I'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?
 
Do they really need more Beta testers that badly?
 
I'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?
It's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
 
Last edited:
A Xaomi MiBox costs $69 it is the cheapest device that does Netflix 4k.. It does it with a Mali-450 gfx.

It doesn't take much to do 4k as long as you have HDMI 2.0 with HDCP 2.2...problem is getting Widevine certified... Pretty sure Nvidia would have to submit for permission every single time it updated it's drivers...
 
I'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?

It's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
The thing is that Netflix want hardware accelerated 10-bit 4K HEVC decoding, which only Pascal and Kaby Lake support.

Polaris is also capable, but is not included in PlayReady 3.0(the DRM) yet.
 
I'm not normally one for conspiracy theories, but it seems like money might have changed hands from intel/nv to netflix for that "requirement" to be established. Software decoding would have worked fine imo.
 
There's no reason devices older than Kaby Lake or GTX10 series can't do 4k hevc decode.

I really hate HDCP.
 
I have to wonder if nVidia will eventually extend this to UHD 4K Blu-Ray?
 
Stagnation: the business plan
 
Is there a way to tell what version of HDCP you have? I see in Nvidia's control panel that I do have HDCP through my GPU and monitor but doesn't say what version.

Is it provided by the GPU? Do I have the latest from the GPU or does the monitor determined what version it runs at?
 
It is like they are deliberately trying to make it difficult for people to pay for their product.

Maybe Netflix's DRM team has been infiltrated by Piratebay?
 
Playready, Fairplay and Widevine...
Too many DRM services.... By law they should only be allowed one single standard...

I don't care how many platforms their are... 1 standard.
 
Good thing it's only for UHD if anything. If they enforced this shit on FHD and HD streams, I'd cancel the subscription to Netflix. Though I still think DRM is garbage and shouldn't exist. Anywhere. It's always an annoyance for legit users where pirates simply enjoy their content anywhere they want.
 
The thing is that Netflix want hardware accelerated 10-bit 4K HEVC decoding, which only Pascal and Kaby Lake support.

Polaris is also capable, but is not included in PlayReady 3.0(the DRM) yet.

If my memory is not deceiving me, gm206 should have hevc 10bit decoding capabilities too(+hdmi2.0 and hdcp2.2). Wonder if nvidia will update their driver later to work with those too.

That 3GB memory minimum of what exactly: Vram or main memory?
 
A Xaomi MiBox costs $69 it is the cheapest device that does Netflix 4k.. It does it with a Mali-450 gfx.

It doesn't take much to do 4k as long as you have HDMI 2.0 with HDCP 2.2...problem is getting Widevine certified... Pretty sure Nvidia would have to submit for permission every single time it updated it's drivers...

It's only Gaming at 4K, 5K, 8K that brings gpus to their knees.

My Dell Dimension XPS Gen 1 laptop supported 1920x1080 before it was popular, It ran movies just fine at that resolution, it was only games it had to be toned back, that was with a Gallatin Core P4 and an ATi R9800 256MB.
 
Good thing it's only for UHD if anything. If they enforced this shit on FHD and HD streams, I'd cancel the subscription to Netflix. Though I still think DRM is garbage and shouldn't exist. Anywhere. It's always an annoyance for legit users where pirates simply enjoy their content anywhere they want.
I hate to say it but UHD is worth putting up with it... You're right if they did this on FHD I would cancel.
Not on PC but on the Widevine DRM they have security levels... Most devices only get 720p HD and very very few devices get full access.... Actually only high end Sony, Samsung and Vizio TVs get full access.. Other than that you have the Nvidia shield TV, Chromecast Ultra and the MiBox..
And under Fairplay I think only the newest Apple TV and the IPhone 6 get full UHD

Iron Fist, Marco Polo, Sense8, Travelers etc... The Netflix shows are literally better in UHD....
I hate the DRM but it's there so they can afford to make more shows in UHD thus worth it.

Edit:
Forgot my point..
I'm glad Nvidia is doing this..
The reason I went to Android TV over my HTPC was because I didn't have the option for 4K....
I'm not going to upgrade my PC for this alone since Android is obviously the better cheaper way to go these days but it's nice to have the option.
 
Last edited by a moderator:
It's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
I'm not normally one for conspiracy theories, but it seems like money might have changed hands from intel/nv to netflix for that "requirement" to be established. Software decoding would have worked fine imo.
This 100% has the stench of Hollywood all over it forcing this on to them. It doesn't benefit Netflix to put these restrictions it less they have no choice.
 
Hate to say but no... Netflix is pushing the DRM.... It's not Hollywood....
Almost everything 4k on Netflix is owned Netflix.
 
Problem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
 
My ARM-based TV is more than capable of playing back UHD content from Netflix, Amazon, and YouTube.
Problem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
I've found Amazon's 4k content to be pretty good in terms of quality over 1080p. I agree though that I haven't found Netflix to really offer any real tangible improvement with 4k. YouTube and Amazon seem to be the only half-decent options for 4k content, which is a little disheartening.
 
Bah the BS some were complain about HDMI crap, yes i am on about HDCP 2.2. Lets face it's costing people a fortune, and just to get 4K.
 
Problem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
I beg to differ...
Just 4k...eh kind if see what you mean but I can tell that 4k HDR is very much with it...
Iron Fist...the ground pound scene is completely different in HDR... The fireworks scene in Sense8...All those fantastic night fight scenes on Marco Polo.. All way better in HDR... Completely worth the extra cost.
 
3GB vram minimum to watch a movie? What kind of artificial limitation is that, lol.
 
3GB vram minimum to watch a movie? What kind of artificial limitation is that, lol.
Most likely is using the video as a constant flux of 4K textures needed to be offloaded first in the frame buffer, therefore the VRAM reqs. Probably this is how CUDA algorithms works....
 
Back
Top