Sunday, April 30th 2017

NVIDIA to Support 4K Netflix on GTX 10 Series Cards

Up to now, only users with Intel's latest Kaby Lake architecture processors could enjoy 4K Netflix due to some strict DRM requirements. Now, NVIDIA is taking it upon itself to allow users with one of its GTX 10 series graphics cards (absent the 1050, at least for now) to enjoy some 4K Netflix and chillin'. Though you really do seem to have to go through some hoops to get there, none of these should pose a problem.

The requirements to enable Netflix UHD playback, as per NVIDIA, are so:
  • NVIDIA Driver version exclusively provided via Microsoft Windows Insider Program (currently 381.74).
  • No other GeForce driver will support this functionality at this time
  • If you are not currently registered for WIP, follow this link for instructions to join: insider.windows.com/
  • NVIDIA Pascal based GPU, GeForce GTX 1050 or greater with minimum 3GB memory
  • HDCP 2.2 capable monitor(s). Please see the additional section below if you are using multiple monitors and/or multiple GPUs.
  • Microsoft Edge browser or Netflix app from the Windows Store
  • Approximately 25Mbps (or faster) internet connection.
Single or multi GPU multi monitor configuration
In case of a multi monitor configuration on a single GPU or multiple GPUs where GPUs are not linked together in SLI/LDA mode, 4K UHD streaming will happen only if all the active monitors are HDCP2.2 capable. If any of the active monitors is not HDCP2.2 capable, the quality will be downgraded to FHD.
What do you think? Is this enough to tide you over to the green camp? Do you use Netflix on your PC?
Sources: NVIDIA Customer Help Portal, Eteknix
Add your own comment

59 Comments on NVIDIA to Support 4K Netflix on GTX 10 Series Cards

#1
bug
I'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?
Posted on Reply
#2
Caring1
Do they really need more Beta testers that badly?
Posted on Reply
#3
infrared
bugI'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?
It's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
Posted on Reply
#4
Unregistered
A Xaomi MiBox costs $69 it is the cheapest device that does Netflix 4k.. It does it with a Mali-450 gfx.

It doesn't take much to do 4k as long as you have HDMI 2.0 with HDCP 2.2...problem is getting Widevine certified... Pretty sure Nvidia would have to submit for permission every single time it updated it's drivers...
Posted on Edit | Reply
#5
Nokiron
bugI'm not a fan of any manufacturer bending over to Hollywood's requests.

At the same time, I'm really curious about what's going on here, because Intel has claimed we need Kaby Lake because of some hardware feature absent from other platforms, yet Nvidia seems to be implementing this in software. Or maybe there's some hardware in Pascal GPUs that is now being taken advantage of?
infraredIt's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
The thing is that Netflix want hardware accelerated 10-bit 4K HEVC decoding, which only Pascal and Kaby Lake support.

Polaris is also capable, but is not included in PlayReady 3.0(the DRM) yet.
Posted on Reply
#6
infrared
I'm not normally one for conspiracy theories, but it seems like money might have changed hands from intel/nv to netflix for that "requirement" to be established. Software decoding would have worked fine imo.
Posted on Reply
#7
hat
Enthusiast
There's no reason devices older than Kaby Lake or GTX10 series can't do 4k hevc decode.

I really hate HDCP.
Posted on Reply
#8
Octavean
I have to wonder if nVidia will eventually extend this to UHD 4K Blu-Ray?
Posted on Reply
#9
Vayra86
Stagnation: the business plan
Posted on Reply
#10
LightningJR
Is there a way to tell what version of HDCP you have? I see in Nvidia's control panel that I do have HDCP through my GPU and monitor but doesn't say what version.

Is it provided by the GPU? Do I have the latest from the GPU or does the monitor determined what version it runs at?
Posted on Reply
#11
Beastie
It is like they are deliberately trying to make it difficult for people to pay for their product.

Maybe Netflix's DRM team has been infiltrated by Piratebay?
Posted on Reply
#12
Unregistered
Playready, Fairplay and Widevine...
Too many DRM services.... By law they should only be allowed one single standard...

I don't care how many platforms their are... 1 standard.
Posted on Edit | Reply
#13
RejZoR
Good thing it's only for UHD if anything. If they enforced this shit on FHD and HD streams, I'd cancel the subscription to Netflix. Though I still think DRM is garbage and shouldn't exist. Anywhere. It's always an annoyance for legit users where pirates simply enjoy their content anywhere they want.
Posted on Reply
#14
jabbadap
NokironThe thing is that Netflix want hardware accelerated 10-bit 4K HEVC decoding, which only Pascal and Kaby Lake support.

Polaris is also capable, but is not included in PlayReady 3.0(the DRM) yet.
If my memory is not deceiving me, gm206 should have hevc 10bit decoding capabilities too(+hdmi2.0 and hdcp2.2). Wonder if nvidia will update their driver later to work with those too.

That 3GB memory minimum of what exactly: Vram or main memory?
Posted on Reply
#15
eidairaman1
The Exiled Airman
jmcslobA Xaomi MiBox costs $69 it is the cheapest device that does Netflix 4k.. It does it with a Mali-450 gfx.

It doesn't take much to do 4k as long as you have HDMI 2.0 with HDCP 2.2...problem is getting Widevine certified... Pretty sure Nvidia would have to submit for permission every single time it updated it's drivers...
It's only Gaming at 4K, 5K, 8K that brings gpus to their knees.

My Dell Dimension XPS Gen 1 laptop supported 1920x1080 before it was popular, It ran movies just fine at that resolution, it was only games it had to be toned back, that was with a Gallatin Core P4 and an ATi R9800 256MB.
Posted on Reply
#16
Unregistered
RejZoRGood thing it's only for UHD if anything. If they enforced this shit on FHD and HD streams, I'd cancel the subscription to Netflix. Though I still think DRM is garbage and shouldn't exist. Anywhere. It's always an annoyance for legit users where pirates simply enjoy their content anywhere they want.
I hate to say it but UHD is worth putting up with it... You're right if they did this on FHD I would cancel.
Not on PC but on the Widevine DRM they have security levels... Most devices only get 720p HD and very very few devices get full access.... Actually only high end Sony, Samsung and Vizio TVs get full access.. Other than that you have the Nvidia shield TV, Chromecast Ultra and the MiBox..
And under Fairplay I think only the newest Apple TV and the IPhone 6 get full UHD

Iron Fist, Marco Polo, Sense8, Travelers etc... The Netflix shows are literally better in UHD....
I hate the DRM but it's there so they can afford to make more shows in UHD thus worth it.

Edit:
Forgot my point..
I'm glad Nvidia is doing this..
The reason I went to Android TV over my HTPC was because I didn't have the option for 4K....
I'm not going to upgrade my PC for this alone since Android is obviously the better cheaper way to go these days but it's nice to have the option.
Posted on Edit | Reply
#17
arbiter
infraredIt's a completely artificial limitation, there's no reason a core 2 duo and 8600GT GTX660 couldn't run 4K netflix.
Edit: 8600gt can't output 4k, derp. My point still stands :)
infraredI'm not normally one for conspiracy theories, but it seems like money might have changed hands from intel/nv to netflix for that "requirement" to be established. Software decoding would have worked fine imo.
This 100% has the stench of Hollywood all over it forcing this on to them. It doesn't benefit Netflix to put these restrictions it less they have no choice.
Posted on Reply
#18
Unregistered
Hate to say but no... Netflix is pushing the DRM.... It's not Hollywood....
Almost everything 4k on Netflix is owned Netflix.
Posted on Edit | Reply
#19
Hotobu
Problem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
Posted on Reply
#20
Aquinus
Resident Wat-man
My ARM-based TV is more than capable of playing back UHD content from Netflix, Amazon, and YouTube.
HotobuProblem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
I've found Amazon's 4k content to be pretty good in terms of quality over 1080p. I agree though that I haven't found Netflix to really offer any real tangible improvement with 4k. YouTube and Amazon seem to be the only half-decent options for 4k content, which is a little disheartening.
Posted on Reply
#21
AsRock
TPU addict
Bah the BS some were complain about HDMI crap, yes i am on about HDCP 2.2. Lets face it's costing people a fortune, and just to get 4K.
Posted on Reply
#22
Unregistered
HotobuProblem is that Netflix 4k is kinda trash. I have a 4K TV and cancelled 4K service because the improvement was marginal at best.
I beg to differ...
Just 4k...eh kind if see what you mean but I can tell that 4k HDR is very much with it...
Iron Fist...the ground pound scene is completely different in HDR... The fireworks scene in Sense8...All those fantastic night fight scenes on Marco Polo.. All way better in HDR... Completely worth the extra cost.
Posted on Edit | Reply
#23
xorbe
3GB vram minimum to watch a movie? What kind of artificial limitation is that, lol.
Posted on Reply
#24
Prima.Vera
xorbe3GB vram minimum to watch a movie? What kind of artificial limitation is that, lol.
Most likely is using the video as a constant flux of 4K textures needed to be offloaded first in the frame buffer, therefore the VRAM reqs. Probably this is how CUDA algorithms works....
Posted on Reply
#25
evernessince
Prima.VeraMost likely is using the video as a constant flux of 4K textures needed to be offloaded first in the frame buffer, therefore the VRAM reqs. Probably this is how CUDA algorithms works....
Video don't use textures. They are just a series of compressed pictures encoded in a certain manner.
Posted on Reply
Add your own comment
Nov 25th, 2024 14:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts