Posts: 2,550
Joined: Dec 2012
Reputation:
226
2017-08-22, 03:50
(This post was last modified: 2017-08-22, 03:53 by brazen1.)
Thanks. Good to know it will do high bitrate HEVC. HDR probably won't have any impact. But, since Kodi doesn't do HDR and madVR does (added externally seamless) 2GB of vRAM probably isn't sufficient. Netflix won't even stream HDR to 2GB but depending on a users needs it's still decent for its price point. Some users may find the economics for popping in an entry level card in that old PC vs multiple streaming boxes a consideration. Just thinking the cheapest option, an RPI, can't do 4k or HDR so it's kinda' losing ground imo. A sub $100 GPU can do everything except maybe HDR via madVR at low settings. (just needs to passthrough) Need a confirmation yes or no from someone trustworthy who uses madVR and HDR hardware or send me a card and I'll test the @%!& out of it.
Posts: 54
Joined: Aug 2008
Reputation:
0
Question: If I have a 1030 Card and run a Linux HTPC box with Ubuntu LTS, no LibreElec, I would just need to upgrade to the latest NVIDIA drivers (probably 384.59) and Kodi 17 should be able to hardware decode hevc (using VDPAU), correct?
Thanks,
o_neill
Posts: 1,068
Joined: Oct 2011
Reputation:
27
Soli
Posting Freak
Posts: 1,068
There's probably no technical reason that you need at least 3GB VRAM for Netflix 4K. Let's hope they do away with that requirement down the road. Of course you could always buy the fanless Palit Gtx1050ti 4GB.. you will sure have a larger headroom when it comes to madVR processing, or you could just wait a little for Coffee Lake and native HDMI 2.0(x) if you don't need to use MadVR..
Damn, we thought Skylake was gonna be it, but then it didn't support hardware hevc10. Then came Kaby Lake..but that didn't support native HDMI 2.0 and requiring the buggy internal DP/HDMI2.0 adapter, and besides various NUCs, that adapter was only included on ASRock Fatal1ty motherboards..
Of course Coffe Lake isn't going to be the final solution either. We're going to have to wait until the next tick (or tock) and new TV's that will support HDMI 2.1(x), which will finally give use 10/12bit RGB@4K60, simplifying all sort of things..
Posts: 3
Joined: Aug 2017
Reputation:
0
What can we do to make the 10bit hevc linux driver issue somewhat of an priority for nvidia? Collectively ask them to proper drivers?
Posts: 14,208
Joined: Nov 2009
Reputation:
711
Klojum
Lost connection
Posts: 14,208
2017-08-25, 14:34
(This post was last modified: 2017-08-25, 14:35 by Klojum.)
Perhaps we can ask the latest Powerball winner (won some $ 750 million...) to hire a couple of Nvidia developers for us, and solve that tiny issue.
Posts: 14,208
Joined: Nov 2009
Reputation:
711
Klojum
Lost connection
Posts: 14,208
Nvidia Linux is not supporting 10bit, Nvidia Windows is.
Posts: 3
Joined: Aug 2017
Reputation:
0
I am inquiring nvidia trough facebook, asking them if there's an eta in resolving this matter. Maybe if we all do this, things will change...