The biggest problem for Nvidia & Linux is that 8bit is supported, but 10bit HEVC is not (yet?). And decoding that via software requires a pretty big CPU.
There should be NUCs on the way with Kaby Lake boards, which have the same functionality as Asrock's J3455 and J4205 boards: 4K 10bit HEVC @ 60fps. The only lacking feature for those NUCs would be HDR, which is a thing necessary in the content itself as well. I'm not that convinced of HDR yet.
10 bit hevc is not working (hw decoding) with gtx 960 (libreelec 7.90.008)
Sent from my SM-T320 using Tapatalk