Intel true 10bits/HEVC/HDR support... ?

  • Hi all,

    I just installed LE 9.2.3 on my Intel NUC NUC8i5BEK. I am unable to get HDR to play - it just down samples to UHD.

    I have been reading this thread, and I understand that currently there is no stable way to get HDR10 to play on Intel?

    I am very happy to test, if there is a latest build for x86 with a patch of something, that I can try? I am using a Samsung UN8000 TV.

    Also, are there newer NUC boards that will play HDR with LE?

    Cheers,

    Ken

    Edited 2 times, last by kenmills (July 7, 2020 at 6:55 AM).

  • At the moment I think HDR10 output support for Intel under Linux is limited to Gemini Lake processors as they are the ones with native HDMI 2.0 output. The issue with other Intel devices is that they use Displayport outputs from the CPU+GPU SoC and convert this to HDMI 2.0 using an LSPCon chip on the motherboard, HDR support for which is currently lacking fully within Linux. That said there are more and more patches appearing (LSPCon devices are now able to trigger HDR 10 mode on connected TVs, but don't display video in that mode at the moment)

  • At the moment I think HDR10 output support for Intel under Linux is limited to Gemini Lake processors as they are the ones with native HDMI 2.0 output. The issue with other Intel devices is that they use Displayport outputs from the CPU+GPU SoC and convert this to HDMI 2.0 using an LSPCon chip on the motherboard, HDR support for which is currently lacking fully within Linux. That said there are more and more patches appearing (LSPCon devices are now able to trigger HDR 10 mode on connected TVs, but don't display video in that mode at the moment)

    Thanks Noggin, for summing it up. There are no newer NUCs that have natic HDMI 2.0 support? Or will I have to wait for the NUC 11 that have native HDMI 2.1 support?

    Are the patches specific to each NUC version? If so, where/how do I go about locating one for my NUC8i5BEK?

    Alternatively, if I use the Display Port output, and use an external adapter (like this Mini DisplayPort 1.2 to HDMI 2.0 4K/UHD 60Hz Active Adapter - Atlast! Solutions - your source of reliable powerful fanless PCs) will that bypass the internal LSPC and allow for HDR? (Actually mine sends DP via USB-C, so an adapter that does USB-C -> HDMI).

    *Please excuse me if that is a stupid question lol.

    Cheers!!

    Ken

    Edited 4 times, last by kenmills (July 7, 2020 at 7:45 PM).

  • So big thing that holds me back from getting a Gemini Lake box to replace my Vero 4k+ is that then I would NEED to use a v19 kodi build vs continuing to use V18. This is only an issue for me because of the python switch. I don't actually use many addon's except 1 is key for me, Artwork Beef. I know v18 and v19 technically can download extended artwork themselves but I haven't had a chance to see how well that works in the real world. I love my Aeon Nox skin and the eye candy so this is important to me. Can anyone chime in on their experience?

    Gonna bump this up again as I didn't get any response. Anyone have experiece with extended artwork NOT using Artwork beef in Kodi? Or any idea how to mod Artwork Beef to work in v19? I asked this in the addon's forum on kodi's forum and so far it's been silent too.

  • I think the advice at the moment is not to switch to Intel for 4K HDR stuff in Linux as your main platform. This stuff is still experimental, there is no >30fps UHD HDR support (in the UK that's needed for live UHD HDR iPlayer for instance) and by the time the work is in a reliable and mainstream state there may be newer (better value?) Intel platforms to consider.

    I was interested in the status of Intel HDR on Linux, and have some diagnostic kit (and a bit of video standard awareness) so when a low cost Gemini Lake box appeared on my radar I was intrigued enough to buy one. It's not my daily driver (that's a combination of Apple TV 4K with Netflix/Prime and MrMC, and AMLogic boxes running CoreElec)

  • Thanks Noggin,

    OK so I'll just move my NUC LE install to my non 4K room for now. I have an AMLocgic S912 that does HDR now.

    I'll keep an eye on this thread, and when Intel releases drivers for Linux, I'll re-investigate. Hopefully I'll be able to use my NUC in the future :)

    Thanks!!

  • I think the advice at the moment is not to switch to Intel for 4K HDR stuff in Linux as your main platform. This stuff is still experimental, there is no >30fps UHD HDR support (in the UK that's needed for live UHD HDR iPlayer for instance) and by the time the work is in a reliable and mainstream state there may be newer (better value?) Intel platforms to consider.

    I know .. I know

    I just love the performance of my NUC outside of the playback over my Vero 4K+ that I just want to switch so badly.

  • i mean 18.7 ... but i unterstand you .

    i would be very nice to see your drivers in official Build. mygeminilake played that files much more smoother in HDR with your Build as my

    Amzon FireTV STick 4K.

    I hope it will be soon true that HDR runs on my Librelec x86 gemini board.

  • This is the main "blocking" issue from my point of view, it seems to happen when mode is changed from one 4:2:0 12-bit mode to another 4:2:0 12-bit mode (and yuv plane is involved). I think LibreELEC-Generic.x86_64-9.80-devel-20200201013402-e776789.img.gz contain a limit to use 8-bit output and if I remember correctly using 4:2:0 50/60hz 8-bit works, it will activate HDR mode on my LG OLED but I am sure colors will not be optimal.

    I have not been able to produce a simple reproducible test case (not using kodi) for this 4:2:0 50/60hz distorted image issue.

    That was actually an issue with my older Samsung TV.

    I just tested with another TV (Samsung UE43TU8000) and 4K 60Hz HDR videos play just fine, no video distortion.

    MVIMG-20200729-235626.jpg

  • That was actually an issue with my older Samsung TV.

    I just tested with another TV (Samsung UE43TU8000) and 4K 60Hz HDR videos play just fine, no video distortion.

    MVIMG-20200729-235626.jpg

    is that still 8-bit output though?

    You can send 2160p60 8-bit 4:2:0, 4:4:4 or RGB with an HDR EOTF flag and the TV will happily switch into HDR10 or HLG mode - but because you've lost 2 bits of video data you will get banding?

    Some (often early) displays would only accept 2160p60 with 4:2:0 as they had HDMI 1.4b-bandwidth limited "HDMI 2.0" inputs, whilst others have different functionality on different inputs (some Sony sets are not 'full HDMI 2.0' on HDMI 1 and 4, and are only 'Enhanced' on inputs 2 and 3 (3 is usually the ARC HDMI too).

  • is that still 8-bit output though?

    Yes.

    Not an issue for me though. The panel is 8-bit with FRC. I will probably get a better result with RGB 8-bit + GPU dithering than with 4:2:0 12-bit.

    Edited once, last by smp (July 30, 2020 at 4:23 PM).