Intel true 10bits/HEVC/HDR support... ?

  • Oh... Do you perhaps know something we don't?


    "Once HDR support is fully baked (right now it is far from)"

    That doesn't sound like HDR support is right around the corner.

    guys from intel was testing themselves

    Intel GFX - Patchwork



    and they pointed Kodi as software to test :)


    and not fully baked ? yes as the standard we have now HDR, HDR10, HDR10+, Dolby Vision etc.

    and who know what more we can expect :D

    but as basic HDR support in linux looks like we are there.


    In ubuntu 19.10 in changelog you will find

    • Intel HDR display support for Icelake, Geminilake

    and what I see on patchwork drm-intel incoming more for devices with LSPCON from Gen9 up

    Intel GFX - Patchwork

    and HDR10 with DP 1.4


    I know still are some incompatibility example with Samsung HDR TV, which is actually issue of intel driver/



  • I compiled a build with working HDR from Kwiboo 's work-in-progress branch. Tested on a Gemini Lake J4105 and Samsung 4K TV. In order for HDR to work go to Settings-->Player and enable "Allow hardware acceleration - PRIME".

    Nice - HDR works! j4105 odroid h2 and a samsung mu7000 with 1080p and 4k content. Thanks! It doesnt switch to HDR everytime, but it does if I stop the video and start it again.


    On a side note, is there anyway to switch the display kodi outputs to? I'd like to give this build a good test, but I need to disconnect a HDMI cable because my second, non 4k HDR monitor is used automatically and the option to change doesnt appear to be in the kodi GUI and xrandr isnt present. I still see the libreelec splash on the 4k screen though.

    Edited once, last by teeedub ().

  • Just to try I loaded this image on my Kaby Lake 7th Gen NUC NUC7i5BNK and had no luck. Trying to playback HDR content just results in a black screen and I wasn't able to select any sound options other than Bluetooth presume the image is missing the drivers for the platform.


    Good to see work continuing on this though as LibreElec would definitely be my platform of choice given the option.

  • Just to try I loaded this image on my Kaby Lake 7th Gen NUC NUC7i5BNK and had no luck. Trying to playback HDR content just results in a black screen and I wasn't able to select any sound options other than Bluetooth presume the image is missing the drivers for the platform.


    Good to see work continuing on this though as LibreElec would definitely be my platform of choice given the option.

    LSPCON support for HDR is not merged yet into kernel

  • I know still are some incompatibility example with Samsung HDR TV, which is actually issue of intel driver/

    I tried to figure out what is causing that "no signal" issue on Samsung TV when "HDMI UHD color" is enabled.

    At first I thought that Samsung does not support 4K RGB modes with Deep Color. But the odd thing is that when trying 4K 50/60 modes - Intel driver is switching to RGB 8-bit and it doesn't work as well.

    So, the only 4K HDMI modes that work with GLK+Samsung TV are:

    4K 24/25/30Hz RGB 8-bit

    4K 50/60Hz YCbCr 4:2:0 8-bit

    Edited once, last by smp ().

  • I tried to figure out what is causing that "no signal" issue on Samsung TV when "HDMI UHD color" is enabled.

    At first I thought that Samsung does not support 4K RGB modes with Deep Color. But the odd thing is that when trying 4K 50/60 modes - Intel driver is switching to RGB 8-bit and it doesn't work as well.

    So, the only 4K HDMI modes that work with GLK+Samsung TV are:

    4K 24/25/30Hz RGB 8-bit

    4K 50/60Hz YCbCr 4:2:0 8-bit

    Its something to do with the colourspaces that the TV expects and the driver sends. You can force UHD modes when not in UHD mode via custom modelines. From memory, to get it working I booted the j4105 with the TV in UHD mode, got the available UHD modelines via SSH, and then added them via a boot script and xrandr. Now I can use 4k60 in non UHD mode.


    More info here

    [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab


    I think it would be beneficial if people here with this issue make some noise on the bug tracker to get some attention on this problem.

  • From memory, to get it working I booted the j4105 with the TV in UHD mode, got the available UHD modelines via SSH, and then added them via a boot script and xrandr.

    That will not give you the 12-bit Deep color modes in 4K (10-bit is not supported by GLK hardware). There is no difference how you force it to output the desired modes.

    I also tried to mess around with editing the EDIDs dumped with both HDMI UHD Color On and Off to narrow down the issue.

    Any 4K mode that requite the HDMI pixel clock higher than 297 Mhz give "no signal".

    Intel driver does not support YCbCr 4:2:2 output. When the EDID report the max.pixel clock of 297 Mhz (UHD Color Off) - Intel driver is using those modes for 4K and they work fine:


    4K 24/25/30Hz RGB 8-bit

    4K 50/60Hz YCbCr 4:2:0 8-bit


    When the EDID report the "high speed" pixel clock mode (594 Mhz) is available - Intel driver is using those modes (and they give no signal):


    4K 24/25/30Hz RGB 12-bit

    4K 50/60Hz RGB 8-bit


    IMO the issue has something to do with Samsung having limited support for 4K RGB modes. It probably expects YCbCr 4:2:2 which Intel does not support.

    Edited 2 times, last by smp ().

  • A couple releases since originally asked.

    Is there a way to force YCBCr4:4:4/4:2:0 with DeepColor (12bit) on Intel UHD GPU?


    There are many threads with similar questions, but couldn't find a tutorial of sort. Maybe a wiki is missing which is dedicated to Intel GPUs and state what is supported and what is not. I dont mind writing it, just missing the howto details...


    Edit:

    Max bpc is 12 but i get RGB8 on the projector ( tw5650)

  • 4K 4:2:0 with Deep color is not a part of HDMI spec. It should always be 8-bit.

    No idea how to force YCbCr 4:4:4. Intel driver knows what's best for you and use RGB.

  • 4K 4:2:0 with Deep color is not a part of HDMI spec. It should always be 8-bit.

    No idea how to force YCbCr 4:4:4. Intel driver knows what's best for you and use RGB.

    I calibrated the projector with the Mansel tuning bluray and a Sony bluray player to a specific setting (eg. YCbCR 444, 12bit, 1920p) which cause the least artifacts and provide the best contrast ratio and color coverage. I want to duplicate it to my LE. 1920p is supported, YCbCr444 is supported, Deep Color is supported. But so far couldn't find how to set it in LE. I really want to avoid installing Windows when LE is exactly what I need in term of a Desktop Manager with CEC...

  • I'm confused - UHD mode is off on my TV for the HDMI port that the j4105 is connected to and with the LE HDR test build you posted the TV switched to HDR mode with 2160p 25fps 10bit content. Since GLK does not support 10bit modes, does that mean the signal being sent was 12bit? I'm not familiar with GBM(? the display manager that build used) and xrandr wasnt present so I wasnt sure how to get more info from the j4105 about what display mode is being used. Because of the lack of xrandr, I didnt set any custom modelines or use an extracted EDID that I use with Ubuntu 19.10. Also, I just tested a 2160p 60fps non-HDR video and it played and displayed fine with UHD mode off on the TV + custom modelines on Ubuntu 19.10.

  • Since GLK does not support 10bit modes, does that mean the signal being sent was 12bit?

    No, the signal was 8-bit. Despite the popular opinion HDR does not really require 10/12 bit and the TV will switch to HDR/ST.2084 mode when HDR metadata is present.

    I wasnt sure how to get more info from the j4105 about what display mode is being used.

    Enable DRM debug logging in syslinux.cfg:

    Code
    1. drm.debug=0x1e log_buf_len=5M

    Then

    journalctl -f | grep -E 'color|bpc'

    Edited 4 times, last by smp ().

  • Thanks for the info, that makes sense of what I was seeing. I'll try your code over the weekend on the test build.

  • 4K 4:2:0 with Deep color is not a part of HDMI spec. It should always be 8-bit.

    No idea how to force YCbCr 4:4:4. Intel driver knows what's best for you and use RGB.

    That's not correct. Image below from the HDMI.org website (Had to go to wayback machine as they have updated to HDMI 2.1)r9jiTs7.png


    HDMI :: Manufacturer :: HDMI 2.0 :: FAQ for HDMI 2.0


    The HDMI 2.0 spec supports the following formats and bit depths :


    2160p24-30 :

    • RGB 8-bit, 10-bit, 12-bit, 16-bit.
    • 4:4:4 YCbCr 8-bit, 10-bit, 12-bit, 16-bit.
    • 4:2:2 YCbCr 12-bit only.
    • (NO 4:2:0 support at 2160p30 and below)


    2160p50-60 :

    • RGB 8-bit only
    • 4:4:4 YCbCr 8-bit only
    • 4:2:2 YCbCr 12-bit only
    • 4:2:0 8-bit, 10-bit, 12-bit and 16-bit.


    So for 2160p24-30 4:2:0 is not supported at all and RGB, 4:4:4 and 4:2:2 are all supported at >8-bit depth. (4:2:2 is 12-bit only)


    For 2160p50-60 RGB and 4:4:4 are only supported at 8-bit, so not suitable for HDR, whereas 4:2:0 supports all bit-depths, and 4:2:2 is 12-bit only.


    For a single, fixed, chroma subsampling format at UHD for all frame rates , that supports HDR, 4:2:2 YCbCr 12-bit is the only option. (No problem outputting 10-bit content padded to 12-bit - that's what a lot of consumer electronics devices do)

    Edited 2 times, last by noggin ().

  • That's not correct.

    Yeah, it is indeed in the specs. But still the only useful 4:2:0 modes are 8-bit (for displays that don't support > 9Gbps link speed).

  • I figured out how to force YCbCr 4:4:4 instead of RGB:


    Unfortunately this did not fix the Samsung issue. 4K YCbCr 4:4:4 8-bit works fine but "no signal" with 12-bit.

  • I just have 1 question:


    I know that HDR is a work in progress and that gemini lake and above should be able to properly show HDR.

    And i know, that the J4005 is the lowest end of the gemini lake with UHD 600 and that J5005 is probably the better choice (2vs4 cores, UHD 600 vs 605).


    Now the question: should 4k 60Hz with 10bit HDR run as well on the J4005?

    I know that CvH has tested an HDR file on the 5005 and worked fine.


    Asking because in this test (Intel NUC7CJYH im Test mit LibreELEC und unter Windows 10) which is running on Kodi in WIndows, the test file did run perfectly on the J5005, but not on the 4005, ans the CPU Speed was not the problem (was at 20-30% usage).

    Can anyone cinfirm that 4k 60Hz 10bit will make problems with J4005 on libreelec?