Intel true 10bits/HEVC/HDR support... ?

  • Good shout, sorry didn’t even think of that as all my HDR stuff is on a separate disk.


    Yes, non hdr videos play fine with drmprime.


    I’d still like to wait for someone else to test just to rule out my setup ;) - or me doing something silly /shrug

    I can confirm that I get exactly the same as PlatypusW.

    I was able to config my audio to pass-thru 100% properly.

    I played a 720p x264 file no issue with video on screen and DTS audio.

    I played a 2160p x265 file and got DTS:MA audio out of my speakers .. the TV transitioned into HDR10 mode .. but I got no picture, just black.

    I will say though, progress on the tv switching into HDR10 mode .. woohoo.

    Edited once, last by pletopia (June 13, 2020 at 1:08 AM).

  • I'm using SMP's build posted a little further up this thread.

    Do the mainstream builds now have Gemini Lake HDR support?

    No I just meant the patch to fix the audio issue was included in 9.2.2 hotfix release.

    Those with no video on HDR files, you are not using Gemini Lake correct?

  • No I just meant the patch to fix the audio issue was included in 9.2.2 hotfix release.

    Those with no video on HDR files, you are not using Gemini Lake correct?

    Correct, not using Gemini lake.

    The build we just tried had patches for LPSCON controller which Gemini lake doesn’t have either so shouldn’t effect Gemini lake.

  • Correct, not using Gemini lake.

    The build we just tried had patches for LPSCON controller which Gemini lake doesn’t have either so shouldn’t effect Gemini lake.

    yup it looks like lspcon devices stlll dont work for HDR as patches are work in progress

    Only gemni lake had native HDMI 2.0 so that is working

    makes it dificult for me to upgrade as everything new still uses lspcon

  • So do Intel just support 4:2:0, 4:4:4 and RGB output?

    If so then 4:2:0 is the only format that will support 2160p50/60 with >8-bit for HDR

    What do you think about 4:2:0 12-bit vs. RGB 8-bit with dithering?

    GBM version of Kodi (which is used in HDR builds) does not support dithering due to the lack of OpenGL. However, a hardware dithering can be enabled for 8-bit modes in Intel driver.

    By default the Intel driver does not dither and I see some banding with some of my sample videos @ 2160p60 8-bit output.

    With dithering enabled those videos look much better, much less or no visible banding.

  • What do you think about 4:2:0 12-bit vs. RGB 8-bit with dithering?

    GBM version of Kodi (which is used in HDR builds) does not support dithering due to the lack of OpenGL. However, a hardware dithering can be enabled for 8-bit modes in Intel driver.

    By default the Intel driver does not dither and I see some banding with some of my sample videos @ 2160p60 8-bit output.

    With dithering enabled those videos look much better, much less or no visible banding.

    For what it's worth - my view is that dithering isn't a great idea. Effectively it adds high frequency noise to a picture to mitigate the lack of bit depth. Plasmas used to do it to allow them to make their subfield processing that delivered something like 6 or 7-bit equivalent grey scale look 'good enough' with 8-bit content - but I was always pretty sensitive to the noise it introduced. (I could also see the subfields when I was chewing...)

    Many HDR displays on sale use 8bit panels with FRC (which I think is a kind of dither) to achieve the perceived bit depth required for banding-free HDR. I wonder if that will also be used with 8-bit sources with dither (which the TV won't know about)

    Personally I think 4:2:0 10-bit/12-bit or 4:2:2 12-bit are the only routes really worth following for 10-bit SDR and HDR content at 2160p50 and above.

    If the industry had thought it could continue to get away with 8-bit video paths with dither - I'm sure they'd have kept them!

    Of course 8-bit with dither display of a 10-bit SDR source is likely to be better than 8-bit display without dither... (So it does have some merit)

  • Personally I think 4:2:0 10-bit/12-bit or 4:2:2 12-bit are the only routes really worth following for 10-bit SDR and HDR content at 2160p50 and above.

    The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.

    10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.

  • Thanks for posting this!


    Pretty much same result as last time, no video but have sound and tv switching to hdr.

    This time I also got no UI/playback menu as well so something has changed at least :S


    Happy to keep testing and waiting :)

  • Nice. I see PlatypusW already tested it and had similar results so I'll just hold off.

    This might also be worth a look CHUWI LarkBox, the World's Smallest 4K Mini PC, Launched on Indiegogo for $149 and Up

    Normal crowdsource caveats apply - though there's a nice discount code on that post I think.

    Interesting. Kinda what I'm looking for and it includes mem and a 128GB eMMC. Like that the RAM is included, would pass on the eMMC as i prefer to use M2 storage to not have that bottleneck.

    So big thing that holds me back from getting a Gemini Lake box to replace my Vero 4k+ is that then I would NEED to use a v19 kodi build vs continuing to use V18. This is only an issue for me because of the python switch. I don't actually use many addon's except 1 is key for me, Artwork Beef. I know v18 and v19 technically can download extended artwork themselves but I haven't had a chance to see how well that works in the real world. I love my Aeon Nox skin and the eye candy so this is important to me. Can anyone chime in on their experience?

  • The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.

    10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.

    If it's a choice of 8-bit with or without dithering, then sure add the dither, it will mask the banding by adding extra noise to the picture. But I would always chose clean 10-bit video carried in 10-bit 4:2:0 or 12-bit 4:2:2 over 8-bit RGB/4:4:4 with dither noise.

    So Intel are basically limiting their Linux drivers to HDR for <30fps 2160p only, and only allowing 8-bit output for >30fps at 2160p? That's a crazy limitation. It means that any HDR live TV shot at 2160p50 or 2160p60 isn't viewable on their platform under Linux? (BBC iPlayer 2160p50 live sport in HDR for instance?)

    When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?

    4:2:0 10-bit YCbCr output would be fine for HDR video content (most of this will be delivered 4:2:0 after all - that's what's used for streaming and UHD Blu-ray) - the only compromise would be that a 4K UI might be slightly compromised and 4K artwork and Photos would have 1920x1080 chroma res (4:2:2 would have 1920x2160 chroma res)

    I can understand Intel wanting RGB/4:4:4 to be preferred for desktop use (it avoids the smeary chroma that 4:2:2 and 4:2:0 introduce on very fine coloured picture detail - such as pixel-wide coloured text which would not be nice on a UI monitor - but you don't really need 10-bit or higher bit depth on a UI display, unless you are grading video, or watching Netflix in a window)

    However as there is no >8-bit HDMI 2.0 mode that supports 2160p50 and above in RGB/4:4:4 you have to accept reduced chroma resolution to get the right bit depth to allow clean HDR output of video (the bulk of which will be sourced 4:2:2 or 4:2:0 anyway). I wonder if Intel don't 'get' this issue yet?

    Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?

    Edited 9 times, last by noggin (June 25, 2020 at 8:56 AM).

  • When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?

    I don't think RGB can be removed from EDID, looks like it is mandatory.

    YCbCr 4:4:4 can be forced with a driver hack but I suppose there is not much use for this mode.

    4:2:2 is not even present in the driver source code, so it can't be forced.

    4:2:0 12-bit can probably be forced with a driver hack but I'm not sure. Driver assume that RGB 8-bit is the highest quality mode for 2160p50/60 and use it instead of 4:2:0 12-bit.

    Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?

    It appear to be some sort of compatibility issue with Gemini Lake's HDMI 2.0 implementation and some Samsung TVs.

    I wonder if Intel don't 'get' this issue yet?

    HDR on Linux with Intel hardware is not a thing yet. I'm sure things will change.

    Edited once, last by smp (June 26, 2020 at 5:02 AM).

  • smp & noggin love your conversations even if it goes over my head. Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR? Sorry but i'm not fully versed with all the 4:2:2 & 4:2:0 and such. I do remember reading somewhere that at minimum 10bit is what I'm suppose to expect for real HDR.

  • Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR?

    We were only talking about the optimal HDMI output for 50/60hz modes. Don't worry too much about it.

  • smp & noggin love your conversations even if it goes over my head. Just wanted to ask a quick question, so as it stands right now even IF my tv flips to HDR mode with intel hardware i'm not really getting "proper" HDR? Sorry but i'm not fully versed with all the 4:2:2 & 4:2:0 and such. I do remember reading somewhere that at minimum 10bit is what I'm suppose to expect for real HDR.

    If you are playing most movies that are 2160p24 or 2160p23.976 (i.e. 4K at 24fps) then I believe you are getting correct HDR (because you can output RGB 10/12-bit over HDMI at 2160p at 30fps and below)

    The topic of discussion here is about 2160p HDR content at 50fps and 59.94/60fps - which is more for Live TV (BBC iPlayer 2160p50 10-bit HLG HDR) and the occasional Ang Lee HFR releases that are released in 59.94fps (Gemini Man, Billy Lynn's Long Halftime Walk etc.).

    HDMI 2.0/2.0a doesn't support 2160p50 10/12-but output RGB over HDMI, but it seems Intel's drivers prefer 8-bit RGB over 10/12-bit 4:2:0 modes - so getting 2160p50/60 10-bit HDR content output is proving problematic.

    At the moment it's not possible to get proper 2160p50/60 10-bit content output correctly (whether it's HDR or SDR)