HDMI Full v Limited Range issues on RPi 5

  • Hi Everyone,

    I know this would count as a crosspost, which is against the rules. However, I haven't gotten much of a response from the Kodi forum as it seems quite dead, so I'd kindly ask for permission to also post about my issue here as I feel the users here would have much better input to the issue. I am not totally sure where the issue lies in any event - Kodi, LE, the RPi 5 or my TV, hence why it was posted on Kodi forums first.


    If a moderator is happy with it I will go ahead and repost (with some edits after some further testing)?

  • Thanks Da Flex!

    Original post with edits follows below:

    I'm running LibreELEC on a Raspberry Pi 5 which is being served media from an UNRAID server I have got going. I have tested this on LE 11.0.6 as well as Nightlies.

    It's hooked up to a Hisense U6K, which I must say has excellent format support - Dolby Vision, HDR10+ etc. One thing that did stump me was setting the HDMI range. There is a setting in Kodi to use either limited or full range. On the TV, there is also an option to set whether Limited or Full Range is being sent by the source device, as well as an "Auto" setting (which I will ignore for this post since this just seems to default to Limited anyway).

    For clarity: when I say Kodi is in "Limited" mode, I mean that the setting "Use Limited Color Range (16-235)" under Display settings is set to ON (the little slider button is moved to the right). Vice Versa, when Kodi is in "Full" the setting is set to OFF / button is moved to the left.

    For the longest time I had left the setting as Limited in Kodi, limited on the TV. As long as they match, it should work, at least that's what I thought based on my expereince. Plus, it's a TV, the setting specifically says to use Limited when using a TV and not a monitor.

    I started noticing that the HDR content appeared much darker, to the point that dark / night time scenes where so dark that you couldn't make out any detail.

    This puzzled me for some time until I switched the TV's HDMI range to Full. This solved the issue almost immediately and was a night and day difference - suddenly I could see all the detail I had been missing. The problem was, that normal, non-HDR content now appeared brightened and washed out, so one step forward, one step back.

    So, the variables I have noticed when changing the settings:
    HDR content goes between Dark and Normal (I used a dark scene from the latest episode of Shogun to test this. I have verified its intended brightness against the original Disney+ stream).
    Standard content goes between normal and brightened (I used the opening scene of episode 1 of Mr. Robot, since it has a black opening title card with white text. When it is "brightened" the black turns to grey and there is noticeable unnatural greyness/brightness to all scenes).
    The GUI also changes in "brightness" level, although I think the best way to describe this is that its as if a layer of grey is added or taken away from the GUI whenever a setting is changed. I have described this as best I can below.

    I have tested the different combinations of ranges with both types of content, results below:

    TV: Limited
    Kodi: Limited
    GUI is brightened (one layer), HDR content is Dark, normal content looks normal.

    TV: Limited
    Kodi: Full
    Gui looks the most natural... deepest blue colors from Estuary. HDR content is dark, normal content is normal.

    TV: Full
    Kodi: Full
    GUI is brightened (one layer), HDR content looks normal, standard content is brightened.

    TV: Full
    Kodi: Limited
    Gui is exceptionally brightened (two layers), HDR content looks normal, Standard content is brightened.


    Use Display HDR capabilities is set to ON and PRIME render method set to Direct to Plane (EGL seems to disable HDR capability completely in my brief tests).


    Any input greatly appreciated!

  • The Kodi setting for Limited doesn't do what you think it does I'm afraid. It's been confusing for years.

    The Kodi tick box for Limited Range option is only useful when your OS is using Full range output (and would map black to 0 and white to 255) but you want to force Limited Range output (with black at 16 and white at 235) because you're feeding a display that won't accept Full range output (or you want to preserve <16 and >235 content and/or avoid the banding that rescaling can cause). This was a useful thing with some Intel GPUs a long time ago...

    HDMI InfoFrames these days mean this is less useful than it might seem (as InfoFrames are now routinely used to flag Full or Limited range from source to display). If your OS is already set-up for Limited output then you get grey blacks and dim whites if you also tick the Limited option within Kodi, as you will if your TV is in Full range mode or your OS is in full range mode and TV automatically detects this nd you select Limited in Kodi.

    For a Raspberry Pi, in my experience with Sony and LG TVs, you should leave the Kodi settings on default and your TV Full/Limited range settings on Automatic if you can (so InfoFrames are followed correctlu) If not I'd use Limited (as my TV HDMI setting) as my starting point - as that's the near-universal standard for HDMI video devices.

    I've never had to alter my Kodi settings or TV settings on a Pi 3/4/5 running LibreElec to get correct black and white levels. Most HDMI systems default to 'Limited' or 'Video/Broadcast' levels (16-235/64-940) as that is what broadcast video production and distribution uses (other than Dolby Vision ICtCp stuff). If you remap Video/Limited to Full you can clip <16 Blacker-than-Black (not usually an issue - though it makes PLUGE tricky to use) and >235 Whiter-than-white (more of an issue as 100-109% are valid video levels - particularly in broadcast TV use)

    From your posts above

    TV: Limited
    Kodi: Full (which doesn't actually mean the Pi 5 is using Full range - it just means Kodi isn't compensating for a Full range output to a Limited display...)
    Gui looks the most natural... deepest blue colors from Estuary. HDR content is dark, normal content is normal.

    If the Pi5 is correctly configured for Limited output automatically at an OS level (which happens pretty much all the time on a modern HDMI-connected display) - you shouldn't need to select Limited in Kodi's settings.

    Is what I would expect to be normal. When you say HDR is dark are you objectively comparing it with the same UHD HDR content via a different player route (i.e. comparing a UHD HDR BD player output vs Kod playing a UHD BD rip of the same disc on the same display) or just making a subjective comment (or worse comparing with an HD SDR version of the same content).

    It's normal for HDR versions of movies to appear darker in PQ (i.e. HDR10) HDR than the same movie release in HD SDR (say comparing a UHD HDR BD with an HD SDR BD) because HDR PQ10 is based on SDR content brightness being at 100nits max (with HDR highlights etc. going >100 nits), whereas most people watch SDR content at display settings brighter than this (so the SDR content often hits 200 nits or more), and thus when PQ content keeps the SDR range of an HDR signal at 100nits 'it looks dark' is often the comment you hear (as people are watching SDR content pushed into the HDR range of their displays - as 100 nits is quite dark for normal viewing conditions).

    I guess I'm asking whether you are saying for certain that HDR content is definitely incorrectly replayed - or you just feel it's too dark?


    ***EDIT - It may be best to leave your TV on Automatic rather than fixing it at Full or Limited - as regular SDR content is almost always 16-235/64-940 video/limited levels - as is HLG HDR (as used by services like BBC iPlayer). However PQ/ST.2084/HDR10 stuff may be using 1-1023/1-254 full range instead and flagging accordingly using HDMI Infoframes ***

    Edited 9 times, last by noggin (April 7, 2024 at 3:25 PM).

  • Thanks Da Flex!

    Original post with edits follows below:


    This puzzled me for some time until I switched the TV's HDMI range to Full. This solved the issue almost immediately and was a night and day difference - suddenly I could see all the detail I had been missing. The problem was, that normal, non-HDR content now appeared brightened and washed out, so one step forward, one step back.

    Only quoting that part, as most of the rest is not relevant on that matter.

    To keep it simple.

    The TV is not creating the same level of brightness/saturation/color on sdr content when that is displayed with hdr as it most likely supports only a low level hdr on sdr content (mostly happens on lcd/led screens, oleds are far better to get a good sdr on hdr picture).

    So the problem with the washed out colors is normal and might be lessend by adjusting stuff like brightness when it is displayed (but would also affect hdr as you change it in that mode).

    You might search the net if people might have posted settings guides for that TV which you might like to try.


    BTW. Switching the color range to full was needed as hdr formats like dolby vision only work with the enhanced color space.

  • The TV is not creating the same level of brightness/saturation/color on sdr content when that is displayed with hdr as it most likely supports only a low level hdr on sdr content (mostly happens on lcd/led screens, oleds are far better to get a good sdr on hdr picture).

    So the problem with the washed out colors is normal and might be lessend by adjusting stuff like brightness when it is displayed (but would also affect hdr as you change it in that mode).

    You might search the net if people might have posted settings guides for that TV which you might like to try.


    BTW. Switching the color range to full was needed as hdr formats like dolby vision only work with the enhanced color space.

    Not sure what you are trying to say here. SDR is SDR - sure you can chose to process it and push it into the HDR range - and many people do. This will make the picture look bright - but it will then mean an SDR signal will be displayed far brighter than the SDR portion of an HDR10/PQ signal.

    PQ ST.2084 HDR (i.e. HDR10) is based on an 'absolute' HDR standard that maps specific video levels to absolute pixel light output - so a video level that deliever 100 nits is 100 nits on any ST.2084 PQ (aka HDR10) display unless that display is not following the ST.2084 PQ curve and is being overridden. Displays differ in their max brightness - and in those casese the PQ curve is tone-mapped to mitigate the max brightness limitations, and obviously there are better and worse performers at black level.

    However 100 nits is a 100 nits (and is represented by 51.9% of full-range in a PQ ST.2084 signal) - any display correctly displaying the SDR portion of an HDR signal will deliver this 51.9% signal as 100 nits light level whatever the screen tech is if it is correctly meeting the standard PQ specs.

    Some TVs will use processing that means they no longer track the PQ/ST.2084 curve correctly to let you make the picture in HDR brighter or darker - some in their 'Cinema' or 'Movie' modes won't.

    I recently switched from a Sony FALD LCD to an LG C3 OLED with both correctly calibrated and the subjective performance was similar for both SDR and HDR content in picture brightness terms watching the same content - though the OLED clearly did better with blacks and there was no FALD backlight booming, and the OLED overall delivers a nice clean picture.

    This is a really good primer on HDR PQ and HLG and SDR and explains a lot about why HDR can sometimes appear 'dark'. It's written by a former colleague who is an expert on both display calibration and motion picture colour science.

    PQ HDR & HLG
    Understanding PQ HDR and HLG
    www.lightillusion.com

    AIUI HDR10 may use 1-1023 rather than 64-940 (aka 16-235 in 8-bit) level space - which is why leaving a TV on Automatic rather than Full or Limited may make the most sense.

    Edited 2 times, last by noggin (April 7, 2024 at 3:30 PM).

  • Is what I would expect to be normal. When you say HDR is dark are you objectively comparing it with the same UHD HDR content via a different player route (i.e. comparing a UHD HDR BD player output vs Kod playing a UHD BD rip of the same disc on the same display) or just making a subjective comment (or worse comparing with an HD SDR version of the same content).

    It's normal for HDR versions of movies to appear darker in PQ (i.e. HDR10) HDR than the same movie release in HD SDR (say comparing a UHD HDR BD with an HD SDR BD) because HDR PQ10 is based on SDR content brightness being at 100nits max (with HDR highlights etc. going >100 nits), whereas most people watch SDR content at display settings brighter than this (so the SDR content often hits 200 nits or more), and thus when PQ content keeps the SDR range of an HDR signal at 100nits 'it looks dark' is often the comment you hear (as people are watching SDR content pushed into the HDR range of their displays - as 100 nits is quite dark for normal viewing conditions).

    I guess I'm asking whether you are saying for certain that HDR content is definitely incorrectly replayed - or you just feel it's too dark?


    Thanks for your detailed response. It's quite interesting to learn that the settings isn't really changing anything in the signalling over HDMI, just compensating for devices that would "discard" any values below 16 and above 235 (which I presume was an issue with some HDTV's many moons ago...)


    To answer your question above: Yes, I have compared versus a reference. My reference of choice is one of the latest episodes of Shogun. It was actually what caused me to investigate the issue in the first place.

    In Episode 6 there is a darkened scene with a lot of candlelight - in fact, a lot of the show is shot in these settings. If the TV is set to limited this scene is barely recognizable except for the brightest of highlights, and even those look somewhat dull. I originally thought the show was trying to be extremely artistic and moody, so I compared against the Disney+ stream. Sure enough, the Disney+ stream had loads of detail in the backgrounds which was being "Crushed" when viewing through Kodi on the RPi. Thereafter I began troubleshooting and found the only setting which seemed to fix the issue - switching to Full range.

    I then placed the same source file being played by Kodi onto a USB drive and inserted directly into the TV set. This yielded similar results to the Disney+ stream and obviously bypasses the Range Setting issue since there is no HDMI source device as a variable.

    I can try post pictures later comparing the two for reference.


    Interestingly enough the TV has different image profiles depending on the input that is being fed. It defaults to a "normal" profile (with pre-baked profiles in the settings such as Standard, Dynamic, Cinema Mode etc). but as soon as HDR content is fed to it, it switches into a "HDR" profile (With pre-baked options such as HDR standard, HDR dynamic etc).


    Feed it Dolby Vision, and the profile is switched to "Dolby Vision" (with corresponding Dolby Vision Standard, Dolby Vision Bright etc). It even has seperate ones for HDR10+ and HLG.


    And now I am pleased to say that I have found the solution. Although, rather embarrasingly, its a fault of the TV itself and not LE or Kodi.


    I had the TV set to "Dot to Dot" mode in Aspect Ratio, which as I understand is the terminology for "Pixel Mapping", ignoring any Aspect Ratio corrections by the TV itself. Switching this back to 16:9 fixed the issue immediately. Taking noggin's advice and leaving Kodi in "Full" (i.e. Use Limited Colour Range set to OFF) and running a full reset of the display profiles for SDR content fixed the issue. I began reconstructing my orignal settings one by one (but leaving the TV in "Limited" mode) and finally came to the above realisation.


    I can't for the life of my understand why this would be the case. However, it didn't take me long at all to find that I am not the only one who has expereinced such an issue:

    https://www.avsforum.com/threads/sharps-dot-by-dot-pixel-mapping-is-very-faded-poor-quality-overscan-issue.2908114/

    Thank you all for your help in the above. Maybe someone finds this thread helpful in future.

    I'm off to kick myself now.

  • Yes - one thing to be aware of and which I should have mentioned is that on many TVs (including my LG OLED and previous Sony FALD LCD) SDR, HDR10/HDR10+/HLG and Dolby Vision sources will often trigger separate display settings for each type of source - plus you wouldn't expect Dolby Vision and HDR10 versions of the same drama to look identical. (The Pi 5 doesn't do Dolby Vision). You often have far less control of settings on Dolby Vision content for instance.

    There are also some dodgy, pirated, 'Hybrid' versions of dramas kicking around apparently. These take a downloaded HDR10 stream from an OTT streaming service, but then graft on Dolby Vision Metadata RPUs from a separate Dolby Vision native download of the same show from that service, to create a file that plays back as HDR10 on non-DV-compatible devices, but will trigger Dolby Vision logos on some Dolby Vision replay solutions (though the RPUs aren't meant for the YCbCr HDR10 stream and instead designed for an ICtCp encoded stream...)

  • Yes - one thing to be aware of and which I should have mentioned is that on many TVs (including my LG OLED and previous Sony FALD LCD) SDR, HDR10/HDR10+/HLG and Dolby Vision sources will often trigger separate display settings for each type of source - plus you wouldn't expect Dolby Vision and HDR10 versions of the same drama to look identical. (The Pi 5 doesn't do Dolby Vision). You often have far less control of settings on Dolby Vision content for instance.

    I am actually glad that there are seperate profiles for all. The only problem is calibration by the looks of things - I can't seem to find any DV, HDR10+ and HLG calibration / test patterns, only SDR and HDR10 (apparently CalMan can do this but requires a colorimeter). Probably doesn't matter on a lower-midrange U6K anyway.

    There are also some dodgy, pirated, 'Hybrid' versions of dramas kicking around apparently. These take a downloaded HDR10 stream from an OTT streaming service, but then graft on Dolby Vision Metadata RPUs from a separate Dolby Vision native download of the same show from that service, to create a file that plays back as HDR10 on non-DV-compatible devices, but will trigger Dolby Vision logos on some Dolby Vision replay solutions (though the RPUs aren't meant for the YCbCr HDR10 stream and instead designed for an ICtCp encoded stream...)

    I've seen these too. The only reason I can assume this is done is to avoid the "Green and Purple" issue when playing DV Profile 5 files on non-DV hardware but still get DV on licensed hardrware. This exact issue crops up with the Pi as well since its not licensed for DV.

    I don't think there's a clean solution to this at all unless you run Kodi on Android TV natively on a licensed box (I think latest Amlogic chips are all DV licensed) but this means dealing with the rubbish software that comes with those media players and of course no LE.

    The short sad story is that Dolby Vision seemingly was designed for streaming and maybe 4K BD where everything is tightly controlled and locked down software wise. There will never be a solution for home media types since we are a very small more hobbyist market and Dolby couldn't care less about us.

  • DeShizz To my knowledge, there is one box with a DV license, and CoreELEC works on it. Go to their forum for details.

    Interesting you posted this when you did, I notice that they have released their Omega Stable build today which seems to add a few more options:

    "Dolby Vision will be supported on Amlogic-ne and Amlogic-ng with Kodi Omega on certain Amlogic devices like Homatics/Dune R 4K Plus, RockTek G2 or Nokia 8010. Other devices on Amlogic-ng like Ugoos AM6+ or Minix NEO U22-XJ (Max) equiped with S922X-J will support Dolby Vision with profile 7 FEL as well."

    (Quoting from their release notes)

    Seems like DV is a high priority featureset for CoreELEC. But their target market seems to be limited to Amlogic only, so I guess it makes sense for them as they are not putting in development effort on other platforms.


    I'll stick with the Pi 5 for now, my country's choice of electronics is, shall we say, very limited so the above devices are probably import items at this point.

    If only the Pi Foundation had included HDR10+ on the Pi 5, which seems to do 99% the same thing as DV. I understand why they didn't though, since that probably means a whole lot of extra R&D and more expensive SOC.

  • I've seen these too. The only reason I can assume this is done is to avoid the "Green and Purple" issue when playing DV Profile 5 files on non-DV hardware but still get DV on licensed hardrware. This exact issue crops up with the Pi as well since its not licensed for DV.

    Yes - though in effect you're watching a YCrCb PQ PQ10 (i.e. HDR10/10+ but ignoring the metadata) stream with DV Metadata RPUs from an ICtCp DV Profile 5 stream grafted in to create a fake Hybrid DV stream (like a UHD BD with the MEL/FEL ignored).

    There's no guarantee that the RPUs are correct for the PQ10 output stream (they are designed for a render in a different colour space - and there's no guarantee the HDR10/HDR10+ render will be changed correctly by metadata designed for a different output colour space) - so although the Dolby Vision light might come on - there's no guarantee you're seeing any improvement over the static HDR10 or dynamic HDR10+ metadata - and in fact you might be seeing something worse - but hey, the DV logo has appeared...


    Interesting you posted this when you did, I notice that they have released their Omega Stable build today which seems to add a few more options:

    "Dolby Vision will be supported on Amlogic-ne and Amlogic-ng with Kodi Omega on certain Amlogic devices like Homatics/Dune R 4K Plus, RockTek G2 or Nokia 8010. Other devices on Amlogic-ng like Ugoos AM6+ or Minix NEO U22-XJ (Max) equiped with S922X-J will support Dolby Vision with profile 7 FEL as well."

    (Quoting from their release notes)

    Seems like DV is a high priority featureset for CoreELEC. But their target market seems to be limited to Amlogic only, so I guess it makes sense for them as they are not putting in development effort on other platforms.

    CoreElec's DV support (at least in the -ne branch) requires an Android box and Android OS that already has DV support - as it uses the Android DV code in the Android install (which has to remain on the eMMC storage) on-the-fly called from the CoreElec code. CoreElec itself doesn't do the DV decoding - it relies on the Android code (which it doesn't redistribute)

    The two builds of CoreElec for the two supported platforms have differing support for dual layer decoding - the -ne is single layer, the -ng is dual layer, and they run on different platforms. This limitation is apparently an AMLogic SDK thing - as the dual-layer decode on the -ne platform-supported devices is currently broken?

    Edited once, last by noggin: Merged a post created by noggin into this post. (April 10, 2024 at 8:06 AM).