Posts by noggin

    What is the advantage of bitstreaming the audio vs.sending the decoded LPCM to AVR? Just trying to understand why people care so much about audio passthrough. I'd like to hear a valid technical reason why bitstreaming is supposed to be better than decoding to LPCM. Since Kodi can decode all known HBR audio formats and send LPCM 7.1 over HDMI I don't see a single good reason why anyone should bother with passthrough.

    Two reasons mainly :

    1. Spatial audio like Dolby True HD with Atmos and DTS:x can't be decoded to PCM without losing the height information - they need to be bitstreamed with passthrough for the extensions to be decoded by an AVR alongside the spatial metadata. If you decode in Kodi you end up with just straight 5.1 or 7.1 - rather than 5.1.2, 5.1.4, 7.1.2, 7.1.4, 9.1.2 etc. you'd get with a receiver fed a bitstream which contains additional audio content and additional object metadata etc.

    2. When you bitstream even non-spatial audio, you still pass through additional metadata alongside the audio - containing information like Dialogue Normalisation values, Line/RF mode compression, Centre and Surround downmix levels, Lt/Rt vs Lo/Ro downmix, original mastering information etc. etc. These all provide additional information that can inform processing in the AVR that is just not present with PCM 5.1/7.1. If you output PCM 5.1/7.1 the receiver doesn't receive this metadata. In some situations this may not be an issue - but if you want to do any dynamic range processing, any down- or up-mixing etc., or want to follow the Dolby spec properly, then PCM won't help.

    dolby-metadata-guide.pdf

    (The above talks about both Dolby E and Dolby Digital - but gives a good overview of the use of metadata)

    In some ways you can think of some of the DD metadata as a bit like the HDR10 metadata that is used with HDR video - it provides the output device with more information than just the picture or audio samples that let output devices optimise things based on information from the person who mastered that content.

    Ok, it's sad. So, if not the RPi4, then any other box running LibreElec or CoreElec could do tone-mapping better ? Lots of HDR stuffs are there and I dont want to buy any HDR display soon.

    You will almost universally find that a 1080p Rec 709 SDR movie will look nicer played back on a 4K SDR display than a 4k HDR Rec 2020 movie tone mapped to 4K SDR.

    HDR->SDR tone mapping is essentially a lossy process and the decisions taken when grading content for an HDR display both usually mean that the end result is often sub-ideal in most situations.

    If your display is also Rec 709 only and can't cope with Rec 2020 wide colour gamut - you also have to contend with tone mapping to reduce the colour gamut of the Rec 2020 source.

    Tone-mapping on replay on a consumer device is an automatic process, whereas the grading process that generates the Rec 709 SDR 1080p version of a movie, even if starting from a UHD Rec 2020 HDR master, will have a degree of optimisation and care taken over it (and information is available to the conversion process that isn't available to an end consumer). (DV could - in theory - provide more optimisation - but I don't know how many DV playback devices do anything special for SDR displays)

    The other hdr stuff I watch is mostly dolby vision on Netflix or hdr movies on YouTube.

    Both of these look much brighter, especially dolby vision.

    Tonight were going to watch a movie, then the living room is also much darker.

    There should be no significant difference in brightness watching the same content on Netflix in HDR10 or DolbyVision - the DV version should optimise better for your display and may display some scenes subtly differently to the HDR10 version - but the overall brightness should be very similar. (The DV metadata carries information scene-by-scene (or shot-by-shot) to assist your TV in tone mapping stuff it can't display correctly, whereas HDR10 stuff just has basic metadata for the whole show or movie)

    If you are saying one series you have watched in HDR10 doesn't look as bright as a different series you have watched in DolbyVision - well that's comparing apples with oranges (as the two series will have been graded separately with different grading decisions taken about SDR content vs HDR content in the HDR10 and DV grade)

    HDR10 and DV are conventionally graded to keep the SDR elements in the 100nit range, only going over 100nits for highlight detail, speculars etc. that in SDR would be clipped to peak (or have to be heavily compressed down to retain detail). Some people grading content will push more conventional SDR stuff into the HDR range to make it 'look HDR' - Amazon's 'The Man In The High Castle' was a real offender in that regard ISTR. However most shows keep the HDR range just for 'HDR' stuff.

    One difference between DV and HDR10 is that many TVs allow you to adjust far more picture settings when displaying HDR10 than they do with DV (with DV taking more 'control' over your display settings when displaying DV content)


    IIRC tone mapping is only supported in Kodi for Windows (via d3d shaders) and Android (via the mediacodec framework, if implemented by a device). I've read reports that it works fine on the Nvidia Shield, not sure about other devices.

    so long,

    Hias

    I think AMLogic devices in CoreElec have some degree of Tonemapping for HDR->SDR conversion too, as well as HLG->HDR10 conversion.

    Do you have the same video on an original source - comparing an HDR10 UHD Blu-ray played on a UHD Blu-ray player, compared with the same disc ripped as a file (without remastering/transcoding) and played back on the Pi 4B would be the obvious route.

    What I can say is that the video levels leaving the Pi 4B are what is expected within the standard, and the EOTF is being correctly flagged.

    I haven't checked what the MaxCLL and MaxFALL and mastering metadata situation is - but I wouldn't expect that to change the replay level (though it may change how highlights are treated)

    (As I've said above - HDR content is often accused of being too dark when it follows the official HDR10 specs, largely because the specs are based on the SDR range - 0-100nits - being unrealistically low for domestic settings, and the reality is that people watch SDR content at far brighter levels than the spec is based on, meaning they watch SDR pushed well into the HDR10 HDR range of >100 nits)

    Properly mastered SDR will keep almost all content but highlights and speculars in the 0-100nit range - so HDR often appears dark as a result (particularly HDR10 content that has a set-in-stone video level to light level mapping, unlike HLG)

    Working for me now, my Sony 65XE9305 switches to hdr, but only thing is that in hdr the overall image looks pretty dark.

    Is there a way to up the brightness in libreelec/kodi somewhere in the settings?

    Have you watched the same content in HDR via alternate sources such as a UHD HDR Blu-ray player? Is that brighter?

    If not - then chances are you're just seeing the normally reported issue with HDR10 from people who have HDR TVs but normally watch SDR on them.

    Most people with HDR displays are watching SDR content with the SDR peak brightness pushed well into the HDR range of brightness levels. The SDR range in HDR10 (and other specs) is officially 0-100nits (i.e. anything over 100nits is reserved for HDR speculars, detail in bright areas etc.). However this feels really dim to many people watching in regular light level rooms, and the reality is that most people watching SDR will have their peak SDR well over 100nits on an HDR TV.

    As a result HDR stuff (which will keep most of the detail in the SDR range of 0-100nits) often looks a lot darker than people are expecting (particularly as many people think HDR = brighter picture, rather than more detail in highlights)

    This is also somewhat complicated by HDR10 officially using PQ (Perceptive Quantisation) where the ST.2084 EOTF (video level - > display light level conversion) according to the standard is fixed to a 1:1 mapping - rather than relative (like SDR and HLG HDR) - so in theory you shouldn't be able to make an HDR picture brighter or darker with user controls (as that stops it being properly compliant with the HDR10 spec). (*)

    Of course you will often find this can be overridden on many TVs - though it does mean that you are then likely to be clipping/squeezing more and more HDR detail (reducing the benefits of the HDR source), as the brighter you make the SDR picture, the less dynamic range you have left for the HDR elements.

    Dolby Vision modes on TVs often go one step further and specifically stop you altering the display brightness levels at all.

    (*) Arguably this is why HLG HDR makes a tonne more sense for domestic viewing (particularly as it has ambient light compensation built into the standard - rather than bolted on as a bolted-on afterthought like Dolby Vision IQ)

    Changing the brightness/contrast of an image should really be a display function not a Kodi function?

    Awesome, this is really good news! Huge thanks for going through all the effort and checking this!

    so long,

    Hias

    No problem. I'm not a Resolve Expert - but things do look good.

    Next step - 10-bit RGB (2160p30 and below) and/or YCbCr 4:2:2 12-bit so we can have HDR (and SDR 10-bit) at all frame rates :)

    (Interesting that some OTT providers are now using h.265 10-bit SDR for some content - SVT Play in Sweden are - as a route to higher quality SDR with far less banding than 8-bit SDR. 10-bit isn't restricted to HDR these days - it has benefits for SDR too )

    Update - just compared the captured bars from the Pi 4B, playing back an h.265 conversion of the source file from the EBU, along with the native bars from the EBU test file in Resolve and they appear identical. I think this suggests that things are good.

    I transcoded the EBU v210 source to h.265 using libx265 with the following ffmpeg command, played it on the Pi 4B and captured the result in RGB 10-bit uncompressed :

    Code
    ffmpeg -i "EBU_HDR_COLOUR_BARS_2160p_v210.mov" -filter_complex loop=loop=20:size=100 -codec:v libx265 -crf 15 -pix_fmt yuv420p10le -color_primaries bt2020 -colorspace bt2020_ncl -color_trc arib-std-b67 "EBU_HDR_COLOUR_BARS_2160p_v210.2020hlg.mov"

    (You have to tell ffmpeg the colour space and EOTF to flag on the output - but those flags don't process the video, they just add metadata AIUI)

    OK - for some reason I had to capture in 10-bit RGB Uncompressed rather than 10-bit YCbCr... Anyway - first glance suggests that the 75% Rec 2020 bars are in the right place on Da Vinci Resolve's vector scope (i.e. in the 75% RGBYCM squares). I'm not a DaVinci Resolve expert by any means - but with a Rec 2020 timeline, and a capture that is correctly flagged as Rec 2020 in Media Info, then I think this is looking good.

    On the luminance ramp below the 4 sets of colour bars you can clearly see truncation to 8-bits from the 10-bit source.

    Hmm - may need to check out my capture solution. I can capture 2160p25 Rec 709 Kodi menus, but the minute the Pi 4B goes into Rec 2020 HLG mode I can see an output, but Media Express (Black Magic recording application) doesn't work properly and just records audio. (The TV I'm feeding via an HDMI splitter still goes into HDR Rec 2020 mode). Will check my software version. I've captured 2160p50 Rec 2020 HLG using this set-up before - so I know it will capture and record Rec 2020 HLG stuff.

    My HD Fury Vertex won't do that (as the RGB/YCbCr conversion matrix isn't an HDMI-signalled thing, as downstream devices don't need to know it) - but I MAY be able to grab the same Rec 2020 content via two player routes (Pi 4B vs Sony UHD BD player or AMLogic CoreElec box etc.) using a Black Magic 4K HDMI capture card if I get a chance, and then compare the results in Davinci Resolve Studio (Vector Scope images and pixel values should both be testable there).

    I'll see if I've got Rec 2020 colour bars as a file on one of my test file archives.

    ** EDIT - The EBU have their nice new colour bars in 4:2:2 v210 10-bit HLG Rec 2020 format here : EBU Technology & Innovation - Colour Bars for use in the Production of Hybrid-Log Gamma (HDR) UHDTV

    And they document the YCbCr snd RGB pixel values they should hit in the various areas in the PDF - in 10-bit (so divide by 4 for 8 bit)

    These bars are also good for checking Rec 709 SDR tone-maps using both Scene Light and Display Light conversions **

    (However BlackMagic HDMI captures from an RGB source may differ from those in YCbCr - they don't have the best reputation in this regard - and my other player may well be YCbCr native output - which is more usual for video players)

    Sorry, I should have better avoided the term "whitepoint" in this context. What I meant was what you mentioned later, due to the wrong matrix the RGB values won't be quite correct, colors skewed, and white won't be white.

    4kp50/60 output isn't implemented in this build, this is still in the queue (we tested an early patch a few weeks ago but it's not 100% there yet).

    4kp50/60 HEVC decoding is more performant now, so 1080p50/60 output of such files will be a lot better (previously the decoder couldn't keep up resulting in a/v sync drifting away). These optimizations are still WIP but testing showed it's already mostly working fine.

    so long,

    Hias

    Haven't tested 2160p50 HEVC yet - but ~45Mbs 1080p50 HEVC SDR 10-bit files are decoding and playing back very nicely (albeit in 8-bit)

    The build also contains current WIP HEVC optimizations and latest kodi Matrix changes. High-bitrate and 50/60fps HEVC videos should perform better now but beware, there might still be bugs :)

    Hias

    Is 2160p50/59.94/60 output now supported in these builds? Or is this note about high bitrate & 50/60fps content just for output at 1080p50/59.94/60 and below?

    Hi,

    Thank you for the release,

    It doesn't seems to work for me, I still see RGB 24 bits on my AV.

    I have the following message :

    As HiassofT said - the output is still 8-bit (so will be RGB 24 bit if your AVR reports that) not 10- or 12-bit.

    However it is now correctly flagged with Rec 2020 colour gamut, rather than Rec 709 :

    "DEBUG <general>: CVideoLayerBridgeDRMPRIME::Configure - setting connector colorspace to BT2020_RGB"

    So the colour primaries flagged to your display are now correct, as well as an HDR EOTF also being flagged (so the for HDR10 content the connected display will now go into Rec 2020 AND HDR mode, using a PQ/ST.2084 EOTF - i.e. the right wide colour gamut and the right dynamic range)

    What isn't yet happening is 10-bit or 12-bit output (so the 10-bit HDR HEVC video is truncated to 8-bits, losing the two least-significant bits - potentially causing banding) and also the RGB<->YCbCr matrix being used is likely to be that for Rec 709 rather than Rec 2020 (the two standards use different matrixes to convert between the RGB that Kodi uses - and currently outputs - and the YCbCr video carried in the HEVC codec)

    Just tested - 2160p25 10-bit HEVC Rec 2020 HLG content is output 2160p25 RGB 8-bit Rec2020 flagged and with an HLG ARIB B67 EOTF. Progress!

    Not sure I get the 'whitepoint' stuff - the whitepoint metadata (which is I think the CIE x and y co-ordinates of the mastering whitepoint used in the grading process) carried in HDMI InfoFrames (and in the HEVC video) is separate to the RGB<->YCbCr matrix stuff ? (This is the HDMI Infoframe stuff that is carried along with stuff like MaxCLL and MaxFALL mastering data in HDR10/PQ stuff)

    However in the Rec 2020 YCbCr<->RGB matrix there is more Red and less Green and less Blue contribution to the Y signal - so if you use a Rec 709 matrix instead I think you'll get a picture skewed less red and more green/blue than it should be?

    HDR doesnt work for anyone yet (rpi4 libreelec), they just think it does.I thought so too before comparing clips with lots of red with my tvs build in player vs libreelec.

    hias told us to wait some more.

    That's not 100% correct. HDR works - but Rec 2020 doesn't.

    HDR works OK - the EOTF (Electro Optical Transfer Function) is correctly signalled over HDMI switching your TV from SDR to either PQ/ST.2084 or HLG/ARIB-B67 HDR display modes. This is what makes the video use the HDR range of the display rather than the SDR range, and it is flagged correctly.

    HOWEVER - most HDR content is also in Rec 2020 colour gamut (i.e. Wide Colour Gamut) rather than the Rec 709 colour gamut used for HD SDR (and which is also similar to the colour gamut used for SD video too). The colour gamut defines the actual colours that the Red, Green and Blue primary colours used in a video signal are - and Rec 2020 effectively defines 'redder reds', 'greener greens' etc.

    The Pi 4B HDR implementation currently DOESN'T send the correct Colour Gamut flags for Rec 2020 content - so you get the correct HDR EOTF (and your TV switches into HDR mode) but the video is treated by most TVs as Rec 709 (so appears to have far less vibrant colours). Some of us can force our TVs into Rec 2020 mode using a menu option - which gets much closer to showing HDR10 correctly.

    (It may be that some models of TV automatically switch to Rec 2020 when they detect an HDR EOTF - which may be why some people are happier than others)

    My Sony is not switching .... as long as I tell him to do so.

    <snip>

    This leads, in my case, to normal videos with colour space 'auto' and HDR videos with bt.2020.

    Yes - this is a known issue.

    At the moment the Pi 4B builds correctly send the HDR EOTF (HLG/ARIB B-67 or HDR10/ST.2084) signal to switch the TV from SDR mode into the right HDR mode.

    However the other requirement for most HDR material (and all HDR10 material) is to switch from Rec 709 to Rec 2020 colour gamut (which is the Wide Colour Gamut used for all HDR10 and most other HDR content). The Rec 2020 colour gamut can be manually forced on your Sony TV, as it can be on mine.

    That gets the colour into the right gamut for HDR10.

    The two other issues are that the output is still RGB 8-bit not 10-bit - so banding is more likely to be an issue - and if the gamut is flagged as Rec 709 then there is no guarantee that the YCbCr->RGB colour matrices used to generate the RGB won't also still be the Rec 709 ones (which are slightly different to the Rec 2020 ones)

    Rec 709 : Y = 0.2126 x R + 0.7152 x G + 0.0722 x B

    Rec 2020 : Y = 0.2627 x R + 0.678 x G + 0.0593 x B

    UHD TVs don't automatically assume all UHD inputs are Rec 2020 though - as some (most?) SDR UHD content is often still in Rec 709 gamut (the same gamut used for HD)

    (For reference the SD Rec 601 RGB->YCbCr matrix is also different - Y = 0.299 x R + 0.587 x G + 0.114 x B)

    That link was great. I stumbled across it yesterday when trying to fix HDR on my AVR X4200. The step which worked for me was:

    "Disable the video conversion in the AVR."

    Something as simple as that (default) setting was killing HDR, resulting in washed-out colors.

    Now to see if my new RPi 4 with LE can reliably play 4K content for entire movies (my S912 with CoreElec struggled).

    Cheers, Carl

    I believe you may still be getting sub-par colours in the current Pi 4 HDR builds - as I don't believe Rec 2020 gamut (which is the gamut HDR10 and the 'in the wild' HLG uses) is being flagged, so whilst the colours look less washed-out than when the HDR EOTF was being blocked (and your PQ ST.2084 content was being displayed in SDR), the Pi 4 still outputs Rec 2020 as Rec 709.

    Some TVs may switch to Rec 2020 when they detect an HDR EOTF I guess (though that's a bit of an 'out of spec' guess), but certainly my Sony TV will display Rec 709 gamut stuff in Rec 709 with an HDR EOTF (which means the colours will be less vibrant than if correctly flagged as Rec 2020 - which I can manually force and which brings things closer to what they should look like) (*)

    (*) The YCbCr->RGB matrixes used for Rec 709 and Rec 2020 are slightly different - but the differences between them will be harder to tell. (The difference between Rec 601 SD and Rec 709 HD are more marked ISTR)

    I was talking about 50/60Hz 4K modes. 12-bit works with 2160p/24/30Hz.

    That's good progress - ISTR that in the earlier days it was 8-bit + dither for <50Hz 2160p modes too. I guess this is with RGB or YCbCr 4:4:4? (And not 4:2:2 ?)

    For 2160p50 and above, if 4:2:2 isn't supported, then 4:2:0 will be the only HDMI 2.0 option for >8 bit output I guess (as 4:4:4/YCbCr is 8-bit only at 2160p50 and above in the specs - you have to use 4:2:0 or 4:2:2 for >8-bit output at 2160p50 and above)