Information Regarding Linux on Rockchip SoC's

  • works now again, wordpress acts wired when you accidentally enter proper formatted links

    *For Gods Sake I started cursing Google / Firefox / Safari / Browsers and my ISP "

    Thank you. CvH

    I got 3 different URL'S for the same image Rock64 . link.


    libreelec-rk3328.arm-8.90.013-rock64.img.gz


    libreelec-rk3328.arm-8.90.013


    libreelec-rk3328.arm-9.0-nightly-20190201-974f4cb-rock64.tar

  • Just tested the LibreElec 9.0.0 Rock64 release and can confirm the issues I had with 50i h.264 content have been solved so far.


    h.264 1080i25 (aka 50i) separate field (early BBC HD Blu-ray releases) and MBAFF (more recently BBC HD Blu-rays and Live TV) all seem to play OK.


    50Hz native content is being deinterlaced correctly without field dominance issues.


    (DTS HD MA bitstreaming is broken with my Denon AVR, but DTS core and 5.1 PCM are OK. Interesting the DD and DTS bitstreams are being reported as PCM 2.0 not Bitstream via my HD Fury Vertex, so I wonder if not all flags are being set correctly. The Fury won't analyse the audio content - just the metadata)


    HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata (I think this was correctly passed in a previous release?) HLG stuff is correctly flagged with an HLG EOTF (which makes Rockchips the only Kodi devices that do this I think)

  • HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata

    That was next the question I was going to ask you.


    In user friendly English can you tell us what a normal 4K HDR user would see visually with LE Rockchip at the moment ?

    ie. color, brightness - what more than Max/Average light level may be missing ?


    is it the same as Vero 4K or S912 with a HDR supported Kernel for example ?

  • ?? from where ?

    at the dl page is only LibreELEC-RK3328.arm-8.90.013-rock64.img.gz

    From Rockchip – LibreELEC is where the issue was.

    Now i can't reproduce it. Forget it.

    I had the nightlies link bookmarked so i see where they are now.

    Index of / works fine.

  • Interesting the DD and DTS bitstreams are being reported as PCM 2.0 not Bitstream via my HD Fury Vertex, so I wonder if not all flags are being set correctly.

    This is correct, only LPCM is "supported" correctly in the kernel code at the moment, when Kodi sends a NL-PCM bitstream it is flagged as LPCM 2.0 16bit 48khz stream. On my TV/AVR this usually gets treated as NL-PCM when using 24p mode and static noise in [email protected] mode.


    HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata (I think this was correctly passed in a previous release?)

    This should never have worked in earlier releases, the code have always only set the eotf parsed from the video metadata (there is currently no easy way to get other related hdr metadata from mpp library back to kodi). Code that sets HDR metadata.


  • After a quick check the situation isn't the same as the S905X/D and S912.


    The Rockchip Rock64 LE 9.0.0 image I've used isn't sending any HDR metadata - it's just flagging the EOTF.

    The AMLogic image I'm using is sending the mastering HDR metadata, but not the Average/Max CLL stuff.


    When I send a file with the following metadata :


    Code
    1. Color primaries : BT.2020
    2. Transfer characteristics : PQ
    3. Matrix coefficients : BT.2020 non-constant
    4. Mastering display color primaries : Display P3
    5. Mastering display luminance : min: 0.0005 cd/m2, max: 1000 cd/m2
    6. Maximum Content Light Level : 1000 cd/m2
    7. Maximum Frame-Average Light Level : 400 cd/m2


    The RockChip doesn't send anything other than that the signal has BT.2020 colour primaries and flags a PQ (aka ST.2084) EOTF.

    No mastering display (mastering primaries or min.max display luminance) metadata is sent, nor is the MaxCLL or MaxFALL data.


    The AMLogic sends the BT.2020 primaries flag, the DCI-P35 D65 mastering primaries, and also the Mastering Display Min and Max Luminance metadata correctly (i.e. tells you what the monitor the colourist was using was calibrated to show them), but doesn't send the MaxCLL or MaxFALL (so you know nothing about the content you are playing other than what settings/performance monitor it was graded on).


    The reason we have HDR metadata is to tell a display what to expect from the content in light level range terms, as PQ dictates an absolute relationship between video signals and display light levels on a pixel-by-pixel basis. This allows it to optimise it's approach to tone-mapping out-of-range (i.e. too bright - or possibly too dark?) content. Without it - the display has no idea what to expect from the content and thus how to cope with content that is out of range for it.


    This is the downside to PQ - you have to have metadata to get the best quality display of PQ content on a display that can't cope with the full PQ range (and consumer displays are a long way from being able to handle the full PQ 10,000 nits range that can be carried by that EOTF) If stuff isn't mastered above 1,000 nits things get easier - but even then you need MaxCLL/MaxFALL to optimise within this range if your TV can't sustain 1,000 nits.


    My understanding is that in practice not carrying the correct mastering and content metadata means your TV, and in particular an HDR projector, won't optimally process and display HDR content that is out of the range of your display - and is more likely to clip details rather than alter the tone mapping to preserve detail.


    How the absence of metadata is handled by displays will presumably vary. The Rockchip is definitely in a worse case - as you have no idea at all of the source video range (or even the primary colour volume it was mastered within inside the BT.2020 colour space), nor do you know the min/max levels of the mastering display (below and above which you wouldn't expect valid content?)


    hdr.pdf This may not be 100% accurate in all regards - but is worth a read.

    Edited once, last by noggin ().

  • How the absence of metadata is handled by displays will presumably vary. The Rockchip is definitely in a worse case - as you have no idea at all of the source video range (or even the primary colour volume it was mastered within inside the BT.2020 colour space), nor do you know the min/max levels of the mastering display (below and above which you wouldn't expect valid content?)

    The CTA-861-G standard states: "The data in Data Bytes 3 – 26 are arranged into groups, as indicated in Table 45 Static Metadata Descriptor Type 1 above. When all of the Data Bytes in a group are set to zero, then the Sink shall interpret the data for that group as unknown." and "For MaxCLL and MaxFALL, this may occur when information about the content light level has not been, or cannot be, provided - for example, content that is rendered or broadcast in real-time, or pre-processed content that was delivered without information about the content light level.", this mean that TVs should support missing metadata, and I expect the result is that no optimization of light levels can be done.

  • The CTA-861-G standard states: "The data in Data Bytes 3 – 26 are arranged into groups, as indicated in Table 45 Static Metadata Descriptor Type 1 above. When all of the Data Bytes in a group are set to zero, then the Sink shall interpret the data for that group as unknown." and "For MaxCLL and MaxFALL, this may occur when information about the content light level has not been, or cannot be, provided - for example, content that is rendered or broadcast in real-time, or pre-processed content that was delivered without information about the content light level.", this mean that TVs should support missing metadata, and I expect the result is that no optimization of light levels can be done.

    Yes - this is the reason PQ HDR is seen as a bit of a 'non-starter' for broadcast applications - particularly live TV production, and why almost all broadcast HDR is delivered to the viewer using HLG not PQ. (Though in many cases it's produced in SLog)


    I'm guessing if you have an HDR PQ ST.2084 signal without light level / mastering metadata the TV behaves in much the same way is it would if you 'forced' HDR10 rendering (as you can with my Sony TV)?

  • Has anyone been playing around with the current level of Panfrost stuff on the rk3399? I am still waiting for my board to show up from Pine but have been working with mainline stable 4.20 kernels trying to migrate some things from my earlier mali hacks that already work on Amlogic's garbage but can't test anything yet till my rk3399 boards show up...


    I am just curious as to where everyone is currently sitting reqarding the panfrost stuff

  • RK3399 can work with panfrost but needs a mainline kernel .. so you hit some issues:


    a) There's only partially working video drivers for mainline RK at the moment

    b) Device-trees are for the mali blob and panfrost requires some changes

    b) There are plenty of bugs (as with S912) that impact stability


    So it works, but, YMMV