Information Regarding Linux on Rockchip SoC's

  • Just tested the LibreElec 9.0.0 Rock64 release and can confirm the issues I had with 50i h.264 content have been solved so far.

    h.264 1080i25 (aka 50i) separate field (early BBC HD Blu-ray releases) and MBAFF (more recently BBC HD Blu-rays and Live TV) all seem to play OK.

    50Hz native content is being deinterlaced correctly without field dominance issues.

    (DTS HD MA bitstreaming is broken with my Denon AVR, but DTS core and 5.1 PCM are OK. Interesting the DD and DTS bitstreams are being reported as PCM 2.0 not Bitstream via my HD Fury Vertex, so I wonder if not all flags are being set correctly. The Fury won't analyse the audio content - just the metadata)

    HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata (I think this was correctly passed in a previous release?) HLG stuff is correctly flagged with an HLG EOTF (which makes Rockchips the only Kodi devices that do this I think)

  • HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata

    That was next the question I was going to ask you.

    In user friendly English can you tell us what a normal 4K HDR user would see visually with LE Rockchip at the moment ?

    ie. color, brightness - what more than Max/Average light level may be missing ?

    is it the same as Vero 4K or S912 with a HDR supported Kernel for example ?

  • Interesting the DD and DTS bitstreams are being reported as PCM 2.0 not Bitstream via my HD Fury Vertex, so I wonder if not all flags are being set correctly.

    This is correct, only LPCM is "supported" correctly in the kernel code at the moment, when Kodi sends a NL-PCM bitstream it is flagged as LPCM 2.0 16bit 48khz stream. On my TV/AVR this usually gets treated as NL-PCM when using 24p mode and static noise in 1080p@60hz mode.

    HDR stuff - ST.2084 PQ stuff flags the EOTF but doesn't flag Max/Average light level metadata (I think this was correctly passed in a previous release?)

    This should never have worked in earlier releases, the code have always only set the eotf parsed from the video metadata (there is currently no easy way to get other related hdr metadata from mpp library back to kodi). Code that sets HDR metadata.

  • That was next the question I was going to ask you.

    In user friendly English can you tell us what a normal 4K HDR user would see visually with LE Rockchip at the moment ?

    ie. color, brightness - what more than Max/Average light level may be missing ?

    is it the same as Vero 4K or S912 with a HDR supported Kernel for example ?

    After a quick check the situation isn't the same as the S905X/D and S912.

    The Rockchip Rock64 LE 9.0.0 image I've used isn't sending any HDR metadata - it's just flagging the EOTF.

    The AMLogic image I'm using is sending the mastering HDR metadata, but not the Average/Max CLL stuff.

    When I send a file with the following metadata :

    Code
    Color primaries                          : BT.2020
    Transfer characteristics                 : PQ
    Matrix coefficients                      : BT.2020 non-constant
    Mastering display color primaries        : Display P3
    Mastering display luminance              : min: 0.0005 cd/m2, max: 1000 cd/m2
    Maximum Content Light Level              : 1000 cd/m2
    Maximum Frame-Average Light Level        : 400 cd/m2

    The RockChip doesn't send anything other than that the signal has BT.2020 colour primaries and flags a PQ (aka ST.2084) EOTF.

    No mastering display (mastering primaries or min.max display luminance) metadata is sent, nor is the MaxCLL or MaxFALL data.

    The AMLogic sends the BT.2020 primaries flag, the DCI-P35 D65 mastering primaries, and also the Mastering Display Min and Max Luminance metadata correctly (i.e. tells you what the monitor the colourist was using was calibrated to show them), but doesn't send the MaxCLL or MaxFALL (so you know nothing about the content you are playing other than what settings/performance monitor it was graded on).

    The reason we have HDR metadata is to tell a display what to expect from the content in light level range terms, as PQ dictates an absolute relationship between video signals and display light levels on a pixel-by-pixel basis. This allows it to optimise it's approach to tone-mapping out-of-range (i.e. too bright - or possibly too dark?) content. Without it - the display has no idea what to expect from the content and thus how to cope with content that is out of range for it.

    This is the downside to PQ - you have to have metadata to get the best quality display of PQ content on a display that can't cope with the full PQ range (and consumer displays are a long way from being able to handle the full PQ 10,000 nits range that can be carried by that EOTF) If stuff isn't mastered above 1,000 nits things get easier - but even then you need MaxCLL/MaxFALL to optimise within this range if your TV can't sustain 1,000 nits.

    My understanding is that in practice not carrying the correct mastering and content metadata means your TV, and in particular an HDR projector, won't optimally process and display HDR content that is out of the range of your display - and is more likely to clip details rather than alter the tone mapping to preserve detail.

    How the absence of metadata is handled by displays will presumably vary. The Rockchip is definitely in a worse case - as you have no idea at all of the source video range (or even the primary colour volume it was mastered within inside the BT.2020 colour space), nor do you know the min/max levels of the mastering display (below and above which you wouldn't expect valid content?)

    hdr.pdf This may not be 100% accurate in all regards - but is worth a read.

    Edited once, last by noggin (February 2, 2019 at 1:23 PM).

  • How the absence of metadata is handled by displays will presumably vary. The Rockchip is definitely in a worse case - as you have no idea at all of the source video range (or even the primary colour volume it was mastered within inside the BT.2020 colour space), nor do you know the min/max levels of the mastering display (below and above which you wouldn't expect valid content?)

    The CTA-861-G standard states: "The data in Data Bytes 3 – 26 are arranged into groups, as indicated in Table 45 Static Metadata Descriptor Type 1 above. When all of the Data Bytes in a group are set to zero, then the Sink shall interpret the data for that group as unknown." and "For MaxCLL and MaxFALL, this may occur when information about the content light level has not been, or cannot be, provided - for example, content that is rendered or broadcast in real-time, or pre-processed content that was delivered without information about the content light level.", this mean that TVs should support missing metadata, and I expect the result is that no optimization of light levels can be done.

  • The CTA-861-G standard states: "The data in Data Bytes 3 – 26 are arranged into groups, as indicated in Table 45 Static Metadata Descriptor Type 1 above. When all of the Data Bytes in a group are set to zero, then the Sink shall interpret the data for that group as unknown." and "For MaxCLL and MaxFALL, this may occur when information about the content light level has not been, or cannot be, provided - for example, content that is rendered or broadcast in real-time, or pre-processed content that was delivered without information about the content light level.", this mean that TVs should support missing metadata, and I expect the result is that no optimization of light levels can be done.

    Yes - this is the reason PQ HDR is seen as a bit of a 'non-starter' for broadcast applications - particularly live TV production, and why almost all broadcast HDR is delivered to the viewer using HLG not PQ. (Though in many cases it's produced in SLog)

    I'm guessing if you have an HDR PQ ST.2084 signal without light level / mastering metadata the TV behaves in much the same way is it would if you 'forced' HDR10 rendering (as you can with my Sony TV)?

  • Has anyone been playing around with the current level of Panfrost stuff on the rk3399? I am still waiting for my board to show up from Pine but have been working with mainline stable 4.20 kernels trying to migrate some things from my earlier mali hacks that already work on Amlogic's garbage but can't test anything yet till my rk3399 boards show up...

    I am just curious as to where everyone is currently sitting reqarding the panfrost stuff

  • RK3399 can work with panfrost but needs a mainline kernel .. so you hit some issues:

    a) There's only partially working video drivers for mainline RK at the moment

    b) Device-trees are for the mali blob and panfrost requires some changes

    b) There are plenty of bugs (as with S912) that impact stability

    So it works, but, YMMV

  • Rockpro64 finally arrived about a week ago and I am luving this board. So far i have managed to piece together a linux build and am playing with the 5.1 and 4.21 kernels and have been poking around with the panfrost stuff as well and fixed a couple of issues reducing some of the lockups and leaks while testing on my own kodi 18.1 build. still more to fix on this project while splitting my time on another unrelated project. I want to use kernel 5 for my other project so i will probably stay at this level while i keep working forward, I have now abandoned all S912 projects and started moving all other boxes over to H96 Max's running on 4.21. Still have a few issues with them but the rk3399 rocks and blows any of the Amlogic SoC's out of the water. The Panfrost stuff is actually pretty good but still requires a fair bit of kernel work yet, but is a huge step in the right direction.

  • Rockpro64 finally arrived about a week ago and I am luving this board. So far i have managed to piece together a linux build and am playing with the 5.1 and 4.21 kernels and have been poking around with the panfrost stuff as well and fixed a couple of issues reducing some of the lockups and leaks while testing on my own kodi 18.1 build. still more to fix on this project while splitting my time on another unrelated project. I want to use kernel 5 for my other project so i will probably stay at this level while i keep working forward, I have now abandoned all S912 projects and started moving all other boxes over to H96 Max's running on 4.21. Still have a few issues with them but the rk3399 rocks and blows any of the Amlogic SoC's out of the water. The Panfrost stuff is actually pretty good but still requires a fair bit of kernel work yet, but is a huge step in the right direction.

    It did until the arrival of the S922 which will be presented in the upcoming odroid N2. This will match or better the rockchip rk3399 and will cost less. Since this is likely to have CE support on release day it likely to grab most of the potential rockchip market from day one.

    Shoog

  • hm... ya that maybe true all of that is a moot point because until a Real working graphix stack emerges nothing will change no matter how much crap Amlogic floods the market with. so far after all these years of promises nothing ever materializes. so only time will tell. theres no point in producing a race car if theres not gas for it.

    But to be fair the emergence of the Panfrost development is a good sign but still some ways off...

  • Panfrost is making serious progress. It now passes almost 100% of GLES 2.0 tests and Rob Herring + Tomeu Viscuso are making good headway on a proper kernel driver (Rob brings the kernel DRM/DRI knowledge lacking among the core panfrost developers). If the current pace of change continues I'm expecting to switch from the ARM kbase driver to a native one in the next month.

  • hm... ya that maybe true all of that is a moot point because until a Real working graphix stack emerges nothing will change no matter how much crap Amlogic floods the market with. so far after all these years of promises nothing ever materializes. so only time will tell. theres no point in producing a race car if theres not gas for it.

    But to be fair the emergence of the Panfrost development is a good sign but still some ways off...

    I have been horribly disappointed in AML in the past, but reading Odroids press release for the N2 it seems that times have changed with a 4.9 kernel and a wayland driver almost ready to roll:

    "

    Linux


    An Ubuntu 18.04 LTS image is available with Kernel version 4.9.152 LTS at this moment. This kernel version will be officially supported until Jan, 2023.


    A hardware accelerated video decoder (VPU) driver is ready. We have c2player and kplayer examples which can play 4K/UHD H.265 60fps videos smoothly on the framebuffer of ODROID-N2 HDMI output.


    The Mali G52 GPU Linux driver works only on the framebuffer. We tested the latest PPSSPP emulation and it can handle x3 scaling on a 4K display nicely with well implemented VSYNC.


    There will be a Linux Wayland driver a few months later. We are intensively working on it together with Arm and Amlogic.


    Unfortunately, there is no X11 GPU driver since Arm has no plan to support X11 for Bifrost GPUs anymore.


    We hope that the Panfrost open source driver can be ported to ODROID-N2 soon."

    That puts it in about the same place as the rockchip with regard to Linux.

    Shoog

  • Actually Hardkernels stuff i have watched pretty much since they started as it was one of the few sources of the leaked SoC docs so i have kinda watched them for awhile and if anyone is going to get any real help from AML it would be from some one like him. I have a few of his boards here as well and there well put together.

    The real issue is going to come down to a real working full graphix stack which just never seems to materialize.

    Outside the discussion of missing drivers and stuff, there is still a long running legal challenge still plaguing the industry over various infringements being claimed by AMD over patents inherited from ATI's take over years ago and a variety of manufactures that are supposedly using unlicensed stuff, some of which is Arm's Mali gpu's. As the years go by one by one the manufacturers are caving and signing agreements like Samsung did awhile back and up till now companies like Arm being a IP developer not really manufacturing any real hardware have only been kinda named on the sidelines to this point. Eventually tho that will change as the hardware manufactures start signing agreements. GPU's is a extremely secretive industry and largely based in IP so its a huge battle that will probably go on for quite awhile still.

    Personally i think AML and others are sitting and using the public coders to create and pitch something out into the public realizing that once its out in the open there business model will hugely benefit without taking any of the real risk which is why i think things have gone the way they have for so long. Seriously if you stop and think about the business part of this, why would any company release a product in which a KEY part of product is basically crippled because of software, especially when there are other graphix stacks that could have been choosen which in the end really would have given the end user of the product the true use of the product. AML and others aren't dumb they have seen the extremely huge base market in a product thats really driven by a large portion of the market being do it yourself type of people trying to play and improve on the crappy software support provided by ALL of the manufactures, in some ways rather then driving off the market because of poor support its actually increased the market by all the hackers trying to fix and improve things.

    As far as writing drivers go for Linux thats definately doable as many have been doing so since the 90's, Docs and good debugging and observation skills are a defiinate help and in this case I at least tip my hat to Rockchip for at least making that part readily availabe for anyone wanting to learn and try, Amlogic tho still squanders and hides there info as even something as simple as the 912 docs and register info are only available to the choosen few, leaving the only way of getting that info is to find leaked material.

    I want to help and at some point will start to do so but for now will stay in my corner until i have a complete solution, at which time i will look into how to go about putting it into the public without putting myself and others in harms way. Intellectual property these days is a big mess and not something that i want to be on the front line over. To some i am sure they will think i am being paranoid but believe me i have spend many years reverse engineering things and am no longer naive about what can happen and anyone that understands what my avatar is based on will understand what i am talking about.

    The X11 comment is interesting as i really think looking into creating the driver in that area is worth looking at as it will as you say soon be abandoned by Arm, its just i am not sure how well it would sit with setups like LibreeELEC and others that werent really designed with that in mind as its not really a full blown linux os. My software is more towards a fully linux install so supporting kodi compiled to run on that type of setup in much the same manner as it would on a full linux box actually works quite well. Now with some of the newer SBC having more memory and better hardware its now becoming a option.