Posts by kubrickdave

    Follow-up to my previous post


    The HDR color space issue seems to be related to this:
    thread-6890-post-42631.html#pid42631


    If I start an HDR video my TV goes into HDR mode (full brightness, 2084 EOTF), but does not switch to the BT.2020 gamut. I can then back out into the main menu, with the video still running and restart the video by selecting it again. Now the TV does not go into HDR mode, but this time it selects the correct color gamut.


    My theory is:
    This appears to be a timing bug. The HDMI InfoFrame containing the colorimetry information is emitted before the refresh rate change happens. My TV apparently needs this reversed (as was the case on Marshmallow kernel). If I disable refresh rate switching in the settings, everything works as expected.


    kszaq, anything you can do about this?


    Also:
    The 2160p60hz (4:4:4) mode works, when I manually select it using /sys/class/display/mode, but not from Kodi menu, where it chooses 2160p60hz420 (4:2:0). So everytime I exit playback, Kodi automatically switches back to 4:2:0.


    There might be an issue with fractional frame rates in some cases in the latest build. I hope it'll be fixed in the next build.


    Anyone having issues with video playback has to post samples. Without it I cannot reproduce any issue. Media info alone does not help.


    Try running "dmesg | grep frac" after playing a file with fractional framerate. You'll get something like this (time stamps omitted):

    Code
    hdmitx: set clk: VIC = 93  cd = 4  frac_rate = 0


    Then "echo 1 > /sys/class/amhdmitx/amhdmitx0/frac_rate_policy" and it becomes this:

    Code
    hdmitx: set frac_rate_policy as 1
    hdmitx: set clk: VIC = 93  cd = 4  frac_rate = 1


    The stutters are gone now. You should be able to add this to the platform_init script.
    [hr]
    Unfortunately when playing HDR WCG files, the BT.2020 color space is not signaled to the TV. This used to work on the old kernel.
    I'm running a Beelink Mini MXIII on a Sony Z9D TV through a Denon X4300H receiver.

    Yes, great performance for the money which works best with libreelec :)
    Considering different bits, format, specs maybe to check this thread because of various demo links thread-4382.html
    Just checked, I can play 10bit HDR demo file from 905x.


    p.s. consider copying the files to sd card&usb, because of 100Mb network is sometimes slow.


    I tried most available clips and they play all, it's just the output that is truncated from 10 to 8 bits. I only noticed that when checking the signal info menu of my AVR.

    Thanks :) Currently the best picture I can display is from libreelec. It's really fantastic.
    If we want 10-bit output from s905x is there something we can do?


    At this point I'm not really sure if the hardware supports it. Amlogic advertises HDR10 support, which I would understand to mean proper 10bit output without truncation (along with signaling the ST2084 EOTF, BT.2020 color space and associated metadata, all of which seems to work).
    Unfortunately almost no display hardware reports the bit depth of the input signal, so it's hard to find people's experiences on the internet. I also haven't tested the Android firmware that came with my box. Maybe it works there.
    Nevertheless the picture quality is very good. I can spot some slight banding in some scenes, but nothing too severe really. I'm really impressed for the 40 bucks I sent.


    Hi all :)


    I have one question related to s905x image. When I play HDR demo/movie from libreelec, my TV recognizes HDR stream and displays HDR notification.


    Since "known issues" from the first post is that HDR is not supported is this fake notification (from tv)?


    HDR is not supported in the sense that kszaq does not have equipment to test it. So no support from him.
    Other than that it works just fine (with the exception that the output is limited to 8bit). Your TV is not lying to you. :)

    kszaq:
    I again took a look at 10bit output and stumbled across this commit:
    PD#118490: hdmitx: add colordepth/HDR feature · LibreELEC/[email protected] · GitHub


    Unfortunately /sys/class/amhdmitx/amhdmitx0/colordepth does not exist and I cannot create it:

    Code
    LibreELEC:~ # echo 30 > /sys/class/amhdmitx/amhdmitx0/colordepth
    -sh: can't create /sys/class/amhdmitx/amhdmitx0/colordepth: Permission denied


    Is there anything that can be done about this?


    I noticed another thing when trying out the Spears and Munsil pattern. There's quite severe banding in the red and blue gradients. This is not the case if you switch to software decoding, where the ramps are perfectly smooth. Also there is no overscan issue with software decoding. I can make a screen cap on the weekend.


    jd17: Maybe switching to RGB output reduces the number of colorspace conversons and gets rid of these artifacts. Have you tried it?


    I have to correct myself. The banding issue is caused by my TV. I tried different sources as well as the integrated player. All show the same pattern. Software decoding clips whiter-than-white luminance levels, which might explain the smoother red and blue gradients. I think the lesson is, that there's high variance in equipment and it's hard to make general observations.


    Also I switched the output resolution to 2160p (previously on 1080p) and this got rid of the overscan.

    I noticed another thing when trying out the Spears and Munsil pattern. There's quite severe banding in the red and blue gradients. This is not the case if you switch to software decoding, where the ramps are perfectly smooth. Also there is no overscan issue with software decoding. I can make a screen cap on the weekend.


    jd17: Maybe switching to RGB output reduces the number of colorspace conversons and gets rid of these artifacts. Have you tried it?


    Krypton builds are now using default kernel settings for deinterlacing and all other video processing. This doesn't mean that Amlogic is fully to blame because they play with these parameters in their Android "native" video player app.


    Would it be possible to compile a list of known hardware decoder bugs? I have heard of slight overscan (missing a column of pixels left and right), as well as the stuff mentioned in the OP of the main thread. Are these unfixable?


    BTW is there some public documentation available as to what the available parameters do or just generally for the SOC?


    As was pointed out in the beginning of this thread - many things have changed when going from the old Kodi video player in Jarvis to the new Krypton video player.
    This is a Krypton-specific issue, so I think AMLogic is not to blame...


    I know about the changes in the video player, but these are parameters for one of Amlogic's kernel modules. They are not directly related.


    Hi kszaq. You mention on the first post that 3D and HDR are not supported. Have they been removed from this current build?


    I can confirm that HDR works. The transfer function and color space are correctly signaled. Only problem is, output is limited to 8 bits, as it seems. I sort of remember someone mentioning a Deep Color mode on Android, so we might get the 10 bits on this build too.

    Nice summary, sounds like the right conclusions to me, also these issues are more visible on SD content like you already mentioned :)


    Is this just a misconfiguration in kszaq's build or the default config coming from Amlogic's kernel source? This and all the other issues make you really think, just how bad one can screw up video processing. Can't they get anything together without hobbyists having to fix their work afterwards.


    If you can really output 4:2:0 at 1080p you could circumvent the chroma upsampling, provided it doesn't do 4:2:0 -> 4:4:4 -> 4:2:0 internally. But I think the 4:2:0 modes were only scpecified in HDMI 2.0 to save bandwidth at 4K resolutions, so your TV might not accept it at 1080p.