Posts by wesk05

    Sony HW700 doesnt support HDR passthrough and only supports HDMI 1.4 afaik, so max 4K @ 30 Hz, thats why no direct connection to it.

    I dont think its Sonys fault though, previously with 1080p media players I connected them directly through Sony and never had issues with switching HD Audio Streams.

    I asked you to try it directly to rule out an issue with the AVR key. When I suggested that it could be a problem with the HW700 system what I meant is, the problem could be that your system also relies on that burst-preamble Pc bit for detecting bitstream types. Since Amlogic has an issue with that, your HW700 is not detecting the audio streams correctly. It would have worked fine with other media players because those may not have this burst-preamble problem.

    I am also noticing audio passthrough issues with this LE build. So, this is a new problem with this build that's unrelated to the burst-preamble problem that I described.

    Neither of those. I use a Signal Splitter- HDfury AVR Key, after that the signal goes into my Oled B7 TV and another goes into Sony MDR-HW700 which is a pretty good headspeaker for movies.

    Perhaps its an issue with the AVR Key, but as described previously, if I switch audio streams a couple of times, I usually get to the one I want.

    It could very well be your HW700. Have you tried a direct connection to the HW700? I see that it has a HDMI out.

    THe only issue Im having so far is that sometimes the HD audio formats dont play, after switching to a different audio stream and then back to previous, it starts to play. Sometimes I need to repeat this a few times until the audio stream of choice starts to play.

    Is your AVR a Denon or Onkyo?

    All Amlogic devices have an issue with the burst-info preamble of the IEC 60958 packet. It sends out packets that do not always carry the Pc bit. The Pc bit defines the bitstream that is carried in the IEC 61937 bitstream. Many lower end AVRs simply rely on this information for determining the bitstream. This seems to be an issue with the Cirrus Logic chipset used in these AVRs that doesn't seem to parse the sync word.

    I had reported this issue to Minix and Kodi developer (fritsch) more than a year ago. At that time, fritsch was of the opinion that it was a Amlogic problem. He didn't think that it was a Kodi problem (I'm also of the same opinion).

    What I have seen is if I switch from a Dolby Digital Plus stream, the next stream will have the Pc bit. This is likely the problem for the DTS HRA issue reported on OSMC forums. I don't think it is limited to DTS HRA. Folks who have issue with DTS HRA likely have problems with HD audio passthrough just like you have reported.

    From a video playback perspective there is NO normal source video content shot at 100 or 120fps to even output at those frame rates.

    Conclusion.

    There is no support on any normal media players for those video playback frame rates.

    SES had demonstrated next generation high frame rate (100fps) UHD transmission last year. There was a 5 min clip of the broadcast available on demo uhd 3d which is now RIP. You can find the clip (Astra HFR Test) on YouTube. I'm not sure whether the YouTube ones are actually 100fps.

    Shield can output 1080p 100 or 120 Hz when connected to *some* displays.

    evFT2itl.png

    H.264/HEVC SDR video up to 2160p, 60 fps, Main/Main 10 profile

    HEVC Dolby Vision (Profile 5)/HDR10 (Main 10 profile) up to 2160p

    -------------------------------------------------------------------------------------------------

    It looks like Profile 5 is for 1080p resolutions only.

    Looks pretty useless to me.

    There are Dolby Vision profiles and levels. You are looking at the levels. Check page 7 of the document for profiles. Profiles 4,5,7 (1:1/4) all support up to uhd60 level.

    But it's becoming clear that mediacodec on Android and AVFoundation on tvOS have only implemented single-layer DV

    ATV4K supports dual layer and single layer single-track Dolby Vision profiles. The Dolby Vision demo clips floating around all have dual-layer single track Dolby Vision profile. MediaInfo will show this information for the video track

    Dolby Vision : 1.0, dvhe.dtr@uhd24, BL+EL+RPU

    dvhe.dtr bitstream profile string indicates that this is Dolby Vision profile 4. The strings for other common profiles are dvhe.stb (profile 5), dvhe.dtb (profile 7).

    dv-Dolby Vision, he-HEVC, d-dual layer, t- 10-bit, r - backwards compatible/can be decoded to SDR, BL-base layer, EL-Enhancement layer, RPU-Reference Processing Unit (metadata). s- single layer, b - backwards compatible with Blu-ray format. The "n" in profile 5 indicates that it is not backwards compatible. The stream is in IPTPQc2/IPT color space.

    So I suspect without a DolbyVision licence we likely have no hope of processing the DV metadata layer that Blurays use.The MINIX U9-H may be DV capable hardware, but does it even have the (probably needed) DV licence to begin with ?

    From Dolby:

    Quote

    Every Dolby Vision playback device must pass Dolby Vision system development kit certification. During the certification procedure, the chipset implementing the Dolby Vision decoder will be tested against the advertised device capabilities, and Dolby will approve the device capabilities.

    When you say you test 3 different dolby vision movies. What was your test material? Also the m8S pro is not the Minix uh-9?

    Some additional commands? hmmm perhaps someone can remember

    Transformers:The Last Knight, Spider-man:Homecoming, The Fate of the Furious.

    Correct, M8S Pro is not Minix U9-H, but it has the same S912 SoC as the Minix U9H and from LibreELEC point of view, they are identical in features. I can test the Minix U9-H also, but I am 110% positive that the outcome will not be any different.

    I just tested an untouched .m2ts file direct from the UHD disc of Spider-Man Homecoming. Mediainfo verifies that this m2ts has a 1080p DV layer separate from the 2160p HDR10 track. LibreELEC (MINIX NEO U9-H, Amlogic S912) ignores the DV layer and sends an HDR10 stream to my display (Vizio P75 with Dolby Vision).

    I was going to post the exact same thing. I tested 3 different Dolby Vision movies with latest 8.90.3 GDPR-2 build on a M8S Pro & LG C7 OLED and they all played back in regular HDR10 mode. The single-track demo clip playback was in SDR mode.

    I remember that johngalt had mentioned some commands that had to be used to enable/control Dolby Vision playback. I can't find it now, but I had tried Dolby Vision play back in one of his very early builds. It didn't work at that time also.

    Quote

    A demo clip isn't the best example—demo clips are almost universally single-layer

    The demo clips are single-track dual layer. Profile IDs 4 and 7 are dual layer. 5/8/9 are single layer.

    This is not so.

    Amlogic S905, S905X great outputs 192/24 on optics SPDIF (TOSLINK), starting with version 7.0.X. (see pic.1)

    My report -> [8.0.2e] LibreELEC 8.0 for S905/S905X

    In recent releases of 8.1.x ower USB can be output to the DAC the frequency of 352.8 khz and 384khz - (see pic.2)

    It works on all my players, and a few of my devices DAC.

    , I don't know how to say it ... it's woooooorking!!! Finally!!! I have 24/192 over optical digital output! Yessir! I'm so happy ... :)

    Just because you are seeing XXXkHz on your DAC/AVR doesn't mean that you are actually getting 24-bit/XXXkHz output. It only means that there is no downsampling.

    I don't think any of the Amlogic SoC's actually output 24-bit depth PCM. It is being dithered to 16-bit. The SoC's may be capable of 24-bit outut, but there is probably something in kernel that isn't right.

    I have looked at the channel status bits in the audio InfoFrame of the HDMI output and it is only set to 16-bit word. One could argue that it is just that the channel status bit is not set correctly and the output is actually 24-bit. I tested this out by recording the PCM output of Minix U1 (LibreELEC 8.2.1.1), Intel Haswell (Milhouse LibreELEC 9 build) and nVIDIA Shield. The source was a 1 min long 24-bit 192kHz WAV. PCM output from the devices was recorded using a 24-bit 192kHz capable Magewell HDMI capture card. Recorded output was analyzed in Adobe Audition and MusicScope for bit depth. The screenshots on the left side show the Amplitude Statistics. You will find the actual measured bit depth of the recorded pcm. The screenshots on the right side show the bit monitor analysis report from MusicScope software. The rectangles represent bits. Unused bits are in blue, red colored bits do not contribute to the SNR and are considered to be noise. The bits vary in grayscale intensity depending on usage (highly used ones are white).

    Original Wav

    Omc880Tm.png    AOsemc4l.png  

    Minix U1

    eFHantpm.png    0LN2LYrl.png

    Intel  

    1Y8tyUrm.png      PtRqpGil.png

    nVIDIA Shield

    bWArsHPm.png        EivRFBam.png

    The conclusion that I can draw from the test is Minix U1 output has only 16-bit audio depth. Intel and Shield outputs do have 24-bit depth. I have only looked at the HDMI output. I can't think the SPDIF output will be any different. It is possible that USB output is different.

    I didn't see anything close to like what you have in your video clip. I definitely don't get that flickering. Is this a HDR TV and is it really this dark?

    Thanks for your input.
    Edge enhancement might be there too, but I doubt it's the root cause for what I am seeing.
    The affected areas in my case are mainly smooth gradations in skies, where edge enhancement should not have a big impact.

    Would you be so kind as to do that for me?
    It's only a matter of 5 minutes.

    This video is particularly severe regarding the flicker and macroblocking in the skies. This is minor in other videos I tested.
    I think the reason is the fact that it's mainly stills with very smooth gradations at a low framerate. Normally, these artifacts seem to be hidden by moving pictures or more obvious content/color changes.

    The vertical lines are not very present in that demo, I'd have to look what other video is suited to see those.

    I compared the clip on Samsung KS8500, nVIDIA Shield, Minix U9-H, Mecool M8S Pro and internal players on Samsung KS8000/JS8500 TVs. I didn't see any obvious flicker or macroblocking in the skies in any of the scenes. However, I did notice fine chevron like almost transparent vertical lines in some of the sky scenes. I am not positive on this, but I think the colors (esp. red) also looked a little different.

    Does anyone of you guys by any chance have a Mini M8S II as well?

    Even if not, those who have not looked for it yet, could you do me the favor and look if you see those artifacts too?

    I have the Mini M8S II. I have not seen vertical lines or other artifacts. I have compared it with nVIDIA Shield and the only thing I have noticed is the ringing associated with Amlogic's aggressive edge enhancement (look at the mountain edges). I also have the Samsung UBD-K8500 Ultra HD Blu-ray player, but haven't yet compared this particular video on it.

    I checked the latest full nougat build. 1080p is now 10-bit only. echo 8-bit command doesn't change anything. 4K is now stuck in 8-bit. echo 10bit command doesn't do anything.

    Sink reads the color depth from the CD fields in the general control subpacket of HDMI signal. The CD fields are indeed being set to 10-bit for 1080p and 8-bit for 4K. What is weird is, I tested this on a simulated display with Deep Color support purposely removed and I could still get a picture and the CD field was still set to 10-bit. This doesn't comply with HDMI protocol which stipulates that the source shouldn't send Deep Color signal if the sink doesn't support it. Since I did get a picture I wonder whether the signal is indeed only 8-bit and that the CD fields are being incorrectly set.