I've tried using mali utgard instead of mesa with no success. Therefore, these changes remained in the patch. I made a patch from my local git and didn't clean it up, so sorry.
No need to apologize, just curious. Why would you want to use binary driver? mesa should be better in all aspects.
By default, the console does not work via UART - there is silence. I had to do this following the example of the Samsung profile and the console now worked well.
Maybe only this parameter turn on console in UART: systemd.debug_shell=ttyS0
it's systemd.debug-shell=ttyS0 Note the hypen instead of underscore. I also usually add it at the end of the command line, but that shouldn't change anything.
Note that you also don't need to remove all other H3 boards, you can use UBOOT_SYSTEM=orangepi-lite to limit builds to this particular board.
I'm curious why did you add mali utgard driver patches? That's not used by mesa at all. Also, why did you change DEBUG_TTY="/dev/console" to DEBUG_TTY="/dev/ttyS0"? Serial console works without this change.
Libreelec's default settings have a setting that reserves 320MB of RAM for the graphics core.
This is not exactly true. CMA memory can be used by any core requiring continuous memory region. Kernel can even assign it for normal use (user space processes), with some limitations. So making this number bigger is not that problematic. By far heaviest user of CMA is VPU (video decoding) which is distinct from GPU.
Ah, if it has RTL8189ETV, then you're out of luck. There is no driver in mainline kernel and team decided against inclusion of any out of tree wifi driver, unless there is strong indication that it will land in kernel (no such indication for RTL8189ETV).
I couldn't find any method for reading SDIO IDs from running system, so opening up is simplest thing to do.
The Display only features DVI out in so I use an adapter to get a hdmi cable connected. How could I manually force the correct resolution?
What kind of adapter is that? I tested native DVI monitors in the past, but I didn't noticed any issues. Forcing correct resolutions is a bit tricky unless it's standard (720p or 1080p). If it's not, you have to find useful edid somewhere, create initrd from it, add it on SD card partition and update pxeconfig to load that initrd and add kernel parameter to use it. I've never done it, so I don't know exact steps.
Actually I have no wlan0 device at all, running iw wlan0 scan doesn't work therefor.
Your box either doesn't have same wifi as mine X2 or they rerouted some traces. I suggest you open up the box and check which wifi module is used.
I just tested wifi on my X2. Driver loads ok and I see wifi networks listed in LibreELEC settings. However, after some time they all disappeared. After running iw wlan0 scan over ssh, they appeared again and stayed there. I have no idea whose fault that is.
1024x768 resolution almost always means that EDID couldn't be read. I tested many non-standard resolutions on H3 and they worked, unless there was some issue with clocks, but that shouldn't happen anymore.
In any case, provide outputs of following commands:
offbeat I converted NASA.x265.2160p60.ts to mkv with following command:
I can confirm that TS file indeed produces iommu errors but mkv with same data does not.
I already observed issues with TS files in the past, usually because they start in the middle of the stream, with B or P frames. Converting it to matroska takes care that stream starts with a key frame.
So as far as I can tell, there are three issues:
1. slow decoding loop in ffmpeg
2. rendering issue, probably DRM driver needs some improvement, possibly around YUV scaler (workaround is to use EGL rendering)
3. Cedrus needs to be more resilient to stream errors (mostly missing references for B and P frames)
I guess I'll only PR frequency fixes and leave iommu branch for now. I think it's more important to finish 10-bit HDMI output support and after that, add proper 10-bit output from Cedrus and Hantro (for VP9). EDIT: and of course proper VP9 10-bit support.
Thanks for report. Can you check if switching rendering method from "Direct to plane" to "EGL" avoids iommu issues? It does for me, at least for some videos.
EDIT: Actually, if ffmpeg decoding with -f null - works, I think issue might be in display driver or misreporting actual buffer information, like size or stride.
offbeat this patch http://ix.io/3QwZ should improve VP9 decoding speed and hopefully lower issues with Cedrus based decoding. It turns out that Android box is running VP9 core on much higher frequency and Cedrus on slightly lower. While at least in Cedrus case it doesn't seem much, it can be just enough to avoid issues. Let me know if this makes VP9 4k 30fps fast enough for you. I would also like to know if HDR works for you.
Well, if you don't have wifi+bt module connected, then having overlay for it is pointless. But I agree that it shouldn't make any harm, just some overhead. I don't see any issue in wifi+bt overlay, so maybe you can just omit it and be done?
Can test on modern HDR capable monitor if needed, but guess not much point until H6 HDMI driver gains ability to send HDR metadata to display.
That ability exists for some time now and it works for HEVC videos, didn't test VP9 yet (I heard HDR metadata is packed differently in VP9 streams). However, colours are still off - HDR is usually connected with 10-bit output, which isn't implemented yet.
EDIT: one IOMMU issue is probably fixed with https://www.spinics.net/lists/kernel/msg4055249.html
EDIT2: above patch indeed fixes kernel warning and H6 indeed sends HDR info to display when playing your Costa Rica VP9 10-bit HDR demo, I updated my iommu branch, if you're interested
I'd say this VP9 fix patch should go to master, nice job!
Thanks for testing! This is just a quick hack and it uses more memory than necessary for 8-bit videos (unconditional buf_size *= 2;). I'll make proper patch series soon, hopefully over next weekend.
10bit HDR plays with washed out colors, but thats probably expected, H6 HDMI hardware tonemapping most likely isn't implemented yet.
I suppose you're testing this on non-HDR capable screen? During writing 10-bit HDMI output support, I noticed that HW does proper downscale from 10-bit to 8-bit, but that is probably just averaging. There is HW for applying HDR tonemapping, but I have no idea how to use it nor how to integrate it in DRM framework.
Yeah, ffmpeg is not optimized yet. As I said before, adopting RPi HEVC approach should yield more performance, but that probably won't happen soon.
Tested this patch on both master(cma=1024M) and iommu branch.
Looking good, not a single error in logs!
We can't afford to set CMA so high, some boards only have 1 GiB of RAM. But I'm glad IOMMU doesn't report any issue. This means my quick patch is sound.
Log shows that TV was properly detected (all resolutions are listed) and you even played video. Make another log, this time make sure it's created "during" the issue, e.g. when you only hear sound effects but Kodi is not visible.
I'm not sure what needs to be done in order for kernel to pick up edid file. FAT partition is mounted to /flash, so proper line would be /flash/edid.bin, but then kernel might request file before /flash is even mounted... Anyway, reading EDID should not be problem even if TV is turned off. Try with only video=HDMI-A-1:D
That's actually a good clue! This means some buffer is too small. With a little extra debug output, it shouldn't be that hard to figure which one. Well, there are some additional registers to set, but hopefully, that's all.
There is a reason why I didn't PR those changes yet Although VP9 worked for me.
there is a merged commit "Add analogue audio driver to Allwinner H6" for the 5.4 kernel.
Analogue audio is not implemented there. However, I know where confusion comes from. H6 contains AC200 IC integrated alongside SoC. AC200 is responsible for several things, like Ethernet PHY for fast ethernet and also analogue audio. So patches mentions audio, but only functionality really implemented there is Ethernet PHY.