My biggest gripe with RPi4 is a lack of hardware VC-1 decoding which no release of LE is able to fix. I've seen people claim it's a niche format which it is except it's standard for Blu-ray movies and is actually quite common especially among early releases. I wonder if software playback without framedrops is feasible with CPU overclock.
VC1 plays fine with software decoding which is why the Pi Foundation designers haven't bothered adding an IP block in hardware for it since the original RPi and RPi2 boards (which had weaker CPUs and needed it). It's the same for MPEG4 and a bunch of other codecs.
Out of curiosity what kind of optimization are we talking about? Decoding stuff in kernel?
RPi0/1/2/3 have no hardware decode capability for HEVC but this emerged as a more popular codec. The Pi devs used compute capabilities on the GPU (not the ARM CPU) to assist the software decoding process. It adds just-enough compute headroom on RPi3/3B/3B+ to handle lower-bitrate 1080p, as used with streaming services like Amazon/Netflix. It works, but there are no public APIs to handle this kind of thing in FFMpeg so it requires a bunch of proprietary code. That code and the optimisation tricks it contains depends heavily on the workflow of the video output pipeline; which is the bit that has been completely reinvented with the move to GBM/V4L2.
Hey that's a nice setup you have there. Considering that typical 4K HDR content is encoded with 10 bit color depth and that current LE release supports only 8 bits and trims the extra bits, have you experienced any trouble with that?
None. Current 8-bit output works fine and the majority of users won't be able to tell the difference once 10/12-bit output is supported (it is being worked on).