It appears that most if not all of my HDR10 videos get sent to my receiver using 12bit color. I think the builds prior to 2021.12.18 would send them as 8bit, as I detailed above.
I thought HDR10 used 10bit color. Is that correct? If so, is there a reason LE outputs them as 12bit?
UHD HDR10 content is encoded as 2160p (aka 4K) 4:2:0 YCbCr 10-bit with Rec 2020 colour gamut and using the h.265 codec typically.
When HDR support first arrived on the Pi 4B in LibreElec, HDR10 content was replayed with the 10-bits truncated to 8-bit (losing the lowest 2 LSBs), and output in RGB. This meant that the two lowest bits were truncated which would mean a lower quality picture was being delivered.
(I think some dither noise may have been added to reduce the visibility of the banding this would introduce on 10-bit capable displays - or those that simulate 10-bits with 8bit panels + FRC)
This truncation was obviously not ideal, but HDMI 2.0 doesn't support 10-bit RGB in 2160p (aka 4K) at all frame rates - so just increasing the bit-depth and staying RGB wasn't really an option (it would mean no 2160p50 or above support).
For higher frame rate UHD HDR with at least 10-bits, HDMI 2.0a supports either 4:2:0 YCbCr at 10-bit, or 4:2:2 YCbCr at 12-bit. The Pi 4B won't support 4:2:0 (as it requires both horizontal and vertical subsampling/scaling of chroma which the Pi GPU doesn't handle?), but the Pi 4B does support the horizontal-only scaling/subsampling for 4:2:2 YCbCr at 12-bit, which is now being used instead.
HDMI 2.0a doesn't include any bit depths for 4:2:2 YCbCr other than 12-bit at UHD resolutions, so YCbCr output at UHD is always 12-bit. 10-bit content is just padded with zeroes in the two LSBs to pad the 10-bit content to 12-bit.
Most of my UHD replay devices (UHD BD player, Apple TV 4K etc.) are set-up for 12-bit 4:2:2 output just as the Pi outputs, as is my AVR. (My Sky Q is 4:2:0 I think - haven't checked recently)