The thing with Intel driver is that it does not allow deep color 4:2:0/4:2:2 2160p50/60 modes. It prefers RGB 8-bit and I'm not sure this will ever change. I think the driver can be hacked to allow 4:2:0 but I didn't test it.
10-bit SDR and HDR sources don't look very good when downconverted to 8-bit without dithering (e.g.sky almost always look terrible with tons of banding). With dithering enabled in the driver the banding is not really an issue anymore.
If it's a choice of 8-bit with or without dithering, then sure add the dither, it will mask the banding by adding extra noise to the picture. But I would always chose clean 10-bit video carried in 10-bit 4:2:0 or 12-bit 4:2:2 over 8-bit RGB/4:4:4 with dither noise.
So Intel are basically limiting their Linux drivers to HDR for <30fps 2160p only, and only allowing 8-bit output for >30fps at 2160p? That's a crazy limitation. It means that any HDR live TV shot at 2160p50 or 2160p60 isn't viewable on their platform under Linux? (BBC iPlayer 2160p50 live sport in HDR for instance?)
When you say it 'prefers RGB 8-bit' - what happens if RGB isn't available as an option? Does it then flip to 4:4:4 YCrCb 8-bit? (I'm thinking if you did a custom EDID that said 'YCrCb only'?) I wonder if it's possible to create an EDID that says 4:2:2 and 4:2:0 only at 2160p50/60?
4:2:0 10-bit YCbCr output would be fine for HDR video content (most of this will be delivered 4:2:0 after all - that's what's used for streaming and UHD Blu-ray) - the only compromise would be that a 4K UI might be slightly compromised and 4K artwork and Photos would have 1920x1080 chroma res (4:2:2 would have 1920x2160 chroma res)
I can understand Intel wanting RGB/4:4:4 to be preferred for desktop use (it avoids the smeary chroma that 4:2:2 and 4:2:0 introduce on very fine coloured picture detail - such as pixel-wide coloured text which would not be nice on a UI monitor - but you don't really need 10-bit or higher bit depth on a UI display, unless you are grading video, or watching Netflix in a window)
However as there is no >8-bit HDMI 2.0 mode that supports 2160p50 and above in RGB/4:4:4 you have to accept reduced chroma resolution to get the right bit depth to allow clean HDR output of video (the bulk of which will be sourced 4:2:2 or 4:2:0 anyway). I wonder if Intel don't 'get' this issue yet?
Is this discussion here relevant to this issue [GLK] no signal - with samsung 4k TV - HDMI UHD Color (ENABLED) (#271) · Issues · drm / intel · GitLab ?