4K 4:2:0 with Deep color is not a part of HDMI spec. It should always be 8-bit.
No idea how to force YCbCr 4:4:4. Intel driver knows what's best for you and use RGB.
That's not correct, if by Deep Color you mean >8-bit bit depth. In fact it's quite the opposite.
4:2:0 is supported in 8, 10, 12 and 16-bin 2160p50/60 modes - it's the only sub-sampling format supported in all bit-depth outputs for 2160p50/60 modes. 4:2:0 was added to HDMI 2.0 - and is only supported for 2160p50/60 modes, with no support in 2160p24-30.
RGB/4:4:4 are only supported in 8-bit at 2160p50/60 - so whilst you may be able to flag an HDR EOTF with 2160p50/60 RGB/4:4:4 output - you can't output 10-bit sources in 10-bit or 12-bit in an RGB or 4:4:4 mode with a 2160p50/60 output format. (RGB/4:4:4 is only an option for HDR at 2160p24-30)
4:2:2 is the only >8 bit format supported for all 2160p frame rates - from 24p-60p. It is supported with 12-bit bit depth only. (So 8-bit SDR and 10-bit SDR or HDR (*) video is padded - there are no 8-bit, 10-bit or 16-bit onions for 4:2:2 output at any frame rate in the spec)
4:2:2 is thus the ideal preferred mode for 2160p HDR output - as it is supported at all frame rates. (This is why it is used by a lot of consumer products)
4:2:0 is the second best option for 2160p HDR output at 2160p50/60 - but can't be used for 2160p24-30.
4:4:4/RGB is only an option for 2160p HDR in 2160p24-30 modes.
Therefore if you can't use 4:2:2 for HDR output of 10-bit HDR material properly (i.e. without truncating to 8-bit) you have to use RGB/4:4:4 output for 2160p24-30, and 4:2:0 output for 2160p50/60. The only format supported in all frame rates is 4:2:2
See attached HDMI 2.0 timings chart downloaded from the HDMI site. HDMI 2.0b or HDMI 2.1 may have added >8-bit depth support for RGB/4:4:4 in 2160p50/60 modes - but it certainly wasn't supported in HDMI 2.0 (or 2.0a AFAIK)
(*) I watch quite a lot of self-mastered HD SDR 10-bit content - it looks so much nicer than 8-bit content covered in quantisation banding 