Since buying a 4K HDR TV (LG OLED C9) I have struggled to find a way to display astro-images in HDR mode. As far as I can tell, HDR is only triggered using a 10-bit 4K HDR video source.
Using (free) ffmpeg software I have now found a way to generate a 10-bit 4K HDR video, preserving both the natural colour and the natural range of relative intensities. The latter point will make the image look underwhelming because we are used to images that are highly stretched. However, from a technical viewpoint it is an interesting challenge.
The video (which contains a static image) has been uploaded to YouTube:
The "2160p 4K" quality option will only be available if you have compatible equipment - my TV screen comes up with an HDR logo when it switches into HDR mode to view the video.
A preview is here, which does not do the HDR video justice:
Watched in HDR, the stars are intensely bright but the YouTube transcoding has created artifacts around the bright stars. The original video can be downloaded here:
Ignore the terribly washed out preview that Google Drive provides.
For those interested in the technical details, the main steps were as follows:
- Stack the data (unmodified Canon 600D on Tak Epsilon, total exposure 30min)
- Transform the linear data to the primaries of the Rec2020 colour space
- Apply the Perceptual Quantizer (PQ) SMPTE-2084 transfer curve to the data instead of the usual Rec 2020 gamma
- Run ffmpeg on the resulting 16-bit TIF to generate the 10-bit 4K HDR video
- Copy it onto a USB stick for final display on the TV in the lounge!
Edited by sharkmelley, 25 February 2020 - 10:34 AM.