I believe we've been over this before. It appears I failed at making you understand.
You cannot explain to me what you do not understand.
If they weren't talking about higher bit depth, it wouldn't matter which display system was involved. The most popular format for HDR displays to support is HDR10, which involves 10bpp. You can view HDR content on 8bpp displays, but you won't get full fidelity. Remember Half Life 2: Lost Coast? (Apparently Riven also used HDR techniques, can't say I noticed.)
I can make HDR images on my camera using Magic Lantern, and have done. It uses the technique where they're generated from bracketed exposures. They are striking even on a normal display with a typical color gamut.
Even displays with 8bpp panels can get more out of having a HDR signal (including >8bpp) so even I would like to have it, with my cheapass LG43UT80[00].) Unfortunately, I have an Nvidia card.
Now, what were you saying? I don't remember it being very interesting, but do go on.