All HDR content is that instead of going from say, 0 to 255 for brightness, we can go from 0 to 1023. Your SDR content stays mapped to 0-255 as it always was, but now your HDR content can now use 256-1023 as brightness values which goes much brighter.
This is absolutely incorrect.
First and foremost, PQ and HLG have entirely different curves than SDR gamma.
And that's how you notice HDR video - they can appear normal, but then it walks out into sunlight and it's a lot brighter than it can be.
Also completely incorrect.
HDR is not about making a scene brighter. It's about making parts of the screen brighter will still retaining detail in very dim parts.
The problem is that most HDR displays are crap - some even claim HDR when they can't do it (DisplayHDR 400, for example is used on many laptops). This just means it can do up to 400 nits.
DisplayHDR 400 looks great on an OLED laptop.
A laptop is never more than a half meter or so from your face. It doesn't need 1000 nits to be absolutely blinding.
And remember- brightness isn't about brightness of the entire scene. It's small parts. That's why peak and sustained are different measures.
The problem is, of course, if your display can only do 400 nits, but needs to display 1000 nit images, and those are usually tone-mapped which can give artifacts.
All HDR imagery is tone-mapped in practice.
The artifacts you're referring to are when you try to tone map HDR down into SDR- i.e., gamma compression. This is not a problem when tone-mapping absurdly high-nit masters into one of the various HDR standard spaces.
Not that it really matters since most content doesn't even come close to 1000 nits HDR10. HDR analysis of a lot of recent movies show most generally keep to well under 1000 nits, with some like Avatar Way of Water only achieving 300 nits. (This is sort of limited because movie screens are brightness limited - most projectors simply cannot do 500 nits, and this applies to home projectors too, which is why very few support more than basic HDR10 - Dolby Vision is practically non-existent).
This is patently fucking false, lol.
First- many movies peak past 1000 nits.
Second- you're confusing HDR parlances- which isn't entirely your fault- it's confusing.
HDR10 is good up to 10k nits- the precise same as Dolby Vision. It is, in fact, the set point of the PQ curve.
HLG is good for any amount of nits.
The PQ curve points being selected at 10k has nothing to do with the advantages of HDR, since movies are typically mastered at 4000 nits, and as you pointed out- projectors aren't capable of anywhere near that of even a nice TV.
But that's ok, precisely because the "Brightness Wars" don't fucking exist.
Average scene brightness remains what it always was- but with HDR transfer functions, we can display far more detail there, as well as displaying very bright peaks (like the headlights of a car, as an example)