Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

HDR-2084 Support via REDCAST module ( Epic and Scarlet Dragon )

Bruce Allen said:
Also, there's another way to get HDR content onto the TV

You can also encode using x265 and set the transfer function to 2084 (16 I believe) and that will put the display into HDR mode. You should also set the display master info, which includes mastering display primaries, white point and nits capability. Also set MaxFALL and MaxCLL. Basically the average and max values used in the content. Does not make much sense for static metadata though.
 
You can also encode using x265 and set the transfer function to 2084 (16 I believe) and that will put the display into HDR mode. You should also set the display master info, which includes mastering display primaries, white point and nits capability. Also set MaxFALL and MaxCLL. Basically the average and max values used in the content. Does not make much sense for static metadata though.

Stacey, you are a true fount of knowledge.

Bruce Allen
www.twitter.com/bruce_allen
 
If shooting 5k or 6k (or 8k!) how is the UHD output derived? Is it scaled or cropped?
In addition to improved wysiwyg monitoring for aerial work, it could be very handy to record in cockpit, off camera, or is that expecting too much?

mike brennan
 
If shooting 5k or 6k (or 8k!) how is the UHD output derived? Is it scaled or cropped?
In addition to improved wysiwyg monitoring for aerial work, it could be very handy to record in cockpit, off camera, or is that expecting too much?

mike brennan

So far this only exist for 6k and and as i understand its a scale to uhd. So when / if done for 8k weapon i assume it will work the same.
 
So, a lot of misinformation out there in connection with HDR. First of all, the monitor needs to be HDR compliant. That in itself is not so simple. Monitor can be HDR compliant and yet, it can possibly be non HDR. How? Well, HDR master is actually called HDR-10. It consist of number of layers, depending on how it was prepared. If it is an Alliance Professional complaint master, then it starts as a HD SDR image. HDR compliant monitor can decode this signal, but if it is incapable of displaying it (brightness), then you will only see HD SDR image. An HDR CAPABLE monitor is able to decode a second HDR and higher resolution layer, which consist of extra metadata. Using that extra metadata HDR monitor will reconstruct higher resolution and HDR image. Right now there are two minimum brightness requirements for HDR, one for OLED, which starts at 540 nits (if I remember it correctly) and another for other technologies, that start at 1000 nits and go up to 10k nits. OLED is capable of infinite darkness and therefore it can meet the HDR spec at much lower brightness. Sony OLED X300 ($30K) is 1000 nits. Right now the brightest monitor available is Dolby's Pulsar, which is capable of 4000 nits. That monitor is a requirement, if you master for Dolby Vision. it cost $250k and it can only be leased from, you guessed it, Dolby.
Interesting fact. For Alliance Professional master HDR-10 type you normally start with an SDR pass and then you apply a PQ 2084 curve (based on Dolby designed gamma curve) based on your monitor brightness- 1000, 2000, 4000 etc. You then use standard grading tools for that pass. For Dolby Vision you start with your HDR version (4000 nits) and then you use Dolby tools for an SDR trim pass, using Dolby hardware and Dolby tools. You guessed it, it's very expensive.
But the result, at least for me, Dolby images are unmistakable HDR, Alliance Professional images are not. There is a HDR standards war raging right now and I would recommend holding off buying any HDR monitor. You may end up with monitor, that is outdated, because it doesn't support approved HDR standard.
Incidentally, all HDR-10 masters support PQ gamma curve, unless you are using a HGC (hybrid gamma curve) proposed by BBC and NHK. That's another standard to be used for mostly live TV. You start with HDR image, apply HGC gamma on the fly and voila, now you have decent SDR image.
And finally, there is no such thing as "shooting for HDR". Any modern camera shoots HDR images. HDRx has nothing to do with it.
It's the monitors, that needs to change.
Phew, good luck with all that...
 
And finally, there is no such thing as "shooting for HDR". Any modern camera shoots HDR images. HDRx has nothing to do with it.
It's the monitors, that needs to change.
Phew, good luck with all that...

Agree with everything you've written on that pesky little HDR thing :)

Though, I do think it's interesting in terms of capturing with HDRx to explore capturing those additional stops and experimenting with various ways of finishing them for these more modern 1000-4000 nits displays. Mainly in the 3-4 stops range of HDRx for me personally.

But yes, any camera with a decent amount of Total Captured Dynamic range can be used to create an HDR deliverable. If your camera can capture more DR, especially if the shot has "more", it certainly an interesting world to explore.
 
Jake, yes, the module can output SDR BT.1886 709 and HDR ST2084 2020 at the same time. The OMOD can also do this assuming you have the color feature. You set the camera to Log3G12 and then 3D LUTs in the module do the rest.

I would also add that Dolby is 12-bit, which PQ was designed for while HDR 10 is 10-bit. This includes delivery to the consumer. To the consumer, Dolby is 4:2:2 while HDR 10 is 4:2:0.

Dolby also defines the tone and gamut remapping altos while HDR 10 does not. BT.HDR defines some of this. E.g. The Samsung JS9500 and the 2016 version don't remap the gamut correctly, they compress it so that colors inside the native gamut will be wrong. They also did not implement the PQ curv correctly. LG is better in all regards to HDR. A single pixel on the LG can go full brightness, LCD can't do this. Makes for impressive star fields. LG has its own quirks, but is far superior to Samsung in HDR consumer playback.

1000 nits looks like SDR next to a 4000 nit Pulsar.
 
One thing I don't understand with HDR, is that HDR is a monitoring standard and impact the way you see an image (with the appropriate signal) but if a camera can do 2 stops of DR why shouldn't it be possible to map those stops to the darkest and brightest value of an HDR image? HDR, as I understand, has nothing to do with the tonal reproduction of an image captured by a camera but with the contrast betwen the darkest and the brightest element in the displayed picture. Now the only problem could happen if there are only a few bits to map those 2 stops and create banding in the transition.

Is my understanding right?

Pat
 
The ST2084 curve is an absolute curve, not relative. You don't remap into it's full range as that is 10,000 nits. This brings up another point. If you have a 4000 nit display that you are using from a live camera, if the content goes over 4000 nits, you will see it clip on the display while the content may not be clipped. The pro display does not remap outside of the simulated high nits value. e.g. The Canon display allows you, through the menu, to claim the display is 100 to 4000 nits even though its nits output is in the hundreds.

Using 10-bit values. Today's 100 nits would be mapped from (~0 nits) 64-512 (~100 nits). 940 would be 10,000 nits. Clipping also occurs on the bottom end now. e.g. 64 would be 0 nits. If you master on a display that can only do 0.05 nits, then all values below the code value that represents 0.005 nits (don't recall it off hand) would be clipped by the display. This becomes really interesting if you master on a Pulasr, with 0.005 nits black and you playback on an LG OLED that can do 0 nits black. What do they do with the bottom end? They are supposed to clip and output at that nit value. However, some consumer displays are stretching it down to 0 nits. I don't recall if the UHDA Premium allows that.

On the subject of UHDA Premium. This was a program created by display manufactures with a vested interest. The bar is pretty low to pass. As Jake pointed out, there are two ways. 0.05 nits and 1000 nits or 0.0005 and 540 nits. Many believe the 540 nits was for OLED. it is actually for LCDs with global dimming. The LG OLED can do more than 540 nits, it does more than the new VIZIO P series. On the subject of VIZIO and UHDA, VIZIO spoke out against it today. My comment is that the UHDA Premium logo does not ensure a display without it won't be superior.

Right now Samsung is fighting LG on how to count resolution. LG OLED is a pentile display, so it has RGBW subpixels. This means that their UHD displays, under some definition of counting pixels, are really 2880 wide vs. 3840. Ironically Samsung phones and computers also use pentile displays, but that market does not care. I think LG should get number of zones added to UHDA. I mean LG could rightly say that they have 6 million zones compared to a couple of hundred on Samsung.
 
The whole HDR business is very confusing. And that includes things, that on it's surface shouldn't be, like monitor brightness. Apparently, the monitor with 1000 nits brightness doesn't mean, that that monitor can hit 1000 nits across the whole picture. And that is the rub. There is no spec on the area of the picture, that needs to hit the specified brightness. And another one, like HDR compliant vs HDR capable. Unless I'm mistaken, an HDR compliant monitor can receive the Alliance Premium logo certification, but the monitor itself wouldn't be capable of displaying HDR image.
 
Well presented HDR, even next to a grade 1 SDR display, looks more like how we (assuming decent vision) experience real life. In my perfect world, the studios would be re-mastering for HDR at the same time they are pumping out UHD/4K deliverables (in 12 bit, please). The current state of HDR, as Jake notes, is full of warring stakeholders and incompatible specs. Yuck.

As an imaging professional I want to create the most compelling visuals possible, and HDR is a powerful tool. As a resident of planet earth, I sure hope credible HDR displays can be designed that don't have massive power requirements...

Cheers - #19
 
Any word on what Apple are doing in HDR space?
If it's anything like their Mac Pro line, they're working on a low-dynamic range monitor to buck the trend. (Mad at Apple at the moment.)

Bear in mind this is a company that also never supported Blu-ray or recording TV shows off cable or satellite on their devices.
 
Progress report...

Progress report...

Still very interested in a RedCast module for DSMC2. TBH, I figured we'd have built in UHD/4K monitor output on top tier RED cameras/expanders by now.

It's not like anyone at RED owes me an explanation. If I don't like the feature set I can always buy a different camera. I get it. That said, I have a dream where I can shoot DSMC2 - AND - have live UHD/4K output. Jarred referenced Broadcast clients recently, are they putting Dragon sensors into configs that support UHD/4K Broadcast infrastructure? If so, do I need to purchase such a camera from a 3rd party? I suppose it's possible that 3rd parties are buying Dragon sensors just for 1080P slo-mo/special applications and don't care about live UHD/4K...

Would appreciate some guidance on the roadmap, if any, to UHD/4K output from camera/expander.

Cheers - #19
 
Back
Top