Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

HDR - It's all down to the nits....

Maybe there is theatrical scope for objects that are briefly 'uncomfortably' bright compared to the rest of the screen.

Perhaps sensing a very bright object from a part of the screen that you are not looking at (eg the image of the sun) would actually heighten realism?


Does anyone have insight in the Dolby Vision rollout schedule for the manufacturers that the guidebook / white paper cite?
+) Hisense :
+) Philips : 58PLF8900/F7 : Due 2015_MAY (no sign of it yet)
+) Toshiba :
+) TCL :
+) Vizio R65 :Due 2015_OCT : http://www.vizio.com/r-series

AJ

EDIT : Added Vizio (even though I did not see it listed in the article)
 
Last edited:
You are right. It does have a part (modest or flashy depending).

I'm interested in your list, but more to do with color gamut. Have you any links Antony?

I went looking at the TCL quantum dot TV's today, as they were the cheapest such TV's last year, but we only had the biggest most expensive one. Unfortunately still no small ones or ones with wide color here (the models were 106% percent color, bad) no sign of new cheaper models or acceptable 140% color.

Speaking of HDR, the TCL screens are dull, but amazing colorization. Walked past a poor dull fullhd OLED screen as well.

So I'm looking for other options.
 
Thanks Antony. Dust that should have settled several years ago. Poor mans 4k Bluray, this is the HD-DVD of 4K, as there was supposed to be a UV based disc coming shortly that was under developed years ago, instead we have this. The TV manufacturers could have just gone it alone and done this current 4K bluray Rec2020, hdr etc 5 years ago to support their products. Very very bad costly timing for the manufacturers during a recession when big incentives are needed for people to buy. They continue to drip out disinteresting bits of features that people are not that interested in buying. 3D, limited 4k, full 4kp50/60, 4k bluray. If they only bunched them together several years ago then it would be a big buy signal for those high prices before introducing the replacement.. By now the full featured sets would be under $2k waiting for the 8k release and then successive generations of holo TV in x years time.
 
Sitting infront of the Sports Arena Led theater screen at the Reef Casino tonight playing The Judge. They let this thing rip, shifted down on the bottom end, but still showing the LED's color gamut and HDR capability. Was the biggest in the country or something when it was put in. They showed a direct sunlight shoot as he started driving, like what was talked about. Tolerable, the rest of the light levels where tolerable too. But if they stretched it out to the LEDs full brightness it would be too much. I don't know why they don't map the led color and luminance spaces to a standard space but with a bit of extra stretch for spice on the color and as much luminance from black up as it can take without banding. Such a setup would be very good on the upcoming 4k Bluray HDR titles.

If only it was 4k and bigger with color gamut control and rec2020 support. A 4k laser projector with a special screen (maybe rear) would be a cheap way to do it
 
There are no 2020 displays available. Dolby Vision is all P3. The Dolby Pulsar is the only display being used to grade Dolby Vision.
Actually, there are Christie Laser projectors that can display Dolby Vision images, and you can "theoretically" buy one (or two for 3D!) and color grade with those. I'm very curious to see how something graded on a Laser projector would translate on a conventional Xenon lamp projector.
 
What are those HDR shows being shot on?
You can shoot on any format and still grade in HDR. The first demo of Dolby Vision I saw a couple of years ago featured scenes from J.J. Abrams' first Star Trek​ feature, shot on film.
 
I have seen a Dolby Vision demo of Oblivion and Pacific Rim. This was a year ago. Since then I have seen clips from the Lego Movie, Edge of Tomorrow and Man of Steel. Lego Movie was also UHD.

We have long captured more dynamic range than we can view on an 8-bit display. It's important to understand that the average picture level of an HDR movie should not change. We now just have much greater range with highlights. Of course anyone can abuse it. Dolby does not impose a limit of percentage of screen being above a certain nits level. I mention this because OLED gets dimmer as more of the image gets brighter. Will be interesting to see how it all plays out given power regulations.

On the Oblivion demo, I watched someone's skins flush when there was fire on screen. Fire looks amazing in HDR. :)

The claim is that 12-bit PQ is equal to 15-bit gamma encoded. PQ is that much more efficient. Want to also point out that when we talk about bit depth in a camera it's usually linear light. E.g 12-bit linear light is ~10-bit gamma encoded. I want to say that 12-bit PQ is something like 42-stops of dynamic range. I believe that is what I recall reading. Very future proof. :)

All other HDR schemes are 10-bit, hence the term HDR10. Dolby decided to plan for the future. UHD BD has made HDR10 mandatory and 12-bit optional. I plan full 12-bit on my next disc. And some test clips to show 8 vs 10 vs 12-bit. No 12-bit displays today, so dither will be used to get down to 10-bit. Also have some nice 709 vs wide gamut samples planned.

Did a short talk on HDR recently. http://youtu.be/lPnr4H1XGog We will be doing another live on 7/16 over at HT Geeks. It's at 2 PM PST https://twit.tv/shows/home-theater-geeks I recommend you watch a recent episode with Jim Helman, the CTO of MovieLabs. https://twit.tv/shows/home-theater-geeks/episodes/257?autostart=false
 
Stacey,

Thankyou for your own Interview and link to the other resources.

The Jim Helman interview answers a lot of questions that I didn't even know I had ;-)

Random thoughts:

+) If Scotts point about the eye stopping down so that you only see the brightest 5 stops of the image you are looking at - does that mean that if Standard brightness display was compared to a High brightness display ... that the standard brightness display would look worse than if viewed by itself?

+) If you did introduce a backlight to Dolby Vision screen - would this make it difficult for the eye to fully open and see the dimmest 5 stops of the available HDR imagery?

+) Do you think it was the absolute brightness that hurt your eyes in the Dolby Pulsar demo - or relative brightness to what was probably a fully darkened room? Would it for example be possible to grade HDR footage with the average brightness being 50 nits, but use an even darker room?

+) If absolute Brightess is important, is there a standard that replaces nit with the brightness that the eye at a specified (ie expected) viewing distance? (candela for a set angle in the eye) - so that luminescence that people grade on a smaller screen will match that in a cinema?

Thanks,
AJ
 
Antony, all very good questions. I will bounce them off Joel, and Dolby, and see if we can answer in the 7/16 podcast.

Antony Newman said:
Do you think it was the absolute brightness that hurt your eyes in the Dolby Pulsar demo - or relative brightness to what was probably a fully darkened room?

It was one clip in particular that hurt my eyes. In this clip, I think 90% of the screen was at a high nit level. Also, I don't think they had proper backlighting in the room. So I would say it was a combination of the two. At NAB, I recall hearing that they could only run the Pulsar at 3000 nits because they kept blowing the breaker. :)

Antony Newman said:
that the standard brightness display would look worse than if viewed by itself?

The first time I saw a DV demo was 1.5 - 2 years ago. At the time they were using two consumer Sharp THX LCD displays. These are capable of ~1200 nits. They played 709 (SDR) on the left and DV (HDR) on the right. The clips were synced. My first thought was this demo is fake and they are making the SDR look bad on purpose. Since then I have spent a lot of time with HDR and realized it was real. I suggested to them a couple of weeks ago they should start by showing only the SDR display, with the HDR covered, so the viewer could see how good it looks. Maybe a full run through the demo material. Then unveil the HDR display next to it and do the side-by-side. So yes, the SDR display would look better when viewed by itself IMO. But even then, its obvious that HDR can be much better.

Antony Newman said:
If you did introduce a backlight to Dolby Vision screen - would this make it difficult for the eye to fully open and see the dimmest 5 stops of the available HDR imagery?

I don't know. There is an ASC working group studying HDR. Backlight is part of that discussison. A backlight is generally a maximum of 10% of the total light output. I don't know if that recommendation is going to change. I can ask Joe Kane or Don Eklund, who are both part of that working group.

Antony Newman said:
Would it for example be possible to grade HDR footage with the average brightness being 50 nits, but use an even darker room?

Is the 50 nits value for theatrical where 48 nits is the standard or home where 100 nits is the standard? As I said, the average picture level is staying about the same. You just have a lot more headroom. I don't know if Baselight or Resolve have special HDR tools to control the way the highlights are distributed. Right now Dolby is requiring, as far as I know, a Pulsar for HDR authoring.

As Marc pointed out once, we also need new scopes to deal with HDR among other tools.

Dolby has built an end-to-end solution. The others are playing catchup and HDR10 has several limitations. I am waiting on Dolby to find out what is public knowledge and what is not so I can discuss on the 7/16 podcast.

We are in the early days and the first gen HDR sets are like the first gen HD, 3D and UHD sets. :) Lots of limitations and those who buy them will want to replaced them in 18-24 months with something much better.

There are a whole bunch of complex remapping algorithms. The idea is they always master on a brighter, and wider gamut, display and things get remapped into what the viewing display supports. In the DV case, these remapping algorithms come from Dolby. They also have a 3D LUT. Display manufactures are free to add their own de-hancement algorithms on top of HDR like they do SDR. I was hoping this would not be the case.

I liked Sony's HDR demo at Cine Gear. It was their OLED BVM and only a small portion of the screen can be at 1000 nits. What I really liked was one scene with a young woman in a red dress. She looked the same on the 709 BVM and the 2020 BVM. (Same display, different memory) The BMV covers 80% of 2020. Anyway, the young woman looked the same on both. The other scenes with sunset sand carnival lights really showed where the wider gamut comes into play. The sunset really stood out between HDR and wide gamut. I don't' recall if it was this demo or at NAB, but there was a kayak and the HDR made the water droplets really standout.

At HPA one DP said he was not sure how he would use HDR to tell a better story. I was thinking to myself, does it really matter? Can't the image just look better for the sake of looking better? :) I know they had the same dilemma with 3D. Of all the 3D movies, I though that Coraline did the best job using 3D to improve the story.
 
At HPA one DP said he was not sure how he would use HDR to tell a better story. I was thinking to myself, does it really matter? Can't the image just look better for the sake of looking better? :) I know they had the same dilemma with 3D. Of all the 3D movies, I though that Coraline did the best job using 3D to improve the story.
I have to say, many (if not most) DPs I work with aren't willing to use the entire 100 nits available in a regular Rec709 monitor. We constantly bring things down to make the scene darker, denser, and more dramatic. It's going to be hard to convince them of the need to use a much larger dynamic range, and also how best to use it. I'm not convinced a one-size-fits-all will be able to handle the conversion.
 
At NAB I was speaking with a colorist, from Deluxe, and he said they were being very conservative with HDR.
Which makes you wonder why people are worrying about 1000-nit monitors... :coolgleamA:
 
The first gen displays are not anywhere near 1000 nits. :) Samsung has provided a special modified version of their SUHD display that has a much beefier power supply in order to get closer to 1000 nits. The one consumers can/have purchased might do half of that if they are lucky. So many things up in the air at this point in terms of nits, gamut, HDR standards, UHD, HDMI, HDCP, etc... I would not buy a new display until 2017 unless you don't mind buying another one at that time.
 
Back
Top