Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Asus ProArt PA32DC OLED Review

The definition technically allows it, but the client specs say something different. I know of cases where the facilities are using LG 32EP950 (which I prefer to the Asus) or the Asus PA32, and just being careful about average Dolby Vision levels not getting too high. As one example, I know of a major -- think $100 million+ -- 2022 feature where the director came in and said, "ya know what... I prefer the SDR picture. Don't make it much brighter than that. You can let the specular highlights stray up to 300, 400 nits, but other than that, keep everything about where it is." Huge movie, already shipped, nobody cared or noticed. And this is a huge A-list director nobody was going to argue with.

Unfortunately, the majority of Dolby Vision shows on Netflix are no more than low con SDR in an HDR container. Most of these shows continue to be lit in an SDR environment, they’re monitored in SDR, and the very first time anyone sees their footage on an HDR display is in the grading suite. Not infrequently, both the post-production house and the director or producer preemptively rule out a version that dramatically departs from the SDR version; the result being that HDR turns out to be little more than a marketing gimmick. According to ARRI and others, grading the HDR version first and making a trim pass after often yields a better SDR grade, the downside being, that the client who has seen the HDR version first will not be happy with the SDR. Dolby Vision wanted to find out whether industry professionals preferred a ‘hand graded’ SDR version of a Hollywood film or the Dolby Vision derived SDR version, so they went on a worldwide tour of nine cities to do an ‘SDR Survey’. The cities they visited were Los Angeles, Tokyo, London, New York, Toronto, Seoul, Mumbai, Munich and San Sebastian (Spain). The 560 participants included colorists, engineers, post producers, studio executives, studio mastering teams, directors of photography, directors and creatives. You can watch the video to learn the results. Nevertheless, it is no secret that many in the industry are either hostile or indifferent toward HDR.
 
Last edited:
The definition technically allows it, but the client specs say something different. I know of cases where the facilities are using LG 32EP950 (which I prefer to the Asus) or the Asus PA32, and just being careful about average Dolby Vision levels not getting too high. As one example, I know of a major -- think $100 million+ -- 2022 feature where the director came in and said, "ya know what... I prefer the SDR picture. Don't make it much brighter than that. You can let the specular highlights stray up to 300, 400 nits, but other than that, keep everything about where it is." Huge movie, already shipped, nobody cared or noticed. And this is a huge A-list director nobody was going to argue with.

Marc thanks for your input on this. Up until I saw Joey D' Anna on Mixing light talk about repurposing Rec709 mastered footage for Dolby Vision, I was under the impression that Dolby Vision content originated from a scene- referred image from the camera then it was graded scene- referred on a minimum of a 1000nits monitor with an ODT of either P3- D65/ ST2084 or Rec 2020/ ST2084 (P3- D65 limited). But it just seems from your example and Joey D' Anna Tutorial on Mixing Light with the Rec709 mastered footage that he stated was never meant at that time for future HDR content, that Dolby Vision can be whatever someone wanted it to be.
 
I enjoy floating in a few worlds when it comes to displays and HDR. Early on when we mastered Aerial for DolbyVision many questions came up as this was before any sort of standardization and proudly some were aiming for a 10000 nit future. After a great deal of testing we made several HDR masters. Notable targets were 400 and 800 nits, but the big one was indeed 1000. I made tests at 1500 and 2000. At that point I said it's going to be a hard stretch to have a display in the home that would exceed 2000 nits. Stated strongly, even years ago on this forum, 1000 was likely the way to go. From there I also noted that even in a 1000 nit container that it would be hard sell for narrative filmmakers to exceed beyond 600-700 nits.

Fast forward to various streaming standards, yes 1000 nits is the hero target for many. In a professional atmosphere it's ideal to have a screen that can hit that. But if you know what you're doing you can use lower nit displays just fine. In my situation I have SDR through 2000 nits here in my office across 6 screens which also span the gamut of LCD, OLED, and MiniLED.

The only time I've pushed out 10000 nit material to date is for demonstration/exhibition use. Nearly all my grades professionally in HDR push out at 1000. This has been going on for years now.

I'm also an LG EP950 owner, I would have to assume the Asus is the same panel with their own jazz. Auto Calibration is a nice feature IMO. One of my Dells has that and it's nice. Takes me about 2 hours to do a fast call on all the displays I use in the office. Another 2 hours for the other screens outside. Full day to do all of them in an advanced cal. Only have to do that maybe 2-4 times a year.
 
I enjoy floating in a few worlds when it comes to displays and HDR. Early on when we mastered Aerial for DolbyVision many questions came up as this was before any sort of standardization and proudly some were aiming for a 10000 nit future. After a great deal of testing we made several HDR masters. Notable targets were 400 and 800 nits, but the big one was indeed 1000. I made tests at 1500 and 2000. At that point I said it's going to be a hard stretch to have a display in the home that would exceed 2000 nits. Stated strongly, even years ago on this forum, 1000 was likely the way to go. From there I also noted that even in a 1000 nit container that it would be hard sell for narrative filmmakers to exceed beyond 600-700 nits.

Fast forward to various streaming standards, yes 1000 nits is the hero target for many. In a professional atmosphere it's ideal to have a screen that can hit that. But if you know what you're doing you can use lower nit displays just fine. In my situation I have SDR through 2000 nits here in my office across 6 screens which also span the gamut of LCD, OLED, and MiniLED.

The only time I've pushed out 10000 nit material to date is for demonstration/exhibition use. Nearly all my grades professionally in HDR push out at 1000. This has been going on for years now.

I'm also an LG EP950 owner, I would have to assume the Asus is the same panel with their own jazz. Auto Calibration is a nice feature IMO. One of my Dells has that and it's nice. Takes me about 2 hours to do a fast call on all the displays I use in the office. Another 2 hours for the other screens outside. Full day to do all of them in an advanced cal. Only have to do that maybe 2-4 times a year.

Thanks Phil for posting on this topic!. I always wanted to know about your experience with HDR. Yeah I've always thought if it wasn't a 1000nit HDR Monitor that it was a waste of my time to even look at or consider. But over the last few years, I have come off of that belief and find myself considering some of the newer inexpensive HDR monitors that may not reach 1000nits more viable now. It's always good to hear from guys like you and Marc whom are more "in the Know" about what's really going on in the upper echelon of the movie making business out their in LA. and beyond.
 
I could write a book on the subject I imagine. But it starts with photographic printing techniques (and materials), projection, and then to emissive displays.

Televisions now are essentially capable of cibachrome/ilfochrome intensity levels. Nearly everything I learned printing stills on that medium w/ backlit displays reigns true for modern televisions screens in HDR. The additional variable is motion, which has lots of considerations, which is why I've put SDR and HDR panning speed recommendations into my tools.
 
I could write a book on the subject I imagine. But it starts with photographic printing techniques (and materials), projection, and then to emissive displays.

Televisions now are essentially capable of cibachrome/ilfochrome intensity levels. Nearly everything I learned printing stills on that medium w/ backlit displays reigns true for modern televisions screens in HDR. The additional variable is motion, which has lots of considerations, which is why I've put SDR and HDR panning speed recommendations into my tools.

I think HDR will eventually be the norm in a few years time and we'll look back and ask how we ever settled on watching movies on anything else.
 
I think HDR will eventually be the norm in a few years time and we'll look back and ask how we ever settled on watching movies on anything else.

Nah. There's new HDR stuff coming that will exceed the two current standards. But for this decade HDR is the new kid that people get jazzed about. SDR has no reason to die. It has every reason to be reinvented. And it has a dozen times in the last decade.

Lots of moving parts on the display side of things for in home lately. So much so that combined with the pandemic some of the newer planned 8K paths were skipped for a generation, which was unusual. But it's about to come back. I think around 2025 a lot of things will make more sense in the 4K and 8K ecosystems from creation to audience.
 
Nah. There's new HDR stuff coming that will exceed the two current standards. But for this decade HDR is the new kid that people get jazzed about. SDR has no reason to die. It has every reason to be reinvented. And it has a dozen times in the last decade.

Lots of moving parts on the display side of things for in home lately. So much so that combined with the pandemic some of the newer planned 8K paths were skipped for a generation, which was unusual. But it's about to come back. I think around 2025 a lot of things will make more sense in the 4K and 8K ecosystems from creation to audience.

Now you just really peaked my interest!! But it's probably something you've already signed a NDA for, but I will be on the look out for them. I'm going to have to go back to being better informed about all things HDR because I never want to be one those whom will be playing "Catch up" when that HDR wave comes.
 
The HDR thing is interesting...

Imo, camera sensors need more dynamic range before a full and proper HDR standard can be set, until then it's a lesser moving target with lower adoption and usefulness.

Even with the option of shooting under controlled lighting, the standard needs to be set in accord with its use in, and the limits of, the natural world.

Over a hundred years and we still have to choose between exposing for clouds or trees (or buildings or people, or anything in sunlight and shade)?

Nothing wrong with shooting for both and crushing and lifting and stretching in post, it's still just too limited.

Being able to simultaneously capture detail in light-toned subjects in direct sunlight and dark-toned subjects in shadow, while shooting at f/1.4, with no ND, may sound absurd or unnecessary, but I think those levels of sensor capability and post adjustment need to be reached before 'HDR' can become a new standard. While SDR remains available for people who like that aesthetic (just like people nowadays emulate or use actual film) or for those who just like and insist on shooting using 1800's technology (putting something in front of the camera to darken the image).

Lenses and apertures are hard limited by the laws of physics and practicality. Displays/screens are limited, but already bright and dark enough it seems. So what are the actual limits of digital camera sensor technology in regards to the range of reflectivity or luminance they can capture? For me that's the most interesting thing about where 'HDR' is possibly going.

Wherever it ends up, like most things, people will use it in all sorts of ways, from the gimmicky and gaudy to the sublime, if the means to do so are made available.

Maybe RED can speed up the process of getting there like they did with making 4K viable.

Or maybe we'll find ourselves in another one of those developmental loops where we inch slowly forward towards some undefined goal, one camera, f-stop, display/screen iteration/increment at a time.

Nice looking monitor btw.

And thanks for all the HDR info and insights.
 
Les,

I think you will always have people whom will truly do HDR justice and those who won't. I think we still need more time to refine and better communicate what HDR should be. I don't mean what we all already know like overly bright highlights or crush blacks and oversaturate colors. I mean what type of movie viewing experience should I expect from HDR and why wouldn't a SDR version been better. How did this HDR rendition tell the story of this film better. What emotional visual experience will be heightened from choosing to deliver this in HDR instead of SDR. If you can't come up with a reason other than "because you could", then it probably would have been better to just deliver it in SDR instead.
 
ASUS ProArt Calibration Software (using PA32DC)

by

PC Monitors





Introduction= 0:00
Calibration Targets= 1:19
Calibration Appointments & Supported Colorimeters= 4:12
Calibration Process= 8:05
Customize Target= 14:34
Calibration Reports= 14:57
Note on HDR Calibration= 17:57
Re-assigning and Resetting Calibrations= 20:38
Embedded Calibrator Correlation= 21:23
Information ('i') Section= 22:26
 
I agree Rand.

In a lot of cases, money, time, ability and creative intent could still make shooting for SDR a better alternative for the foreseeable future.

The HDR I'm imagining has more potential as an image-capturing and processing improvement than as some gimmick that can be used to sell more TV's. Not that it wouldn't change what people see on those TV's.
 
Last edited:
Marc thanks for your input on this. Up until I saw Joey D' Anna on Mixing light talk about repurposing Rec709 mastered footage for Dolby Vision, I was under the impression that Dolby Vision content originated from a scene- referred image from the camera then it was graded scene- referred on a minimum of a 1000nits monitor with an ODT of either P3- D65/ ST2084 or Rec 2020/ ST2084 (P3- D65 limited). But it just seems from your example and Joey D' Anna Tutorial on Mixing Light with the Rec709 mastered footage that he stated was never meant at that time for future HDR content, that Dolby Vision can be whatever someone wanted it to be.
That is true. When I went through the Dolby Vision training process earlier this year, I specifically asked Aby Matthew (Senior Manager of Content Solutions) about studios that want to "repurpose" 4K SDR material to HDR, and he said, "we'd prefer you do the 4K Dolby Vision first, but yes, you can tone map and color-correct a 100-nit Rec709 image to 1000 nits Rec2100 if you wanted to, using a calibrated 1000-nit display for reference. Technically, the color range is not quite what it could be, since it's been limited to Rec709, but with some judicious correction, it's close.
 
That is true. When I went through the Dolby Vision training process earlier this year, I specifically asked Aby Matthew (Senior Manager of Content Solutions) about studios that want to "repurpose" 4K SDR material to HDR, and he said, "we'd prefer you do the 4K Dolby Vision first, but yes, you can tone map and color-correct a 100-nit Rec709 image to 1000 nits Rec2100 if you wanted to, using a calibrated 1000-nit display for reference. Technically, the color range is not quite what it could be, since it's been limited to Rec709, but with some judicious correction, it's close.

Thanks again Marc for your knowledge and experience on all of this. It makes me wonder from reading your earlier post how many major films and other projects have created HDR content from repurposed SDR.
 
Color accurate 500-nit OLED monitors in this price range do open up real opportunities for editorial departments working on HDR projects, since they’ll presumably be making cutting decisions differently to low con SDR shows. Colorists could also conceivably purchase one of these of these monitors to do their initial grade then go on to rent a grading suite equipped with a proper 1000-nit reference display to put the finishing touches on the specular highlights. Lower cost displays are also a great way for colorists just getting acquainted with HDR to learn the workflows and gain experience grading.
 
Color accurate 500-nit OLED monitors in this price range do open up real opportunities for editorial departments working on HDR projects, since they’ll presumably be making cutting decisions differently to low con SDR shows. Colorists could also conceivably purchase one of these of these monitors to do their initial grade then go on to rent a grading suite equipped with a proper 1000-nit reference display to put the finishing touches on the specular highlights. Lower cost displays are also a great way for colorists just getting acquainted with HDR to learn the workflows and gain experience grading.

Jon,

Yeah it's still not a 1000nit monitor, however as has been mentioned previously by others, as long as you know what to watch out for while grading content with this monitor you can make it usable.
 
Jon,

Yeah it's still not a 1000nit monitor, however as has been mentioned previously by others, as long as you know what to watch out for while grading content with this monitor you can make it usable.
Sure, relatively inexpensive displays like the Asus can fill in handily as production monitors, as Mr Schwarz demonstrated in his promo piece; and for VFX, editing, and quite possibly even for QC; not so much for high-end video production. Peak luminance is one of the very first specs to consider when selecting an HDR reference monitor, as that is one of the two defining characteristics of HDR video, the other being WCG. The monitors usually found in high-end post-production facilities are capable of reaching and being calibrated to 1000 nits peak luminance. Not being able to take on Dolby Vision projects and being confined to 500 nits will definitely limit your prospects. This is after all a VESA 400 monitor - suitable perhaps for preview and editing suites where display of the entire signal isn’t obligatory - not for mastering at premium post production houses. Not even discerning videophiles are interested in consumer displays capable of only doing 500 nits. And fewer than 10% of participants in Dolby's study were satisfied with 500 nits peak luminance. The UHD Alliance doesn't even consider anything less than 540 nits to be HDR.
 
Last edited:
Sure, relatively inexpensive displays like the Asus can fill in handily as production monitors, as Mr Schwarz demonstrated in his promo piece; and for VFX, editing, and quite possibly even for QC; not so much for high-end video production. Peak luminance is one of the very first specs to consider when selecting an HDR reference monitor, as that is one of the two defining characteristics of HDR video, the other being WCG. The monitors usually found in high-end post-production facilities are capable of reaching and being calibrated to 1000 nits peak luminance. Not being able to take on Dolby Vision projects and being confined to 500 nits will definitely limit your prospects. This is after all a VESA 400 monitor - suitable perhaps for preview and editing suites where display of the entire signal isn’t obligatory - not for mastering at premium post production houses. Not even discerning videophiles are interested in consumer displays capable of only doing 500 nits. And fewer than 10% of participants in Dolby's study were satisfied with 500 nits peak luminance. The UHD Alliance doesn't even consider anything less than 540 nits to be HDR.

Yeah,

for Dolby Vision not so much for HLG content it's maybe more viable.
 
Back
Top