Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Asus ProArt PA32DC OLED Review

ProArt for Studios ft. Final Pixel | Global Virtual Production Studio


by ASUS


 
Yeah,

for Dolby Vision not so much for HLG content it's maybe more viable.

I don't know why that would be, though HLG is primarily for broadcasting and the Asus does not even meet the the EBU's minimum requirements for a mastering monitor. Furthermore, according to Vincent Teoh, the Asus had issues with the white point; and the EOTF curve in HLG which made the picture brighter than the spec, making it unsuitable for use as an HLG monitor:

The HLG EOTF tracked brighter than reference. [It] seemed to be operating at a fixed system gamma of 1 in HLG mode likely due to chipset limitation instead of an adjusted system gamma tailored to display luminance and surround luminance as specified in BBC's white paper on HLG.
 
Last edited:
I don't know why that would be, though HLG is primarily for broadcasting and the Asus does not even meet the the EBU's minimum requirements for a mastering monitor. Furthermore, according to Vincent Teoh, the Asus had issues with the white point; and the EOTF curve in HLG which made the picture brighter than the spec, making it unsuitable for use as an HLG monitor:

This appears to be monitor you're just not interested in so don't buy it but others will.
 
This appears to be monitor you're just not interested in so don't buy it but others will.
We're seeing this same refrain echoed over and over again in the comments sections of YouTube camera and display reviews and in online forums in reply to those for whom image quality is paramount and who recognize the importance of standards.
 
Last edited:
We're seeing this same refrain echoed over and over again in the comments sections of YouTube camera and display reviews and in online forums in reply to those for whom image quality is paramount and who recognize the importance of standards.

Jon,

do you not understand that every thing you have mentioned everyone else is aware of already? Do you not understand that everyone else realizes the shortcomings of this Monitor also? Do you not understand that you are just repeating the same info that we are all aware of? The point I keep trying to continually make which appears to continually go over your head is that someone will still purchase this monitor knowing everything you have already mentioned and they will be happy with that.It appears that you believe you are teaching us something we didn't already know?
 
Last edited:
Jon,

do you not understand that every thing you have mentioned everyone else is aware of already? Do you not understand that everyone else realizes the shortcomings of this Monitor also? Do you not understand that you are just repeating the same info that we are all aware of? The point I keep trying to continually make which appears to continually go over your head is that someone will still purchase this monitor knowing everything you have already mentioned and they will be happy with that.

Nothing you've said has gone over anyone's head.

Whether people purchase this monitor or not or whether they are happy with their purchase is totally irrelevant.

Even your hero Joey D'Anna sees eye-to-eye 100% with what I've stated about the limitations of inexpensive OLED monitors like this - that they're great for practice, perhaps suitable for prepping a project prior to finishing at a rental facility or for editorial purposes - but not necessarily for final grading.
 
Nothing you've said has gone over anyone's head.

Whether people purchase this monitor or not or whether they are happy with their purchase is totally irrelevant.

Even your hero Joey D'Anna sees eye-to-eye 100% with what I've stated about the limitations of inexpensive OLED monitors like this - that they're great for practice, perhaps suitable for prepping a project prior to finishing at a rental facility or for editorial purposes - but not necessarily for final grading.


You're don't seem to be able to read and comprehend what you've read. As far as I'm concerned any further discussion with you is pointless.
 
I should add, as the former owner of an Asus PA32UCX, that, for a company targeting the professional market, Asus has got to have some of the lousiest after sales support I've ever experienced in my life.
 
I've heard and read about that test before.And yet the Dolby Vision Specification allows for up to 10000 nits of brightness and with usual Reference Mastering monitors of from 1000 to 4000nits.
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.

There is a move afoot to modify the D-Cinema spec to something approaching 100 nits, which would be extremely bright if you had (say) a screen 50 or 60 feet wide. It has a much different feel in the pitch-black atmosphere of a cinema, vs. watching something at home in your living room.
 
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.

There is a move afoot to modify the D-Cinema spec to something approaching 100 nits, which would be extremely bright if you had (say) a screen 50 or 60 feet wide. It has a much different feel in the pitch-black atmosphere of a cinema, vs. watching something at home in your living room.

Marc,

Thanks again, I don't think I really even seen a 4000Nit display monitor let alone a 10000nit monitor, I don't think I would want to sit in front of it. Yeah that 10000nits I don't think can be displayed on any monitor in production, I imagine it would have been experimental. I personally can't remember seeing any monitors for sale off the top of my head that ever exceeded 1000nits, even though I know they are out there.

That 100nit modification to the D-Cinema specification would be interesting to see get implemented and how it actually compares to a watching a film in a home viewing environment.
 
The DCI spec for HDR peak brightness is 300 cd/m2 and 0.005 cd/m2 for minimum black level. The average picture level should be no brighter than any ordinary 48 cd/m2 movie or even darker. If it were brighter, it would be perceived as being of poor quality and create eye fatigue. The extended brightness is for specular highlights, emissive sources and the other instances already discussed earlier. Brighter, contrastier pictures without any purpose would only lead to a drop in ticket sales.
 
Last edited:
Brighter, contrastier pictures without any purpose would only lead to a drop in ticket sales.
You forget about 3D, IMAX, and other large-format venues. Brightness standards are in a state of flux. Watch this:

 
You forget about 3D, IMAX, and other large-format venues. Brightness standards are in a state of flux. Watch this:


When it comes to HDR, peak luminance refers to the maximum brightness level, which allows headroom for specular highlights. HDR is not about making the entire image brighter (apart from the exceptional cases I already mentioned).
 
Last edited:
When it comes to HDR, peak luminance refers to the maximum brightness level, which allows headroom for specular highlights. HDR is not about making the entire image brighter (apart from the exceptional cases I already mentioned).
My company has been a Dolby Vision facility for six months, and I've been certified for Dolby Vision as a colorist for more than a year. I've worked on four Dolby Vision feature releases in the last couple of years. I'm already well-aware of all this. Always consider there are people out there on the net that know much more than you do. Watch the video and consider what I've said: that Dolby Vision standards are changing for theatrical, and it's not a simple situation.
 
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.

There is a move afoot to modify the D-Cinema spec to something approaching 100 nits, which would be extremely bright if you had (say) a screen 50 or 60 feet wide. It has a much different feel in the pitch-black atmosphere of a cinema, vs. watching something at home in your living room.

For one thing, DCI’s D-Cinema white paper specifies 300 nits. There is no move afoot to regress to 100 nits. Dolby Cinema screens range in size from 15-50 feet and I've never heard anyone complain that 108 nits was uncomfortably bright. To the best of my knowledge, neither did any of the 157 participants in DCI's study report any feeling of discomfort at all while watching clips mastered to 300-nit peak brightness - in fact, they preferred 300 nits and above. Contrary to your assertion, brightness levels below 300 nits were insufficient in achieving a compelling HDR experience for the majority of viewers. Kevin Shaw, recounting his experience grading for the Onyx HDR Cinema screen, writes,

If I had just made a movie, I would fight to have it shown at Onyx cinema. I did prefer a D65 white point to DCI white and I believe that today it makes so much sense to make all HDR formats D65. Control in the highlights was clean and well defined. At 300 nits there is not the harsh cut off at peak white that I have experienced when grading on a Dolby Vision Christie projector. As with all HDR content, the trick is to keep the main program level low and similar to the regular cinema levels of about 48 nits. This allows plenty of headroom for the HDR effect, shows great depth, looks very real and most importantly looks beautiful.

https://icolorist.com/samsung-onyx/

As for your contention that 1,000 nits is plenty bright for home viewing, fewer than 15% of participants in Dolby’s study were satisfied with that meager level of peak luminance.

My company has been a Dolby Vision facility for six months, and I've been certified for Dolby Vision as a colorist for more than a year. I've worked on four Dolby Vision feature releases in the last couple of years. I'm already well-aware of all this. Always consider there are people out there on the net that know much more than you do. Watch the video and consider what I've said: that Dolby Vision standards are changing for theatrical, and it's not a simple situation.
Can you point to where Dolby Vision has made an official announcement that they are adopting the DCI specifications mentioned in the video? I must have missed that part. Did they send you a memo? Standards can change from day to day, that does not alter what I’ve written about peak luminance one iota. Furthermore, if there's nothing in the least controversial about the information I’ve provided about reference grading monitors and HDR, it's because it is backed up by leading industry authorities, experts in color theory and film technology, prominent DPs, renowned colorists, standards bodies like the EBU and ITU, the ASC, and researchers. On the other hand, as you've shown here and elsewhere, your pronouncements are not always credible - and not infrequently, the very sources you cite contradict your claims. And if you think that being able to successfully complete an online multiple choice questionnaire is supposed to impress me, it doesn't. You're also derailing the thread, which is about the Asus monitor, not whether or not you think 100 nits should be the absolute upper limit for theatrical presentation.
 
Last edited:
\You're also derailing the thread, which is about the Asus monitor, not whether or not you think 100 nits should be the absolute upper limit for theatrical presentation.
I'm just saying the target is moving. I don't think the Asus can hit 1000 nits corner to corner, top to bottom. The BVM-X300 and X310 can. Have you ever actually used a Grade-1 mastering display for judging color? (I don't see any color credits for you at all in IMDB.)

Note also that getting Dolby Certification is more than just a multiple choice exam. It's 20 hours of classes, plus a lot of back and forth with the instructor, and an exam, and a big pile of documents, and paying a fairly hefty fee. It's not simple or easy, but we felt it was important for our clients to have this option for our work.

Thanks again, I don't think I really even seen a 4000Nit display monitor let alone a 10000nit monitor, I don't think I would want to sit in front of it. Yeah that 10000nits I don't think can be displayed on any monitor in production, I imagine it would have been experimental. I personally can't remember seeing any monitors for sale off the top of my head that ever exceeded 1000nits, even though I know they are out there.\
The only 4000-nit monitor I know of is the Dolby Pulsar, which is a rental-only display device you have to get through Dolby Labs. It's bright and tough to look at for more than a half hour. In truth, there's nothing that demands that you use all of that dynamic range. I know of a major (huge) A-list director who routinely requests his DPs and colorists working on his project actually keep the dynamic range fairly similar to that of the regular SDR version. The specular highlights are allowed to shoot up to 200-400-600 nits, but the average program levels are not much different than in the world of 100 nits.

So the decision of "how bright" and "what dynamic range" to use is purely a creative one. Dolby makes their suggestions, but there's no law requiring that you toe that line. Amazon, Apple, Netflix, and Disney actually have more strenuous rules for what they want to see with HDR content... but again, if the Director stands their ground and says, "no, my creative intent is this," the studio will generally go along with them. (I'm reminded by the first season of Mandalorian being criticized by some crazy fans who felt it "wasn't HDR enough," when that was a decision by the showrunner, not necessarily by the colorist.)
 
I'm just saying the target is moving.
Actually, what you said was

There is a move afoot to modify the D-Cinema spec to something approaching 100 nits...

which is counter-factual. D-Cinema specifies 300 nits. DCI deemed brightness levels below 300 nits insufficient in achieving a compelling HDR experience for the majority of participants in their study.

I don't think the Asus can hit 1000 nits corner to corner, top to bottom.

No one in this thread or anywhere else claimed that the Asus could hit 1,000 nits, side-to-side, top to bottom, back to front or corner to corner, so you're not going to get an argument there.

Have you ever actually used a Grade-1 mastering display for judging color? (I don't see any color credits for you at all in IMDB.)

Not sure what relevance that has to the conversation. Three letters tacked onto the end of someone's name don’t mean anything either. Everything I've written is backed up by experts in color theory and film technology, prominent DPs, renowned colorists, standards bodies like the EBU and ITU, the ASC, and researchers.

The only 4000-nit monitor I know of is the Dolby Pulsar, which is a rental-only display device you have to get through Dolby Labs. It's bright and tough to look at for more than a half hour. In truth, there's nothing that demands that you use all of that dynamic range.

So the decision of "how bright" and "what dynamic range" to use is purely a creative one. Dolby makes their suggestions, but there's no law requiring that you toe that line. Amazon, Apple, Netflix, and Disney actually have more strenuous rules for what they want to see with HDR content... but again, if the Director stands their ground and says, "no, my creative intent is this," the studio will generally go along with them. (I'm reminded by the first season of Mandalorian being criticized by some crazy fans who felt it "wasn't HDR enough," when that was a decision by the showrunner, not necessarily by the colorist.)

No one in this thread or in the universe ever insisted all shows had to hit 1,000 nits. The reason outfits grade on the Pulsar is for future proofing.

I know of a major (huge) A-list director who routinely requests his DPs and colorists working on his project actually keep the dynamic range fairly similar to that of the regular SDR version. The specular highlights are allowed to shoot up to 200-400-600 nits, but the average program levels are not much different than in the world of 100 nits.

That's nothing special. Many DV shows on Netflix are nearly indistinguishable from SDR. Just rattling off a bunch of random nit values is meaningless talk anyhow. It's all in how and when they're used to enhance the story.

Dolby Vision standards are changing for theatrical...

We're still waiting to see a citation here.
 
Last edited:
Actually, what you said was which is counter-factual. D-Cinema specifies 300 nits. DCI deemed brightness levels below 300 nits insufficient in achieving a compelling HDR experience for the majority of participants in their study.
You are mistaken. Normal projection for D-Cinema is still 48 nits. Here's a link to the documents that define it back in 2011:

https://ieeexplore.ieee.org/document/7290729

48 nits equates to 14 foot-lamberts, which was the film projection standard (with a clear piece of film in the gate) for many, many years. This would equate to SDR. 108 is still the current Dolby Vision standard for HDR cinema. The link for Dolby Vision with this specific number is here:

https://professionalsupport.dolby.co...language=en_US

Minolta has a good document here on how they measure projection system brightness, which also mentions the SMPTE ST 431-1:200 48-nit standard:

https://sensing.konicaminolta.asia/w...easurement.pdf

The HDR specs (including Dolby Vision Cinema) are, as I said before, a moving target: AMC, Imax, and Dolby all kind of have their own ideas on how bright it can or should be. It gets even more complicated with Large Format Theaters, 3D, and other kinds of venues. And it only works with RGB laser, as far as I know. It's standard practice for major releases to go through 30 or 40 or more different deliverables because of different aspect ratios, different brightness levels, different light source types, different 3D systems, and so on. The possible combinations are almost endless. (I'm told they may hit 100 deliverables for Avatar: The Way of Water, which is now being finished.)

The specs on the proposed 300-nit HDR addendum to the DCI standard only came out in March of 2022, so it's still fairly new:

https://www.dcimovies.com/specificat..._2022-0330.pdf

What the embedded video above says is they're talking about a brighter standard beyond that, which will be possible with new kinds of projection technologies. This was a hot topic at the SMPTE conference in LA last month.

But here's the reality: I would say more than 90% of the projectors out there -- all Xenon -- are still running the old DCI SDR standard. It's true that laser projection can go far above that and stay fairly stable, and this has been true since Christie's original 6P for 9-10 years ago. I like the look of laser projectors quite a bit, and the blacks in particular look really solid. Even in "Fake IMAX" theaters with double-stacked 4K projectors, it can look exceptionally good. But I think setups like this are not exactly affordable, and I don't know a lot of post houses that have more than one laser projector in the building. As a future concept, it's great. But it has nothing to do with Asus OLED displays.


That's nothing special. Many DV shows on Netflix are nearly indistinguishable from SDR. Just rattling off a bunch of random nit values is meaningless talk anyhow. It's all in how and when they're used to enhance the story.
Tell me again how many HDR or Dolby VIsion titles you've actually worked on. Watching them doesn't count.
 
You are mistaken. Normal projection for D-Cinema is still 48 nits. Here's a link to the documents that define it back in 2011:

https://ieeexplore.ieee.org/document/7290729

48 nits equates to 14 foot-lamberts, which was the film projection standard (with a clear piece of film in the gate) for many, many years. This would equate to SDR. 108 is still the current Dolby Vision standard for HDR cinema. The link for Dolby Vision with this specific number is here:

https://professionalsupport.dolby.co...language=en_US

The HDR specs (including Dolby Vision Cinema) are, as I said before, a moving target: AMC, Imax, and Dolby all kind of have their own ideas on how bright it can or should be. It gets even more complicated with Large Format Theaters, 3D, and other kinds of venues. And it only works with RGB laser, as far as I know. It's standard practice for major releases to go through 30 or 40 or more different deliverables because of different aspect ratios, different brightness levels, different light source types, different 3D systems, and so on. The possible combinations are almost endless. (I'm told they may hit 100 deliverables for Avatar: The Way of Water, which is now being finished.)

I would say more than 90% of the projectors out there -- all Xenon -- are still running the DCI SDR standard. It's true that laser projection can go far above that and stay fairly stable, and this has been true since Christie's original 6P for 9-10 years ago. I like the look of laser projectors quite a bit, and the blacks in particular look really solid. Even in "Fake IMAX" theaters with double-stacked 4K projectors, it can look exceptionally good. But I think setups like this are not exactly affordable, and I don't know a lot of post houses that have more than one laser projector in the building. As a future concept, it's great. But it has nothing to do with Asus OLED displays.

TLDR; You’re sorely mistaken, Marc. You’re purposely derailing the thread, where we are dealing with the HDR capabilities of the Asus PA32DC OLED monitor. You even shared a video discussing DCI’s HDR specs and somehow got it confused with Dolby Vision [!]. D-Cinema HDR spec is 300 nits.

https://www.dcimovies.com/specificat..._2022-0330.pdf
 
TLDR; You’re sorely mistaken, Marc. You’re purposely derailing the thread, where we are dealing with the HDR capabilities of the Asus PA32DC OLED monitor. You even shared a video discussing DCI’s HDR specs and somehow got it confused with Dolby Vision [!]. D-Cinema HDR spec is 300 nits.

https://www.dcimovies.com/specificat..._2022-0330.pdf
Note it's only a proposed addendum, and it's still a 1.0 document. Everything I said above is still true. Answer the question.
 
Back
Top