- Thread starter
- #41
rand thompson
Well-known member
- Joined
- Aug 9, 2011
- Messages
- 18,878
- Reaction score
- 608
- Points
- 113
ProArt for Studios ft. Final Pixel | Global Virtual Production Studio
by ASUS
	
		
	
				
			by ASUS
Follow along with the video below to see how to install our site as a web app on your home screen.
							 
						
Note: this_feature_currently_requires_accessing_site_using_safari
Yeah,
for Dolby Vision not so much for HLG content it's maybe more viable.
The HLG EOTF tracked brighter than reference. [It] seemed to be operating at a fixed system gamma of 1 in HLG mode likely due to chipset limitation instead of an adjusted system gamma tailored to display luminance and surround luminance as specified in BBC's white paper on HLG.
I don't know why that would be, though HLG is primarily for broadcasting and the Asus does not even meet the the EBU's minimum requirements for a mastering monitor. Furthermore, according to Vincent Teoh, the Asus had issues with the white point; and the EOTF curve in HLG which made the picture brighter than the spec, making it unsuitable for use as an HLG monitor:
We're seeing this same refrain echoed over and over again in the comments sections of YouTube camera and display reviews and in online forums in reply to those for whom image quality is paramount and who recognize the importance of standards.This appears to be monitor you're just not interested in so don't buy it but others will.
We're seeing this same refrain echoed over and over again in the comments sections of YouTube camera and display reviews and in online forums in reply to those for whom image quality is paramount and who recognize the importance of standards.
Jon,
do you not understand that every thing you have mentioned everyone else is aware of already? Do you not understand that everyone else realizes the shortcomings of this Monitor also? Do you not understand that you are just repeating the same info that we are all aware of? The point I keep trying to continually make which appears to continually go over your head is that someone will still purchase this monitor knowing everything you have already mentioned and they will be happy with that.
Nothing you've said has gone over anyone's head.
Whether people purchase this monitor or not or whether they are happy with their purchase is totally irrelevant.
Even your hero Joey D'Anna sees eye-to-eye 100% with what I've stated about the limitations of inexpensive OLED monitors like this - that they're great for practice, perhaps suitable for prepping a project prior to finishing at a rental facility or for editorial purposes - but not necessarily for final grading.
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.I've heard and read about that test before.And yet the Dolby Vision Specification allows for up to 10000 nits of brightness and with usual Reference Mastering monitors of from 1000 to 4000nits.
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.
There is a move afoot to modify the D-Cinema spec to something approaching 100 nits, which would be extremely bright if you had (say) a screen 50 or 60 feet wide. It has a much different feel in the pitch-black atmosphere of a cinema, vs. watching something at home in your living room.
You forget about 3D, IMAX, and other large-format venues. Brightness standards are in a state of flux. Watch this:Brighter, contrastier pictures without any purpose would only lead to a drop in ticket sales.
You forget about 3D, IMAX, and other large-format venues. Brightness standards are in a state of flux. Watch this:
My company has been a Dolby Vision facility for six months, and I've been certified for Dolby Vision as a colorist for more than a year. I've worked on four Dolby Vision feature releases in the last couple of years. I'm already well-aware of all this. Always consider there are people out there on the net that know much more than you do. Watch the video and consider what I've said: that Dolby Vision standards are changing for theatrical, and it's not a simple situation.When it comes to HDR, peak luminance refers to the maximum brightness level, which allows headroom for specular highlights. HDR is not about making the entire image brighter (apart from the exceptional cases I already mentioned).
As far as I know, there's only experimental displays that can go that high. Almost all the mastering displays out there peak at 1000 nits. And believe me, 1000 nits is plenty bright.
There is a move afoot to modify the D-Cinema spec to something approaching 100 nits, which would be extremely bright if you had (say) a screen 50 or 60 feet wide. It has a much different feel in the pitch-black atmosphere of a cinema, vs. watching something at home in your living room.
If I had just made a movie, I would fight to have it shown at Onyx cinema. I did prefer a D65 white point to DCI white and I believe that today it makes so much sense to make all HDR formats D65. Control in the highlights was clean and well defined. At 300 nits there is not the harsh cut off at peak white that I have experienced when grading on a Dolby Vision Christie projector. As with all HDR content, the trick is to keep the main program level low and similar to the regular cinema levels of about 48 nits. This allows plenty of headroom for the HDR effect, shows great depth, looks very real and most importantly looks beautiful.
Can you point to where Dolby Vision has made an official announcement that they are adopting the DCI specifications mentioned in the video? I must have missed that part. Did they send you a memo? Standards can change from day to day, that does not alter what I’ve written about peak luminance one iota. Furthermore, if there's nothing in the least controversial about the information I’ve provided about reference grading monitors and HDR, it's because it is backed up by leading industry authorities, experts in color theory and film technology, prominent DPs, renowned colorists, standards bodies like the EBU and ITU, the ASC, and researchers. On the other hand, as you've shown here and elsewhere, your pronouncements are not always credible - and not infrequently, the very sources you cite contradict your claims. And if you think that being able to successfully complete an online multiple choice questionnaire is supposed to impress me, it doesn't. You're also derailing the thread, which is about the Asus monitor, not whether or not you think 100 nits should be the absolute upper limit for theatrical presentation.My company has been a Dolby Vision facility for six months, and I've been certified for Dolby Vision as a colorist for more than a year. I've worked on four Dolby Vision feature releases in the last couple of years. I'm already well-aware of all this. Always consider there are people out there on the net that know much more than you do. Watch the video and consider what I've said: that Dolby Vision standards are changing for theatrical, and it's not a simple situation.
I'm just saying the target is moving. I don't think the Asus can hit 1000 nits corner to corner, top to bottom. The BVM-X300 and X310 can. Have you ever actually used a Grade-1 mastering display for judging color? (I don't see any color credits for you at all in IMDB.)\You're also derailing the thread, which is about the Asus monitor, not whether or not you think 100 nits should be the absolute upper limit for theatrical presentation.
The only 4000-nit monitor I know of is the Dolby Pulsar, which is a rental-only display device you have to get through Dolby Labs. It's bright and tough to look at for more than a half hour. In truth, there's nothing that demands that you use all of that dynamic range. I know of a major (huge) A-list director who routinely requests his DPs and colorists working on his project actually keep the dynamic range fairly similar to that of the regular SDR version. The specular highlights are allowed to shoot up to 200-400-600 nits, but the average program levels are not much different than in the world of 100 nits.Thanks again, I don't think I really even seen a 4000Nit display monitor let alone a 10000nit monitor, I don't think I would want to sit in front of it. Yeah that 10000nits I don't think can be displayed on any monitor in production, I imagine it would have been experimental. I personally can't remember seeing any monitors for sale off the top of my head that ever exceeded 1000nits, even though I know they are out there.\
Actually, what you said wasI'm just saying the target is moving.
There is a move afoot to modify the D-Cinema spec to something approaching 100 nits...
I don't think the Asus can hit 1000 nits corner to corner, top to bottom.
Have you ever actually used a Grade-1 mastering display for judging color? (I don't see any color credits for you at all in IMDB.)
The only 4000-nit monitor I know of is the Dolby Pulsar, which is a rental-only display device you have to get through Dolby Labs. It's bright and tough to look at for more than a half hour. In truth, there's nothing that demands that you use all of that dynamic range.
So the decision of "how bright" and "what dynamic range" to use is purely a creative one. Dolby makes their suggestions, but there's no law requiring that you toe that line. Amazon, Apple, Netflix, and Disney actually have more strenuous rules for what they want to see with HDR content... but again, if the Director stands their ground and says, "no, my creative intent is this," the studio will generally go along with them. (I'm reminded by the first season of Mandalorian being criticized by some crazy fans who felt it "wasn't HDR enough," when that was a decision by the showrunner, not necessarily by the colorist.)
I know of a major (huge) A-list director who routinely requests his DPs and colorists working on his project actually keep the dynamic range fairly similar to that of the regular SDR version. The specular highlights are allowed to shoot up to 200-400-600 nits, but the average program levels are not much different than in the world of 100 nits.
Dolby Vision standards are changing for theatrical...
You are mistaken. Normal projection for D-Cinema is still 48 nits. Here's a link to the documents that define it back in 2011:Actually, what you said was which is counter-factual. D-Cinema specifies 300 nits. DCI deemed brightness levels below 300 nits insufficient in achieving a compelling HDR experience for the majority of participants in their study.
Tell me again how many HDR or Dolby VIsion titles you've actually worked on. Watching them doesn't count.That's nothing special. Many DV shows on Netflix are nearly indistinguishable from SDR. Just rattling off a bunch of random nit values is meaningless talk anyhow. It's all in how and when they're used to enhance the story.
You are mistaken. Normal projection for D-Cinema is still 48 nits. Here's a link to the documents that define it back in 2011:
https://ieeexplore.ieee.org/document/7290729
48 nits equates to 14 foot-lamberts, which was the film projection standard (with a clear piece of film in the gate) for many, many years. This would equate to SDR. 108 is still the current Dolby Vision standard for HDR cinema. The link for Dolby Vision with this specific number is here:
https://professionalsupport.dolby.co...language=en_US
The HDR specs (including Dolby Vision Cinema) are, as I said before, a moving target: AMC, Imax, and Dolby all kind of have their own ideas on how bright it can or should be. It gets even more complicated with Large Format Theaters, 3D, and other kinds of venues. And it only works with RGB laser, as far as I know. It's standard practice for major releases to go through 30 or 40 or more different deliverables because of different aspect ratios, different brightness levels, different light source types, different 3D systems, and so on. The possible combinations are almost endless. (I'm told they may hit 100 deliverables for Avatar: The Way of Water, which is now being finished.)
I would say more than 90% of the projectors out there -- all Xenon -- are still running the DCI SDR standard. It's true that laser projection can go far above that and stay fairly stable, and this has been true since Christie's original 6P for 9-10 years ago. I like the look of laser projectors quite a bit, and the blacks in particular look really solid. Even in "Fake IMAX" theaters with double-stacked 4K projectors, it can look exceptionally good. But I think setups like this are not exactly affordable, and I don't know a lot of post houses that have more than one laser projector in the building. As a future concept, it's great. But it has nothing to do with Asus OLED displays.
Note it's only a proposed addendum, and it's still a 1.0 document. Everything I said above is still true. Answer the question.TLDR; You’re sorely mistaken, Marc. You’re purposely derailing the thread, where we are dealing with the HDR capabilities of the Asus PA32DC OLED monitor. You even shared a video discussing DCI’s HDR specs and somehow got it confused with Dolby Vision [!]. D-Cinema HDR spec is 300 nits.
https://www.dcimovies.com/specificat..._2022-0330.pdf