Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Great posting on Cinematography.net on color matching/grading the current cameras

Skin tones on EPIC

Skin tones on EPIC


Well spotted, Clark.

Geoff Boyle is a great DoP and very down-to-earth and pragmatic ... he carries out camera tests all over the world and always tries to be as objective as poss and suss out the best tool for the job.

What's interesting in this camera comparison test is that Geoff approached it from matching the chip charts as closely as possible to see how the images compared which gave one set of clues. However, once Geoff made the 'RAW' files available on CML, we downloaded them and then I asked our senior colorist, Aaron Peak, to take a slightly different approach ... to mach the skin tone of female subject as best he could and not to worry too much about the RGB values of the chip charts.

Taking that approach, Aaron worked his magic (he has a great eye) and after he finished I asked him to put the images up on the Barco 2K DLP and let me see them "blind" (i.e., not knowing which camera was which) ... after watching them a few times I gave them some kind of subjective assessment (and no, I couldn't define the exact parameters of my subjective assessment scale but I have seen a few million images in our DI theater) ... it was easy to spot which was the worse looking image and which were in the middle range of totally acceptable but nothing special.

However, when it came to choosing which was best (what ever that may mean) there were two cameras that looked pretty damn close in terms of aesthetic appeal ... in fact, so close that I would fluctuate between choosing one over the other ... when Aaron finally revealed which were my top two picks, I was somewhat surprised to see that one was the Alexa and the other one the EPIC .... I say surprised, because we've been grading RED since Day #1 (the first images of the vey first RED prototype camera were graded in our DI theater and we've worked on a ton of RED shows since then) and my number one concern with Graeme's colour science over the years has always been how the RED sensor renders out skin tones.

Resolution is important, but DR, skin tones and the overall "filmic look" are more important in my mind (to say nothing of ease of workflow) which is why the Alexa has done so well even though it has less technical resolution than the RED.

The other interesting thing in our approach is that if you look at the girl's skin tones and then at the chip charts you get a good sense of how each camera's sensor and color science is interpreting RGB values.

And guys, before you get your knickers in a twist over Geoff's test and our grading, just remember that this is not a rigorous scientific experiment (there are quite a few variables involved in the lighting set-up and the position of the cameras, etc) so let's not let the discussion degenerate into a typical RU flame-war and drink more Kool-Aid BS session.

The point of our approach is that in a movie, we grade people, not chip charts. If you set up your EPIC correctly and dial her in you will get great looking skin tones!

Neil
 
I love it when people explain approaches and results openly, rather than simply saying "this is good" or "this is bad".

Beauty is in the eye of the beholder, and depending on the needs of the shot/scene/film/still.

technical things are good to be able to handle, and a lot more is similar than different between the different raw cameras, than many tend to think.

The differences are there, and they are good to know.

NOT very surprised by your results.
My experience is pretty much the same, in practical terms.

Then add lights and filters!
And the eye behind the eyepiece.

Thanks!
 
NIce
 
Hey Neil, you mind posting your skin-matched results?

I gotta be honest, in those chip-chart matched "hollywood DI" CML files the Alexa looks (I guess subjectively) head and shoulders above Epic (and everything else)... The BMCC colours look pretty awesome too. The Epic shot looks like it needs the mids (in the skintones) brought down further. While that's certainly possible, it's giant pain in the ass.
 
yeah, I got the dpxs and the raws, but if Neil's colourist already did a professional job, it'd likely be as good (if not better) than anything I could do.

EDIT: OH, I didn't realize the coloured DPXs were actually FROM Neil/Aaron. I thought the top ones were raw directly out of the camera, and the second set was them balanced to the charts.
 
Last edited:
the importance of a calibrated viewing environment when make subjective assessments

the importance of a calibrated viewing environment when make subjective assessments

Hey Neil, you mind posting your skin-matched results?

I gotta be honest, in those chip-chart matched "hollywood DI" CML files the Alexa looks (I guess subjectively) head and shoulders above Epic (and everything else)... The BMCC colours look pretty awesome too. The Epic shot looks like it needs the mids (in the skintones) brought down further. While that's certainly possible, it's giant pain in the ass.

The skin-tone matched TIFFS that we did are up on the same CML page as the chip-chart matched shots that Boyle did:

http://cinematography.net/edited-pag...era-match.html

Agree about the differences you see in the chip-chart images that Boyle created by doing a technical match on grey scales only ... but as I said above, in our DI theater we grade people not chip-charts which is why I asked Aaron to match the skin-tones first. When we did that, the RGB values of the chip charts from each of the different cameras were all over the place.

You raise a very valid point on the subjectivity of it all ... if I look at the same images on my $3700 Retina Display MBP calibrated 15 inch screen as I am now, i get one set of subjective assessments ... however, if I look at the same images in our DI theater on our $100,000 2K DLP projector on a 20 foot Stewart Snowmat non-perforated screen that's all been tightly calibrated with $18,000 Photo Research spectroradiometer, I get a difference subjective assessment.

Which is correct? .... all depends on what your objective is. But which environment do I trust to make a subjective assessment on image quality? The DI theater of course. But it's still my subjective assessment and what does that matter? You mileage may vary even in the same DI theater.

What I offered to do for the CML folks was to do 'blind viewing' test where we mix up the chip-chart and skin-tone matched images and play them in a random order with some random letter attached to each image so that no one knows which is which and let peeps mark them on a scale of 1 to 10 .... see which one come out tops.

Not that it proves anything about the objective values of each sensor/color science only how a particular group of viewers assesses them at a particular point in time in that particular viewing environment.

Not sure what viewing environment you're looking at the images at the moment .... maybe we should ask everyone who wants to comment on the results to make explicit their viewing environment - especially how their monitor's been calibrated and to what color space standard ... from my experience, where and how people look at professionally graded imagery is more important than what camera, software package or colorist was used to make the final footage. You'd be amazed at how many times I hear the comment, "It looked way more poppy on my TV at home."

However, in our accurate DI theater i'd be surprised if you came to the same conclusions you do above when you see the skin-tone matched images ... but as we all know, beauty is in the eye of the beholder .... especially when it comes to emotional responses to visual imagery.

Neil
 
... to mach the skin tone of female subject as best he could and not to worry too much about the RGB values of the chip charts.

"Match" to what? I would contend that such an approach has little real relevance because you are, by definition, defining one camera as the "standard" to match to, probably the first one you correct (whichever one that was). You're then skewing the "natural" tendencies of all the others to match that one. This has little relevance to the real world because in most cases, you're not trying to make something look like something else, you're simply trying to get the best version of what you want from what you have, without comparing it to something you didn't shoot. If one wants to use a still photo, or possibly a live scene, as a reference, that's one thing. But one really can't take a bunch of digital cinema cameras and try to "match" them because inevitably one becomes the reference for the others. And as most of us know, they're not the same, for many reasons I won't go into. Not only that, but by avoiding a similar processing path (I assume neither you nor Geoff used a standard rendering transform, like ACES) you're simply playing color grading games and trying to come to some sort of conclusion.

Don't get me wrong, I think comparisons are inevitable and often useful. But the notion that "matching" flesh tones gets you something that has any real meaning other than "we can match flesh tones when we have some match reference" seems a bit presumptuous, at least to me. I think comparisons are much more relevant when you use them to determine things like shadow response, shadow color contamination, highlight retention and/or falloff, and color accuracy by passing them each through their particular "natural", or expected, processing path, rather than make them look like something they normally don't simply to prove you can.
 
Okay, cool.

Yeah, I'm looking off a non-pro display. And, yes, I completely agree that a professional environment is where "real" decisions should be made. That said, if the Alexa looks pimp on my crappy MBP, I'm confident that it'll look as good (or better) on a professional display/theatre. While the Epic may look great (or in this case "identical" to an Alexa) on a pro theatre/display, it raises a red flag that it doesn't look as good as the Alexa on a non-pro display (given that the Alexa still looks great).

Anecdotally, I just noticed in the skin-balanced frames, the BMCC, F65, and Alexa all seem to keep their RGB chips roughly the same, whereas the F55, Epic, and C500 lean far more towards purple in the blue channel. It's also crazy how the Epic looks softer than the Alexa (does that have internal auto-sharpening?) and BMCC (which is only 2.5k, but sans OLPF)... It's essentially double the resolution of both.
 
Last edited:
Okay, cool.

Yeah, I'm looking off a non-pro display. And, yes, I completely agree that a professional environment is where "real" decisions should be made. That said, if the Alexa looks pimp on my crappy MBP, I'm confident that it'll look as good (or better) on a professional display/theatre. While the Epic may look great (or in this case "identical" to an Alexa) on a pro theatre/display, it raises a red flag that it doesn't look as good as the Alexa on a non-pro display (given that the Alexa still looks great).

Anecdotally, I just noticed in the skin-balanced frames, the BMCC, F65, and Alexa all seem to keep their RGB chips roughly the same, whereas the F55, Epic, and C500 lean far more towards purple in the blue channel. It's also crazy how the Epic looks softer than the Alexa (does that have internal auto-sharpening?) and BMCC (which is only 2.5k, but sans OLPF)... It's essentially double the resolution of both.

I'd withhold judgement on the quality of skin-tones comparison between Alexa and EPIC until you see the images in a calibrated DI environment on a big screen ... as I said, I was somewhat surprised at how well the two matched once they'd been dialed in .... as also noted, there's some variations in the positions of the cameras with regard to the subject and the lighting set-up ... not major, but enough to give a different reading on what we're looking at.

And you're right regarding the skew on the color gamuts ... hopefully that will all get ironed out in the ACES framework as per Mike's posting.

The resolution question you raise is an interesting one ... with projection screenings I think you only see a real difference in image quality between the Alexa and the EPIC on a very large screen ... even at the RenMar Studios I've seen some EPIC demos that looked OK but not 'blow you away with 4K' amazing.

Ironically, I think where you do see the difference in resolution come into play is in UHDTV ... 4K TVs at 3840x2160 res really do benefit from EPIC's higher resolution ... we showed some 3840x2160 EPIC footage at our last FCP X Unplugged session and that looked stunning on a UHDTV panel. Aaron did finish all his skin-tome matching at 3840x2160 but I haven't looked at the files on our UHDTVs only on the 2K Barco. We might do that at our next 4K Unplugged session.

It'll be interesting to see how the new 6K Dragon sensor performs with Graeme's new color science.

Neil
 
Appreciate your input and eyes-on experience Neil (and it's all stuff I agree with, I just tend to lean toward devil's advocate and have a bit skepticism due to real-world/real-work results).

As with anything, I tend to light my stuff according to the camera/sensor being used anyway (most often MX) so it generally looks pretty good. That said, I have a slight distaste for people saying that Epic is as good as other cameras 'because it's raw, so you can do whatever you want with the image; plus it's 5k' (or something along those lines). A few other cameras just hit colour/skintones/what-have-you out of the park directly out of the gate - no fuss, no muss. Moreover, when a 2.8k image upscaled to UHD looks eerily similar (in terms of sharpness) to 5k@UHD, I honestly wonder if applying post-sharpening (to get an image as sharp as you'd expect 5k@4k to be) is actually worth the render time.

Anyway, again, thanks for pointing out the caveats. While MM makes some very valid points in terms of pipeline and finding a "base" image (It's like the tests they do with police line-ups; a more accurate scenario is to compare them one at a time, rather than to each other), I do like knowing that the flesh tones of all the cameras can, more or less (more for MX, less for maybe C500), can match.
 
Not sure how relevant this is, but here you go.... Alexa and RedMX from the CML test. I did make a slight curve adjustment in photoshop.

Do not watch if you are sensitive to flashing images.




























0GQxgWp.gif
 
time spent on finessing digital images ...

time spent on finessing digital images ...

Not sure how relevant this is, but here you go.... Alexa and RedMX from the CML test. I did make a slight curve adjustment in photoshop.

Do not watch if you are sensitive to flashing images.QUOTE]

Yikes! ... as you say, not for those sensitive to flashing images :-( ... interesting "side-by'side" comparison .... joking aside, even on my rdMBP I'd be hard pressed to say which of those two stills looked "better" (what ever that may mean) and even which is from which camera .... they both looked pretty good and very close in terms of skin tones.

Another aspect of how we approached Geoff's comparison test, as an alternative to doing just a technical match on the grey scale in the chip chart, was to ask the question - how easy was it to dial the images in and get them looking good? ... that's again where you see a difference in the underlying color science and sensor characteristics of each of the cameras ... a skilled and experienced colorist like Aaron can (with a bit of help from the DP) work wonders on most digital footage but how long it takes him to find the sweet spot varies a lot based on the camera in question. In Geoff's test, because he was able to control the lighting and set-up fairly well between each camera, Aaron was able to dial in the Alexa and the EPIC pretty quickly and easily whereas the other cameras took a little longer and required more tweaking.

I'd be interested to know how much relative time you spent in Photoshop finessing the Alexa versus the EPIC? ... did you try the other cameras? ... if you could combine all the cameras together, is there a way you could slow down the flash rate a smidgen or two?

Of course, looking at still images gives you only half the story of how well a movie camera performs ... how each one handles temporal resolution as well as spatial resolution is the acid test in our industry ... more on that in the follow up posting below.

Thanks,
Neil
 
time spent grading and getting to the sweet spot ...

time spent grading and getting to the sweet spot ...

Appreciate your input and eyes-on experience Neil (and it's all stuff I agree with, I just tend to lean toward devil's advocate and have a bit skepticism due to real-world/real-work results).

As with anything, I tend to light my stuff according to the camera/sensor being used anyway (most often MX) so it generally looks pretty good. That said, I have a slight distaste for people saying that Epic is as good as other cameras 'because it's raw, so you can do whatever you want with the image; plus it's 5k' (or something along those lines). A few other cameras just hit colour/skintones/what-have-you out of the park directly out of the gate - no fuss, no muss. Moreover, when a 2.8k image upscaled to UHD looks eerily similar (in terms of sharpness) to 5k@UHD, I honestly wonder if applying post-sharpening (to get an image as sharp as you'd expect 5k@4k to be) is actually worth the render time.

Anyway, again, thanks for pointing out the caveats. While MM makes some very valid points in terms of pipeline and finding a "base" image (It's like the tests they do with police line-ups; a more accurate scenario is to compare them one at a time, rather than to each other), I do like knowing that the flesh tones of all the cameras can, more or less (more for MX, less for maybe C500), can match.

Yes, that was the point of the exercise and our approach - could we make each camera look as good as possible and match the skin tones to each other? Given the reasonably well controlled lighting conditions that Geoff used, we managed (with all but one of the cameras) to get them close. Aaron has worked with all the different digital cameras and takes great pride in being able to bring out the best in each sensor and color science - but some of the "raw" Log files take more finessing than others to get them into the sweet spot from a colorist's point of view - in Geoff's test, the EPIC and the Alexa were both quick to dial in and match ... see post above on how long it takes to find the sweet spot of an image with different cameras and color sciences.

The other question you raise regarding the relative merits of downscaling to UHDTV (3840x2160) from 5K versus upscaling to UHDTV from 2.7K is an interesting one ... we're doing a lot of testing at the moment on different 4K UHDTV panels and cameras comparing different approaches and different kinds of content and it's not so obvious which is the optimum combination.

How each camera combines spatial resolution with temporal resolution as well as their own unique color science, seems to be part of the mix in terms of what makes a 4K sequence stand out on a UHDTV panel ... resolution per se is not the only factor involved in the subjective assessment of the relative merits of the overall appeal of different footage shot by different cameras ... 3840x2160 uprezing without too much loss in image quality is definitely achievable from the Alexa and in certain content scenes works well ... where EPIC (and presumably Dragon) really come into their own is in scenes with high detailed content and vibrant colors.

We're involved with the making of a mini UHDTV documentary at the moment down at Del Mar race track using EPICs for the wides and GoPros for the PoV shots .. the GoPro Hero3: Black Edition shoots 2.7K at 30 fps with ProTune log and even that when carefully graded and finessed looks OK at UHDTV resolution ... of course, when compared to the stunning shots of galloping thoroughbreds we get with the EPIC shooting at 120 fps, the image quality pales in significance but none-the-less works well when judiciously intercut with the EPIC footage ... and besides, when was the last time you stuck two EPICs on a jockey's helmet and sent him off around a racetrack at 40 mph?

How the Alexa handles temporal resolution is part of the answer of why it looks good at UHDTV spatial resolution. How EPIC handles spatial resolution is obviously why it looks like 3D at UHDTV resolution ... we showed some underwater footage shot by Ron Lagerlof at our last 4K Unplugged workshop and there were a couple of scenes that literally jumped out of the UHDTV screen ... one in particular, a combination of high density coral and vibrant Angel fish swimming in clear blue seas was a sight to behold - totally stunning.

UDHTV is going to be an interesting viewing target for all the camera makers ... I know that JJ and his team are focused on 4K DCI cinema but to my mind, 3840 x 2160 in the living room is where the real battle is going to be fought.

Cheers,
Neil
 
Geez Neil, I've been a colorist on and off for a LONG time, and this isn't rocket science. If you're working from a log based image, and you just use exposure (offset), contrast, and either a normalizing lut or a curve, it takes about 10 seconds to "dial in" a flesh tone regardless of the source. If you're looking to balance other things in the image to match something else as well, or impart some kind of artistic look to the result, that can take some time, but dialing in a flesh tone is very quick and pretty simple if that's what you're looking for. It's when you approach all of this using lift, gain, gamma, luminance, and secondaries that it becomes time consuming - and often not as pure. Hence one of the reasons I questioned the use of "match everything" tests.
 
Not sure how relevant this is, but here you go.... Alexa and RedMX from the CML test. I did make a slight curve adjustment in photoshop.

Do not watch if you are sensitive to flashing images.QUOTE]

Yikes! ... as you say, not for those sensitive to flashing images :-( ... interesting "side-by'side" comparison .... joking aside, even on my rdMBP I'd be hard pressed to say which of those two stills looked "better" (what ever that may mean) and even which is from which camera .... they both looked pretty good and very close in terms of skin tones.

Another aspect of how we approached Geoff's comparison test, as an alternative to doing just a technical match on the grey scale in the chip chart, was to ask the question - how easy was it to dial the images in and get them looking good? ... that's again where you see a difference in the underlying color science and sensor characteristics of each of the cameras ... a skilled and experienced colorist like Aaron can (with a bit of help from the DP) work wonders on most digital footage but how long it takes him to find the sweet spot varies a lot based on the camera in question. In Geoff's test, because he was able to control the lighting and set-up fairly well between each camera, Aaron was able to dial in the Alexa and the EPIC pretty quickly and easily whereas the other cameras took a little longer and required more tweaking.

I'd be interested to know how much relative time you spent in Photoshop finessing the Alexa versus the EPIC? ... did you try the other cameras? ... if you could combine all the cameras together, is there a way you could slow down the flash rate a smidgen or two?

Of course, looking at still images gives you only half the story of how well a movie camera performs ... how each one handles temporal resolution as well as spatial resolution is the acid test in our industry ... more on that in the follow up posting below.

Thanks,
Neil


I just pulled the mid greens and mid reds down ever so slightly on the red epic mx.
I spent maybe 1 minute or less, was just a slight tint on the red epic mx.

I didn't try the other cameras yet... not sure how relevant it is after being brought down to 256 colors :] maybe it is maybe not....
 
Skin tone and DRAGON sensor ...

Skin tone and DRAGON sensor ...

Interesting insight from Jim's latest posting on Dragon sensor's performance .....

"The color depth of the Dragon was increased to 16 bit. New Graeme color science. Skin tones."

Be great to see Geoff Boyle repeat his camera tests using Dragon sensor .... given how good the existing EPIC color science already is, this could be a major boost to the look and feel of the RED camera .... how a camera interprets skin tone is such an important aspect to one's perception of the 'cinematic quality' of digital film stock (now there's an oxymoron if ever there was one).

Exciting times ahead!

Neil
 
Back
Top