Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Just watched first part of zacuto shootout....

It's RED's forum. If you don't like it, go somewhere else.

Yes it is RED's forum, but there is a difference between being a RED supporter and being an arrogant dick.

(And to be clear I'm not saying you are one, I'm just talking in general.)
 
For my own edification

For my own edification

How could Zacuto do a "proper" test next year, with credible results?

My thoughts...

1. Same lighting for every camera on each test run
2. Same lens / tripod / filters for each camera on each test run
3. Same ISO setting
4. Three different ASC DPs who all have experience with all the cameras (preparation prior to test if needed)
5. Each DP gets a test run using all the cameras under identical conditions, point being to get identical exposure
6. Multiple professional colorists, with each giving their best, matched, version of footage
7. Result would be each of three DP's take on all the cameras processed through two or three colorist's takes so that each camera gets a similar processing and chance at showing its characteristics

DP1/C1 DP1/C2
DP2/C1 DP2/C2
DP3/C1 DP3/C2
 
Trouble is that each camera isn't necessarily the same sensitivity nor can they always take the same lens. And if you don't time them closely to match, people assume that the differences are intrinsic to the cameras ("oh, that camera is warmer than the other camera").
 
Trouble is that each camera isn't necessarily the same sensitivity nor can they always take the same lens. And if you don't time them closely to match, people assume that the differences are intrinsic to the cameras ("oh, that camera is warmer than the other camera").
David is of course right ... I just don't think these test's are relevant anymore ... they were interesting when it was film vs digital (when digital was clearly inferior). Now that digital is on par with film ... the question of film vs digital is more of a artistic/story decision and not a technical "spec's" one. Obviously RAW is better then compressed in the quality front, but with difficult timelines and production costs, compressed can be better in many circumstances ... it's more of a story/cost decision of raw vs compressed. Possibly a raw vs compressed with different sensor sizes could be interesting ... but each film has different requirements, so I'm not sure what that type of test would accomplish either.

I personally would like to see something more arranged on testing camera configs against a film's spec/art requirements ... it would be more of a "learning" tool for people like me and not a "shoot out" between manufacturers. In this last ?shoutout?, I couldn't learn much, since I would have re-arranged the set to avoid most of those light problems (it was more like viewing of a phd thesis defense then a learning experience). I would also like to see a lot more with natural light, and complex textures (i.e. paintings, cloth, fruit ) with more "found" lighting.
 
How could Zacuto do a "proper" test next year, with credible results?

My thoughts...

1. Same lighting for every camera on each test run

This was already done in this test. In empirical test, the lighting was same for all cameras. In the creative test they were allowed to change the lights.

I am guessing E was Epic in the creative test?
 
Wow. So many of you are completely missing the point of this test.

It purposely WASN'T a shootout of all the cameras under identical conditions.

It was a comparison of all cameras against each other when each camera is handled by a passionate, experienced expert with that camera - and that expert could guide and tune the results in a variety of ways during the process. That's what this first part was about.

It was: "Hey person who is an acknowledged expert in your respective camera "community" - light this deliberately difficult scene as best you can and shoot it with the tool you know best - then grade this scene as best you can. Now let's compare THOSE results".

It WASN'T to show which camera was "best"; this time, it was to show that there are a lot of really great cameras out there that can give surprising results that are more similar than you might expect....and to show that good results AREN'T just about "my camera's the best" arguments.

It was a different kind of test. On purpose.

But that purpose wasn't to "do evil" to anyone. There was NO conspiracy. And those who created it aren't stupid, crazy, foolish, uninformed or misguided. Just stop with this bullsh*t. It just makes you look petty and, well, insecure.

You may not agree with the direction they decided to take this year (at least in Part 1) - but they knew what they were doing. And there was a positive spirit, intent and message behind it all.

And one other thing: I wish a lot more people here had the balls to actually rank the footage. DPs, shooters, and owners in other communities are happily doing it: Debating, giving interesting and rational supporting arguments about why they like which clip.

But too many here at REDUSER seem, quite frankly, SCARED to do it. It feels like some are scared to say they actually like the results of one of the cameras (and decisions of the associated "Master"), if, God forbid, it turns out to not be the camera they own.

And stop the lame excuses about "But it's Youtube!" - and download the full 1080p version. Watch it at 1080p (like how....Oh, I don't know....EVERYONE ultimately watches what 99% of us produce, at best).

Have the balls/guts/lack-of-insecurity to participate.

It's actually an interesting experiment. Something a little different. And it has merit.
 
Last edited:
Anyone know when part 2 comes out? That first part was boring and not at all like last years shoot out.

I want to see the tests, not a bunch of dudes talking.
 
Anyone know when part 2 comes out? That first part was boring and not at all like last years shoot out.

I want to see the tests, not a bunch of dudes talking.

Those "dudes" happen to be some of the best Cinematographers in the history of film. While it might have been boring, give them some respect. They were just asked about the craft of cinematography and lighting.
 
Those "dudes" happen to be some of the best Cinematographers in the history of film. While it might have been boring, give them some respect. They were just asked about the craft of cinematography and lighting.

I get that, no disrespect intended.

I just clicked and was expecting it to be more similar last years, which was really dense with the testing.
 
And one other thing: I wish a lot more people here had the balls to actually rank the footage. DPs, shooters, and owners in other communities are happily doing it: Debating, giving interesting and rational supporting arguments about why they like which clip.
There's been quite a bit of discussion about it on the CML group (Cinematography.net). Many positive words about Red, and some back-and-forth debate on Alexa, F65, and the other contenders.
 
Well, in their defense, the most common criticism of earlier tests by Zacuto and other people was that the Red footage was not posted and graded in a way that was specific to the needs of Red footage, that it was "forced" into a common workflow and forced to match other cameras. So you can't have it both ways, argue that these tests all need to use the same parameters and argue that Red needs to be posted in whatever way is best for Red footage.

But I agree that a test where everyone does whatever they need to do in order to improve the camera image by relighting, etc. is not scientific, but it's just another type of test that some people may find interesting.

No single test is going to satisfy everyone, particularly people who have a vested interest in certain results before the test is even shot. And if someone makes a decision to drop $50,000+ of their own money based on a single test that someone else performed, then they are not very bright, are they? And if they are not very bright, they aren't going to make good decisions no matter what you tell them.

David I really hate to sound sycophantic. I am not. You are not perfect, no human being is. But I must say, yet again, I love your post. I always do love your posts. They always cut sharply through the chaff and chatter and reveal a truth in a clear way.
 
Did they release part two of the shootout yet?
 
anybody heard any news or results about part 3 of this series? The side by side with as similar set ups as possible?
 
so where's all that big dynamic range advantage alexa/f65/f3 have over Red? I saw identical DR on epic compared to alexa/f65 and arguably superior DR to sony F3 in those equalized tests (talking about the newly released part 3)
 
So Question, I shot 2 shows on the red one why did the first chip look closer to film then the MX (I do love my MX)
any thoughts!!

you mean the Mysterium sensor looks closer to film than the Mysterium-X sensor? not sure that's the case, only thing I can say for that is due to the poorer color science and weaker dynamic range at the time, alot of older original Mysterium sensor shot projects tend to often have a more washed out or desaturated palette (tones burn out/clip faster and thus get diluted and not as rich looking, color-wise) and so with the advent of better DR and better color science a lot of Red MX stuff tended to go overboard and look overly rich perhaps to your eyes, in terms of saturation and such - that coupled with the ultra sharp resolution perhaps gave a more 'videoy' effect?
That's just my conjecture because Red MX definitely looks more film like than M to me and most people I would imagine
 
Back
Top