Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Camera shootout 2011

Status
Not open for further replies.
RED does look good. I don't hear any complaints there, and from what I've seen, RED looks pretty good on their images. What I'm curious / concerned about is the numbers attached to graphs which look very "authoritative" but appear to be somewhat suspect - like what does the "1024 physical rez" actually mean on the RED? Answers on a postcard to....

Graeme
 
I'm not going to get into an attack here on this test, and I commend Scott for replying (remember, he didn't conduct the test itself; he made a documentary reporting about it and people's reactions to the test). But I too have a big issue with the charts presented (which were in the SCCE presentation. When one presents a list with numbers, one is making a statement of better or worse in a scaled measure. It is drawing a conclusion and doing so in a vastly simplistic way. That's probably my biggest gripe.

I actually think that there were issues non-Red specific, and felt that Red was treated pretty fairly on the whole. There are a lot of other tests in there besides resolution. I think the test was unfair to pretty much every camera at one point or another. And I was surprised that the one treated the worst to my thinking was film!
 
Graeme, what do you think of Primes' definition of sensitivity where he uses the noise floor. Seems to me he's defining D-min, not sensitivity. Doesn't really help when different cameras have different dynamic ranges.
 
Mitch, I'll have to watch that bit again! I wish the videos were downloadable as it would make this a lot easier. Agreed on the over-simplistic notion of single numbers on graphs - especially when terms are ill defined and we still don't have an answer on what the "physical rez" of the RED at 1024 actually means..... Indeed, the issues are not RED specific - just that that's one camera I know inside out so when there's a number attached to it that I don't understand, well, it stands out :-)

Graeme
 
I remember on CML when someone objected to the notion of defining sensitivity as the point where shadow and highlight detail are split evenly, whereas they favored the noise floor idea... which just shows that this is open to debate. How is film ASA determined? When middle grey falls halfway on the straight-line portion of the characteristic curve? Or when it achieves "x" amount of density? I think today it is similar to the reasons why Kodak adopted the Exposure Index (E.I.) label instead of ISO or ASA, because it allowed them to pick what they thought was an optimal rating regardless of where exactly things fell in measurement.
 
I remember on CML when someone objected to the notion of defining sensitivity as the point where shadow and highlight detail are split evenly, whereas they favored the noise floor idea... which just shows that this is open to debate. How is film ASA determined? When middle grey falls halfway on the straight-line portion of the characteristic curve? Or when it achieves "x" amount of density? I think today it is similar to the reasons why Kodak adopted the Exposure Index (E.I.) label instead of ISO or ASA, because it allowed them to pick what they thought was an optimal rating regardless of where exactly things fell in measurement.

Agreed. Both sensitivity and dynamic range are very subjective in terms of how you interpret what is usable.

On the video, I didn't understand why the isos used for comparison were all over the place. Sometimes set
two stops apart and then visually compared and commented on.
 
Watching the behind the scenes material really starts opening up some of the issues I had with the original SCCE presentation, Since there were not and still are not any white paper data on the parameters used I find it really hard to interpret how they came to certain conclusions. Watching the Zacto presentation I'm discovering some interesting tidbits, such as the exposure question Deanan just mentioned. For the Sieman's Star test, "Station Chief" Matt Siegel decided (somewhat arbitrarily) to rate all the cameras at 640 ISO, leaving the lighting and the lens the same and swapping out the cameras. Well let's see now, I so happen to know that the lowest sensitivity rating available on the Phantom Flex camera is 800 iSO, and it actually has a nominal rating of 1200 ISO. I'm sorry the camera is too sensitive for your test sir, but if you say that contrast is an aspect of perceived resolution and then you over-expose the sensor, just what does that say about the test?

That's just one example ...
 
We look at this "test" and just roll our eyes. We would say that you should just do your own tests but it is nearly impossible for everyone to get hold of every camera.

We don't have a solution to this problem, but certainly don't feel that this test does anything to help anyone make a reasonable decision.

Jim
 
There's a lot of cognitive dissonance going on with the audience members at the shootout, il me semble...

(except in the cases of the DSLR's, where the frame grabs are so soft that there's no choice but to admit that there's no comparison.)

Some of the best real world footage I've seen that illustrates the difference that resolution can make is in the Roland Garros shots, where you have very wide, pin sharp shots of the crowd, and could easily identify each person.
http://reduser.net/forum/showthread...ot-on-Epic-4k60p-delivery-for-152-quot-Plasma
 
Jens Bogehegn says in the beginning of Episode Two: "Camera manufacturers were invited to be involved and send a camera master for their camera.". I'm a little surprised that RED didn't send anyone?
 
Why don't they let you download the video?

Vimeo can really suck for hosting long videos and it often chokes. I can barley get through the whole thing.

THe approach is a nice idea but the best thing to do is your own tests with what ever you are going to use.
There is also a dead pixel in the Steven Lighthill interview.
I was seriously afraid that girls hair was going to get caught in the candle.

On the good side, Zacuto is trying to get all the kids in teh same room to compare apples but with 12 cameras (some that shouldn't be compared to each other) it can be very tough and ultimately, I think the point is to see side by side comparisons, which are, well, side by side comparisons. Drivers win races, in good cars.

Also, this is not a documentary. It is an event video.



David
 
I've downloaded it with Internet Download Manager (for Windows). Highly recommended app.
 
Guys I don't mean to stir the pot further but the 2nd episode just left a bad taste in my mouth. Why?

Well, Zacuto goes out of their way to firmly state that they had nothing to do with the test and their presentation dubbed the Great Camera Shootout of 2011 is NOT the TEST ITSELF but rather a documentary ON the test. However, we all know in the film industry there is a rule that states a film is created 3 times. Once in post production (script) once in production (the actual filming) and then it is recreated from scratch for a third time in editing (post). Thus, Zacuto themselves may not have RUN the test, but they certainly have close to 90% of the control of the results and the sway on the public via how they edit the footage.
For instance, it is stated that there is no bias, but is it not INDIRECT bias to commonly show an audience member repeatedly stating after every test that "Wow I thought the Arri Alexa blew the others away."

I thought the test was supposed to be an objective look for the viewer to decide on their own yet they are continually splicing in audience member SUBJECTIVE footage of them stating and thus subliminally controlling the perception of the viewer with their remarks that the Arri Alexa is superior to everything else.
Yet on the resolution test it almost seemed quickly canned and deliberately played down how utterly the Red dominates the rest of the cameras. I don't even think the Red was named by name during that segment yet every single other segment from part 1 and 2, Zacuto spliced in at least one person stating how the Arri Alexa dominated. Just because Zacuto did not shoot this themselves or had any control over the testing does not mean that they do not 100% control the results via the editing of their episodes, and I find it strangely biased.
 
I have to laugh at this thread being about people dis crediting tests done by true professionals with input from working professionals and creatives with names that mean something to just come on here and have a load of no names try and dis credit their efforts if they don't praise this particular brands camera. Cameras are horses for courses listen to any of the pros in this video and it's pretty obvious thats their shared opinion there will never be one that's heads and shoulders above all others in all aspects.
 
Tom, in case you've not been following the thread as closely as others - the criticism is the lack of documentation on how measurements were made, why some measurements don't make sense (like resolution greater than that which the sensor is able to achieve), that lights come and go between comparisons etc. I have to laugh at the "load of no names" comment which appears to be a case of the kettle calling the pot black.

Graeme
 

Thanks Graeme - didn't catch that one. While I fully understand your point, you could have sent someone along not necessarily affiliated directly with RED (meaning recommended someone who knew what the hell he was doing as it didn't seem like the one they used had any clue what he was doing - or didn't bother to stand his ground).
 
I don't imply to have a title of any kind at all far from it I'm just saying the editing doesn't Color someones views when a room full of very experienced people are all pretty much conveying the same message, they bust their ass to make these videos
 
Status
Not open for further replies.
Back
Top