Your statement is not really relevant when using typical grading software such as Resolve, Baselight, Lustre, Scratch, Pablo, or any other "standard" grading toolset, which is what I was referring to and what the post I was responding to was asking about.
Last edited by Elsie N; 05-05-2012 at 04:49 PM.
Che didn't even make back it's budget. Knowing is critically considered stupid (34% Rotten Tomatoes). Gamer didn't make it's budget back, and has a 30% on Rotten Tomatoes. The Girlfriend Experience, Valhalla Rising...It didn't matter what camera was used in these cases. District 9 was shot on RED and did well. Why? THE STORY.
Avatar made money because it's emotionally manipulative and the James Cameron is a master of that. The story wasn't original or good, people were more interested in the world. It's an exception, not a rule.
Notice how many successful and unsuccessful movies are shot on a variety of cameras? As long as people can see what's happening on the screen and are engaged, that's all that matters. Yes, there's a lot that goes into that, and yes having a great image can contribute to that, but it is hardly a requirement. Which you've admitted to with Paranormal Activity's success.
Here's a thought. Forget about tests and go out and shoot something on an iPhone or whatever camera you can get your hands on. Would that not be more profitable than talking about companies with successful business plans that you don't like?
At some point, I think it's not about what camera you are shooting with. It becomes just shooting.
The camera nobody talks about in tests is the camera that played a big part in me getting my current job. I'm not saying the iPhone is what you want to shoot a big movie on, but it is a camera.
Now I owe you more thanks. Great. I should have just made it a crate or built a small structure out of beer kegs and delivered that instead. OK damn it. Need to get more. You prefer regular Sam Adams or their seasonal stuff?
Maybe I'm biased because our last trailer for The Avengers is the new record holder for most internet downloads in a day :)
Personally, I watch trailers at work on my side computer - a 2011 24" iMac with stock monitor settings - eg emulating a good consumer computer viewing environment. When at home, I watch trailers on my Sony projector, my calibrated DreamColor monitor and / or 27" iMac.
I just wish - speaking as a RED nut - that I couldn't spot every movie shot on RED by its "RED colors!" look...
For contrast, I remember saying "this likely wasn't shot on RED" 5 shots into the trailer of Hunger Games. Which I watched at 360p on Youtube. The rich, natural colors instantly read as filmic, even at low res: http://www.youtube.com/watch?feature...&v=mfmrPu43DF8
I'm sure it doesn't bug many folks of course except for crazy old me - Spiderman trailer looks fantastic otherwise and marketing and filmmakers are doing a great job - seriously, wow. I'm there opening weekend.
I must know at least 50 colorists in LA, and to me, the good ones will generally agree that very often, "less is more." Start with a reasonable correction, and then when you really need to go wild, make those the later layers / nodes / whatever.
This was my reason for several conversations in the Resolve area, where people are trying to do 9 nodes for a relatively simple shot of two people standing by a tree. I've done shots with 20 layers or windows on occasion, but those where whacked-out, crazy, screwed-up situations where we were basically trying to save a very problematic scene or introduce a very heavy, complex look. A straightforward scene, shot well, shouldn't take more than 1 minute and a couple of nodes to make it look reasonable. Nothing's worse for a colorist than when you go through a session and realize that node 7 basically undoes what was done in node 2, or node 8 just adds noise to compensate for a bad balance in node 1. Color correction can really be a house of cards if you aren't careful.
I honestly didn't think the new Spiderman trailer looked bad at all, and there's always the chance that blotchy, uneven skintones are the fault of bad makeup. In a movie I worked on some years back, Queen of the Damned for Warner Bros., we spent a lot of the color-correction process just fixing makeup problems (on the two lead actors), at least 100 hours mainly solving that problem. You can pull keys and even out blotchy skin, but at some point, it becomes more of a visual effect than color-correction, per se.
I'm always amazed by what certain people zero in on during a shot. The director will worry about a glass window in the back; the DP is worrying about the second actor being slightly out of focus; I'm trying to look at the entire shot and how it's dark on one side. None of us are wrong. These are very subjective areas, and our brains are sometimes larger factors than cameras or monitors.
Back to the original topic: I would be very curious to see what would happen if you took 7 different high-end digital cameras, shot them on the same set with a decent DP, then went into one color correction room and spent a week making all the cameras look exactly the same, as much as humanly possible. At least then, the other characteristics would be on an even playing field, and nothing would jump out and look awful or weird.
I've asked the Zacuto people before if they were interested in releasing their past tests for sale on Blu-ray, and so far, they haven't been interested. Me, I'm skeptical of what I can really see as streaming media on questionable computer displays (even my own). A Blu-ray on a decent consumer monitor would be a lot better than that; a 4K theatrical display would be even better, from a traditional DCP.
|« Previous Thread | Next Thread »|