Marc: "I will again voice the opinion that I don't think the "K" matters as much as the lighting and the skill of the cinematographer." - in some ways you're right, but when you consider that when well done, a higher resolution sensor will have less un-natural artifacts than a lower resolution one, and handle the contrast better in terms of MTF, the more K has many indirect benefits to the cinematographer than just resolution. If you want to think of these benefits as primary and absolute resolution as a secondary benefit, that's fine with me. To me, they're all benefits and I use and appreciate them as appropriate.
I would agree that it's better to start with a 4K image and degrade it, rather than trying to take an HD image and uprez it. Still, even with Hugo (which won the Oscar), they worked constantly to soften the image throughout that project, even for film out. An ultra-sharp look would not be appropriate for 1931, and -- as has often been said by directors I've worked with -- "sharpness is not always our friend," especially with actors of a certain age.
Once there are a dozen 4K cameras on the market, Red is going to have to work on other marketing and technical issues beyond "just the K," and just the price. Strictly my opinion.
Well, there's a vast difference between sharpness and resolution. I agree that sharpness is not always desirable - especially the fake electronic kind which never looks natural. I've had to suffer through DVD releases where the sharpness lines were so thick they looked like a child's wax crayon line around the characters. And indeed, it's not resolution that's a problem for aging actors, but "HD sharpness", which is usually a combination of too much aliasing and too much electronic sharpness.
Of course, you've got to wonder why lens designers go to the bother of making their glass sharp if all that is wanted is a soft image, or why we even moved to HD, or why to use 65mm or IMAX to get more resolution out of film. There's obviously a call for resolution and sharpness, and as a camera that crosses the stills / motion boundary, it's even more important.
What we can do with all that resolution is make a camera that responds well to nuances in lighting and filtration. We can keep the MTF50 high so that you've got strong contrast rendition for the image, but room to have your filtration on the fine details, but keep the image "readable".
4k+ was a necessary move. But a fair bit of good reason in what you say. I think we do such resolutions tastefully and with some grace in how the detail presents itself. There's some other high resolution camera systems out there that I've seen that throw that detail in your face in a rather aggressive manner that I'm not keen on at all.
Absolutely: still you gotta wonder why J.J. Abrams is shooting some of the new Star Trek movie in Imax, and why Chris Nolan's new Batman movie is mostly shot in Imax. What are they getting from Imax that they can't get from 4K digital?Of course, you've got to wonder why lens designers go to the bother of making their glass sharp if all that is wanted is a soft image, or why we even moved to HD, or why to use 65mm or IMAX to get more resolution out of film. There's obviously a call for resolution and sharpness, and as a camera that crosses the stills / motion boundary, it's even more important.
I agree absolutely that the lens makes a huge amount of difference, even on lower-end cameras. It's very telling that Zeiss Master lenses, Cooke lenses, and Panavision Primo lenses own the high end of the digital business, witnessed by movies like Dragon Tattoo, Hugo, and the upcoming Oz: The Great and Powerful. The lenses and the guy lighting the picture make a huge amount of difference -- really more than the camera, in my opinion as a working colorist for a long, long time.
And I would add that often for CG elements I render with 16-32x supersampling in order to remove artifacts.... and then apply a 1-2 pixel gaussian blur in the end. Aliasing can't be blurred away. OLPFs help optical sensors enormously compared to CG but aliasing is aliasing. :)
I continue to be amazed at the amount of aliasing that is "accepted" in displayed images. I would even go so far as to say that viewers have become so accustomed to the "outlining" of edges that when they are not "enhanced" (sic) they perceive them as soft.
How did we get here? I have a few theories:
Distribution via MPEG-2 at insufficient bit rates has led to HD picture quality that is fundamentally lacking in the greater detail CE companies and MSOs are hyping. To artificially compensate for the lack of delivered resolution, "sharpness" is ladled on, often at several steps along the way from mastering to the settings on your display.
TV manufacturers crank up the detail and contrast in their default settings to make them "pop" in the showroom. A huge percentage of buyers never adjust this over sharpened picture setting.
Many people sit too far away from their screens so they use sharpening in an attempt to overcome the limitations of their eyesight.
While those of us in the industry may have the good fortune to see displays in a proper viewing environment, many consumers put their TV sets in brightly lit rooms, directly across from large windows, etc in which case juicing up the image is simply a way to overcome glare and high ambient lighting.
I dream of a world where no sharpening is done anywhere but at the display, kind of like not adding any salt to food during prep and letting the diner decide just how much to sprinkle on to suit their taste (or, in the case of TV watching, adjusted to compensate for current viewing conditions/material).
Cheers - #19
If your captured "sharpness" is disturbing, could it mean you're using the wrong lens for that shot?
On the 8K and + front, I would think the bottleneck is data throughput, not sensor, at this point.
Can you store, move, and edit the files? Can it record at over 60 fps? If not, what would it be good for?
I'll take 3K or 4K RAW vs 8K compressed and color-locked for any work I'll ever do.
Now 8K RAW . . . . might someday have a use.
This sensor is meant for 8K Broadcast cameras. For a general digital motion camera you would of course go for a higher resolution sensor.
Report from the last 8K broadcast test demo at IBC september 2011; Super Hi-Vision Wows Audiences at IBC 2011
|« Previous Thread | Next Thread »|