Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Is the future FF35?

Well let's find out. :emote_popcorn:

There are advantages to digital other than DR. And again, there are things that can be done in post at the RAW stage that simply cannot be done on film.

Like what? I don't see a lot of difference between a RAW sensor record and a film negative. Both are relatively low contrast elements that represent what was in front of them at the time of exposure. The differences to me are simply that the RAW record is captured as linear light, and the film negative image is captured with a specific characteristic logarithmic response curve, and that the film record has an inherent grain pattern while the RAW record has an inherent noise pattern. Once film is scanned into a representative digital form, there is nothing I can see that can be done with the electronic capture that can't be done with the film capture - except that the film capture will have more dynamic range and better retention of highlight information, at least based on current technology. Am I missing something?
 
Like what? I don't see a lot of difference between a RAW sensor record and a film negative. Both are relatively low contrast elements that represent what was in front of them at the time of exposure. The differences to me are simply that the RAW record is captured as linear light, and the film negative image is captured with a specific characteristic logarithmic response curve, and that the film record has an inherent grain pattern while the RAW record has an inherent noise pattern. Once film is scanned into a representative digital form, there is nothing I can see that can be done with the electronic capture that can't be done with the film capture - except that the film capture will have more dynamic range and better retention of highlight information, at least based on current technology. Am I missing something?

36c4a4e69e5f4f6078818966cf571d7b.jpg


There are all kinds of crazy things that can be done in the RAW stage. Is Stephen going to set up a chemical lab at his house like some kind of mad scientist to try to match what can be done with a few swipes of a mouse in Camera RAW?
 

There are all kinds of crazy things that can be done in the RAW stage. Is Stephen going to set up a chemical lab at his house like some kind of mad scientist to try to match what can be done with a few swipes of a mouse in Camera RAW?[/QUOTE]

All you did was an exposure change. Your first image is printed too dark, and your second is printed too light. A properly done negative scan allows the same (in most cases, more) latitude. If you're talking about the need for extra steps to get from the raw material (in this case, film) to a scan, to your desktop, vs. importing a RAW file directly from a digital camera, yes, of course, that's true. But you were implying that somehow a digitally shot RAW file can yield image flexibility that film cannot, and that is just not the case.
 
what tom does is faster cheaper and almost equaly good...when HDR moves to all cameras its cheaper, faster and better...so why shoot film?
 
The types of manipulations done to shots like the one posted above (by the way, not my shot), are far too numerous to even begin to list. There could be 20 different levels of manipulation going on. Those types of shots require very advanced RAW processing skills.

(Please note the lack of astrophotographers who currently use film of any kind.)

Maybe you are right. Who knows.

I recently had a shot with elements of light probably 3 stops or more overexposed. With a combination of Adobe Camera RAW virtual Grad NDs and by pulling down the luminance in certain channels, desaturating other channels, etc, I was able to bring a hopeless photo into balance. Maybe this can be done with film. I have no idea, since the last time I spent much time in a darkroom was in high school.

I like having as much control over my own images as I can get from the time the shutter is pressed to the time the image is displayed. Another nice thing about digital.
 
I recently had a shot with elements of light probably 3 stops or more overexposed. With a combination of Adobe Camera RAW virtual Grad NDs and by pulling down the luminance in certain channels, desaturating other channels, etc, I was able to bring a hopeless photo into balance. Maybe this can be done with film. I have no idea, since the last time I spent much time in a darkroom was in high school.

I like having as much control over my own images as I can get from the time the shutter is pressed to the time the image is displayed. Another nice thing about digital.

You seem to be confusing digital processing with digital capture. They are not the same thing. Image capture is image capture, regardless of the medium used to capture it. If you can capture a wide enough range of information, you can then use digital processing techniques to manipulate it. Modern digital cameras capture a very wide range of values. Modern film emulsions capture even more. Once you scan that negative into a digital format, the same processing techniques can be used.

You are certainly not the first to confuse the two. Even some studio execs, high end directors of photography, and directors seem to sometimes jump on the bandwagon and profess that digital capture somehow allows them to use post production tools that film capture doesn't. That sounds good, except that it isn't true. You don't have to capture with a digital camera in order to use digital post tools, just as you don't have to capture with a digital camera in order to have digital projection. It's a choice, each one being a valid one.
 
Why don't astrophotographers use film?

Also, is there as much room to push around colors with film as there is with RAW?
 
Why don't astrophotographers use film?

I don't know anything about astrophotography, so I can't answer that. I would offer the layman's opinion that much of what is regarded as "astrophotography" has migrated to space based platforms - i.e., the Hubble telescope, deep space probes, etc. In those situations, electronic imaging is the only choice because you can't return a piece of physical film to earth electronically. I would also point out that on manned Shuttle missions, there is quite a bit of film shot. In fact, I attended a presentation recently regarding use of the IMAX camera aboard the Hubble "refurbishing" mission last year. It was sent up - at great expense - because it was felt by all involved - including NASA officials - to be the best image capture medium currently available. Not to mention the damage that can be and is done to electronic sensors by cosmic rays in the non-protected space environment, with no currently known method for prevention. However, my entire career has been spent doing "normal" photography, on sets and on location, and in that milieu, what I said is my opinion, which is generally in line with most industry professionals.
 
Hi Tom,

please don't misinterpret me wrong now but still the best time-lapse is shot on film, but not on S35 as Stephen would like like to rule with (his "clunky" TS S35 stuff),

it is shot on film in a size of 65mm with anamorphic lenses as Ron Fricke used for Baraka movie and still supposed to be was using on his latest project Samsara movie.

Even if you watching a lot of DVD/Blu-ray disks with movies you can still get a right feeling of celluloid's sort of "wet emulsion" that you can't find in digitally shot movies.

Maybe somebody can finally make plug-in called a film look with celluloid emulsion effect starting from 0 to 100 %.
 
Why don't astrophotographers use film?

Also, is there as much room to push around colors with film as there is with RAW?

Mike's right in that a film image is essentially a raw image simply stored on film. It's just waiting to be transferred to to a computer. Film is still one of the most cost effective long term storage media in existence. There's a ton of information packed into a very little space.

Why don't astrophotographers use film? My guess is reciprocity failure. Long exposures can make film response inaccurate/weird etc. Perhaps digital sensors are superior for long exposures.
 
I don't know anything about astrophotography, so I can't answer that. I would offer the layman's opinion that much of what is regarded as "astrophotography" has migrated to space based platforms - i.e., the Hubble telescope, deep space probes, etc. In those situations, electronic imaging is the only choice because you can't return a piece of physical film to earth electronically. I would also point out that on manned Shuttle missions, there is quite a bit of film shot. In fact, I attended a presentation recently regarding use of the IMAX camera aboard the Hubble "refurbishing" mission last year. It was sent up - at great expense - because it was felt by all involved - including NASA officials - to be the best image capture medium currently available. Not to mention the damage that can be and is done to electronic sensors by cosmic rays in the non-protected space environment, with no currently known method for prevention. However, my entire career has been spent doing "normal" photography, on sets and on location, and in that milieu, what I said is my opinion, which is generally in line with most industry professionals.

I'm talking about Earth-based astrophtography, with 35mm cameras. Nearly all of it is done on digital cameras now. I'm also going to guess that there is no other form of photography that requires as much ability to manipulate image as astrophotography. So you have to ask yourself. If there is more image information on film, why aren't astrophotographers using it?

I'm just posing the question.

BTW, this thread is now miles off track! :auto::driving:

Sanjin, let me get some screenshots for you.
 
Why don't astrophotographers use film? My guess is reciprocity failure. Long exposures can make film response inaccurate/weird etc. Perhaps digital sensors are superior for long exposures.

Good point, I hadn't really considered that.
 
Again, though, I want to point out that there are other factors beyond DR. Is Stephen's image going to be as stable and flicker-free as mine? Will his S35 image match the detail and clarity of a FF35 RAW? Doubtful. Let's find out.
 
Again, though, I want to point out that there are other factors beyond DR. Is Stephen's image going to be as stable and flicker-free as mine? Will his S35 image match the detail and clarity of a FF35 RAW? Doubtful. Let's find out.

From Wikipedia (it can't possibly be wrong if it's in Wikipedia - Wikipedia knows everything...;-)):

Reciprocity failure is an important effect in the field of film-based astrophotography. Deep-sky objects such as galaxies and nebulae are often so faint that they are not visible to the un-aided eye. To make matters worse, many objects' spectra do not line up with the film emulsion's sensitivity curves. Many of these targets are small and require long focal lengths, which can push the focal ratio far above f/5. Combined, these parameters make these targets extremely difficult to capture with film; exposures from 30 minutes to well over an hour are typical. As a typical example, capturing an image of the Andromeda Galaxy at f/4 will take about 30 minutes; to get the same density at f/8 would require an exposure of about 200 minutes.
When a telescope is tracking an object, every minute is difficult; therefore, reciprocity failure is one of the biggest motivations for astronomers to switch to digital imaging. Electronic image sensors have their own limitation at long exposure time and low illuminance levels, not usually referred to as reciprocity failure, namely noise from dark current, but this effect can be controlled by cooling the sensor.

I think that answers your original question.
 
MMost, I already conceded from the very first post that film has more DR. No one disputes that.

So getting back to the topic of this thread, is Stephen's S35 timelapse going to be as stable, flicker-free, clear and as full of detail as my FF35 5.6K RAW? Putting DR aside, will his S35 timelapse match the image quality of my FF35 5.6K RAW?
 
So getting back to the topic of this thread, is Stephen's S35 timelapse going to be as stable, flicker-free, clear and as full of detail as my FF35 5.6K RAW? Putting DR aside, will his S35 timelapse match the image quality of my FF35 5.6K RAW?

Since the FF35 5.6K RAW doesn't exist yet, that's clearly a question that nobody can answer. However, since that 5.6K Bayer chip RAW image will likely represent about 4K of actual resolution once processed - assuming the optics in front of it are capable of passing that - my "quick" answer would probably be yes, based on what current film emulsions are capable of, once again given glass that can yield that. But, once again, this is conjecture because while one currently exists, the other does not. But there are other considerations. Timelapse is a specific case (much like astrophotography) in which certain characteristics of digital imagers have some advantages given certain conditions. One of the nice things about having high quality digital imagers today is that there are choices, and depending upon the specific imaging requirements, there are different tools that are available. It's not just film anymore. Instead of looking at these things as one being consistently superior to the other in every case and in every way, one should be looking at the plethora of tools available and matching their strengths and weaknesses to the requirements of the specific job. What works well for shooting explosions may not be the best choice for time lapse.
 
5.6k is probably heading towards 4.5k measured resolution. Of course, only way to know for sure is to measure. Given that without the OLPF we can induce serious aliasing on the RED One with not the most expensive lens, I'd say that if you're aperture is reasonably set, the 5.6k resolution is probably not yet lens limited. But testing will tell us for sure.

Graeme
 
Since the FF35 5.6K RAW doesn't exist yet, that's clearly a question that nobody can answer.

Sure it exists. I shoot it all the time.

Here are two frames shot by Ron Fricke on 65mm film and scanned at 8K for the BARAKA Bluray. The other two were shot by me recently on a FF35mm 5.6K RAW 5D Mark II. These are screengrabs from similar H264 encodings.









Just to be clear, I am not comparing the shots, just the format. Everyone knows that I worship Ron Fricke.

When you watch these shots moving, the digital stuff is far, far more clean, stable, detailed and clear, even against the legendary 65mm, shot by a master.
 
5.6k is probably heading towards 4.5k measured resolution. Of course, only way to know for sure is to measure. Given that without the OLPF we can induce serious aliasing on the RED One with not the most expensive lens, I'd say that if you're aperture is reasonably set, the 5.6k resolution is probably not yet lens limited. But testing will tell us for sure.

Graeme

One of the features of the new Leica S2 series is that they don't have an optical low pass filter, opting instead for software based variable low pass in the processing. Any opinion on how that might work for RED?
 
One of the features of the new Leica S2 series is that they don't have an optical low pass filter, opting instead for software based variable low pass in the processing. Any opinion on how that might work for RED?

Software OLPF is bollocks, unfortunately. You can, to an extent, deal with chroma moire in software, but luma moire is "burned in" as the point of sampling and if it were easy to seamlessly remove afterwards, we'd all be doing it. Fact is, there's no way to remove it. There is some software that can do a ok guess - Hassy's raw software tries, but when it fails, and it does, it looks ugly. And there's no way it would stand up to motion.

In the end, the "correct" engineering and aesthetical approach is to optically ensure that the input signal to the sensor is sensibly band limited, and that there's enough photosites to get a reasonable degree of oversampling.

Graeme
 
Back
Top