Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Build 15 update...

Status
Not open for further replies.
Not quite...:bye2:

I have to do other tests and waiting until build 15, but uptoday it's closer to 7,5 stops (without noise) than 9 stops. (200 asa / +3stops -4,5 stops)...:blush:

See you.

Pat

Hi,

That's why I am holding back my tests too.

Stephen
 
Not quite...:bye2:

I have to do other tests and waiting until build 15, but uptoday it's closer to 7,5 stops (without noise) than 9 stops. (200 asa / +3stops -4,5 stops)...:blush:

See you.

Pat
hm, bizarre, we got easily 9-10 stops beginning from the first tests - one of the charts in fact is my backdrop on this monitor.

Have you shot on your own camera and compared with another?
especially the first 100 units had before the replacement /massive/ issues.

Also one can screw up in the raw workflow and when using the 709 monitoring.

I am not telling that you are doing something wrong, just trying to understand why you get such different results.
 
Also one can screw up in the raw workflow and when using the 709 monitoring.

Not saying anything about stops here, but this is a very important issue to understand when dealing with Red.

Those 12 bits are a fragile matter, easily thrown out.

Gunleik
 
We have said all along the dynamic range testing is a slippery slope. You can make a case many ways depending on how you test. I guess the best measure is what the footage looks like... as David Mullen has suggested many times.

We have posted charts and our methodology. If you use different tests, you can certainly get different results.

Jim
 
I certainly agree with Jim's observation. I took some shots to test DNR (firmware #12) which you can see here: http://www.bealecorner.org/red/test-080108/page2.html#Test_5

I used the Imatest ( http://www.imatest.com/ ) program to measure the output TIFF and I obtained anywhere between 7.5 stops and 11.5 stops from the SAME original raw frame, just by varying the gamma adjustment, and depending on where I set my noise tolerance threshold. Right now I don't have much confidence in any DNR range test that concludes "X.X stops". Not from my test, and not from anyone else's.

If you want to compare DNR of Camera A and Camera B, I think just shooting the same scene with both and studying the results, is probably a more useful thing to do.
 
Seems like these sorts of numbers are only useful for *comparing* inputs, no? You've got to specify results for Red + 35mm / Dalsa / D-20 / whatever, using the same test, or the number doesn't say very much.

Even more than that, at least the way I tested it, my final DNR number was very sensitive to the input processing. I could get a higher or lower number by several stops, just by changing the gamma curve, from the exact same input.

Maybe the program I used could be improved (no doubt it can) but I suspect that no matter what program or method you use to generate one number for dynamic range, it's going to depend sensitively on exactly what you do with pre- and post-processing your raw or scanned file. Based on my experience, if you sit there for hours tweeking small adjustments to a gamma curve, you can make the DNR number bigger. But what does that say about the real-life images you get from the camera? At this point I'm not sure.
 
I certainly agree with Jim's observation. I took some shots to test DNR (firmware #12) which you can see here: http://www.bealecorner.org/red/test-080108/page2.html#Test_5

I used the Imatest ( http://www.imatest.com/ ) program to measure the output TIFF and I obtained anywhere between 7.5 stops and 11.5 stops from the SAME original raw frame, just by varying the gamma adjustment, and depending on where I set my noise tolerance threshold. Right now I don't have much confidence in any DNR range test that concludes "X.X stops". Not from my test, and not from anyone else's.

If you want to compare DNR of Camera A and Camera B, I think just shooting the same scene with both and studying the results, is probably a more useful thing to do.

A CMOS Chip manufacturer should be able to measure the noise level and the signal level to come up with the internal "on-sensor" dynamic range. This internal dynamic range should be easily mapped to "external" dynamic range by considering the bit-depth of the ADC.

However, manufacturers are many times ambiguous about how they measure noise levels and signal levels (is it RMS, or peak to peak, or what), and can pick ones that gives them the best numbers.

That is why users are then reduced to measuring it using experimentation by putting up a chart / pattern in front of a camera. It is difficult to devise a standard illumination environment, ambient lighting, distance of chart to camera, etc., for different users using different cameras. Hence, the natural variation in numbers for users.
 
I don't think anyone who's serious is attempting to measure DNR with reflective charts. I'm using a carefully constructed light box which has even illumination to better than 0.05 stops across the active area, a transmissive Stouffer test wedge, and there was no ambient (front) illumination. I did some tests to measure effects of lens flare and other factors, but my point is I could get large variability from the *same frame* due to the way I did post-processing, never mind the external variables.
 
I don't think anyone who's serious is attempting to measure DNR with reflective charts. I'm using a carefully constructed light box which has even illumination to better than 0.05 stops across the active area, a transmissive Stouffer test wedge, and there was no ambient (front) illumination. I did some tests to measure effects of lens flare and other factors, but my point is I could get large variability from the *same frame* due to the way I did post-processing, never mind the external variables.

IMHO dynamic range should be kept independent of post-processing parameters such as gamma. Strictly speaking it just tells us how much different variations a sensor can recognize, and hence, gamma and other down-the-stream parameters may not be included in my opinion.
 
IMHO dynamic range should be kept independent of post-processing parameters such as gamma. Strictly speaking it just tells us how much different variations a sensor can recognize, and hence, gamma and other down-the-stream parameters may not be included in my opinion.

As a theoretical matter, I agree with you that the gamma should not matter. As a practical matter, it turned out the program I used to measure DNR (Imatest) gives you different numbers depending on the gamma of the image, and I don't have another way to measure it quantitatively without some subjective judgement being involved.

If you'd like to try this, I have a raw file for you on this page:
http://www.bealecorner.org/red/test-080108/page2.html#Test_5

The target is a back-illuminated Stouffer T4110 with evenly spaced areas that differ in transmitted light by 1/3 stop each, for a full range covering 13.3 stops, which ought to be enough to test a 12-bit linear camera like Red.

So, here's the $64 question: what is the actual dynamic range of that image?
 
As a theoretical matter, I agree with you that the gamma should not matter. As a practical matter, it turned out the program I used to measure DNR (Imatest) gives you different numbers depending on the gamma of the image, and I don't have another way to measure it quantitatively without some subjective judgement being involved.

I agree with you on the practical aspects.

Thanks for the link. A question. Isn't the Red Raw pattern without any gamma adjustment (gamma = 1). Is that a safe assumption?
 
jbleale - I had exactly the same issue with Imatest - just putting a different gamma slope on the 16bit data (which I know does not alter it's dynamic range one little bit) would alter the answer I'd get.

Graeme
 
To me sceientific tests are inconclusive if everyone is using different methods. I have no doubt if Graeme says he measured 11.3 stops with the wedge, then he did.

However the F23 claims a lattitude of 12 stops, Red 11.3 . But the F23 currently outperforms the RED by 1.5 stops of lattitude in practical testing. This has been observed by different people and claudio's tests. Some say more.

That means one of two things, either the F23 is underated (and is performing around 13.5 stops, not 12) or the Red is overated (and performing at around 10 stops)

Technical numbers aside, I'm hoping the Red can match the lattitude of the F23 with build 15.
 
If had an F23 here I'd be able to tell you. Please send the $250,000 cheque to me personally :)

Not only have I measured 11+ on a wedge, but seen 11+ on real world images.

Graeme
 
If had an F23 here I'd be able to tell you. Please send the $250,000 cheque to me personally :)

Not only have I measured 11+ on a wedge, but seen 11+ on real world images.

Graeme

Hi Graeme,

Any chance of posting a frame of each?

Stephen
 
I've posted wedges before. The footage is not mine to post. That's why I was reticent to post my statement as it's not nice for me to say something without backing it up such.

Graeme
 
If had an F23 here I'd be able to tell you. Please send the $250,000 cheque to me personally :)

Not only have I measured 11+ on a wedge, but seen 11+ on real world images.

Graeme

I'm afraid I spent all my money :) but I'll let you know my observations as I only got my camera yesterday. Just gotta pick up a lens rental tomorrow :biggrin:
 
Status
Not open for further replies.
Back
Top