Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Varicam LT vs Red Weapon Dynamic Range testing

I'm fascinated by this result as it doesn't mimic my DSC Labs Xyla-21 tests on Epic Dragon, Weapon Dragon, or the Varicam 35. Another thing to test it seems.
 
Looks like a very well designed and thorough test. Well presented.

The only thing it leaves out, and I think this is an important factor, is that the RED footage can always be push or pull processed. That is to say that what might be "clipping" at 800ISO is not actually clipping and can be recovered from the RAW data with processing techniques.

I get though that this is just a direct ISO comparison test at a manufacturers suggested baseline exposure. I'd venture a guess that when you get to the top end, the RED footage is quite recoverable (more like film) than something in a baked in curve and ISO.

Good methods though, for the type of test it is.

David
 
Last edited:
Well... allways the same errors done... But yes at 800iso it looks like the Scarlet-W has 1/2 less highlights. You probably should expose it at 2000 iso. The underexposure test is really flawed though.

Pat
 
Well... allways the same errors done... But yes at 800iso it looks like the Scarlet-W has 1/2 less highlights. You probably should expose it at 2000 iso. The underexposure test is really flawed though.

Pat

This.
 
Hey guys, thanks for posting this. Spent a lot of time on it, and I knew the methodologies would undoubtedly (as always) be argued.

The main thing we were trying to compare was the dynamic range under the same exact lighting conditions and trying to keep the settings as normal and consistent as possible across both cameras. We didn't want to accommodate each camera specifically for each exposure setup to keep some sort of control, and didn't want to adjust each camera in post to pull down or anything. We wanted to keep it controlled, and present exactly what we got as we shot it.

I'm sure in a real-world shooting situation you would be adjusting your camera for each lighting condition, but this was comparing how both cameras performed at their native ISO of 800. As David said, we needed some baseline exposure. They should react similar to the same lighting setup, should they not?

I touched upon it on the Red Facebook group earlier, but I would love to try shooting the Xyla-21 chart again. It was one of the tests that I didn't feel we did quite right, because I agree with Phil, it was very different than other results I've seen on the web.
 
Last edited:
Hey guys, thanks for posting this. Spent a lot of time on it, and I knew the methodologies would undoubtedly (as always) be argued.

The main thing we were trying to compare was the dynamic range under the same exact lighting conditions and trying to keep the settings as normal and consistent as possible across both cameras. We didn't want to accommodate each camera specifically for each exposure setup to keep some sort of control, and didn't want to adjust each camera in post to pull down or anything. We wanted to keep it controlled, and present exactly what we got as we shot it.

I'm sure in a real-world shooting situation you would be adjusting your camera for each lighting condition, but this was comparing how both cameras performed at their native ISO of 800. As David said, we needed some baseline exposure. They should react similar to the same lighting setup, should they not?

I touched it on the Red Facebook group earlier, but I would love to try shooting the Xyla-21 chart again. It was one of the tests that I didn't feel we did quite right, because I agree with Phil, it was very different than other results I've seen on the web.

Also, you may want to consider pairing DragonColor1 with RedLogFilm, as my understanding is its a bit less contrasty and less saturated than DragonColor2 (ie for an even flatter more log image)
 
Thanks for chiming in Brian. I loved your test. I think your methodology was strong and well executed. My only wish (and maybe you could still do this) was that you would grade the over/under-exposed images back to proper exposure. As it is, it's hard to tell when the image starts clipping and when the noise becomes too much.

On a different note, I don't know why it's so hard to accept that another camera (besides the Alexa) performs better in some respects than the Dragon sensor. Every time the Dragon comes out the "loser" in a test like this, everyone here calls foul on the testing method! This test perfectly exhibits my experience with the Dragon and replicates the results of numerous other tests I've seen. Even the Xyla chart.

I agree with Brian's methodology: not doing special favors to either camera in processing. Dynamic range is dynamic range. The exposure and ISO values should be equal for both cameras or what's the point in comparing?
 
Hey guys, thanks for posting this. Spent a lot of time on it, and I knew the methodologies would undoubtedly (as always) be argued.

The main thing we were trying to compare was the dynamic range under the same exact lighting conditions and trying to keep the settings as normal and consistent as possible across both cameras. We didn't want to accommodate each camera specifically for each exposure setup to keep some sort of control, and didn't want to adjust each camera in post to pull down or anything. We wanted to keep it controlled, and present exactly what we got as we shot it.

I'm sure in a real-world shooting situation you would be adjusting your camera for each lighting condition, but this was comparing how both cameras performed at their native ISO of 800. As David said, we needed some baseline exposure. They should react similar to the same lighting setup, should they not?

I touched it on the Red Facebook group earlier, but I would love to try shooting the Xyla-21 chart again. It was one of the tests that I didn't feel we did quite right, because I agree with Phil, it was very different than other results I've seen on the web.

Was it Panasonics Xyla? Just curious.
 
Kenneth,

Just in case you thought it was me crying foul. :) I was not. I was simply stating a fact, that the images that might indeed look clipped on the Dragon in fact might not be. Your suggestion that the clipped images being graded back as a comparrison is a good one. I don't think you would have the same luxury with the clipping on the Panasonic camera (but I could be wrong).

It's part of a greater education. If people think the RED camera is clipped (and in fact it might only be the settings that create a clipped image) then that is kinda important information.

It is more important to know where the stoplights were.

I agree (and already stated that I think this is a sound test, just that you do actually have more flexibility in the R3D format meaning all things are actually not equal.

David
 
Thanks for doing these tests Brian. They are solid. I agree that there needs to be a baseline. What I discovered is that the baseline is only a starting point, because the LUT (look) on the RAW file only actually tells part of the story. There is a lot of exposure flexibility in the R3D format, meaning that you can choose to rate this sensor natively at a variety of ISO with excellent results (which I am sure you have done).

But this is a good straight up test side by side test for people who want to rate and use the camera that way.

David



Hey guys, thanks for posting this. Spent a lot of time on it, and I knew the methodologies would undoubtedly (as always) be argued.

The main thing we were trying to compare was the dynamic range under the same exact lighting conditions and trying to keep the settings as normal and consistent as possible across both cameras. We didn't want to accommodate each camera specifically for each exposure setup to keep some sort of control, and didn't want to adjust each camera in post to pull down or anything. We wanted to keep it controlled, and present exactly what we got as we shot it.

I'm sure in a real-world shooting situation you would be adjusting your camera for each lighting condition, but this was comparing how both cameras performed at their native ISO of 800. As David said, we needed some baseline exposure. They should react similar to the same lighting setup, should they not?

I touched it on the Red Facebook group earlier, but I would love to try shooting the Xyla-21 chart again. It was one of the tests that I didn't feel we did quite right, because I agree with Phil, it was very different than other results I've seen on the web.
 
I don't know why it's so hard to accept that another camera (besides the Alexa) performs better in some respects than the Dragon sensor. Every time the Dragon comes out the "loser" in a test like this, everyone here calls foul on the testing method!

Because it's a RAW shooting camera that needs a curve and perhaps also it's not an 800 iso camera?
The best bet is to try to compensate the over/under exposure and then you will have your results. The methodology is ok but not the development of the rushes. The good news is that he still has his R3D.
 
Because it's a RAW shooting camera that needs a curve and perhaps also it's not an 800 iso camera?
The best bet is to try to compensate the over/under exposure and then you will have your results. The methodology is ok but not the development of the rushes. The good news is that he still has his R3D.

I'm curious. What ISO do you think you would rate the Scarlet W at?
 
I'm curious. What ISO do you think you would rate the Scarlet W at?

I would make my Iso tests. I'll have a Magnesium so probably close to 2000 if there could be some clipping problems. But shooting Dragon since 2 years didn't give me much troubles of highlights clipping even with LLO installed at 800 iso.
 
Was it Panasonics Xyla? Just curious.

It was. If you know anywhere in LA I can check out another Xyla chart, let me know!

Because it's a RAW shooting camera that needs a curve and perhaps also it's not an 800 iso camera?
The best bet is to try to compensate the over/under exposure and then you will have your results. The methodology is ok but not the development of the rushes. The good news is that he still has his R3D.

I wouldn't mind revisiting it at all, thanks for the info.

How do you suggest I develop them to give an accurate portrayal of the dynamic range for the Scarlet-W? Admittedly, this is where I start getting in over my head, how would you really compare the two? Would it be fair in comparing RAW vs. Log? How would I go about in linearizing the RAW from the R3D instead of applying the RedLogFilm look to get the RAW sensor data?

I thought since the Varicam was shot in V-Log it would be the most accurate comparison between the two, that is, comparing both their log looks. I know each manufacturer's log is different but most people use RedLogFilm as a grading starting point, and same with V-Log.
 
Last edited:
I would make my Iso tests. I'll have a Magnesium so probably close to 2000 if there could be some clipping problems. But shooting Dragon since 2 years didn't give me much troubles of highlights clipping even with LLO installed at 800 iso.

So, are you suggesting that when rated at 2000 ISO, the highlights won't clip as soon? But wouldn't that just mean redistributing your dynamic range, that the shadows would just get noisier more quickly? It makes sense in some cases. For instance, the Alexa holds highlights much better than the Weapon, but it's not good with shadows. To get similar results to the Alexa, you could rate the Weapon at 2000 ISO and that would mean more highlight retention, and your shadows would probably act similarly to. But in the test Brian did, the Varicam LT had better highlight and shadow retention, so unless you can work some magic with a flatter gamma curve, I don't see how you can boost your latitude at both ends of the spectrum.
 
So, are you suggesting that when rated at 2000 ISO, the highlights won't clip as soon? But wouldn't that just mean redistributing your dynamic range, that the shadows would just get noisier more quickly? It makes sense in some cases. For instance, the Alexa holds highlights much better than the Weapon, but it's not good with shadows. To get similar results to the Alexa, you could rate the Weapon at 2000 ISO and that would mean more highlight retention, and your shadows would probably act similarly to. But in the test Brian did, the Varicam LT had better highlight and shadow retention, so unless you can work some magic with a flatter gamma curve, I don't see how you can boost your latitude at both ends of the spectrum.


exactly Kenneth .. Weapon is better in the shadows and ALexa in highlight at 800 iso , just set Weapon 1280/1600
and all be the same or better for Weapon.
So do you think Varicam LT would do better than ALexa and Weapon or Scarlet-w? I doubt! I want to see before judging the original files, then we can discuss.

Brian, please, let us can download the original files?
 
So, are you suggesting that when rated at 2000 ISO, the highlights won't clip as soon? But wouldn't that just mean redistributing your dynamic range, that the shadows would just get noisier more quickly? It makes sense in some cases. For instance, the Alexa holds highlights much better than the Weapon, but it's not good with shadows. To get similar results to the Alexa, you could rate the Weapon at 2000 ISO and that would mean more highlight retention, and your shadows would probably act similarly to. But in the test Brian did, the Varicam LT had better highlight and shadow retention, so unless you can work some magic with a flatter gamma curve, I don't see how you can boost your latitude at both ends of the spectrum.

Again, because he was using DC2, I think his curve may have been more contrasty...and this may have affected DR at extremes. Im wonder what results would have been on DC1.
 
Back
Top