Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

RED KOMODO MONOCHROME TEST

rand thompson

Well-known member
Joined
Aug 9, 2011
Messages
18,878
Reaction score
608
Points
113
RED KOMODO MONOCHROME w/ LEICA M 0.8 PRIME 50mm NOCTILUX 0.95 NIGHT TEST



BY Keith Morton


 
I couldn't confirm the previously posted video actually used a RED Komodo Monochrome camera so I deleted it.
 
Last edited:
Howdy Hill | Red Komodo Monochrome

by
Moviesauce





Shot on Red Komodo Monochrome with Sigma Art 24-35mm
 
The video is of very poor quality. There is obnoxious banding.


A byproduct of recompression/rencoding via YouTube and likely export settings.

 
A byproduct of recompression/rencoding via YouTube and likely export settings.


Phil, I'm not seeing obnoxious banding in the anamorphic video as seen in the spherical monochrome video, but you're comparing a postage stamp sized anamorphic crop with severely reduced bit rate and excessive noise. But for sure, whoever uploaded the video wasn't doing the camera any justice with whatever settings they were using. YouTube compression is why I almost always provide a link to download the original video nowadays.

Screenshot 2023-03-30 at 11.22.10 PM.png
 
Last edited:
Yep. Just downloaded your 10-bit 4:2:2 H.265 and it looks fine.

The YouTube HDR 10 encode looks more or less fine, but the SDR encode shows posterization/banding all over the frame as well as a very, very different color.

Moderately known frustrations in the "tube space". Worth investigating the metadata injector method for YouTube in particular, but they truly need to change that workflow as it's awful.

For cinema and streaming distribution I create several masters for HDR and SDR, though there is a lot of effort going into the "one master" concept. In practice for the last decade, that remains a pipe dream outside of simple conversion and seeing where things land. Even with an HDR master converting into R709 space there's nearly always tweaks that must be made. The scene referred buildup does work in practice, but you need quality output transforms for that, which is what RED's IPP2 workflow represents as well as the continuing journey of ACES. I generally have two or three major transforms I've built to make life easier. All my work for several years gets mastered in both SDR and HDR. Encoding either by me or by the streaming entity or studio depending on factors, requirements, and specifications. Already in the scary world of H.266 and AV1 without any real hardware support or true code optimization. Somewhat similar to where we were with VP9 and H.265 a long while back now before the encoders and decoders existed. I have one client with a proprietary codec that I can't physically encode to, but I submit 16, 12, or 10 bit masters in that case.
 
I should add, I suspect your'e viewing in an HDR enabled environment, flip over to SDR and see how things look. The Monochrome piece is only in SDR which is why you'll see that banding in either viewing environment.
 
A lot of people will just use the Export settings they feel are warranted based on the amount of time they are willing to invest in the upload process, the level of importance of the video, and the size of the video....etc. Usually that is either H.264 or H.265. These two formats don't require that much time for the Upload process in relation to other higher quality formats like ProRes or DNxHR but they don't offer the exact image quality either. I personally use either DNxHR HQ, SQ or LB ( since I am on Windows) depending on the size and the importance of the video.

However, You also have to factor in the "PRE-Export" workflow settings as well since they will also factor into how the video looks after exporting and re-encoding on Youtube. Are they using a "Scene Referred" workflow like RWG/LOG3g10, ACES or DWG/DI or are they grading a transformed REC 709 image?




DNxHR Settings

1.png




Are you using "Proxy" or "Optimized Media" in your export or Proxies for "Playback"?


2.png


10.png



9.png



3.png


4.png




Also the "DECODE QUALITY" and "BIT DEPTH" of the Project make a difference.

5.png



6.png


Lastly, how are you processing the images in the rendered video. What Cube-Size Luts are you using? 33 or 64/65 and what are their bit depth? I have some Luts from "MITCH BOGDANOWICZ" that are 65 cube sized and 12-Bit. I think Resolve's "Generate Lut" function generated Luts are only 8-Bit. How many LUTs are you using? What 3D Lut " Interpolation" method were you using with the luts mentioned above, "Trilinear" or Tetrahedral" Are you using Luts or Transforms to process the images and the Plus and Minuses of that?


LUTs vs Transforms
LUTs vs Transforms: How They're Different and Why It Matters (frame.io)


Screenshot-9202.png


7.png



There are probably more things to consider that can affect the final rendered uploaded Youtube and Vimeo video,, etc than I mentioned above. So it's really just do the best you can with the amount of time you are willing to invest in the video and the Upload process of that video to show what you wanted to show and what you wanted to say.


You can also use SENDGB to upload a File of 5GB or smaller for free if you wanted to give your audience the video before Youtube or Vimeo Compression and re-encoding.



SENDGB link
SendGB | Send Large Files | Free file transfer
 
Last edited:
What Difference Composites Reveal About Your Compressions

By

Jesse Koester


 
Banding is probably my least favourite digital artifact.

Funny that it gets mentioned here, as it's always one of the first things that comes to mind when thinking about using black and white.

I love those RED monochrome sensors and the advantages they have over converting from colour, but I've always thought it would be a challenge to preserve the extra subtle shades and tonalities they can capture all the way through to the final images others could view.

I remember doing direct comparison tests of different 16mm film stocks and seeing even in the B&W version of those how much more apparent the gradations of light and shadow were and how much more attention to the lighting and exposure would be required to get a deliberate result.

It's the same now with the more capable digital sensors, but on top of it we've got the compression issues that weren't there with the physical projection of light through actual film.

Using dithering or blurring of the image, then higher bitrates for the deliverable seems to be the most common way of trying to mitigate the banding effect.

It's also possible to adjust the actual grade to make the content more banding-proof, but going so far as to change the actual look of the content just so it fits within the limits of the delivery system seems a bit much.

Makes me wonder if shooting for large physical prints of still frames would be the best way to see what the sensor can actually capture.
 
Looks great. I just can’t see myself shooting monochromatic video. Mind you, I often convert my still images to B&W.

Han,

Yeah me neither. I imagine if you are or were a photographer whom has shoot with Monochrome cameras in the past like those from Leica and others and you wanted to now make the move over to video, this and other Red Monochrome cameras would be how you would do that in the highest possible quality on the market But I'm sure other Cinematographers would be interested in using a Monochrome camera for whenever a project might require it.
 
Here's another Example Red Monochrome video. I can't really confirm that it was shot with a Red Monochrome camera nor can I confirm that it wasn't shot with a Red Monochrome camera. So take this video with a grain of salt


Chicken

By

Jed McKenna



shot on monochrome komodo
canon 28-135mm
canon 75-300mm
 
RED Komodo Monochrome: The Ultimate Black and White Cinema Camera

by Lensrentals



 
Looks great. I just can’t see myself shooting monochromatic video. Mind you, I often convert my still images to B&W.

I can't see myself purchasing one. However, I once shot a project in 4:3 B&W on standard Komodo-- with a monochrome LUT applied in camera and the sides of the production monitor taped off to avoid showing the unused portion of the image on set. And so I was quite surprised to see the cut in full color 16:9! I guess no one had told the editor-- who also thought it was a creative choice to occasionally see light stands at the edge of the frame!
So that's an argument for me for renting a monochrome camera next time...and I also wouldn't mind if Red added a 4:3 mode.
 
RED Full Spectrum Monochrome Komodo Review


By
Scott Balkum



Know The World - A Message We All Need To Hear - RED KOMODO 6K Monochrome



By Scott Balkum


 
Last edited:
SAMPLE FOOTAGE: Red Komodo 6K Monochrome (Standard + Full Spectrum)


By
LensProToGo


 
Back
Top