Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Epic Compression Test

Yes, thank you for your good info, Marc!

I figured the test results would be helpful for others, but I didn't know if the results were good, bad, or expected.
 
This is pretty much what we've told our post clients. This is not a Red problem -- it's the same with any JPEG2000-based imagery. I think 5:1 is a realistic limit for feature projects.

I've had a gut feeling for a few years that this is kind of a "magic ratio" for compression. This is also the compression rate of standard-def Digital Betacam, which (for SD) looks very good and holds up for many, many generations. HDCam-SR also uses about 5:1 compression, and likewise works fine for broadcast. In audio, 320kbps sound files sound very good, and are about 1/4th the size of an uncompressed WAV file.

Any more than that, you can see, hear, and/or measure the compression artifacts. They aren't always deal-breakers, but they are there.

Its a little more complicated than that.

Firstly, recode is jpeg2000, but is compressing a RAW image, not a complete RGB one, so any artifacts will affect the debayering process.

However, Red (Graeme and Jim on the patent app.) are rearranging the raw data prior to encoding to make it more compressor friendly.. which may offset any problem above

Either way, the raw gets a fair old whack of processing before and after compression..

also, nice test .. redcode hates tree bark, or the other way around.. good job 5k has nearly double the pixels that 4k (i believe 2.25) has, as a high quality downsample works wonders
 
also, nice test .. redcode hates tree bark, or the other way around.. good job 5k has nearly double the pixels that 4k (i believe 2.25) has, as a high quality downsample works wonders
I agree that there's a lot going on in the capturing and debayering process, but I think my observation about compression in general is pretty accurate.

In addition to tree bark (or any monochromatic areas with hundreds of areas of light and shadow), another real torture test is a football stadium, looking at the spectators in the bleachers. All those little tiny dots of color, moving randomly... this is very tough for many compression engines. I've seen broadcast HD just give up and go "blocky" in situations like this.

As it is, it's a miracle that Red's R3D encoding works as well as it does. The amount of data being processed in real-time is really, really intense. To me, it's dicey to push that processing harder by adding in double or triple the amount of compression.
 
I agree that there's a lot going on in the capturing and debayering process, but I think my observation about compression in general is pretty accurate.

In addition to tree bark (or any monochromatic areas with hundreds of areas of light and shadow), another real torture test is a football stadium, looking at the spectators in the bleachers. All those little tiny dots of color, moving randomly... this is very tough for many compression engines. I've seen broadcast HD just give up and go "blocky" in situations like this.

As it is, it's a miracle that Red's R3D encoding works as well as it does. The amount of data being processed in real-time is really, really intense. To me, it's dicey to push that processing harder by adding in double or triple the amount of compression.

I wasn't picking fault , merely responding.. your general observations are pretty good, Im just making the additionl point that, usually when you decode (or compare in this case) a lossy or visually lossless image format, its in the context of RGB (or some other colour triplets).. but with redcode thats not what it is, quite..you start with a bayer pattern that gets 1) rearranged prior to compression (and whatever else), 2) has lossy compression applied, then later gets 3) decompressed and debayered (and whatever else etc etc).. so redcode you have many more parameters to balance than normal.. debayering may smooth out compression artifacts, or at other times may amplify them to some degree.. understanding the role they both play is quite important.. personally, id like to have different debayering options, rather than a single debayer that fits everything.. but im greedy i guess..

.. Im just emphasizing the point there is that bit of apples and oranges going on when references to other image formats are made

off topic slightly.. i've tested rearranging and compressing 16bit RGB png files in the past (and recently just for fun.. raw bayer pattern data in various formats) .. and png compares very well with j2k in terms of compression.. not so well with compression speed.. as do some of the other forms of lossless jpeg if you do the rearranging bit well.. and are faster encodes/decodes than j2k

I noticed FLAC still rules the lossless audio compression arena (for non float samples) compared to microsofts wavelet based lossless media format.. each data type will have a lossy compression sweet spot, and a preferred compression algorithm
 
Just to update everyone, I plan on installing the new firmware very soon, recalibrating, and redoing the same tests, perhaps with a second shot of a pan.

I have a new question, though, and I am curious if anyone else has experienced this. We did a small studio shoot yesterday, and strangely, the first two shots we took had some blocky compression in the black duvatine background. We planned on crushing out this background anyway, but what I found strange was that the remaining 10-12 shots all looked much much better in those dark areas (blocky compression looked more like an even layer of film grain). It's bizzarre, because the camera was locked down, never changing angles, never turned off, and never changed settings between these shots.

I know you might be thinking it's the camera warming up, but we had the camera on for at least 30-40 minutes and had even run a couple of test shots. Still, could it be the camera was still warming up? Has anyone had this happen?

Also, I must sincerely apologize, but because of a strict non-disclosure agreement, I cannot post any frames for you. I know that's frustrating (very frustrating for me!), but I do think the description I gave above is accurate.

Here is the info:

Scene: Brightly lit objects and people in front of a black screen. I'm limited in what else I can say.

Lens: Zeiss cp.2 35mm, f/4to f/5 aperture.

Camera settings: We were rolling slomo, 4K HD, 120fps (29.97 timebase), 1/240 shutter, 800 ISO, 10:1 compression ratio (that's the max at that framerate); all other settings to their defaults.

Any insights you might have would be greatly appreciated!

Thank you very much!

Adam
 
What happens if you compress less than 10:1? Can you not do 5:1 at this framerate?
 
I've posted small crops of the black duvatine background where you can see the differences in the camera's compression, so hopefully this will give you enough of an idea of what I'm seeing.

Here it is: http://www.filesanywhere.com/fs/v.aspx?v=8a70638e5c5f73b4a2ab

These two clips were shot within 15 minutes of each other. The first crop is how the first two clips look, and the second crop is how the remaining 10 clips look. I picked clips 1 and 6 because they were the most similar in terms of content.

Again, the camera had been powered up for at least 30 minutes before shooting, and we had shot a couple of short test shots. Also, none of the camera settings, focus, or camera position and framing changed between these shots.

Thank you for your help!

Adam
 
Back
Top