Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Colour Management

RSR is definitely more reasonable... and it's quite a flexible product (though I'm not sure if it works with NLE's... I'm in contact with them now about that).

Still, though, by the time you purchase a probe and the necessary components, I've paid twice as much to calibrate my display as I have for the NLE suite itself...

Am I asking too much for this to be part of an NLE? Or, at least, for software that's in the hundreds per seat, instead of thousands?
 
What about the new Decklink HDMI Card (Intensity) will this give a color/frame accurate monitoring solution for FCP/AE using LCD computer dysplays ala the BMD Decklink/HDLink option?

I guess it depends on what "accurate" means to you. :)

I don't know about the HDMI card... but "accurate monitoring" and "LCD display" don't usually go together. The only two companies I know that are really making serious attempts at creating color accurate LCDs are eCinema Systems (www.ecinemasystems.com) and Cine-tal (www.cine-tal.com).

Lucas
-----
ASSIMILATE, Inc.
Los Angeles
 
Check out Prolost Blog on Stu's thoughts about mastering in your NLE instead of After Effects.

http://prolost.blogspot.com/

32-bit rendering is definitely a good reason to go to AE for mastering. But that still doesn't negate the need for accurate colour in AE (and Final Cut, PPro, Combustion, etc.) all the way through the pipeline.

I've had long discussions with a friend from a local DI facility... for what I'm asking for (a 90% solution) a 1D LUT system like RSR's EqualEyes looks like the best bet... I'll quit ranting on this subject since a good solution does exist... but I'll sit quietly in my corner and mumble about how many few-thousand-dollar purchase I have to buy to get a complete workflow.

I suppose it wouldn't be as much of a complaint if I didn't keep agreeing to do these great projects I'm working on that are mostly for the love and little for the money. My choice!
 
As a person who comes from a bit more of the indie world, and is in the middle of learning scratch. I do have to say that the art form of real time compositing and color grading is just that, an art form. There are some people who can learn it and some people can't. When you get into the arena of dealing with clients and or having them ask you to pull a key on a girls body color , darken it up, and then take out the grain on her skin, but leave it in for the background. Then hit play and have it render in real time while they watch... lol you're playing several different roles artist, client manager, and IT engineer. Thats why a good colorist is paid comensuratly.

In regards to color management its the same game, film outs, DCI packages, to youtube video releases. They're all different, and yes you will have a client whether you're shooting wedding videos, or a spot for nike ask you why their dvd looks so different from their web copy.

Right now RsR is what we're looking at just because of the price point/ level of service. It'll at least get you in the ball park and in the game.
 
Lets say you have an eizo colouredge monitor(i believe thats their top of the line monitor series) and plug it to an aja using an hd-sdi to dvi converter. would you have a colour accurate monitor or not? if this monitor is as good as its supposed to be, whats the difference colourwise with one from ecinema and the likes?
 
An Eizo (a cinema display, or other high-end LCD) will give you reasonably accurate colours... but the question is, accurate to what? NTSC is different than Rec 709 (HD) is different than film, etc. Each is unique in how they store and respond to colour.

The solution you mention - AJA HD-SDI to DVI converter - will likely have settings for Log footage, 709, etc. It is a reasonable way to go, but it has limitations in usefulness. E.G. how do you calibrate for your particular display? Also, most of these devices use 1D LUTs which will get you close, but can't accurately represent mediums that use a different display technology (e.g. LCD vs. film).
 
I know that its a bit off-topic, but i have a certain issue with an upcoming project to be shot with the hvx-200. We will be editing and colour-correcting in a macpro propably with final cut for both. We can afford an eizo colouredge monitor for viewing purposes through the second dvi output of the nvidia card(dont remember which). A friend has a gretag-macbeth probe so we'll calibrate it as well. so the question is, since this will be for tv only(satelite hd) is there any need to get a multibridge card(again connect the monitor through the card's dvi output) to be colour accurate, or not?
 
Does anyone know how much a Truelight system costs?

The old adage... "If you have to ask..."

since this will be for tv only(satelite hd) is there any need to get a multibridge card(again connect the monitor through the card's dvi output) to be colour accurate, or not?

If you're choosing to do colour correction on your NLE, you'll need some way of mapping what that display can do, to rec709 (HD) colour space. Rising Sun Research has Equal Eyes, which is a software-only solution. Other 1D-LUT options are the multibridge or the Matrox MXO. My suggestion would be an Apple Cinema Display plus using one of these solutions, which should run you about the same as the Eizo monitor.
 
Can anyone tell me the difference between 1D,2D & 3D luts? Also digital meaning 1 & 2 light passes? Are they related?
DF
 
LUT = look-up table.
http://en.wikipedia.org/wiki/Lookup_table

(From what I understand, correct me if I'm wrong.)
For a 1-D LUT, you take the input number, find that number in the table, and use the associated number. So you might have something like...
00--> 00
01--> 02
02--> 04
03--> 06
04--> 08
05--> 10
(etc.)

A LUT like that would make the whole image brighter. You can apply a 1-D LUT on the red, green, and blue channels. You can do things like adjusting the white point/balance of the monitor, change the gamma, etc.

A 2-D LUT is like a 1-D LUT, except in two dimensions. You might visualize it like a chess board, with 8 x 8 entries. You would number/index the board along its width and height. Each square on the chess board would contain a output value.

A 3-D LUT occurs in three dimensions. You can visualize it as a cube. Each cube within the whole cube would contain an output value.

2- The 3-D LUT is the most sophisticated and powerful of all. It can describe transformations that 1-D and 2-D LUTs can't (i.e. color response of film).

3- In practical implementations, 3-D LUTs run into a problem. If is a LUT with 8-bit entries, then its size is:
256 x 256 x 256 x 2 = 33,554,432 (bytes)

That many entries won't fit into the CPU's cache, so it would have to fit into the RAM. The problem with that is that accessing data from the RAM can be very slow- it can take several hundred CPU cycles. So to get around this, people use less bits in the LUT. This reduces the accuracy/precision of the LUT. There are some ways of squeezing a little more accuracy/precision out of a particular 3-D LUT. One of them is trilinear interpolation.
 
3- In practical implementations, 3-D LUTs run into a problem. If is a LUT with 8-bit entries, then its size is:
256 x 256 x 256 x 2 = 33,554,432 (bytes)

That many entries won't fit into the CPU's cache, so it would have to fit into the RAM. The problem with that is that accessing data from the RAM can be very slow- it can take several hundred CPU cycles. So to get around this, people use less bits in the LUT. This reduces the accuracy/precision of the LUT. There are some ways of squeezing a little more accuracy/precision out of a particular 3-D LUT. One of them is trilinear interpolation.

i don't know if i remember right, or was there a reason that the DCI specs didn't want to use ICC profiles for this reason, that it would take too much time in processing power before the image gets sent to the projector
 
i don't know if i remember right, or was there a reason that the DCI specs didn't want to use ICC profiles for this reason, that it would take too much time in processing power before the image gets sent to the projector

Hmm... this could be solved easily by creating a copy of the media with the profile already applied, instead of trying to apply it to a "fresh" copy in real-time. Perhaps they didn't want to introduce that sort of complexity.
 
I don't believe ICC profiles are necessary since the target is always DCI's X'Y'Z' space. Since there is only one target (as opposed to multiple possible targets/inputs), ICC profiles aren't necessary.

It is up to the playback system to map from X'Y'Z' space to the display device's color gamut. It can use whatever system it wants for that.
 
I don't believe ICC profiles are necessary since the target is always DCI's X'Y'Z' space. Since there is only one target (as opposed to multiple possible targets/inputs), ICC profiles aren't necessary.

It is up to the playback system to map from X'Y'Z' space to the display device's color gamut. It can use whatever system it wants for that.

Well, sure, you don't need to embed an ICC profile in a DCI movie, because you know what color space it's in. ICC profiles would be useful for that second problem, though, the mapping of that X'Y'Z' space to the display device. It would be been nice to have a standard approach to this based on a widely deployed technology, instead of leaving the whole thing up to device makers.
 
Back
Top