Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

Ask David Mullen ANYTHING

I don't think it would safe to run the magnetic ballast HMI on the non-sync generator; I'd be curious if the electronic ballast can handle it -- probably, within reason. HMI's do need some extra power though when striking them, so you have to have some leeway there.



120 ASA is really slow -- you'll either have to shoot around bright store windows and whatnot as a background, or light the background. Hopefully your 35mm lens is really fast. You may have to live with opening up the shutter.

Thanks; we'll rent the crystal sync generator.

These adapters are horrible, but we can't afford a faster rig and we need the shallow depth of focus to hide our low budget; we'll go to 270º and f1.4 at night if we have to. The crazy thing is that we're going to a 2.35:1 super35 blow-up. This will be the fuzziest film ever.

Anyhow, thanks again; this is an extremely generous service you're providing.
 
Policar, what about switching adapters for the night scenes? If you switch at that point to something like an SGblade with the faster GG or M2Encore, or a Letus Ultimate then you might not notice it even if coming from an M2 or SGpro. Then switch back for the rest of your day scenes. Just my 0.02.

Just added "Prince of the City" to my queue. Thanks David!
 
Policar, what about switching adapters for the night scenes? If you switch at that point to something like an SGblade with the faster GG or M2Encore, or a Letus Ultimate then you might not notice it even if coming from an M2 or SGpro. Then switch back for the rest of your day scenes. Just my 0.02.

Just added "Prince of the City" to my queue. Thanks David!

That's a good idea, but we only have money for one adapter...so we'll go with the fastest one we can find and use it for everything. Thanks for the good advice, though.
 
Thanks for the thorough response David - I expected nothing less. You are such an amazing resource for this board. Thank you, really!


About Prince of The City - the cinematographer, Andrzej Bartkowiak is more or less a father figure for me - a super gifted DP and maybe the nicest guy on the planet. I urge everyone to check out his work on The Verdict, especially.
 
Thank you for all your response David...
According to my Drive Shoot question.
If you have to shoot four people in a car,
(two in the front, two in the back)
and all four would have Dialog,
what about camera axis?
 
The thing to always remember about screen direction is that the rule only exists so that the orientation between people is clear and that you know who is talking to whom when intercutting singles.

So if the geography is clear (you know where everyone is sitting in the car) and you know who is talking to whom (maybe because both are in the same frame, etc.) then screen direction really only matters when you start intercutting singles, and even then, it's not so critical in a car because the geography is so clear.

I mean, let's just say you're talking about two people in a car, both in the front seat -- if you shot the driver's single from the back seat over his right shoulder so he was looking left to right towards to road and passenger, but then you out the camera in the driver's window and shot the passenger looking left to right also at the driver, well, technically you've crossed the line... BUT IT DOESN'T MATTER. Only the idiots in the audience would be confused as to who the person is talking to in their close-up. The driver is obviously talking to the passenger and the passenger is obviously talking to the driver whether you shot through the front windshield, the side windows, or the back seat, and they flipped from screen left to screen right, whatever.

Even when you match screen directions, like when covering a raking 2-shot angle across the front seat, from each side of the car, the background ends up going in the opposite screen direction on each cut to the reverse angle anyway.

So don't lose sleep as to screen direction issues when covering the four people in the car -- any interesting angles you get will be useable. Lighting four people is a bigger deal, especially in those horrible SUV's where the back seat side windows and rear windows are tinted so dark.

Now you may want to cover mostly from one side of the car, like the passenger side let's say, just to reduce the number of angles possible.

So just keep that in mind -- you only need to follow screen direction when it is necessary in order to not be confusing to the audience (assuming you don't want them confused, sometimes you do), and that tends to happen mostly when intercutting singles. So if something isn't going to be confusing, then don't worry about screen direction so much. Let's say two people are having a fist fight while struggling on the hood of a parked car -- you could shoot that from any angle and no one is going to be confused because the car itself provides such a strong point of geography. But as soon as the two people stop fighting and look at each other, when you cut into close-ups, then screen direction starts to matter so that they look like they are looking at each other.

I brought this up before, but look at these shots from "The Shining":

shining1.jpg


shining2.jpg


shining3.jpg


shining4.jpg


You see that the two wide shots are a 180 degree dead reverse of each other, crossing the line, but that in the singles, Kubrick establishes a line and doesn't cross it.
 
Thanks...
I was thinking the same about the geographic of a car, everyone knows so well.
I remembered a good example for Single shots that are not confusing.
Although its a music Video, the four charcters interact and camera is all the time
crossing the line, but I buy it.
ALANIS MORISSET - IRONIC
http://www.youtube.com/watch?v=8v9yUVgrmPY
Thanks for the Kubric shots, makes it really clear (and I love him).
So you can jump over the line but once in a frame where orientation gets lost
(like a close up) you better stay in one line.
But being in terms of screen direction.
I always wonderd about round Table scenes,
havn't analysed one yet,
but is there a rule of thumb.
You would often have to cross the line.
Would I need to always establish that with a wide- or dolly Shot?
(If I dont want to confuse)
Or 5 people spread in all directions of a room?
Is it enough to establish their postions once?
 
T Lighting four people is a bigger deal, especially in those horrible SUV's where the back seat side windows and rear windows are tinted so dark.

Thats a very good point.
Reading the workbook of MICHAEL BALLHAUS (interviewed by TOM TYKWER) he talks about using gelled christmastree light
(a chain of light - don't know the accurate word)
to get a soft light powered by the car battery.
Today one can use Litepannels.
Three questions if you don't mind.
1. How would you light this four people in a car situation (day and night)
2. What experiences have you made with sound recordings
Do you by chance know what kind of mikes they used?
Hidden Lavs, or hidden supercadiol capsels?
3. Did you ever in terms of the skintone recognize any difference
between HMIs and full CTB gelled tungsten lights, or even Kino Flos and
LED lights? is there a difference in skin reflection?

(By the way Im trying to read all the posts of this thread,
reaching the mid now, and loving it more from page to page,
so if you have to repeat yourself, just tell me, and Ill find out while reading!)
 
Thanks...

I always wonderd about round Table scenes,
havn't analysed one yet,
but is there a rule of thumb.
You would often have to cross the line.
Would I need to always establish that with a wide- or dolly Shot?
(If I dont want to confuse)
Or 5 people spread in all directions of a room?
Is it enough to establish their postions once?

That's too abstract a scenario, it would depend on the scene. Vaguely I'd say that you may want enough two and three-shots and over-the-shoulders to connect people and remind the viewer who is sitting next to who. But that's just in the abstract.

I mean, look at the trial in Dreyer's "Joan of Arc" -- he avoids wide shots, he avoids making it clear where Joan's interrogators are, he cuts to a bunch of close-ups as the judges stare at her... it's all deliberately unnerving and confusing, giving the impression that Joan is being overwhelmed and overpowered by authority out to destroy her. And Dreyer shot conventional coverage, he just didn't use much of it in editing.
 
In the SUV scenario, I would seriously try to get the tinted windows switched out, whether it was for a day scene or night scene. Lighting at night isn't so hard, you can hide weak lights like LitePads, Mini-Flos, etc. -- it's day interior lighting that can get hard if the people in the backseat are very dark.

--

Don't ask me about mics, I'm not a sound person.

--

HMI's, gelled tungsten, Kinos, etc. all look somewhat different on skintones even when gelled to match, but the differences may not be enough to worry about, it just depends.
 
... it's all deliberately unnerving and confusing, giving the impression that Joan is being overwhelmed and overpowered by authority out to destroy her. And Dreyer shot conventional coverage, he just didn't use much of it in editing.

Youre completely right!
Thank you for opening my eyes again.
Its about through which character I want to tell the scene,
and to create his feelings with the tools I have!
 
Inverse Square Law: Does it ALWAYS apply?

Inverse Square Law: Does it ALWAYS apply?

Hello again. I didn't think I'd be back so soon but I've got a dilemma.

I completely understand the Inverse Square Law, and how/why it works with lighting units. What I am currently having trouble understanding is why the Inverse Square Law (ISL) does not seem to occur in reflected light bouncing to the eye/camera or when viewing apparent surface brightness of a source.

A theory I have is the ISL does effect the reflected light we photograph and see, but it is negated since moving away (which should reduce the light) only needs the light energy over a smaller area of the negative. In turn, moving closer to the reflected light source (which should increase intensity) requires the light to fall over a greater area on the negative, again negating the effects.

If this is correct, and I have answered my own question, I still have some issues with this theory.

Why does my reflective light meter read the same F/stop when I'm 2 inches or 20 feet away? I loaded a white screen on my laptop, and took a reading from literally against the screen, and then another about 20 feet away (before the spot meter angle of view was no longer good) For both readings the screen filled the entire spot meter 'frame' and gave the same exposure.

I'm also racking my brain on how this would apply to zoom lenses. The lens could enlarge the reflected light upon a greater area of the negative, yet not gain the increase in light because it didn't move forward.

I guess I'm looking way beyond what is necessary, but it boggled my mind that reflected light doesn't appear to be effected by distance which breaks the ISL.

Thanks again!

-Ryan
 
Hello David,

I have a some questions about pre-production. How do you do shot lists? Do you go to all of the locations with the director and talk about every shot in the movie before production starts? At that time, do you and the director figure out blocking, and do you also decide how to light each shot? What notation or documentation do you produce to describe this? Is this documentation that the gaffer and key grip can read directly? How do storyboards come in? Do you take stills to give to the storyboard artist, or do you describe each shot to him/her?

Thank you,
Chris
 
Why does my reflective light meter read the same F/stop when I'm 2 inches or 20 feet away?


First of all, remember that Inverse Square Law only applies to a point source. But even broad sources suffer from fall-off.

But as for why your eyes and a camera lens does not see reality fading into darkness as objects recede, I don't know the scientific answer but I would guess it is because your eye or the camera lens does not need or use the volume of light reflected by an object, it is focusing a few photons through a lens, so it doesn't matter as much that a lot of other photons are scattered and lost over distance. Your eyeball isn't being "lit" or exposed by all the bounced light off of an object, it is taking a picture using only a few representative rays of light coming from the subject. That's a guess though, because it's still confusing when you think about it. I mean, I assume a point source of radiation (visible electromatic radiation in this case) behaves according to the inverse square law, but a single stream of particles (photons, electrons, etc.) doesn't. So how does a laser beam behave? They seem to travel much further than a typical light so perhaps because of a lack of scatter, they don't follow the inverse square law as much.
 
Why does my reflective light meter read the same F/stop when I'm 2 inches or 20 feet away?

The inverse square law does still apply, in a way. When you move farther away, the light meter is picking up light from a larger area, so each point gives it less light, but it is getting light from more points. With a camera, if you move the camera back and put a longer focal length lens on so you see the same area, you would use the same f/stop, but the diameter of the lens opening will be larger to collect the same amount of light.
 
Hello David,

I have a some questions about pre-production. How do you do shot lists? Do you go to all of the locations with the director and talk about every shot in the movie before production starts? At that time, do you and the director figure out blocking, and do you also decide how to light each shot? What notation or documentation do you produce to describe this? Is this documentation that the gaffer and key grip can read directly? How do storyboards come in? Do you take stills to give to the storyboard artist, or do you describe each shot to him/her?

There's no single method and no right or wrong way to do this. And it's not even necessary in all cases to have shot lists or storyboards, nor if you do, is it necessary to follow them. They serve really two purposes: as memory aids and as communication aids. Memory aids as in you want to remember some interesting ideas you have for shooting the scene, and communication aids in that you want other people to know what you are planning in case it affects their work.

You basically prioritize -- you start by discussing (between you and the director) the most complicated sequences, breaking them down into manageable shots. If necessary, you storyboard them because you want people to see the graphics of the sequence, what might be in the background, etc. Less visually intensive sequences can be shotlisted. Less complicated scenes, well, you can be vaguer about the shot list.

Sometimes you have a dialogue scene with a lot of actors moving around a space and you decide that you need to block it out with the actors in a rehearsal, you can't adequately predesign the blocking without the actors' input. So you try and be aware of the time restrictions in which you have to shoot and cover the scene, do the blocking rehearsal, and then quickly break it down into shots right then, sometimes the script supervisor taking notes.

But if you do that, go without a shot list, at least try to make a guess at special equipment needs that may require some prep, some advance work to organize. You don't want to block a scene on the day and then decide you need a Steadicam, if you aren't normally carrying one (and even if you are, you want to plan on the time it takes to set-up.)

Shot lists can also be a bit pointless if they are unrealistically long, because you may end up redoing the list on the shooting day anyway to make it realistic. Slightly long is OK if the director can separate the "must haves" from the "nice to get" shots, i.e. prioritize.

There is an old saying that on a film shoot, it's " 'Gone with the Wind' in the morning and 'Dukes of Hazzard' in the afternoon" -- that you spend too much time in the morning and you rush like mad in the afternoon to make up for it.

In other words, you always have less time than you think, and if you think you are on schedule, it means that you are probably behind schedule -- so just assume you are behind schedule.

More often than not, a simple shot turns out to be harder to shoot than planned, so give yourself some wiggle room in your shot list.

It's not a bad idea necessarily to shot list the whole movie because it's like a first draft, one pass at breaking down the movie into images and cuts. That can be enormously helpful for simply getting to know each scene and its dramatic intent -- even if ultimately you find yourself deviating from the shot lists. But it takes a lot of time to go through a whole movie in such detail, and if you start at Page One (not a bad idea) you may not make it to the last page. So I would do the opening and closing of the movie, and key scenes in between, and then fill in the gaps if you still have time.
 
The inverse square law does still apply, in a way. When you move farther away, the light meter is picking up light from a larger area, so each point gives it less light, but it is getting light from more points. With a camera, if you move the camera back and put a longer focal length lens on so you see the same area, you would use the same f/stop, but the diameter of the lens opening will be larger to collect the same amount of light.

Yes, but I think his question is why, for example, when we dolly the camera away from the subject, it doesn't get dimmer? After all, it is receiving less and less of the light reflected off of the subject. But I think that's because the eye or the camera only needs a few representative photons, it doesn't use the entire volume of reflected light.
 
Yes, but I think his question is why, for example, when we dolly the camera away from the subject, it doesn't get dimmer? After all, it is receiving less and less of the light reflected off of the subject. But I think that's because the eye or the camera only needs a few representative photons, it doesn't use the entire volume of reflected light.
Ah, yes, each camera pixel (or film grain) needs a certain amount of light to expose it properly. As the object gets smaller in the frame, it is lighting up fewer pixels, so the object's smaller amount of light (from being farther away) is still sufficient. Am I still not getting it?
 
^ That is my best guess, too.

Thanks for the replies! It's a tough question I've been banging my head against for the last day. I was posting here hoping for a 'hail mary'.

This discussion was from a different website, which I take part in and the thread was quickly hijacked by this intriguing need to know why.

I still do not understand how the spot meter doesn't account for distance differences, but your answer I think has given me a better idea.

Thanks again!
 
I think it's partly about how focussed the light is. As in with a laser, it doesn't seem to get dimmer as quickly because the light is coherant (meaning all going in the same direction). With a lamp the light is getting tottally scattered, less so with a fresnel but still pretty much scattered, the same with a reflected light source like some white card or an actor. When a light is shone right in your face it feels much more offensive than when it's far away, but kinda looks just as bright.

I'm not sure I'm really getting at anything here. But yeah, I guess the fact that when something is further away, it's smaller amount of light is filling up less photo-sights or photoreceptor cells so that compensates.

Edit: perhaps the spot meter is being "polluted" by nearby light?
 
Back
Top