Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

CHARTS! CHARTS! CHARTS! too many Charts, Too confusing,...

Thank you!!

Thank you!!

I thught I was off base in the other thread. This quote confirms my understanding of sensors:
"the MTF curve for film typically drops off fairly quickly relative to its limiting resolution, whereas solid-state electronic sensors tend to have high MTFs through their resolution ranges, with a sharp drop at the limiting resolution, and often with spurious detail due to aliasing past that point)."
 
Does choosing to post resolution results in line pairs per sensor height rather than “TV lines per picture height” have anything to do with how movies are projected? Is it essentially saying that movies now are projected in their own equivalent of SD and so far only film and RED allow them to be projected in "HD"?
 
No, this is just different units- it has nothing to do with how the image is projected
 
We're talking system MTF though, which is made of lens, OLPF, sensor, recording process and decoding process. Film can alias, but for the most part avoids it being visible through a random sampling structure. Of course, if you scan film, you're using a sensor again though. If you just look at the MTF of a sensor there is no sharp drop off, else we'd not need an OLPF to minimize aliasing.

Graeme
 
Graeme, I have a question. If you let the sensor vibrate'/move to an amount that is less then a pixel but maybe more than a sub-pixel for each frame, could you simulate a more random sampling structure like film emulsion and thus reduce aliasing?
 
Felix, I can see such a motion having the effect of a gaussian profile, like the old scanning dot profile of tube cameras actually. The end result would be very similar to an OLPF, but very much more expensive.

Graeme
 
It's funny how much stuff ends up as a gaussian or gaussian-like. It's also why CRTs were so nice to look at - the gaussian spot profile was very pretty and hid a lot of nasties.

Graeme
 
One could also try to read out different sub-pixels as one pixel and thus change the pattern a little bit for every frame, no? But then the sensor layout would probably be more complicated. Sorry, I just love to think about that kind of stuff :D
 
Probably need to go to a silly resolution to make that work - you could prototype it in SD from 5k though, if you had the time....

Graeme
 
True. Lot's of work that could be done by a postproduction plug-in once you shoot in that resolution. Thanks for letting me borrow your mind for a sec :)
 
We're talking system MTF though, which is made of lens, OLPF, sensor, recording process and decoding process. Film can alias, but for the most part avoids it being visible through a random sampling structure. Of course, if you scan film, you're using a sensor again though. If you just look at the MTF of a sensor there is no sharp drop off, else we'd not need an OLPF to minimize aliasing.

Graeme
Thanks for the sanity check Graeme- I keep forgetting about that darn OLFP. The more I learn about it the more I despise it....
Where did the 1700 lines on Alexa come from? I thought that was a 1080 camera....
speaking of fresh approaches- couldn't the "aliasing" frequency(band) be just removed entirely? in post for example. Have you tried it?
 
OLPFs are a necessary and complex to calculate into the scheme of things, but you're right you can't ignore them....

Graeme
 
Thanks for the sanity check Graeme- I keep forgetting about that darn OLFP. The more I learn about it the more I despise it....
Where did the 1700 lines on Alexa come from? I thought that was a 1080 camera....
speaking of fresh approaches- couldn't the "aliasing" frequency(band) be just removed entirely? in post for example. Have you tried it?

In the article it said the additional lines came from aliasing causing the test to read additional false resolution, which they said is one of the problems you have to deal with when shooting on Alexa. It was kind of nice to read that as generally these kinds of tests are all roses about the Alexa - this one seemed very unbiased to me.
 
In all sampled systems you must be concerned about aliasing. We take great care to minimize the dangers of aliasing, as our example resolution and MTF charts show. However, it's just as important to beware measuring aliasing as false resolution when you're testing resolution. Some traditional charts are not that good at showing aliasing easily, and are tricky to read. The other issue is that some charts are not always that accurate - you should always do a physical measure of the chart and it's lines and framing guides to ensure reasonableness. And if you get an answer for resolution that's greater than the pixel count, check, check and check again as you've done something wrong.

Graeme
 
In the article it said the additional lines came from aliasing causing the test to read additional false resolution, which they said is one of the problems you have to deal with when shooting on Alexa. It was kind of nice to read that as generally these kinds of tests are all roses about the Alexa - this one seemed very unbiased to me.
If you read more carefully you'd understand that they say that the aliasing accounts for extra 76 lines(from 1620 stated to 1696 measured) That's understandable. I'm asking about the difference between 1080 and 1696. Can you even get that out of Alexa? I thought that even raw is in camera downsampled to 1080....
 
If you read more carefully you'd understand that they say that the aliasing accounts for extra 76 lines(from 1620 stated to 1696 measured) That's understandable. I'm asking about the difference between 1080 and 1696. Can you even get that out of Alexa? I thought that even raw is in camera downsampled to 1080....

848 line pairs from 1620 bayer samples on the Alexa would imply 105% debayer efficiency which is awkwardly not possible.
If they're looking at aliasing as resolution then that makes the test/evaluation methodology suspect.
 
OK, One more time... They stated in the article that the 76 (5%) lines are aliasing.but that is NOT what I'm asking. What I wonder is how they got more than 1080 lines out of that camera. I thought the output was in camera super sampled down to 1080 so you never get more than 1080(2K) or so...
 
LP/PH is such an awkward measurement.... To get to the horizontal resolution, x2 to get from LP to lines, and then multiply by 16/9. 848LP/PH is therefore 3015 lines, which is greater than the 2880 of the readout area of the sensor. You can't at that point say that that's 135 lines more than possible, hence 135 are aliasing, the rest are real. The only conclusion you can make is that something went wrong in testing - you can't derive "correct" numbers from the data.

Graeme
 
Back
Top