Thread: Cinematography of 2030s: Ultra HFR! I have witnessed realtime 1000fps on real 1000Hz.

Reply to Thread
Page 2 of 5 FirstFirst 12345 LastLast
Results 11 to 20 of 43
  1. #11  
    Senior Member PatrickFaith's Avatar
    Join Date
    Nov 2011
    Location
    California
    Posts
    2,557
    Quote Originally Posted by Mark Rejhon View Post
    Interpolation?

    Yes, clever interpolation techniques can be done to increase frame rates. That will probably be a necessary piece of the jigsaw puzzle, too.

    In the past, interpolation was a really evil thing with lots of artifacts (soap opera effects). But the art of interpolation is rapidly improving, especially if you give it "all the information" possible. Parallax backgrounds, positional information, other missing info to avoid artifacts from guessing logic, etc.

    One subcategory of SOE artifact (of many) is ultrasmoothness without reduced blur: The camera exposure length should be really short for low-framerate video intended to be converted to ultra-HFR. 24fps video with 1/48sec motion blur doesn't always "look right" when converted to a 120fps HFR video, unless there's a method of de-blurring (of which research now exists).

    Source-based awareness also helps avoid a lot of interpolation problems. Video compression standards are heavily interpolation based (H.264 and H.EVC) so even Netflix, 4K Blu-Rays and E-Cinema is practically >99% mathematically based on predictive math arts found in interpolation -- the I-Frames, P-Frames, B-Frames, and the similar equivalents. It avoids problems because the codec 'knows' the original uncompressed. But an intermediate interpolator (e.g. Sony motionflow) is missing this, along with temporal/positional info, so it has to do a ton of 'guessing'. And the problem is compounded when you're doing 3 dimensions (holographically, true voxels or polygons, not stereoscopically).

    Likewise, Oculus Asynchronous Spacewarp (virtual reality) converts 45fps into 90fps virtually laglessly without rerenders, by virtue of high-resolution positional knowledge (mouse trackers, head trackers, etc are available at ultra-high-Hz today -- many gaming mice already operate at 1000Hz now).

    Some future HFR video may be recorded with ultra-precise gyroscopic/accelerator/positional information, to aid a computer in converting the video to a 3D environment, with less guessing. Smartphones have all the sensors needed, and is already being used to "scan" real world objects into 3D (and things like ARKit etc) so naturally all this happening already. However, this extrapolates into helping theoretical future cinematography techniques needed for ultra-HFR techniques and Holodeck-like environments.

    Reducing the mandatory need for tracking points (e.g. pingpong balls) of traditional motion capture, not practical for things like recording documentaries and real-life, cameras (e.g. RED) might end up record ultra-precise telemetry too (e.g. gyroscope, accelerometer, AR data, synchronized with one, two, three, or other other recording angles of different cameras, etc) -- I imagine this is already being done by some of you guys for certain kinds of cinematography. (Whether it's for greenscreen help, or help with CGI overlays) -- but it looks for future HFR framerate-upconversion assist will also (additionally) eventually greatly benefit from improving-accuracy that's occuring in this telemetry data in the coming years (decades).

    The more telemetry data recorded alongside the video, the more accurately and perfectly future algorithms are able to convert films into framerate-independence or convert films into true-3D environments with less man-hours of human help.

    It's a massive amount of processing work and manpower, but rapidly falling. Retina 3D environments (Holdecks) capable of tricking a human, generated from a video, are currently still a figurative Olympus Mons on a faraway planet, compared to the Mount Everest challenges on this planet we're only barely scaling today ...but the light is at the end of the tunnel within a couple human generations (or even within maybe one, given an Apollo Mission push -- but there isn't one occuring -- so I give it two human generations to successfully pass the Holodeck Turing Test on an 'any-material' basis).

    Mark Rejhon
    Founder, Blur Busters
    Yep, agreed. On the stuff I see the frames are normally mapped to a mesh, then a virtual camera ray traces the mesh at whatever speed is desired (with the virtual camera set at 360 shutter and also motion blur is applied via the meshes velocity vector). Recently though, an app was doing fine, but then this version "felt" like bad/jerky credits - so there are a lot of little tricks to this.

    My assumption is RED hydrogen probably is doing something fancy on this too, for example if I watch a film in the "3d" mode that is at 24 fps, I bet it converts it to way faster then that so as you move the phone around it "feels" right.
    Reply With Quote  
     

  2. #12  
    Senior Member
    Join Date
    Oct 2010
    Posts
    1,188
    Quote Originally Posted by Mark Rejhon View Post
    ...

    My questions to readers are as follows:

    (1) Has there been any recent innovations in Ultra HFR? Video of frame rates above 120fps outside the laboratory, played realtime, not slomo?
    e.g. Presentation of true-240fps video (non-slomo) on now currently-shipping true-240Hz eSports gaming monitors (more than one dozen models are now on the market).

    (2) Are there any pre-existing articles in the HFR community, that accurately explain in a simple way:
    e.g. How source persistence & destination persistence interact to create, a final human perceived display motion blur.


    ...

    Not aware of any recent innovations in Ultra HFR recording capabilities.

    When shooting at a cropped resolution of 2K FF (2048 x 1080), RED camera's using the 8K resolution 'Helium' sensor can shoot 240 fps with an almost visually lossless data compression ratio of 4:1.

    The Phantom Flex 4K camera from Vision Research can record 1977 fps @ 2048x1080 & 1000fps @ 4096x2160.

    Haven't seen anyone presenting 240fps non-slomo footage though, but I'm not a HFR expert myself, so maybe it's happening and I'm just not aware of it.

    Likewise, not aware of any articles explaining "How source persistence & destination persistence interact to create, a final human perceived display motion blur.", but am also curious as to where such developments might be at.
    Last edited by Les Hillis; 02-05-2018 at 03:54 PM. Reason: Typo
    Reply With Quote  
     

  3. #13  
    Haven't seen anyone presenting 240fps non-slomo footage though, but I'm not a HFR expert myself, so maybe it's happening and I'm just not aware of it.
    It's already happening in the laboratory.

    You can do it too! It's easy and cost under $2000 now.

    Since 2017, it has become possible for only $1500-$2000 to begin experimenting with 240fps HFR. The instructions are relatively simple, but produces interesting results, and costs only about $2000 to experiment with 240fps HFR in the laboratory. Derived from Ultra HFR instructions.

    Instructions: How to create & view 240fps HFR for under $2000

    1. ~$500+ - Get your favourite slo-mo camera. Begin with a cheap 1080p 240fps camera.
    2. ~$500 - Get a GPU capable of 4K60 playback = enough GPU power for 1080p 240fps.
      For 480fps 1080p HFR, get a GPU capable of 8K 30fps
      For 960fps 1080p HFR, get a GPU capable of 8K 60fps
    3. ~$500 - Get a true-240Hz gaming monitor (e.g. Viewsonic XG2530, ASUS ROG PG258Q, Acer XB252Q). There's now a dozen models that came out in last 12 months. Google "list of 240Hz monitors".
    4. Film something at 240fps, for one-eighth speed playback at 30fps. Save as .mp4 file.
    5. Install open source ffpmpeg. Run this ffmpeg command line to speed up slo-mo to realtime:
      ffmpeg -i slowmo240.mp4 -r 240 -vf "setpts=(1/8)*PTS" -an realtime240.mp4
    6. Play back on your monitor. 240fps HFR!

    For Future cheap 480Hz and 960Hz:
    For 480fps and 960fps HFR, the command lines are (for speeding up 30fps slo-mo to realtime)

    ffmpeg -i slowmo480.mp4 -r 480 -vf "setpts=(1/16)*PTS" -an realtime480.mp4
    ffmpeg -i slowmo960.mp4 -r 960 -vf "setpts=(1/32)*PTS" -an realtime960.mp4

    The "setpts" is your speedup factor.
    30fps to 240fps = 8x speedup = "setpts=(1/8)*PTS"

    At this stage, you need to purchase 5-figure scientific projectors at this stage to successfully play these files in high resolution real time true-480Hz (color) or true-1000Hz (usually monochrome, unless you do the 3-projector + color-filter trick). However, tomorrow's displays capable of higher refresh rates, will be able to pull it off. The ultra-HFR bar is falling relatively quickly in the computer world.

    Workflow Issues

    No movie editing software supports 240fps HFR today. And you need a separate microphone to record high-quality audio, because most slo-mo cameras won't properly record audio.

    The best laboratory experimentation workflow (as of 2018):
    1. Use your favourite movie software edit your film at original slo-mo.
    2. Once done, output your master to an .mp4 file.
    3. Use command line software to convert the video file into a higher frame rate.
    4. Finally, use command line software to attach (separately recorded) audio to the video.

    This is easiest for single-clip experiments (no scene changes). But if you are going to do scene changes, you can note time offset or frame offset of scene changes, and use that as your reference points -- to dub the separately-recorded audio by command line to different parts of a .mp4 file. This can be better automated (with a script or batch file)

    Phantom Flex cameras will work too for realtime ultra-HFR too. However, since film software cannot succeed in editing realtime video, you have to master the film at an intentionally low frame rate (e.g. 30fps, 60fps, 120fps) then use a command line utility to speed up the frame rate as a final video-related step after all the mastering. Then afterwards, merge your separately-recorded audio to the file (via command line utility). As no cinema software supports ultra-HFR today. This allows you to use existing cinema software workflows, and do the framerate-speedup only as a final step.

    I can help assist a world's first public demo

    There are no known public exhibitions of ultra-HFR, but as of Year 2017, 240fps HFR works successfully with just a hobbyist budget. 1000fps or 1440fps HFR is now achievable on small-business budget (5 figures or low 6-figures) for experimentation.

    As Blur Busters, I will be happy to work with anybody who wants to figure out how to set up a public exhibition (convention, etc -- e.g. NAB, etc) of 1000fps HFR.

    My calculations is it is currently doable with presently available pieces of equipment that has never been married together. (e.g. three ViewPixx quad-Hz projectors running in color-filtered mode, overlapping 3 monochrome images into a true 960fps, 1000fps, or 1440fps HFR full-color) -- combined with intentionally speeded-up Phantom Flex video footage -- merged with concurrently-recorded audio on a separate high-quality microphone.

    A demo should happen. It may be a gimmick and publicity thing, but it would pretty much stun the world that a thing is already possible today, and begin a lot of discussion of "what is it good for?" -- also witness CRT-clarity with zero need for camera blur and zero stroboscopic effect -- a magical effect only a few human eyes have seen -- that many eyes could easily feast on in a public demonstration! For one example, I think big-budget amusement park rides (And certain big customers) would pay a pretty penny for this 'display effect' that feels extremely unique, seen by so few eyes, so there might be a business case for some of the outfits whose employees/contractors regularly reads these forums.

    I also am developing experimental software that can realtime-split color channels (R/G/B) into three separate fully synchronized video outputs -- 3 video outputs with 1000fps monochrome video of each color channel. This is perfect for commandeering three monochrome 1000Hz scientific projectors, putting a color filter on each of them, stacking them (or pointing the three of them to a dichroic mirror) and display a ultra-HFR color video! I am a computer programmer myself, so I can assist in ultra-HFR experiments, like a "world's first" exhibition (NAB, CEDIA, etc). Lots of starter advice and ultra-HFR due diligence offered free of charge in exchange only for Blur Busters credit. mark[at]blurbusters.com

    Beyond that, if anyone so desired, I can assist in helping pull off a demo at one of the upcoming conventions. Let's start with a 1000fps realtime ultra-HFR video demo. AFAIK, this would be a world's first. With my skills, combined with your skills, this is something achievable and exhibitable this year (2018) or next (2019). This would be the "Farnsworth Experiment" showing a displayfeel that no members of the public have ever seen.

    Player Software Requirements

    Player software is extremely variable quality for ultra-HFR.

    Radeon versus NVIDIA has major differences in playback quality with different video player software, for example. For some graphics cards, VLC plays better. On others, Windows Media Player worked better, but I've also even seen a web browser <VIDEO> element play full HFR properly (this works at 120fps).

    Fiddling with settings in VLC helps -- e.g. using different video decode settings. A different brand of player may play smoother than others.

    Overkill is your friend, too. Extra GPU horsepower headroom helps a lot -- a GPU capable of 8K 60fps often plays 1080p 240fps HFR really smoothly; just that a GPU capable of 4K 60fps is the minimum GPU you need for 1080p 240fps HFR. Headroom is your friend for de-stuttering ultra-HFR video.

    OS Recommendations

    Windows. Microsoft Windows 10 works fine at 240Hz, 360Hz and 480Hz in my lab. Intel i7 and GeForce GTX 1080 Ti recommended. I have 240Hz, 360Hz and 480Hz in-house.

    NOTE: I have only field-visited 1000Hz+ displays, but given my 480Hz experiences, and today's 8K60-capable graphics cards being able to decode 1080p960 video realtime, I don't anticipate blocking factors.
    At the moment, I'd suggest GeForce Titan Xp card for 1000fps HFR playback. Their video outputs can reach 1000Hz and even 1440Hz with the help of custom resolution utilities (ToastyX CRU, etc).


    For this interim (2018) I currently do not recommend Linux for ultra-HFR yet at this stage due to VSYNC issues, but if you know of a Linux workflow that successfully does 120fps HFR, let me know -- I'll test it for 240fps HFR -- since I want to know specifics (graphics drivers, window manager, kernal version, modeline settings, VSYNC settings, etc)

    Not sure of Mac, but if you try Mac, then SwitchResX is your best friend for forcing a Mac to ultra-high-Hz.
    I have heard of 120Hz and 240Hz successes.
    Last edited by Mark Rejhon; 02-02-2018 at 05:03 PM.
    Reply With Quote  
     

  4. #14  
    Quote Originally Posted by Alex Lubensky View Post
    And after all, you'll still have a whole century of beautiful cinematography done in 24p and established to be viewed in 24p. Only this can last the world of 24p for another half of a century.
    I love 24p cinematography, so don't get me wrong! :)

    Yes, 3D is a fad.
    Some things stay a fad, and some things come back over and over.
    Video games were originally thought to be a fad, then the 1982-1983 Video Game Crash.
    Games recovered and is now bigger than Hollywood, while using cinematographic techniques too (e.g. motion capture).
    3D crashed many times too. This won't be the last one. Holodecks are a totally different thing than 3D stereoscopic too.
    In all likelihood, it's also a parallel blended industry like movies versus games.

    Quote Originally Posted by Alex Lubensky View Post
    Yes for VR, yes for 3D, yes for viewfinders, on-board monitors etc. But it doesn't relate to actual world of today's cinematography overall.
    ...That said, I've also seen various argument over the decades:

    - Motion capture and pingpong balls is not cinematography
    - CGI is not cinematography
    - Digital is not cinematography
    Etc.

    What was your grandfather's cinematography isn't necessarily the same exact definition of today's cinematography. Simultaneously, there are attributes that overlaps universes, e.g. the overlap between games and movies -- especially motion capture done for games, to the tie-ins frequently done between games and moviesWhether, completely generated, or motion-captured, or insertions of various FMV into the game. So while games/VR are not necessarily cinematography, there are many cinematography techniques being utilized, so this is still certainly a legitimate (Albiet for today, somewhat outlier) topic thread when presented with this point of view.

    Quote Originally Posted by Alex Lubensky View Post
    The technology itself is a beautiful thing, but sometimes we're overhyped about it. Personally, I don't see almost any benefit of shooting HFR for viewing HFR for today's perception of the word "cinema".
    There is still much overlap to future cinematography techniques. Concurrently, there will be times where 1000fps needs to be displayed on a flat rectangle, too -- there's still benefits there (once it's cheap enough to do so). Instead of 24p->120p->24p, you might have 24p->1000p->24p, if the capability is already in the cameras and the displays. Or it might be utilized differently. Who knows -- many unthought overlaps -- There'll be many spinoffs between both worlds: Imagine this, the Holdeck-like technologies (turning a movie into a flawless 3D environment) will make possible virtual camera angles (camera filming angles for cameras that wasn't even there in the original place) -- and those cinematography might actually be put at 24fps with traditional projection technique and traditional storytelling (even as the virtual world continues to be used for a game). Prevent a reshoot because of a missing camera angle or missing drone location etc. Just like for the past, CGI, digital, motion capture, and all the innovations, the ultra-HFR will have many spinoffs into cinematography that we haven't currently imagined yet. Given cinema 100 years ago, versus cinema today, versus what cinema will be 100 years later, pretty much brings this discussion into this scope of the word "cinema".

    Quote Originally Posted by Alex Lubensky View Post
    I wouldn't call VR movie a movie at all - it's a whole another story
    Agree. Just to point out, I intentionally focus on the word "cinematography" and phrase "cinematography technique", and the broad-ranging spinoffs, as explained above.

    Quote Originally Posted by Alex Lubensky View Post
    I don't state it's wrong, it's just too different - from the way you create it, to the way you percieve it.
    Today, it's foreign.
    Tomorrow, it's not.
    CGI. Digital. Pingpong balls and motion capture.
    All foreign stuff to the people who filmed the 1939 Wizard Of Oz with 3-layer Technicolor.

    IMHO, arguably, some may even dare to say: Potentially a vastly more different universe to us today. than this thread is to today's cinematographers. Yes, this thread is very foreign today. But less foreign than 1939 versus 2018 in many elements.

    Quote Originally Posted by Alex Lubensky View Post
    It's more like theatre, than cinema - bacause of the ways you as a viewer relate to the story.
    Yes, maybe you're right.

    Still -- it's the arts of presenting moving images to a viewer (interactive or not) -- and most of those (even games!) often borrow from the universe of cinematography technique.
    Last edited by Mark Rejhon; 02-02-2018 at 01:08 PM.
    Reply With Quote  
     

  5. #15  
    Senior Member Karim D. Ghantous's Avatar
    Join Date
    Oct 2011
    Location
    Melbourne AU
    Posts
    1,794
    Ooh, I had a little bit of an insight. The question that some people would ask is how you would do slow motion. Do you need to shoot at 2,000fps to get 2x slow motion? Do you need to shoot at 4,000fps to get 4x slow motion? No! That would be quite pointless. You just slow down the frame rate! 500fps to get 2x, 250fps to get 4x, and so on. If you played back at 24fps, you'd get 40x slow motion, which is almost useless for narrative.
    Good production values may not be noticed. Bad production values will be.
    Pinterest
    | Flickr | Instagram | Martini Ultra (blog)
    Reply With Quote  
     

  6. #16  
    Moderator Phil Holland's Avatar
    Join Date
    Apr 2007
    Location
    Los Angeles
    Posts
    11,324
    Quote Originally Posted by Mark Rejhon View Post
    My questions to readers are as follows:

    (1) Has there been any recent innovations in Ultra HFR? Video of frame rates above 120fps outside the laboratory, played realtime, not slomo?
    e.g. Presentation of true-240fps video (non-slomo) on now currently-shipping true-240Hz eSports gaming monitors (more than one dozen models are now on the market).

    (2) Are there any pre-existing articles in the HFR community, that accurately explain in a simple way:
    e.g. How source persistence & destination persistence interact to create, a final human perceived display motion blur.
    Without breaking NDAs and in the spirit of your search.

    1. Yes. With more things coming to market in response to this.

    2. Likely, as there's a few buildings tall worth of white papers out there.

    In an effort to assist where you will be going, investigate high refresh rate versus viewing medium size and distance, resolution also comes into play here. There is a point of dimishing returns on this subject matter, but it gets much more interesting when taking the type and style of content being presented.
    Phil Holland - Cinematographer - Los Angeles
    ________________________________
    phfx.com IMDB
    PHFX | tools

    2X RED Monstro 8K VV Bodies and a lot of things to use with them.

    Data Sheets and Notes:
    Red Weapon/DSMC2
    Red Dragon
    Reply With Quote  
     

  7. #17  
    Quote Originally Posted by Karim D. Ghantous View Post
    Ooh, I had a little bit of an insight. The question that some people would ask is how you would do slow motion. Do you need to shoot at 2,000fps to get 2x slow motion? Do you need to shoot at 4,000fps to get 4x slow motion? No! That would be quite pointless. You just slow down the frame rate! 500fps to get 2x, 250fps to get 4x, and so on. If you played back at 24fps, you'd get 40x slow motion, which is almost useless for narrative.
    Yes, this is correct.
    Normally, high speed video cameras are for slow-motion.
    You can slow down or speed up. Many video editors and command lines lets you do this.

    But what many people don't know is that all high speed video cameras can be sped-up to realtime for any display matching its refresh rate.

    Yesterday, we didn't have 1000 Hz displays in the lab until today. Today, a single person running a small business created a 480Hz display out of an off-the-shelf display panel. Display researchers are playing at 1000Hz, 1440Hz, 1700Hz and beyond. Now that we do, for the first time, researchers have sped up plain high speed video slo-motion clips (e.g. from a Phantom Flex) to play back at real time framerates. Yes, the cameras were never designed to do this, but boo, we can. What we are discovering is that ultra HFR produces a magical-looking display effect, and this needs to be shared to the HFR community as a whole.

    1000fps-vs-120fps (7.3ms difference) is roughly as magically big as 60fps-vs-120fps (8.3ms difference). Diminishing points of return, but the BIG massive jump upwards the curve, majorly compensates, and the difference is unmistakably noticeable, indeed.

    Yesterday, I just posted a new entry on Blur Busters containing instructions on how to speed up a slo-motion video clip to real-time video playback:
    Ultra HFR: 240fps Real Time Video Now Possible Today. 1000fps Tomorrow.

    I realize this is a niche purpose, but the effect is quite magical:
    Quote Originally Posted by Mark Rejhon
    By doing 1000fps HFR on a 1000Hz display, you can simultaneously avoid camera motion blur and avoid display motion blur and avoid stroboscopic effects. Strobless low-persistence (blurless sample-and-hold) without flicker is successfully achieved with 1000 Hz experimental laboratory displays, and is also useful with Ultra HFR video.

    Time-wise, the difference between 120fps HFR versus 1000fps HFR is roughly as big as the difference between 60fps and 120fps. This is because 1/60sec and 1/120sec is an 8.3ms difference, while the difference between 1/120sec and 1/1000sec is a 7.3ms difference. There is a curve of diminishing points of returns, but the massive jump up to 1000fps HFR greatly compensates.

    At Blur Busters, we are among the few people in the world to have witnessed Ultra HFR video!

    We believe this is very useful for many applications in the coming decade, from speciality theatre, virtual reality, amusement park rides, advanced cinema, truly immersive virtual vacations, “Holodeck” video, and other applications once more 1000 Hz displays are commercially available beginning sometime within the next decade and beyond.
    There is careers and money to be earned with Ultra HFR in the decades to come so Blur Busters is just finally breaking out preliminary HFR experiments to the public so that anybody else can work with this.

    Today, people still jawdrop: "1000 Hz???" -- are you crazy! Yet, that's what people said about 4K twenty years ago. Now a cheap $400 bottom-barrel Walmart HDTV is already 4K! 1000Hz isn't going to be expensive forever, so the first-comers first-movers gets the advantage here. Maybe not going to be widespread (maybe not even 8K).

    This is mostly anecdotal and not yet in any official scientific papers -- but we have made a breakthrough discovery:

    One important observation: From what I have seen, whether fad or not, it's still far more impressive than plain stereoscopic 3D -- for some material, it successfully jumps over the uncanny valley -- depending on how the video material is done.

    Fewer nausea from 1000fps HFR than 48fps/120fps HFR. It's surprising how the ultra-big jump upwards, helps compensate. This is not too different from some anecdotes by users of gaming monitors: Some people get nausea from phantom array effects (Wagonwheel) and different people get nausea from motion blur (less headache on CRT than LCD).

    By using brute Hz and brute framerate, you kill stroboscopic+camera blur+display blur+flicker all simultaneously -- all gone at the same time which was never achieved before on any lower refresh rate or lower frame rate -- magic recipe to jump the uncanny valley for a slightly bigger percentage of population. There is still motion sickness problems, but less than for 48fps HFR and 120fps HFR for certain kinds of niche applications -- e.g. vertigo disconnect from motion (but motion simulator rides helps quite a lot -- 1000fps is perfect for massively superior motion simulator rides).

    Old assumptions (of past research) die with this. You can't test 200fps vs 300fps, you have to geometrically go up the curve, e.g. 240fps -> 480fps -> 960fps realtime = to properly test the uncanny valley and nausea problems, and try to solve the five-sigma-nausea problems as much as possible, for things like trying to solve all the possible image artifacts problem for as many people simultaneously as possible.

    The motion blur mathematics is surprisingly simple (source camera blur and destination display blur is additive). Once somebody views this from a source-persistence (camera shutter) and destination-persistence (display behavior) perspective, in an "E=mc^2" style epiphany -- people who read many of the simplified Blur Busters explanations and view all of the TestUFO Display Motion Tests --

    Blur Busters writings over the years, have been hugely educational to many gaming-monitor display engineers. We've moved many needles there with many successes and acknowledgements by manufacturers. I hope that we can educate more camera makers and sensor makers to understand the behaviours better, to guide long term masterplan to make sure that they're not dismissing Ultra HFR as tomfoolery, in a necessary mankind century-long technological sharp climb up the geometric diminishing-points-of-return curve.

    ------

    At the end of the day, it's not just research, but my personal interest in making sure the camera community is aware:

    Part of this reason this thread exists, is I also want RED and related accessory developers (and, yes, other camera manufacturers) to make sure their cameras are ready for Ultra HFR. The 240fps setting (and faster) isn't just only good for slo-mo: It can also be used for Ultra HFR. Most slo-mo cameras are capable of Ultra HFR via intentional framerate speedup to realtime. RED camera content is already used for some VR content and game content too, and Ultra HFR will be part of the ballgame in the coming decade(s).

    Agree, it is hard to slot this topic/thread anywhere on reduser.net, so Cinematography is the closest earlycanary match (elements of cinematography technique -- e.g. motion capture, HFR, green screens -- often used for non-cinema stuff like games, VR, etc). But certainly, I think this thread deserves to exist -- as few people understand the purposes behind ultra-high-Hz / ultra-high-framerates -- and some use RED cameras for VR material and non-cinema purposes too.
    Last edited by Mark Rejhon; 02-03-2018 at 03:58 PM.
    Reply With Quote  
     

  8. #18  
    Quote Originally Posted by Phil Holland View Post
    Without breaking NDAs and in the spirit of your search.

    1. Yes. With more things coming to market in response to this.

    2. Likely, as there's a few buildings tall worth of white papers out there.

    In an effort to assist where you will be going, investigate high refresh rate versus viewing medium size and distance, resolution also comes into play here.
    Exactly, this is the "Vicious Cycle" effect that I have already written in my recent Holiday Special article (jump to section) .... Quoted here:

    Quote Originally Posted by Mark Rejhon
    Blur Busters Law Is Also a Vicious Cycle

    The Blur Busters Law (1ms of persistence = 1 pixel of motion blurring per 1000 pixels/sec) becomes a vicious cycle when it comes to increasing resolutions and increasing FOV. Persistence limitations are more easily noticed with the following:

    1. Higher resolution displays:
      The same physical motion speed travels more pixels per second. This creates more pixels of motion blur for the same persistence (MPRT).
      .
    2. Wider field of vision (FOV) displays:
      The same angular display motion speed (eye tracking speed) stays onscreen longer. This extra time makes display motion blur more easily seen.
      .
    3. You need lower persistence to compensate:
      Increasingly bigger & higher resolution screens as time progresses, requires lower persistence (MPRT) numbers to keep motion blur under control.

    Display persistence is more noticeable for bigger FOV (bigger displays or virtual reality) and for higher resolutions (retina resolutions) due to bigger clarity differences between stationary & moving images.VR

    In the most extreme future case (theoretical 180+ degree retina-resolution virtual reality headsets), display refresh rates far beyond 1000 Hz may someday be required (e.g. 10,000 Hz display refresh rate, defined by the 10,000 Hz stroboscopic-artifacts detection threshold). This is in order to pass a theoretical extreme-motion “Holodeck Turing Test” (becoming unable to tell apart real life from virtual reality) for vast majority of human population.

    However, for general CRT-quality sports television watching, 1000fps at 1000Hz would sufficiently approximately match 1ms CRT phosphor persistence, for a flicker-free sample-and-hold display. Technologically, this is achievable through interpolation on an ultra-high refresh rate display.
    ---------------

    There is a point of dimishing returns on this subject matter, but it gets much more interesting when taking the type and style of content being presented.
    Yep.

    That said, the diminishing curve doesn't end until the quintuple-digit Hz.
    It takes increasingly larger leaps up the curve, though.

    The difference between 1/60 and 1/120 (8.3ms) is nearly the same as the difference between 1/120 and 1/1000 (7.3ms) -- relates to the number of inches of motion blur per specific physically same motionspeed. Many are surprised that the diminishing curve doesn't end until the quintuple-digit Hz (in the theoretical 360-degree retina VR situation). Quintuple digits -- yes, five digit refresh rate -- is the endpoint of the diminishing point of returns curve (according to VR scientists).

    It is mathematically determined by a human's maximum accurate eye tracking speed. Head-turns on theoretical future 10K retina 180 VR headsets, can be 20,000 pixels per second panning that moves at only one screenwidth per 1/2 second = (20,000pps / 10,000Hz) = still 2 pixels of display motion blur at 10,000Hz sample-and-hold. This is still a very faint, subtle defocuss effect. So, the diminishing point of return is a long, long, long curve -- and we have to geometrically go up it for the Holodeck Turing Test.

    The theoretical Holodeck Turing Test (tricking a person into virtual reality, and them not knowing they are in virtual reality) obviously requires a wraparound retina display of some quite insane refresh rate -- in addition to solving other problems preventing a reality-match.

    Quite unobtainium, but mere 1000fps Ultra HFR at 1000Hz is "good enough" for the majority of the reality-match effect for all but ultra-fast-motion on wraparound displays.

    Because of these reasons, the geometric necessity, we are going to have a full century of display progress, simply because we have to sharply climb the curve. In 50 years, that cheap Walmart display might have the 1000Hz feature for free. Who knows? It's easy to dismiss 1000Hz, and it's fine to do so -- and it may not be relevant till the 2030s -- 4K leapt on us faster than some of us expected! Now even our pocket phones can record 4K.

    As explained earlier, camera blur and display blur are additive to each other. And since we exclude flickering displays from the five-sigma (reality has no flicker, strobing, etc), this leaves sample-and-hold displays, and research to try to reduce blur on those requires higher refresh rates (double refresh rate halves display motion blur)...

    360-degree shutter material on sample-and-hold displays:

    60fps@60Hz = 1/60sec camera shutter + 1/60sec display persistence = 1/30sec final motion blur
    120fps@120Hz = 1/120sec camera shutter + 1/120sec display persistence = 1/60sec final motion blur
    240fps@240Hz = 1/240sec camera shutter + 1/240sec display persistence = 1/120sec final motion blur
    480fps@480Hz = 1/480sec camera shutter + 1/480sec display persistence = 1/240sec final motion blur
    1000fps@1000Hz = 1/1000sec camera shutter + 1/1000sec display persistence = 1/500sec final motion blur
    2000fps@2000Hz = 1/2000sec camera shutter + 1/2000sec display persistence = 1/1000sec final motion blur

    You gotta go geometrically up the diminishing-points-of-return curve.
    As we already know, 360-degree shutters have the least possible stroboscopic effect (e.g. it fixes wagonwheel effects). Now trying to make 360-degree as blurless as possible requires ultra-high framerates, too.

    The name of the game is, get lowest possible final motion blur on a completely flicker free display, while simultaneously avoiding all stroboscopic effects. This is the five-sigma magic recipie for the Holodeck Turing Test, and other "looking out of a window" reality tests (for fast-motion material).

    Ultra HFR on ultra-high Hz display successfully achieves this:
    Blurless sample-and-hold combined with blurless 360-degree shutter.

    Needless to say, the mathematics meant such refresh rates to achieve this perfect match-reality effect, were not technologically obtainable -- until today.
    Now that this is finally testable today, we can confirm the improved reality effect.

    So if you read the news, and see "120fps HFR! Can human eyes see that? It's useless!" -- dissmiss that hogwash. Headlines like those are now NATIONAL ENQUIRER eyerolls to us in the know. One needs to go really sharply up the geometric curve to compensate partially for the diminishing-return effects.

    Some people are sensitive to flicker or PWM, some people are sensitive to blur, some people are sensitive to stroboscopic effects, some people are sensitive to vertigo-disconnect problems). Some people cannot use low-persistence VR because of flicker. But low-persistence without flicker is achieved via ultra-high-Hz with ultra-high-framerates. Throwing brute Hertz did definitely hit multiple birds with one stone (Even if it doesn't solve the vertigo-disconnect problems -- but that's something VR/rides/etc can do). Trying to five-sigma all of this is hugely helped by the inclusion of Ultra HFR.

    Indeed, all of this is preliminary observations, but bruteforcing the Hz out of camera blur, display blur, and stroboscopics, all three simultaneously, really does help the "it looks like reality" effect a lot, for a larger percentage. (Now, we have to figure out how to cram contrast ratio and HDR in addition to 1000Hz+ ..... currently, this is not possible to keep HDR with Ultra HFR, but all engineering problems that many need to communicate to display manufacturers about -- the scientific display makers don't talk to the mainstream display makers much).

    As you have noticed from the recent Blur Busters articles (#1 site on the Internet for "Better Than 60Hz" displays) -- I am publicizing a lot more about this in the coming months/years to come.

    -------------------

    We're different nuts and bolts in the display improvement chain. Metaphorically, the people who make rocket nozzles and fuel piping don't necessarily understand the ARM assembly language that goes into a rocket-engine control unit. Similarly, I have educated a few display engineers on low-persistence and the art of reducing display motion blur, even though I don't know a thing about operating the panel fab machines themselves or designing chip ICs. I still remember witnessing the days when broadcasting 1080i digitally for the first-ever time required a big truck, lots of electricity, and big rackmount equipment. Containing equipment that had six separate 480p-strength MPEG2 encoders, multiplexed to achieve a 19Mbps ATSC 1080i stream for a live HD broadcast in the late 90s (8K was a distant dream then). Now today, iPhones can do 4K H.265 realtime in pocket size. Technological progress.

    I have had some contracts with gaming monitor display manufacturers and even the original Oculus Kickstarter. It's surprising how a lot of brilliant star display engineers, know little about what I'm writing in this thread, since it is not a priority (until recently).

    Now I am trying to move the awareness needle for camera manufacturers and raise awareness on the source-side of the equation. It will be years of advocacy, though!
    Last edited by Mark Rejhon; 02-06-2018 at 10:01 AM.
    Reply With Quote  
     

  9. #19  
    Senior Member Joel Arvidsson's Avatar
    Join Date
    Nov 2008
    Location
    Sweden
    Posts
    2,900
    24fps is great as along as the camera is static or move very slow! But I hope there will be more than Peter Jackson that keep experimenting with HFR for cinema. I hope to se a variable frame rate in movies some day where you can have 24p the slow paced parts and ramp up when things starts to move quicker. I think we going to need a good hfr format also because as cameras get smaller people tend to move it more and faster.
    Epic #06696
    Epic-W #004069
    Reply With Quote  
     

  10. #20  
    Quote Originally Posted by Joel Arvidsson View Post
    24fps is great as along as the camera is static or move very slow! But I hope there will be more than Peter Jackson that keep experimenting with HFR for cinema. I hope to se a variable frame rate in movies some day where you can have 24p the slow paced parts and ramp up when things starts to move quicker. I think we going to need a good hfr format also because as cameras get smaller people tend to move it more and faster.
    Yes, motionspeed plays a massive role. (Most here is already familiar with that!)
    The faster the motion, the more blur appear from long camera shutter (source side blur) and high display persistence (destination side blur).

    We see 4K demo material in Best Buy, static or slow moving. Obviously, HFR doesn't matter for blur.

    But as soon as things start panning or moving fast enough, even 120fps HFR still becomes a blurry soup, e.g. you can't read TestUFO Panning Map at 960pps on a 120Hz sample-and-hold LCD or OLED display that doesn't use any strobing/BFI/flicker technique. You can only finally begin to read the map labels if you (A) add strobing/flicker (2ms frame visibility time) or instead (B) use a 480Hz sample-hold display (2ms frame visibility time). The latter is superior in that there's no flicker involved. Adding the latter helps blur a ton, but strobing/BFI/flicker is simply a band-aid. A good band-aid nontheless, but real life doesn't strobe/BFI/flicker.

    For Ultra HFR (in the quadruple digits) one thing that is so impressive is ultra-fast panning (with zero camera blur, zero flicker, and zero stroboscopics) looks fully as sharp as stationary images. Longtimers have often saw this on CRT for sports in the CRT days (when using fast-shutter cameras) but with a fast camera shutter and short display persistence (from black periods between short flickers) you got flicker and stroboscopic stepping effects and wagon wheel effect. Yes, you can whac-a-mole (fix stroboscopics by adding blur, e.g. 360 degree shutter) but that does not solve both simultaneously like Ultra HFR is able to. Imagine being able to avoid adding camera blur, avoid adding display blur, and avoid wagonwheel effects, while still having the same CRT-clarity fast-panning effect as perfectly as sharp as static images -- that's matches more closely to real life. This is what Ultra HFR achieves (realtime 1000fps on 1000Hz) so it has some various useful applications of the future as described in earlier posts.

    Interim Easier Blur Free Motion Technological Steps

    IMHO, I think on an interim basis for the HFR-needed applications of the nearer future, 120fps HFR display-strobed and 240fps HFR display-strobed are easier technologies as interim stepping stones (low persistence impulsed displays above an average human flicker threshold). This doesn't fully fix the stroboscopic effects, but does brings CRT clarity using digital HFR without bothersome 48Hz or 60Hz flicker.

    Strobing/Flicker is a Humankind Band Aid For Fixing Motion Blur

    They're good band-aids, but will never be able to pass the Holodeck Turing Test.
    But real life doesn't strobe.

    Non-360-degree shutters are a humankind band aid (in a century-of-progress perspective).
    Non-sample-hold displays are a humankind band aid (in a century-of-progress perspective).

    Ultimately, it's only a humankind stopgap towards later true Ultra HFR (1000fps+ 360-degree shutter on 1000Hz+ strobeless low-persistence sample-and-hold displays) of future decades. That said, it helps for the toolmakers (camera makers, display makers) to at least properly understand the "Journey" too. It's important for the toolmakers to at least understand these behaviours. Understanding how to best temporal resolution out of material -- with minimal side effects -- for various kinds of applications. Real life has no shutter, flicker, framerate, BFI, etc, so you need ultra-high-framerate flickerfree-strobefree.

    Ultra HFR file format and VFR file formats is already here today: .MP4 files

    There are existing distribution file formats such as H.264 in a .MP4 container, are fully capable of Ultra HFR and VFR video.
    It's possible to create an .MP4 file that plays back 1000fps on realtime 1000fps displays.
    It's possible to create an .MP4 file that has a 24p -> 120p -> 24p transition.
    (Sometimes you need to use bleeding edge stuff such as the 'ffmpeg' command line utility, since editors & software don't properly understand/handle these outlier .MP4 files.)

    True VRR displays also successfully play .mp4 files that have 24p->120p->24p transitions seamlessly without a mode change. Using certain supported players such as SMPlayer that function with variable-frame-rate .MP4 files. (Not all players understand variable-frame-rate .MP4 files). We've successfully tested variable frame rate .MP4 .... it works fine as long as the player supports it accurately (sub-millisecond framebuffer flip accuracy, preferably best-effort microsecond-accuracy, to avoid beat-frequency/granularity/roundoff errors that creates microstutter) as well as a display that is variable refresh rate. All frame rates within the display VRR's range (e.g. 30fps-240fps for a 240Hz GSYNC display), even fractional framerates, are seamessly supported.

    Keep An Eye On VRR Over The Next Ten Years

    The new HDMI 2.1 VRR standard, while originally designed for games, also benefits VFR video too -- e.g. 24p->120p->24p video. And seamless no-mode-change support for all possible custom frame rates in the VRR's display refresh rate range. HDMI 2.1 VRR is fully compatible with AMD FreeSync and VESA Adaptive-Sync (they all use the same VRR signal technique of a variable-size VBI to vary the interval between display refresh cycles).

    For those who aren't familiar with VRR.... If the video file is 88.35fps, the display is 88.35Hz automatically: Variable refresh rate means the display syncs to the software, not the other-way around. That's a boon for 24p->120p->24p. Or tomorrow, 24p->1000fps->24p. Though, most of the time, VRR display is usually normally intended for video games since they fluctuate in frame rate (demo: www.testufo.com/vrr ...) but this is also applicable to seamless frame rate changes. Imagine, tomorrow's VRR display capable of 10Hz through 1000Hz (completely decimal and fractional) -- you can play any odd fractional frame rate you want, whether a 1920s flick at 18fps, an NTSC at 59.94fps, or a UltraHFR at 500fps. That's the beauty of HDMI 2.1 VRR, VESA Adaptive-Sync, AMD FreeSync, NVIDIA G-SYNC, and all the variable refresh rate standards. (Apple iPad ProMotion is one, too, albiet slightly more limited).

    Eventually, that might expand to consume all displays, and make practical framerate-agnostic distribution practical. Want to distribute 48fps on Netflix? Done. Want to distribute a silent film at its original 18fps? Done. Want to distribute Ultra HFR 240fps or 480fps or 500fps? Done.

    We don't know if this will be a reality in twenty years, but VRR means a display doesn't care what framerate you use -- 48fps or 60fps or 59.94fps or 88.315fps or whatever.

    In theory, one could use UltraHFR masters, and a display would downconvert the framerate. Like today for resolution, tomorrow for framerate/refreshrate.

    This may not be the concept that eventually happens; none of us can really predict the future THAT far out. But it would be lovely to (in twenty years) say goodbye to arbitrary frame rates. Different approaches might occur instead (e.g. By the end of the century -- 2100s -- we may instead be using timecoded photon cameras that record to framerateless video files. And displays/players that can play these video files at any framerate you want. Who knows?). That said, the innovation of variable refresh rate display has thrown an interesting new curveball lately.

    Memo to Camera And Video Format Designers: Don't Back Your New Video File Format Into a Corner!

    Yes, it's quite outlier, but nothing is so infuriating as a mastering file format that only defines a few rigid frame rates (23.97, 24.00, 30.00, 59.94, 60.00, etc).
    Quite useless for Ultra HFR experiments.

    If a camera maker wants to create a new frame timestamping format, be aware of the new Oculus Flicks (705.6 MHz clock). Many online H.264 streams often are based off a 90KHz clock, as specified in RFC 6184 industry standard. While fine for Ultra HFR, a more flexible timestamp accuracy than 90KHz clock is preferred -- whether floating point timestamps, microsecond-accurate timestamps, numerator/denominator timestamps (e.g. 60000/1001 to generate 59.94) -- but if you are using integer clock timestamps, please use something more fine-grained than RFC 6184, such as the new open 705.6MHz clock standard -- and make sure that frame intervals can vary atomically (each frame has its own unique frame interval) -- then this maximizes futureproofing for Ultra HFR and for VFR/VRR video. They are outlier applications, but at least the choice exist for future Ultra HFR/VRR/VFR applications.

    (Ha, I almost feel like someone from the 1980s warning people about Y2K. Sure, Ultra HFR is not remotely as critical, but from a future-proofing point of view.)

    Yes, editing software may not be able to support unusual frame rates yet in industry standard file formats (Ultra HFR .MP4 files, and VRR .MP4 files), but please future-proof your file formats. Such file formats already exist today, don't be less flexible than them (HFR-wise/VFR-wise) please!
    Last edited by Mark Rejhon; 02-03-2018 at 10:48 PM.
    Reply With Quote  
     

Tags for this Thread

View Tag Cloud

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts