Last edited by Martin Weiss; 04-14-2010 at 12:26 AM. Reason: Picture quote removed
Awesome. Deanan, how would you guys imagine this H264 picture comparing to the RED's current HD-SDI out? Do you see this becoming a standard for video villages or is it something you would only use if you need wireless?
You are seriously worried about the damn Gig-E connector? My brain is melting.
RED, you get it. . . and I hope you continue to get it.
I'm thrilled at the decision to use a Lemo [or Lemo-esque] connector instead of the traditional jack. Deanan is 100% correct - a jack has no place on a camera... it'll only get trashed.
Any way to incorporate a hinge in those antennas? Or at least a breakaway section of sorts so that when we break the antenna, it's the antenna that breaks and not the actual connector in the module!
In short, it looks just like one of the articulating antennas on my wireless access point.
One question comes into my mind:
The module encodes a 1080p stream - will a iphone or even the ipad be able to decode 1080? Seems a lot of power will be needed for that.
Would it be possible to switch between 1080p, 720p and SD for "weaker" viewing devices? Or even encode multiple streams with different online links?
I'm pretty sure Red will find a solution for the gigantic iphone 3G crowd, but just wanted to know if there are any plans yet.
Ideally I would like to system which works a little like Adobe Story but with the following specific features:
System based around a centralized script or shooting plan(for docs) centric database.
During pre-production & previz all shot plans, storyboards, animatics etc are hooked up to the story/script.
During shooting 2ndAC can link an auto log of clip metadata of what's being shot & camera notes to this database on set while shooting using his ipad, iphone or laptop (maybe even integrated with movie*Slate app)
Continuity can do a similar thing with their notes. So can any other on set who might need to.
All notes link to both any module generated proxies and R3D files and use metadata to make this connection harder to break and enable automatic reconnection as media is transferred to various destinations such as for dailies viewing (Director and others can annotate further during viewings and have these sync to central database and individual clip metadata) or onto edit stations.
Editor can add own notes during their initial viewings.
Based on a set of preferences (omit shots marked as no good by 2nd AC for example) the Editor will be able to conform the shoot onto a multicamera style timeline according to the script and scenes and automatically place all media into appropriately named and organised bins. Viola a rough cut made in seconds using a multicamera edit switching technique.
Basically I would love to see tapeless workflows that leverage the metadata and data-centric workflow database style throughout a whole project end to end, thus eliminating much of the laborious repetitive organizational tasks that take away from our creative time.
If Red can provide the pieces of the puzzle to make this happen within their part of the workflow (on camera wifi enables the possibilty of all this) and allow for an API or even create their own software to do this I will be in heaven. There are things like adobe story and scriptwriting/production software that hint at this but we need it to be edit suite and scriptwriting software agnostic and update itself live from the cameras during shooting.
Or am I ?
I am wondering if these modules could be used to record to a format that can be edited on final cut version 5 this would allow me to use my existing non intel imac for editing with the scarlet.
|« Previous Thread | Next Thread »|