Dailies Workflow
A handy "how to" guide for negotiating Dailies Workflow
Jan 18
Dailies workflow
Please note this article references a PDF download which can be found at the bottom of the article.
At its essence the steps that take place are the following:(WORKFLOW DOCUMENT)* > CAPTURE > BACK UP & STORAGE > QC > SYNC > COLOUR > TRANSCODE > SEND
The workflow document summarises and encompasses all the steps that will be taken. As such, all steps need to be worked out before any principle photography has commenced. However it is also important to see the document as organic; something that can change throughout the first few days / weeks of production, but having the general route the workflow will take allows for an agreed standard and language to which camera, sound, DIT and Lab should adhere too as well as allowing clear communication to and from production and post and, importantly, your insurance. A good workflow document is a PPS’s best friend. We have placed an example workflow document at the footer of this article, this is by no means the only way to write one, but what you will see are all the steps that need to be covered and the detail that should be gone into.
The dailies workflow is, as mentioned above, dictated by the workflow document, so let us from this point assume that the document has been written, therefore seeing this article as a confluence of the actual steps taken on set / in the lab and the steps that need to be taken before. It would also be remiss of us not to mention that of course there are variations to workflows, (i.e : live grade colour notes done during capture, colour pipelines, no transcoding needed because camera is producing OCFs that can be read by the NLE software and no proxies are wanted) but starting with the above system > CAPTURE > BACK UP & STORAGE > QC > SYNC > COLOUR > TRANSCODE > SEND | is a great place from which to build and inform when creating your workflow.
Capture: Be that sound, light, metadata or timecode.Before you turn over you need to consider workflow implications in choosing a specific camera or sound recorder and in turn the capture codecs / capture format. Each and every codec and camera setup will stipulate specific workflow systems that one has to then follow to achieve the best quality sound and picture that can be delivered to post. There are a lot of cameras that shoot a lot of different ways! To help us in choosing or understanding camera choice (as often this is not a choice the PPS gets to make) we think as PPSs the best way to narrow down choice for others is working backwards in the workflow, or rather going to the end to get the answer in how to begin. As Post Production Supervisors you are always looking at the bigger picture, the whole workflow - from capture to post to delivery; this means, capture / editorial / VFX / colour / delivery and archive, each demanding, probably, unique codecs and a minimum or maximum of data. But in going straight to delivery you know the absolute minimum requirements that must be met for camera choice.
Example:
Your main output will be a 4K UHD HDR delivery; Immediately you know you need to be shooting with a camera that will deliver an image that can shoot at least UHD and have a colour depth of at least 10 bit, you also know that you will be working in a wide colour gamut (WCG) wider than rec.709 and that the camera will have to have a dynamic range of at least 13 stops. You will also know that if there is push back on minimum camera specs (as an example the camera resolution only shoots 3.2K) you can manage expectations throughout the pipeline and foresee extra work that must be done and any extra costing because of this.So now we have our target specs, this informs us on camera choice - please see links to example cameras and their manufacturer pages, this is not by any means a definitive list, and other makes and models are available
https://www.red.com/dsmc2
https://www.arri.com/en/camera-systems/cameras
https://pro.sony/ue_US/products/digital-cinema-cameras
https://pro-av.panasonic.net/en/products/cinema_cameras.html
https://www.blackmagicdesign.com/uk/products
But next then is which codec ! There is a multitude of differing codecs but one easy route to which one is, as above, go to the end to answer the start. Find out what your master deliverable needs to be and this will tell you you don’t want to drop below that compression.
Example:Your main deliverable is 4K UHD HDR and you know that this is to be a ProRes Quicktime 4444. You now know that if any camera is shooting in a codec with greater compression and lesser colour bit depth expectations can be managed throughout the workflow, you also know what minimum codec should be coming out of your cameras.
Below is a little break-down explanation on codecs: please jump to next section ‘Back up and Storage’ if you’re not wanting a more in depth explanation of codecs and wrappers. Codecs & Wrappers: One first needs to understand the difference between container (wrapper) & codec, as often they are confused to mean the same thing or they are often referred to as a collective noun in the word ‘Format’. Codecs are one thing and wrappers are another, the word ‘Format’ is a semantic exercise!
Codecs have formats and containers (wrappers) have formats. In the same way a photograph has dimension and a box has dimension. A .mts file has a format, a .mov file has a format. Both are video containers but they are formatted to different parameters.
A container file (wrapper) is what contains the video / audio / metadata codec file. It is there as a bridge for more generic software/ hardware to play the video/audio codec files, and is there to allow for multiple streams of data to be embedded into a single file. “Since audio and video streams can be coded and decoded with many different algorithms, a container format may be used to provide a single file format to the user”. ( https://en.wikipedia.org/wiki/Container_format_(computing) ). So the container file format holds the video/audio/metadata codec format. A sweet in a wrapper. Example of wrappers would be a .MXF, .MOV, .QT, .WMV etc
So to understanding Codecs: Codecs within a film and television setting, are set rules applying to data streams. They tell your computers and electronic equipment (in our case camera, sound recorders and computers) how they will handle your media. The term codec is actually a portmanteau for coder-decoder; the coder encodes the data stream as well as compressing the stream, whilst the decoder function works in reverse unpacking the stream for playback. We should always remember that there is no one codec that fits all for every production, each and every production will have multiple varying factors that will always have implications on what codec is or can be shot and used throughout the pipeline: time, money, elements, are VFX involved, location - the list goes on and on.
Once a shooting codec has been decided upon, you are quickly able to see if any transcoding will need to be done throughout the pipeline as well as the cost implications that may incur from that decision, but as a PPS you have so many areas to be looking at and considerations to be doing that time is not always on your side, and as such the steps are often not seen until you are upon them. But as long as you take with you that the better quality the codec or the closer to uncompressed video you go the greater the amount of data needed, you will be safer in managing your budget.
Back up & storage:
The first stage is to choose hard drives appropriate to your task. There are multiple options and multiple cost factors. One should let the choice of shooting spec and actual use dictate what hard drive should be bought.This article won't go into break down of every hard drive type, make and model there is, however there are factors you should consider when choosing.
SIZE
How much footage will you be shooting at the specific shooting codec, frame rate and resolution. - There are multiple sites that will help you in calculating the amount of space needed (often on camera manufacture sites themselves, such as Arri’s tool page here cc’d: https://www.arri.com/en/learn-help/learn-help-camera-system/tools ).
DRIVE SPEED
Data rates (read/write speeds) vary massively from drive to drive. You need to consider your end use here - are you planning to use the drives to cut from? - if so the quicker the better. Do you just want to back up and then at a later date transfer the data to servers to cut from, if so then perhaps a slower drive will suffice. Generally speed is locked in with price - the quicker the drive the greater the cost. The cost does however plateau after a point. There are four ‘speed sets’ to remember, with examples.BUS powered Single HDD (Hard Disk Drive) non enterprise. (such as https://www.lacie.com/gb/en/products/rugged/ )Mains powered Single HDD (Hard Disk Drive) enterprise class (such as https://www.lacie.com/gb/en/products/d2/ )RAID set HDD (Hard Disk Drive) enterprise class (such as https://shop.westerndigital.com/en-gb/products/external-drives/g-technology-g-speed-shuttle-xl-thunderbolt-3-hdd#0G05850 )SSD (Solid State Drive) - there are varying speeds involved with these drives. Generally they are quicker than individual HDDs. (such as https://datastores.co.uk/product/lacie-rugged-ssd-pro-with-thunderbolt-3-1tb-2tb/ )
CONNECTION & FORMAT
Let us first get format out of the way quickly. Macintosh computer systems work on a hard drive journaling system called APFS, you will commonly see the partitioning option MAC OS Extended. PC based hardware most commonly work off a NTFS file system. Linux uses ext2, ext3 & ext4 (ext4 most common). For any cross proprietary workflows, ie Mac to PC or to PC to Linux etc EXFAT formatted drives would be the option to choose. It is important to format the drives in the correct fashion at the start of the job as you will be stuck later on down the line if you have a MAC formatted drive when wanting to work on a bank of PCs for your online! (Though there are of course always solutions).
Let us first get format out of the way quickly. Macintosh computer systems work on a hard drive journaling system called APFS, you will commonly see the partitioning option MAC OS Extended. PC based hardware most commonly work off a NTFS file system. Linux uses ext2, ext3 & ext4 (ext4 most common). For any cross proprietary workflows, ie Mac to PC or to PC to Linux etc EXFAT formatted drives would be the option to choose. It is important to format the drives in the correct fashion at the start of the job as you will be stuck later on down the line if you have a MAC formatted drive when wanting to work on a bank of PCs for your online! (Though there are of course always solutions).
Then how you physically connect to your workstations is important. Know your workstations throughout the pipeline so no confusion occurs as plenty of connection types have become obsolete and plenty of drives wont physically connect to some computers. You also have some HDDs or SSDs that wont work with certain workstations, they connect fine but they wont start up. The standard today as of October 2020 is USB C and Thunderbolt 3. Now there is a common understanding that because you have a T3 or USB C connection your files will go across quicker - this is not necessarily true. The drive you are using has to have the speed to allow the connection to take advantage of that, as does the computer you are connecting to - it has to have the connection ports that take advantage of the spec of the connecting cable; See your connection type - be that USB, Firewire, Thunderbolt, SAS, eSATA as the pipeline that can hold and contain a certain flow. It is your drive speed that dictates how the speed information is released or taken in.Once you have taken into account these three factors and the subsequent questions that arise, actual use + cost can be correctly considered and the drive(s) type/make/model can be decided upon.
Filing formatWe can then look to agreeing upon a file system. There is no one way to do this, multiple post houses and editorial teams each have their specific way to approach setting up folder structure. We find that a clear and short structure is the best system. Please see example below: where ‘ABC’ is shows working code.
ABC >
SEASON_2 >
TESTS>
ABC_S2_DDMMYY _SDXXX_AM/PM>
ABC_S2_DDMMYY _TDXXX_AM/PM>
>OCF
>CAM_#
>SOUND
>ABC2_1AA>
>REPORTS
>PROXIES
>EDITORIAL
>VIEWING_DAILIES
The actual breakdown of rolls for sound and picture you can see at the end of the document in the example workflow document. But the overall picture you see above is one of simplicity. No overly long folder names, and files placed as close to the root directory as possible. Simple clean and quick.
Backup #
Now folder structure has been agreed upon, how files will be copied can be decided.
First the number of copies; always make sure you have a minimum of three copies (for low end budget / commercial x 2 copies seems to be commonplace ). This just helps negate any issues that may occur, fire, flood, theft or more. It should also keep your insurer happy. The copies that are made should always be split at the end of the day, and should not be housed under the same roof. One should consider and place LTO archival into the chain here also. As these tapes can become a back up for you as well. Currently the two most common versions of LTO are LTO6 & LTO7.
Once you have decided on the number of backups you can then purchase the drives and look next to the backup software your data wrangler or DIT will be using.
Backup Software: This you will find is one of the insurers redlines (if they or you ask). It must be done with a MD5, xxHash or MHL (*please find fantastic article on MHL https://mediahashlist.org ) checksum transfer software. There are multiple softwares to choose from of which the most common are (in no specific order);
Yoyotta https://yoyotta.com
Silverstack https://pomfort.com/silverstack/
Hedge https://hedge.video
Shotput Pro https://www.imagineproducts.com/product/shotput-pro
Any copies you ever do, be that for sound or picture should always done using one of these softwares and always with MD5, xxHash or MHL transferring. Doing this means you have;
- Assurances and insurance the footage has been copied fully from the card to destination drives.
- Because logging is done, you can pin point any errors that may have occurred, whether technical or operator fault.
- You have documentation that helps archival of the project.
- Quicker method of writing to multiple drives. -(though the checksumming will add additional time to the transfer)
Now we have drives, folder structure and back up software in place we can now start backing up the footage. On transfer completion of each roll, transfer logs / reports should be generated and stored. We will come to reporting shortly. Once the technician is happy that transfer of the roll has gone correctly they can start to QC (Quality Control) that roll.
QC (Quality Control)
We are of the opinion there are two forms (for onset/lab QC) of QC that can be done, a light version and a correct / full version. This is dictated by the environment the offload workstation is situated, as a DIT onset outside or on a set with polluting lighting and distractions can QC at a far less of a level as that of a Lab technician in a controlled environment with a large grading OLED and no distractions. This is often why shows have an onset DIT and an offset Lab, a copy can be made onset and quick QC can be done, then the rushes can be sent offset to the Lab where a thorough QC can be done.
Let us for arguments sake assume that we have a Lab; the QC process should be multilayer; comparing file size, file counts, frame counts as well as painstakingly watching each frame via playback software then noting any issues (ideally this is entered in as metadata). There are varying ways of which to approach QC, and often larger studios will dictate how this must be done. For instance often and most commonly Lab technicians will scan through the footage noting any issues, then flagging and informing all who need to know. (From our experience the best lab ops will always inform of the issues to an absolute minimum amount of persons, reducing any worry that inevitably comes - keep panic down.) After informing on the issue the technician reports the issue in emails and lists it in their daily report. END // But there are some studios that now require the lab to do a full QC with metadata annotations on any issues that may have occurred, this with a coding system they have devised. Please see here a link to Netflix’s QC code bible this was as of 28/9/2020 : https://help.prodicle.com/hc/en-us/articles/360038427533-Netflix-Originals-Branded-Content-QC-Error-Code-Glossary-V2-5. The steps taken by the Lab tech are the same in that they will inform those that need, as well as creating a unique email for the issue and noting in the daily reports, but they will also note the issue (as a code) on the clips metadata, using the clips frame count or timecode to identify the start point of the issue, as well as a short description for location of issue. This means that from this point forward in the chain all can be aware of the issue. This really is a useful tool for you as PPSs; immediately knowing of any extra work that might be needed, informing your future costings as well as it keeping your team throughout the pipeline appraised of up coming issues, and so in turn giving the piece a better pass rate when it comes to the master QC.After the technician has QC’d the footage and noted any and all issues, other metadata inputs can be done at this stage for ALE creation to help your editorial team.
We will go onto to describe the reports that come under the ‘QC’ stage, but please remember these are generated as one of the last steps after colour and transcodes have been produced.In general there should be a number of reports the technician should present to you and your team. Below are examples with naming conventions one might use, again using ‘ABC’ as the fictional show code.
- Full Quality Control (Q.C.) - All or any issues noted here on a per clip basis.
- ABC_S2_SD003_01_October_2020_DigitalQC_Report
- Clearance Report - Report to say all cards have been copied to either to your LTO or your master drive with no issue
- ABC_S2_CLEARANCE_REPORT_011020
- Day Summation. - Shorter summation of any issues found, for editorial and HODs
- ABC_S2_SUM_REPORT_SD001_011020
- LTO Archive Report / LTO Table of Contents (TOC) - (This is normally a live document, updated daily)
- ABC_S2_ARCHIVE_TOC Transfer ALE, CSV, PDF & MD5 - Proof of copy
- ABC_S2_ALE_SD003_011020
- ABC_S2_CSV_SD003_011020
- ABC_S2_MD5_SD003_011020
- ABC_S2_CDL_SD003_011020
- ABC_S2_MHL_SD003_011020
Now we have QC’d the footage we can go to sync, colour and transcode the footage.
Sound Syncing
There are two factors that will dictate if your technician needs to sync or not. 1.Editorial & 2. Viewing Dailies.
1. From our experience the greater proportion of shows are cut within AVID, and generally experienced edit assistants will ask for the footage not be synced - rather not to be synced outside of Avid.
2. Viewing Dailies.If viewing dailies have been asked for, then syncing has to be done by the DIT/lab technician.There are two options for the technician here, manual or automatic syncing. If timecode has been locked correctly from sound to camera then auto syncing should be a breeze for the technician, adding little to no time to his working day. If there is an issue with the timecode / no timecode then the technician must manually sync the footage, adding time to the workday.In summation, editorial prefers to import sound separately and manually sync within their editing software. All viewing dailies have to be synced.
ColourAfter we have synced we can start to apply colour to the image.
We have misled you here slightly, as often the colour is applied (in a non destructive manner, not actually burning information onto the files) during capture of the image - this via software like Pomforts Live Grade. Applying CDLs (colour decision lists) to each clip. Or an already decided upon LUT (lookup table) might have been used throughout filming. But it is at this point we are looking to bake it into the proxy files we produce.
t is here we either import the generated CDLs from set or LUT from set and apply to image, or colour from scratch. The colour pipeline is a separate discussion and merits its own article, and as such we wont go into large detail here, other than to say the colour pipeline must be decided upon before starting, and this does involve set all the way through to post; with some production companies stipulating an absolute minimum requirement. A good starting point for a PPS is to talk to your post house (your colourist or head of post) & DIT as well as your lead tech of the studio if shooting for a larger production.
Once the colour has been applied, the colour information needs to be passed down the pipeline along with all the other metadata that might have been applied (such as scene, slate, take etc…). An export of this information is done here - via an ALE & CDL export or LUT export - this information can then pass to everyone from editorial through to post and VFX. All then knowing what colour values have been applied throughout the process.
Transcode
Now with colour set and sound synced, transcoding can commence. The DIT/Lab Tech should have collated (and placed into the workflow document) all the information needed for the specs of the proxies produced, be that file specs, framing, burn Ins etc. There are usually two flavours of proxies that are produced: 1. Editorial and 2. Viewing Dailies
1. Editorial – The file type will be dictated by whatever software the edit team.
1. From our experience the greater proportion of shows are cut within AVID, and generally experienced edit assistants will ask for the footage not be synced - rather not to be synced outside of Avid.
2. Viewing Dailies.If viewing dailies have been asked for, then syncing has to be done by the DIT/lab technician.There are two options for the technician here, manual or automatic syncing. If timecode has been locked correctly from sound to camera then auto syncing should be a breeze for the technician, adding little to no time to his working day. If there is an issue with the timecode / no timecode then the technician must manually sync the footage, adding time to the workday.In summation, editorial prefers to import sound separately and manually sync within their editing software. All viewing dailies have to be synced.
ColourAfter we have synced we can start to apply colour to the image.
We have misled you here slightly, as often the colour is applied (in a non destructive manner, not actually burning information onto the files) during capture of the image - this via software like Pomforts Live Grade. Applying CDLs (colour decision lists) to each clip. Or an already decided upon LUT (lookup table) might have been used throughout filming. But it is at this point we are looking to bake it into the proxy files we produce.
t is here we either import the generated CDLs from set or LUT from set and apply to image, or colour from scratch. The colour pipeline is a separate discussion and merits its own article, and as such we wont go into large detail here, other than to say the colour pipeline must be decided upon before starting, and this does involve set all the way through to post; with some production companies stipulating an absolute minimum requirement. A good starting point for a PPS is to talk to your post house (your colourist or head of post) & DIT as well as your lead tech of the studio if shooting for a larger production.
Once the colour has been applied, the colour information needs to be passed down the pipeline along with all the other metadata that might have been applied (such as scene, slate, take etc…). An export of this information is done here - via an ALE & CDL export or LUT export - this information can then pass to everyone from editorial through to post and VFX. All then knowing what colour values have been applied throughout the process.
Transcode
Now with colour set and sound synced, transcoding can commence. The DIT/Lab Tech should have collated (and placed into the workflow document) all the information needed for the specs of the proxies produced, be that file specs, framing, burn Ins etc. There are usually two flavours of proxies that are produced: 1. Editorial and 2. Viewing Dailies
1. Editorial – The file type will be dictated by whatever software the edit team.
AVID – Will use DNx encoded MXF wrapped files – there are two types of DNx – DNxHD & DNxHR and multiple flavours of the encoding version such as DNxHD 36 or DNxHR LB.
Adobe Premier – There is no specific codec that is linked to Premier – as this is its selling point, the ability to work with any file type at a native resolution, however your editorial team may ask for ProRes encoded, QuickTime wrapped files.
Final Cut Pro – Will use ProRes encoded, QuickTime wrapped files such as ProRes 422LT.
2. Viewing Dailies – The file type that is created will be dictated by either the viewing platform you are posting the clips to, such as PIX, MediaSilo, Frame.io etc or if you are not using a platform to host the dailies on then the devices that the director, DOP, producers will watch the rushes on. Usually this is a 720p or 1080p compressed H.264 encoded file within either a QuickTime or Mpeg container.
Again if you look to the workflow document you will find two examples for the editorial proxies and the viewing dailies.
SendOnce the transcodes have been completed the files need to be sent to editorial and if you are using a screening dailies platform, that too.
2. Viewing Dailies – The file type that is created will be dictated by either the viewing platform you are posting the clips to, such as PIX, MediaSilo, Frame.io etc or if you are not using a platform to host the dailies on then the devices that the director, DOP, producers will watch the rushes on. Usually this is a 720p or 1080p compressed H.264 encoded file within either a QuickTime or Mpeg container.
Again if you look to the workflow document you will find two examples for the editorial proxies and the viewing dailies.
SendOnce the transcodes have been completed the files need to be sent to editorial and if you are using a screening dailies platform, that too.
Editorial files
These files will be put together into a clear days package for the editor to unpack each day. An example of this would look like the following;
EDITORIAL_UPLOAD_ABC_SD003_011020>
>ALE
>MXF
>MASTER_AUDIO
>REPORTS
Coupled with this package you would send an email detailing the package contents, no. of files contained within each folder and total size for all folders and total size of package. If this was being uploaded you would also give an est upload time and again reiterate anything you want to flag.
Sending the files there are two options:
- Either the files will be offloaded to a shuttle drive and physically sent to the editorial team. - This is an option if you have nominal upload / download speeds or if editorial are located close to set or the lab.
Sending the files there are two options:
- Either the files will be offloaded to a shuttle drive and physically sent to the editorial team. - This is an option if you have nominal upload / download speeds or if editorial are located close to set or the lab.
- Or the files will be uploaded and sent via FTP (File Transfer Protocol) or using a dedicated file service provider such as Aspera, Sohonet’s FileRunner or Signiant’s Media Shuttle . (There are multiple options available, for you as PPS often the best initial port of call is to get pricing from your DIT/ Lab provider as they may offer a good price for the service that could be encompassed into the package they deliver. Alternatively often when working with studios they will have either there own system - look to Netflix’s Content Hub or a deal with a service provider that keeps its cost to minimal amount).
Viewing Dailies platform
Once upload has started it is always good to inform - via email - the editorial team of eta of finished upload time as well as informing yourselves the PPS’ and your team.
All these files will be as an upload, with specific instructions how to do this. The PPS and their team may want to stipulate to have an ‘unreleased’ folder the Lab tech or DIT uploads the daily dailies too before distribution to the lists. This way you and your team can control what is being seen by the execs and all. Having a separate ‘unreleased’ folder also offers up a quick qc your team can do on the dailies, before sending them on their way.
Viewing Dailies platform
Once upload has started it is always good to inform - via email - the editorial team of eta of finished upload time as well as informing yourselves the PPS’ and your team.
All these files will be as an upload, with specific instructions how to do this. The PPS and their team may want to stipulate to have an ‘unreleased’ folder the Lab tech or DIT uploads the daily dailies too before distribution to the lists. This way you and your team can control what is being seen by the execs and all. Having a separate ‘unreleased’ folder also offers up a quick qc your team can do on the dailies, before sending them on their way.
Viewing Dailies no platform
Because this is a physical action - hardware needs to arrive to dit station / lab to upload the media too - that must be done by the DIT or Lab Op, there is no need to send out emails. The hardware would be handed directly to the necessary person.
There are multiple apps available for hardware such as iPads, but the simplest one is probably just dropping directly onto the iPad and playing out via its player, alternatively iTunes is an easy option. You can ask your DIT /Lab tech to label the names of each clip in specific fashion so you can easily find the clip you are looking for. Alternately you can ask for a single clip of all the days rushes to be made as a time of day timeline, then you have one clip you can scroll through using the script supervisors notes to navigate.
This article was kindly contributed by Charlie Noble
Because this is a physical action - hardware needs to arrive to dit station / lab to upload the media too - that must be done by the DIT or Lab Op, there is no need to send out emails. The hardware would be handed directly to the necessary person.
There are multiple apps available for hardware such as iPads, but the simplest one is probably just dropping directly onto the iPad and playing out via its player, alternatively iTunes is an easy option. You can ask your DIT /Lab tech to label the names of each clip in specific fashion so you can easily find the clip you are looking for. Alternately you can ask for a single clip of all the days rushes to be made as a time of day timeline, then you have one clip you can scroll through using the script supervisors notes to navigate.
This article was kindly contributed by Charlie Noble

We are Post Super Ltd , your ultimate destination for all things post-production management.
Copyright © 2025
FEATURED LINKS
CONNECT WITH US
-
Facebook
-
Instagram
-
Linkedin