This is the 1st part in an ongoing weekly series looking at the audio post production workflow. We have enlisted the help of top end professionals in their respective fields to walk us through what they do and sharing some tips and tricks along the way.
The series will be based on TV drama in the UK and these days takes a lot from the film workflow but also a lot of what applies is very similar in other genres all be it not on as big a scale.
What Is An Assistant Editor?
We start with Conor Mackey who is an assistant editor. Don’t under estimate the impact a good and careful assistant editor can have on a project long after it has left their control. So many things can more or less smoothly depending on the care people like Conor take at the start of a project journeying though the post production workflow. Over to you Conor….
The assistant editor is the librarian, processor and archivist of both the raw picture and sound media (the rushes), and the paperwork from the set.
Metadata Is a Big Help
Much of my responsibility as an assistant editor is syncing the picture with the sound. This task is helped by the metadata embedded in the sound files.
Both the picture and sound files share a common ‘time of day’ timecode, which results in the simple task of highlighting both sets of media in a bin and clicking an ‘auto-sync’ button. This creates new sub clips . Anything not synced is left highlighted.
This has to be manually double checked because frame accurate timecode still cannot be guaranteed between two synced, but unlinked, pieces of equipment. At best the error could be within one frame but if one of the machines has drifted a bit the sync could be further off.
The double-check is a sift through every take, finding the board and ensuring the clap sound is at the first frame where we can see the top of the board meet the bottom.
Metadata in the sound files also helps me rename the newly synced media. Rather than hunt through the picture file looking for the slate number scrawled onto the board, the slate number in the metadata can be copied and pasted from the column originally named by the sound recordist to a backup column (Slate & Take, Clip Name; see the first image), and back to the name column when the sub clips the editor works with have been processed.
Other details that can be handy to add to the metadata are track number, character, whether it is clip or boom, bit rate, sample rate, shoot date, etc. The other great news is that, as this is an Avid workflow, the metadata that I enter and tidy up in the Avid, is passed on seamlessly onto the ProTools sessions.
Problems With Timecode
If there is a problem with the timecode, almost all of which come from the camera department, the individual take will have to be synced by hand. Rather than getting the computer to match the timecodes, I have to go through each picture clip and find the board, then each sound clip listening for the clap, and put an ‘in-point’ on each. Then I ‘auto sync’ each individual clip which is very tedious but getting it right now makes it so much easier for everyone downstream.
Common reasons for the camera losing timecode include changing frame rates, changing cards, low batteries, changing batteries, camera trainees getting to know/tampering with, the camera.
Other problems I have to deal with include no end board, where the camera has been turned off before the clapper loader has a chance to mark it. In these cases, one can check sync with a foot fall, a door close, or, for example, an item being put down on a table. Lip syncing can be done, but is less accurate and more time consuming. The synced material is given to the editor who cuts it into a sequence.
In part 2 Conor will walk us through how he prepares the edting sequences to hand on to the Dialog, ADR, FXs, Foley, and Music editors to work their magic.