An introduction to timecode

 In Blogs, knowledge base

How does multicamera timecode synchronisation work?

What is timecode?

Timecode is an important form of production metadata, which if used correctly can save productions time and money.

 

How does it work?

Each frame recorded is assigned a specific timecode, allowing editors to find a particular frame across multiple camera and audio sources by referencing this number. If each camera and audio device on a shoot is running timecode, this allows data to easily be dropped into the edit timeline and automatically aligned.

 

But each device has an internal clock – why can’t I use that?

The various internal clocks on the cameras and sound recording devices run at marginally different rates, causing drift, and consequently synchronisation is lost. Using an external timecode device ensures every camera and audio source is jammed to one, incredibly accurate master clock, creating a robust wireless sync network.

 

Server, Client…what?

Most wireless timecode generator systems, including the Timecode Systems solution, use a Server and Client relationship (formerly Master and Slave) as the basis for synchronisation.

To set up your sync network…

 

  • Every camera and audio source needs an external timecode unit.
  • One unit should be assigned as the ‘server’ device and set to transmit timecode (or genlock if relevant).
  • All of the timecode devices on the other recording units, should be set to run as clients, using the same RF Channel as the server.
  • When a slave device is on, it transmits a signal via the set RF channel.
  • The server detects the client and relays its clock settings to the client.
  • The client sets its own clock to match the server, and this jams all sources to the same incredibly accurate clock.

But what if a camera roams out of range of its server?

If the events of a shoot take an unexpected turn and a camera happens to roam out of the RF range of the server, the Timecode Systems unit will continue to run timecode using its own accurate internal clock, and then sync back to the server as soon as it’s back within reach of the RF signal.

 

And what if the director’s plans change mid-shoot?

The system is completely flexible. If, for example, additional cameras arrive on location mid-way through the recording day, users have the capability to simply attach an UltraSync ONE to the camera, or even an UltraSync BLUE if it’s an iPhone. They can select the chosen RF channel and get going knowing the content recorded will be instantly synced against the server unit sitting with the sound department.

 

And how does this benefit my colleagues in the edit suite?

At the end of the shoot, the memory cards from the main cameras, PSC cameras, and sound mixer will all contain data files stamped with the same embedded timecode. This allows all media to be easily dropped into the edit timeline and effortlessly aligned for a swift and efficient edit.

Recommended Posts
0
DSLR camera and UltraSync ONE sync workflow image