Once you’ve finished shooting, the next step is postproduction. This is where you bring the raw material together to shape the story. Postproduction is completely software based, giving you a wide range of powerful tools.
You need the following, for a successful postproduction:
- Great quality pictures and sound
- Good coverage of each scene (a range of different shot sizes and camera angles)
- Consistent exposure and colours (where possible) for each scene
In this article, I’ll introduce five different areas of postproduction. They are:
When making a film, you’ll use all of them in one form or another.
We strongly associate video editing with postproduction. This is where you assemble, arrange and trim your media. You use video editing software to cut the movie together. The software is also known as a non-linear editor, or NLE for short.
During production, you usually shoot scenes out of order. When you’re at a location, it’s more efficient to shoot all the scenes for that location. Usually those scenes are not in script order. This results in a complex mix of footage. Like starting a jigsaw puzzle, you need to lay out all the pieces to make them easier to find. You need to organise the material before editing begins.
Organising the media:
- If pictures and sound are separate recordings, they’re sync’d so dialogue matches mouth movements.
- Organise the media in a way that makes sense. Make collections of shots from the same scene. You can also collect scenes into sequences, or scenes into acts.
- Tag clips with the words you would use when running a search. For example, tag all the shots of the character Tom with the word ‘Tom’. Other useful search terms include, ‘good take’, ‘FX shot’ and ‘replace dialogue’.
Editing is Storytelling
First you tell the story in the script. Another version exists if you’ve done pre-viz (pre-visualisation). But the last time you tell the movie’s story is in the edit. You use filmmaking techniques to do this. By cutting to different shots, you provide the audience with new information, which advances the story. You show story points being setup and paid off. Good editing highlights characters making important decisions and the consequences of those decisions.
When the film cuts to a new scene, the audience subconsciously ask these questions:
- Where are we? A new scene usually means a new location.
- What time of day is it? Time might have passed since the previous scene.
- Who are these characters? They could be characters we’re already familiar with, or completely new ones.
- What are they doing? The audience is looking for story and character clues. Actions tell us who the character is and what they want.
A skilled editor understands what’s going on in the audiences’ brain. Using filmmaking techniques you can reveal or conceal important information. You can plant questions in people’s minds, build tension and create drama. If you do a good job, your work becomes invisible. The audience simply sees a great story.
Editors don’t receive the same recognition as directors or cinematographers. But here’s a guide to some influential editors and their films…
Thelma Schoonmaker has eight Academy Award nominations and three wins. She’s been the editor on Martin Scorsese’s movies since the 1980s. Check out her Academy winners: Raging Bull (), The Aviator (), The Departed ().
Like Schoonmaker, Michael Kahn has won the Academy Award for editing three times. He’s worked extensively with Steven Spielberg from the 1970s onwards. Watch his winning films: Raiders of the Lost Ark (), Schindler’s List (), Saving Private Ryan ().
Dede Allen was the editor on several classic movies of the twentieth century. Her work includes: The Hustler (), Bonnie and Clyde () and Dog Day Afternoon (). She was nominated three times for the editing Oscar. Allen worked with many of the great directors, such as Arthur Penn and Sidney Lumet.
Some editors work in teams. Maryann Brandon and Mary Jo Markey cut Star Trek Into Darkness () and Star Wars: The Force Awakens (). Kirk Baxter and Angus Wall have won two Academy Awards for The Social Network () and The Girl with the Dragon Tattoo ().
The next stage in postproduction is sound editing. You can work with audio in your video editing software, but dedicated sound apps have better tools.
You can think of the audio workflow as having five stages:
- The dialogue edit
- Sound effects
- Composing or sourcing music
- The final mix
Pictures provide information and move the story forward, but sound is all about emotion. Dialogue gives us insight into a character’s feelings. The sound of approaching danger builds tension. Loud noises create scares. Silence is also very affective. It lulls us into a false sense of tranquility before the next burst of action.
The Dialogue Edit
Although you edit dialogue with the pictures, it usually needs extra work. Refine dialogue by removing background noise and improving clarity. You can replace the occasional unclear word with careful editing. But if many lines are poorly recorded, then ADR is required. ADR stands for Automatic Dialogue Replacement. This is where actors re-record their dialogue while watching their on-screen performance.
Sounds effects are crucial for making the film feel real. They help the audience believe what they see on the screen.
You can source sound effects in three different ways:
- Sound libraries. Some audio apps have built-in effects libraries. You can also buy third party sound effects.
- You can capture your own sounds. The world is full of amazing noises you can record and use as effects. For a punch sound, hit a leather jacket with a baseball bat. Scrunch 16mm film to simulate the effect of walking through tall grass. Flap an umbrella for the sound of bird wings. If you need explosions, get to know a fireworks display company.
- You can use synthesis to create weird and wonderful sounds. It’s especially useful for technical noises. Some audio software, such as Logic Pro, has built-in synthesisers.
Use the effects in your audio software to give sounds more impact and texture. You can make tonal changes with the EQ effect. It enables you to raise or lower certain frequencies. Reverb simulates different room sizes, giving sound spatial characteristics. A compressor controls the dynamic range. Dynamic range is the difference between the softest and loudest sounds in the recording. A compressor reduces the dynamic range by lowering the loudest sounds. There are many other effects for enhancing the audio or making it sound otherworldly.
Foley is the performing and recording of everyday human sounds. It adds subtle realism to the pictures we’re watching. These include footsteps, clothing rustle, doors opening and closing. Foley artists create the sounds in real-time, while watching the movie. To be a Foley artist you need objects for making sounds, plenty of experience and a good understanding of how sounds are made. Like sound effects, Foley can also be sourced as a library.
The purpose of film music is to manipulate the audience’s emotions. High volume and tempo creates a feeling of excitement. Slower, quieter music will calm the audience down. Sad or joyful music helps them empathise with the characters.
Musical cues are used to announce a particular character. Think of the flute flourish every time Din Djarin appears in The Mandalorian. Or the Scavenger Theme used for Rey in the recent Star Wars trilogy. In the Lord of the Rings trilogy, each location in the world had its own unique theme. We know immediately if we’re in Rohan, Gondor or one of the Elven realms.
Music gives scenes momentum and pace. Using the same music across different scenes, helps to bring them together as a sequence. A change in the music signals the end of one sequence and the beginning of another.
The final step is to bring the sound elements together for the final mix. The re-recording mixer balances the volume levels. They make sure the dialogue is always audible over the music and sound effects. The mixer uses EQ effects to create tonal separation between the different elements. A stereo or surround mix creates a sense of space and gives direction to the different sounds.
Listen to These Audio Pros
One of my filmmaking heroes is Ben Burtt. He created the iconic sound effects for the Star Wars movies. Burtt made me realise how sound design adds realism to the fantastical. One of his strengths is making noises convey emotion. He used this technique to bring life to characters such as Chewbacca and R2-D2. Check out these videos of Burtt talking about the sound design for the lightsabers and the Emperor’s force lightning.
Lora Hirschberg is a re-recording mixer. This means she’s responsible for the final mix of a movie soundtrack. She was part of the team that won the Sound Mixing Academy Award for Inception ().
Gary Hecker is a well-known Foley and sound effects artists. In this video he demonstrates his Foley techniques.
We use colour grading in postproduction to correct and enhance the pictures. It gives you complete control over the brightness and colour. You can work on the whole image, or mask off certain areas. Colour grading overlaps with cinematography. As a colourist, you need to know the look the cinematographer wants for each scene. The colourist also works closely with the director, helping to clarify the story. Grading techniques are used to highlight characters and other features that are important to the story.
The look is a big part of a films presentation. It’s decided by the director with the help of the cinematographer. It’s defined by the lighting style, colour palette and emotions required for the story. Then the film is designed, dressed and shot with this look in mind. In post, it’s the colourist who becomes responsible for continuing this aesthetic.
Brightness and colour reveal a character’s emotions. Shadows and highlights will show or hide important details. Grading can add realism to a location. For example, the colours of Northern Europe are very different to those of North Africa. Colour can tell us the time of day. Early morning looks very different to midday. It can also define the historical period. Imagine the natural tones of medieval times, compared to the artificial colours of today.
A complex scene can result in mismatched colours, especially when shot over several days. Inconsistent colours make a scene feel disjointed. The colourist can match the colours between shots to bring the scene together.
As the story develops and characters grow, the film’s colour palette can also change. Animation studies create a colour script to track these changes. The script shows how the colours evolve over the life-time of the movie. There’s no reason why you can’t colour script a live action movie.
Colour Grading and Fashion
Movie fashion influences colour grading. Over the past few decades, several distinct looks have come and gone. We’ve seen highly coloured pictures and desaturated looks. There’s the teal and orange effect that started with the Transformers movies. The high-contrast look favoured by Tim Burton and Zach Snyder. All of these looks focus on skin tone. Some protect it, while changing the rest of the image. Others emphasise the warmth of skin tone and contrast that with cooler colours.
The first movie to be graded digitally was O Brother Where Art Thou? (). The colourist was Jill Bogdanowicz. The 35mm print was scanned to media files known as digital intermediates. Once the grading was completed the digital master was printed back to film for distribution. These days, the whole process is digital. Films are shot with digital camera, postproduction is done on computers and more cinemas are using digital projectors.
While there isn’t an Academy Award for this category, the Hollywood Post Alliance does give awards for outstanding colour grading. The colourist who’s received the most awards since the category started in is Steven J. Scott. His work can be seen in Iron Man (), Gravity () and The Revenant (). Natasha Leonnet has been nominated several times and won for Spiderman: Into the Spider-Verse (). She was also responsible for grading Whiplash (), Hidden Figures () and First Man (). Stefan Sonnenfeld was the colourist for 300 () and Beauty and the Beast (). He was awarded for Alice in Wonderland ().
Our next postproduction process is visual effects. We use effects to create shots that would be too dangerous, too expensive or simply impossible to achieve any other way. Visual effects appear in all genres, but strongly associated with sci-fi and fantasy.
The Visual Effects Shot
We shoot special effects in-camera and create visual effects in postproduction. Visual effects are a combination of different digital elements in a single, photorealistic shot.
These elements include:
- Actors shot against green screen
- CG (computer generated) backgrounds or objects
- 2D graphics
Colour correction helps to blend the different sources. We sharpen or soften elements to define their distance from the camera. Optical effects such as distortion and lens flare add another level of realism.
Visual effects work includes several different disciplines. 3D animators create the CG elements. The compositor builds shots by combining different material. A rotoscope artist creates animated masks for live action. A matchmover tracks moving shots. Then converts the coordinates into virtual camera data.
Situations where you need visual effects are usually obvious. Does the story require spaceships? Do you want to destroy a building? Does the wizard shoot magic from their staff? Visual effects can make the fantastic feel real. But there are less obvious uses. For example, you can replace a boring sky with something more dramatic.
Iconic Visual Effects
The first CG sequence in a movie was the Genesis Device simulation in Star Trek: The Wrath of Khan (). The same team also created the stained glass knight for Young Sherlock Holmes (). This team was the Lucasfilm computer division, which later became Pixar.
The Abyss () used CG to create an effect where the aliens control and shape water. The techniques developed by ILM (Industrial Light and Magic) took another big step with Terminator 2: Judgement Day (). The shape-shifting, liquid metal T-1000 was entirely computer generated.
CG took a massive leap towards photorealism when Jurassic Park was released in . It might seem dated now, but cinema audiences’ were amazed at how incredible the dinosaurs looked. Jurassic Park made ambitious filmmakers aware of the possibilities of digital effects. It opened the flood gates in Hollywood, accelerating advances in techniques and software. After Jurassic Park, it’s actually difficult to identify blockbusters that didn’t use digital effects. Highlights include the The Lord of the Rings trilogy, the Harry Potter films and the Marvel Cinematic Universe.
The next revolution in visual effects is already happening. Instead of a green screen, the producers of The Mandalorian series for Disney+ are using a huge array of LED screens. Know as The Volume, the screens are arranged in a circular wall and across the ceiling. It enables the visual effects artists to project a photorealistic background behind and above the actors. When the camera moves the background also shifts to create realistic parallax.
Not only do the LED screens provide the background, they also create most of the light needed for the scene. Environment based lighting produces more realistic results than standard film lights. Because the background images are generated with a games engine, they are created in real-time and can be quickly altered. Colours, sun position and background objects can all be changed without delaying the shoot.
While related to visual effects, motion graphics are a separate postproduction discipline. As the name suggests, it’s a merging of graphic design and animation. Motion graphics have one major difference to visual effects, they’re not photorealistic. Motion graphics can be 2D layers, 2D objects in 3D space (known as 2.5D), or sophisticated 3D animations.
When creating title sequences and end credits, the emphasis is on text. So experiment with different fonts and styling. Motion graphics are also used for FUIs (fictional user interfaces). Think of Iron Man’s heads up display, the desk in the film Oblivion () and the holographic interfaces that are a staple of science fiction films. In the movies Non-Stop and John Wick (both released in ), text messages between the characters appear on-screen. These are great examples of motion graphics adding another visual layer to the story.
Iconic Motion Graphics
Here are some of the most iconic motion graphics from the past few years…
The Emmy Award winning title sequence from Game of Thrones by Angus Wall, Robert Feng and Kirk Shintani. The graphics changed each season to reflect new locations and important events:
WARNING, FLASHING IMAGES! The title sequence from Gasper Noé’s Enter the Void ():
The Oblivion () light table and HUD designs by GMUNK.
Jayse Hansen’s hologram designs for Star Wars: The Force Awakens ().
I’ve identified five areas of postproduction: video editing, sound editing, colour grading, visual effects and motion graphics. Your editing software can do a bit of everything. But beyond video editing, their capabilities are pretty basic. You can buy plug-ins that add extra functions to your NLE. But for best results, use dedicated software for your task. This will always have better features than a plug-in.
When making films on your own terms, use whatever software you like. If your ambition is to work as a professional, you need experience with a range of apps.
Check out these other Postproduction articles from Indigo Film School:
Get the latest videos, guides and filmmaking resources sent straight to your inbox. Sign-up free for exclusive emails from Indigo Film School.