Film and Video Editing
Film and Video Editing
When most people think about what makes great movies and television programs, they might consider the importance of a well-written script, or the magical collaboration between a first-rate director and high-powered actors. But many people outside the entertainment business do not know much about the world of post-production, the sophisticated crafts and technologies that turn miles of raw footage into finished products that can teach us, motivate us, and make us laugh or cry.
Computers play a role in many aspects of post-production, including animation and special effects, but one area in which they have brought about enormous change is editing—the art and craft of arranging pictures and sound to tell the story envisioned by the writer, producer, and director.
Background
Moving images were first captured on photographic film in the late nineteenth century, and for decades film—an optical/chemical medium—was the only way to record and preserve moving images. With the delivery of the first practical videotape recorder in 1956, it became possible to capture moving images electromagnetically, eliminating the expensive and time-consuming process of developing and printing film.
Until the early 1980s, film and videotape had separate, well-defined uses in the worlds of news and entertainment. Film's strengths were seen to be excellent picture quality, long shelf life, portability, and a technology that was widely used around the world. Traditionally, film has been used for theatrical presentation, prime-time network TV production, high-end commercials (in 35 mm), and documentary (in 16 mm).
The strength of videotape has been economy, speed of production and post-production, and a "live" look. Originally, tape was used for non-prime television such as soap operas, talk shows, and game shows, and for low-end commercials.
During the 1980s and 1990s, the differences between film and video gradually became less pronounced. As video technology grew more reliable and less expensive, videotape became important in news and documentary production, in corporate communications, and eventually in consumer products. Videotape's picture quality also improved greatly, to the point where low-budget feature films like The Blair Witch Project are being shot on video and transferred to film for theatrical distribution. High definition—the highest quality of video—allows image quality that equals that of film. Production costs in "high-def" are lower than in film, and elaborate special effects are easier and less expensive than with traditional film optical effects.
Use of Computers in Editing
For most of the twentieth century, film was cut by hand. Editors were talented craftspeople who worked directly with their material—they assembled scenes by splicing together separate picture and sound rolls, and had great flexibility in rearranging and trimming shots for optimum on-screen impact. The technology was simple, and although the process was labor-intensive, it allowed great creative control.
By contrast, as videotape editing evolved from the 1960s through the 1980s, it was relatively clumsy and needed rooms full of expensive equipment. The process required playing back scenes from two or more video players, and recording them in the desired sequence onto another recorder. Specialist technicians had to operate the equipment, so editors had less creative control over their material. Worse, video editing was "linear"—that is, scenes had to be assembled in order from the beginning of the program, and if a change was desired everything had to be rebuilt from that point on. The technology made the process a lot faster than film editing, but there was less opportunity for subtlety and creativity.
Then came the computer revolution. Starting in the early to mid-1990s, personal computers became powerful enough that production footage (whether it was shot on film or tape) could be transferred to hard disc arrays, and editors working with Mac- or Windows-based editing systems could have the best of both worlds. Computerized systems gave them the precision, random-access ability, and "hands-on" feel of film editing, but, since the systems were electronic, they offered the speed and low cost of the video world.
Use of these computer-based "non-linear editors" or "NLEs"—so-called to distinguish the new style of editing from traditional linear video editing— has totally changed the way content is organized in many media. For example:
- News organizations like CNN use huge amounts of computer storage to keep video materials online, allowing several editors simultaneous access to the footage so they can work on multiple versions of the story for different programs.
- All TV shows and commercials, and most movies, are now edited on NLEs.
- Consumers are buying low-cost, easy-to-use hardware and software to edit their home movies.
The Process
This is how non-linear editing is accomplished on a professional level. The original footage can be shot on film or videotape; footage shot on film is transferred to video in an expensive, carefully controlled process. Then the videotape is digitized, or "loaded in," to the NLE's hard drive.
As the footage is being transferred into the computer, it usually goes through a compression process, so it will need less storage space on the NLE's hard drives. Video files are very large—one minute of "broadcast-quality" picture and sound can consume about two gigabytes of disc storage; high definition footage requires approximately nine gigabytes per minute of material. Editors often need many hours of footage available to them as they edit a project, and digitizing that much footage without compression would require massive and expensive storage systems. But when footage is compressed, at a typical ratio of five to one, a two-minute video clip that might require 4 gigabytes (or 4,000 megabytes) if it is digitized uncompressed will need only 800 megabytes.
The trade-off with compression is that video quality is somewhat reduced (typically, the images are not as crisp and the color fidelity is not as good as the original). But the editors choose a compression ratio that gives them a picture that is good enough to make their decisions. And the reduced quality is temporary—the final product will be made using the high-quality original footage.
As the footage is digitized, the NLE's software helps the editor organize the material. The editor or an assistant "logs in" each shot, writing a description of the shot and what scene it is intended for. The system's visual interface generates a "bin" to hold all the shots that belong in a given scene. A database management system keeps track of where everything is, and allows the editor to sort and retrieve information quickly.
Once the footage is in the system, the creative part begins—building the program scene by scene. That means not just selecting the best shots to tell the story, but also developing a pace and rhythm that propel the story forward, and selecting visual transitions to blend the scenes together. The software provides an easy-to-use graphical display, and gives the editor a variety of tools to arrange the materials, including drag-and-drop, keyboard commands, and placement of scenes on a visual timeline—very much the way word processing software allows a writer to rearrange words.
The NLE's software precisely tracks all the editor's changes, remembers which shots originated where, keeps picture and sound synchronized, and makes an "edit decision list" which describes exactly the portion of each shot that has been selected to be shown in a particular order. The computer plays back the edited scenes in real time, showing the editor, director, producers, and others exactly how scenes will look in the final product. Changes can be made easily, and different ways of organizing the material can be tried out. Because the original material is not being cut or changed during this process—the computer is just displaying the editor's choices of shots in the right sequence—the editing is said to be "non-destructive." The process proceeds over and over, with scenes being continually refined until the creative team agrees that the best possible choices have been made about how to use the materials. This decision point is often called "picture lock."
Once the picture is locked, the process moves in one of three directions. First, if the production originated on film, and the final product is to be a film print (for instance a movie to be shown in theaters), the NLE generates a "negative cut list." The cut list describes exactly what portions of which shots from the original production footage have been selected. Then craftspeople physically cut or "conform" the original negative, splicing it together in the order the editor specified. Finally, traditional optical/chemical film printing techniques are used to make the prints that will be distributed to theaters.
Second, whether the production was originally shot on film or tape, if the final product is to be released on videotape, as television programs are, a "video master" is prepared. This can be done in one of two ways:
- In an "online conforming session," the edit decision list prepared by the NLE software controls a video editing system, using two or more playback videotape machines and one recorder. The high-quality videotapes containing the original footage are automatically played back in sequence on two or more videotape players and assembled onto a new videotape, with transitions and special effects generated by a video "switcher."
- Alternatively, the finished product can be generated completely within an NLE. First, the selected shots—which had originally been compressed to save on storage space—are re-digitized from the original videotapes into the NLE at the full resolution. Then the NLE plays them back in sequence to be recorded on a videotape recorder, with transitions and special effects generated and added by the software on the NLE.
The third way the edited material can be released is directly to broadcast. For instance, once a news story has been edited, it can be transferred immediately from the NLE over a high-speed network to an "air server," from which it is played as often as needed.
Sound Tracks
Audio can be edited within the NLE and released with the video as a final product without any further effort. Often, though, only basic sound editing is done within the NLE, and a separate team of specialists does audio mixing and "sweetening" using a similar computer-based system that is optimized for audio work.
In an audio-editing computer, multiple sound tracks are constructed for dialog, narration, music, and sound effects. The sound mixing team listens to these tracks while watching the edited picture, and carefully blends and enhances the separate tracks to tell the audio portion of the story in the most effective way. Once the sound track is finished, it can be "laid down" on digital audio tape or it can remain within the audio computer; either way it is synchronized by computer with the picture from the NLE as the final video master is produced.
What is the result of all this creativity, effort, and technology? A movie or a television program that—the creative team hopes—is so skillfully prepared that it will entertain us for a little while, or even teach us something about ourselves and our world.
The Future
Technology and economics are driving the evolution of film and video editing. Computer-based video editing at all levels will continue to become less expensive, more powerful and easier to use. This process lowers the barriers to entry for the novice filmmaker, the enterprising journalist, and the consumer who just wants to make better home movies.
The Internet is only beginning to have an effect on editing, but the possibilities are intriguing. Accessing the same materials from anywhere in the world, multiple editors will be able to collaborate on a program. Producers, directors, and executives will screen and comment on the work in progress from anywhere, and highly specialized work like sound mixing and effects generation will have no geographic limitations.
On the high end, both broadcast television and theatrical features are moving away from film and toward high-definition video—a process that will push technical requirements higher as editing systems are built to handle the extreme demands of high-def. Next up is the replacement of traditional 35 mm film projection in theaters with high-definition video taken directly from satellite or fiber optic cable—one more major step in the journey from the film medium to the electronic one.
see also Animation; Graphic Devices.
John Cosgrove
Bibliography
Ohanian, Thomas A. Digital Nonlinear Editing, 2nd ed. Boston: Focal Press, 1998.