Television Broadcasting, Technology of
TELEVISION BROADCASTING, TECHNOLOGY OF
Television production and postproduction require an extensive array of equipment, but the most important element for television production is patience. Before any equipment has been turned on, before the lights have been focused, before the director has called his or her first shot, creative personnel must take the time to plan the production. Despite the best intentions of the production team, poor planning or a weak premise will only result in a feeble finished product.
Cameras and Accessories
Central to television is the use of cameras, which provide most of the visual elements that the audience sees. Basically, the camera is the visual acquisition tool that uses a lens to project an image (technically, the light reflected from the subject) onto the camera's pickup element, either an electron tube or a solid-state imager. The tube encodes the image into thousands of dots, and then an electron gun "reads" the dots and outputs the signal. Most cameras have replaced the tube system with a "chip" system (solid-state imager) that uses one or several charge-coupled devices (CCDs), which are smaller, more durable, and use less power than tubes.
Camera operators can control various elements of the image through lens selection and camera features. For example, electronic shutter speed can vary, video signals can be amplified through gain control, the aperture can be opened or closed to adjust for lighting conditions, and the field of view can be changed without physically moving the camera (provided that the camera is equipped with a zoom lens). Most professional cameras also have built-in white balance control, which compensates for light temperature and adjusts the picture to reflect optimal color rendition.
Generally, cameras are configured as either studio cameras or portable cameras. Portable cameras, often called handheld cameras, are often used for electronic newsgathering (ENG). Most professional portable cameras are designed to rest on the right shoulder of the operator, with camera controls within reach and the viewfinder resting in front of the operator's right eye. Handheld cameras are designed to run using battery power. Either the battery is attached to the rear of the camera or the camera is connected to a battery belt pack. Modern ENG cameras, often called camcorders, also have a built-in videotape recorder (VTR) and microphone, and they often sport a removable mini-light for better closeups.
In contrast, the studio camera does not have a built-in VTR or microphone. In addition, studio cameras use conventional power instead of batteries because they are in a studio environment. Studio cameras often feature more sophisticated lens systems (a box-type lens) than ENG cameras. Some high-end cameras feature swappable components, so they can be configured as a studio or remote camera.
The camera pedestal provides a large, stable mount with wheels for studio cameras. Pedestals allow camera operators to change easily the position of the camera on the studio floor, as well as the camera height and angle. They are also built to accommodate studio-configured camera controls, often called "configs." Rather than making adjustments directly on the camera, studio configs allow the camera operator to control focus and zoom from remote controls on rods that are attached to the pedestal. The operator can also view the picture on a monitor that is attached to the top of the camera, rather than through the ENG camera's smaller and less convenient viewfinder. Studio configs also incorporate a headset microphone, which allows the camera operator to communicate with the director in the control room.
For remote shoots, tripods are often the preferred camera-mounting equipment, due to their compact size and lighter weight. Tripods allow remote cameras to be configured like studio cameras and save operators from constant handheld shooting. Tripods support the camera on three legs but do not have wheels, so the camera is physically fixed in one position. Camera operators can attach wheels to the bottom of a tripod with a dolly, but movement is not as smooth and predictable as when a pedestal is used.
In special production circumstances, cameras are mounted in a fixed position, though camera operators can maintain camera control remotely. Network football game coverage, for example, routinely includes a camera mounted on the goal post, while cooking programs will often mount a camera above the talent to provide a better view of the work area. Cameras can also be attached to counterweighted jib or crane arms or suspended on a wire for smooth, moving shots. Cameras can even be mounted to the body of the camera operator for more stable handheld shots.
Video Switcher
The technical director (TD) sits at the heart of the control room, the video switcher, which has rows of buttons that are used to select which video source will be recorded or broadcast. Each row is called a bus or bank. The program bus controls which image is recorded on a VTR or is broadcast. The preview (or preset) bus, which usually has its own monitor, provides the TD (who is sometimes referred to as the "switcher") the opportunity to check images that are not on-screen before switching to them on the live program bus. TDs can also "key" one source, such as graphics, over another source, such as a camera.
For example, if there are five cameras covering a football game, each camera will have its own button on the program bus and preview bus. When a camera is "hot," or on-the-air, the TD can use the preview bus to preview an effect. Other video sources, such as the graphics station and slow-motion replay, will have buttons on the switcher as well. If a reporter interviews a coach on the sideline, the TD can key the coach's name (using the character generator) over the camera image.
Using the switcher, the TD can switch between cameras and other video sources in a variety of ways. A cut, an instantaneous change, is the least obtrusive transition between images. A fade transitions gradually from a color (usually black) to another image or vice versa. Dissolves simultaneously fade-in one image while fading-out another, and they are often used to illustrate the passage of time. Finally, wipes and digital video effects (DVE), which can be created using external equipment, are the most noticeable transitions, as one image literally wipes across another to replace it, often in a dynamic pattern that attracts attention. Fades, wipes, and DVE moves are controlled manually with a fader bar, though some switchers offer preset controls as well.
All video switchers essentially do the same job, but not all video switchers are created equal. More advanced switchers can include a number of additional functions and effects. Some switchers can handle dozens of inputs and have multiple input buses, while others have limited sources and only preview and program buses. Other switchers have downstream keyers, which allow key inserts just before the program source leaves the switcher without using a switcher's mixing or effects systems. Some switchers even have chroma key, which inserts one image over another to create a composite image (often used for weather reports where meteorologists stand in front of live satellite footage).
Graphics
The graphics station is the home of the character generator (CG), often called the Chyron, based on the name of the dominant CG manufacturer. The graphics operator is responsible for providing pages of text and other graphic images that will be used during a production, from opening titles to the closing credits. Improvements in technology have made graphics more dominant on television than ever, with more detailed and visually energetic images. Digital paint systems can create three-dimensional and animated images, while even simpler PC-based CG programs can manipulate and animate text efficiently.
Most graphics are built in preproduction so that access is simple during production. At a football game, for example, player names would be typed in before the game, and introductory graphic images, such as the starting lineups, would be ready for broadcast before kickoff. Game statistics, however, would be composed on the CG as the game progressed.
Audio
Different audio performances require different audio acquisition tools, some of which are visible to the audience. On-camera microphones include (1) the handheld microphone, which is used extensively for interviews, (2) the desk microphone, which remains in a fixed position on a small stand, (3) the stand microphone, which can be adjusted to match the height of the performer, (4) the headset microphone, which is used by sportscasters to keep the microphone close to the mouth and provide program audio and directions in a noisy environment, and (5) the lavalier microphone, which is an unobtrusive microphone that is clipped to the front of the talent's clothing.
When productions call for microphones that are not visible to the audience, highly directional boom microphones are positioned out-of-frame above or below the action, often using a collapsible aluminum pole (i.e., a fishpole). Comments from a roundtable of panelists can be recorded using a pressure-zone microphone (PZM), which captures sound from a variety of sources but equalizes the volume of all sounds (so shuffling papers sound as loud as a person speaking). Other types of microphones can be concealed throughout a set, though wires must also be hidden and talent must be positioned near the microphones for optimal effectiveness.
Microphones also vary in their audio pickup patterns, or the way in which they capture audio. The omnidirectional microphone, for example, provides equal audio pickup from all sides. It is useful for capturing crowd noise in a stadium during a football game. During that same game, however, a sideline commentator may want to interview a coach. The commentator will use a handheld unidirectional microphone, which features a much more narrow pickup pattern, to isolate the coach's comments from the noisy stadium around him. For isolating the sounds of football players on the field, an audio assistant will hold a microphone in a parabolic reflector, which resembles a shallow dish and increases microphone sensitivity and direction. Other audio pickup patterns include the cardioid, which maintains a heart-shaped pattern; the super-cardioid, which features a more narrow, directional pattern with limited pickup behind the microphone; and the bidirectional, which picks up sound in front of and behind the microphone but is deaf to noise on the sides.
Production sound is coordinated through the audio board (though some remote shoots use smaller but less flexible portable audio mixers). Pre-recorded music, videotape roll-ins, and live sound are all input into the audio board as electrical signals. Each audio source is given its own channel, and the operator sets the volume for each channel using rotating knobs called potentiometers (pots) or sliding bars (faders). Each signal can also be manipulated through equalizers to improve sound quality. The board operator monitors the intensity of the sound level using a volume unit (VU) meter, which indicates the volume in terms of decibels and percentage of modulation. Overmodulation, which occurs when the audio signal is stronger than the maximum percentage of 100 percent modulation, can lead to distortion, especially in digital audio situations. The master output from the audio board provides the audio mix for the program.
The audio board is not set once and forgotten for the rest of the shoot; it needs to be constantly monitored, or ridden. Also, audio levels vary during a telecast, so the board operator must "ride" the levels of the inputs. During a football game, for example, if a commentator gets excited and begins to shout, the board operator must be ready to bring down the volume of the signal or risk distortion. Also, the sideline commentator's microphone only needs to be open or "hot" during interviews. The board operator must be ready to close or "kill" the microphone when it is not in use to avoid broadcasting comments that are not supposed to be on the air.
Editing
For live programming, the show is usually over when the director declares the production a "wrap" and the crew strikes the set and equipment. However, most productions are followed with some kind of postproduction work. Even when a program is shot live-to-tape, so the production is recorded as it happens and without stopping, directors may want to insert video or graphics over bad shots, clean up or add sound effects, or add other special effects.
During production, when mistakes are made, the director often says, "We'll fix it in post." Post-production usually corresponds with a trip to the editing suite. Editing is a process that is used to put shots in different chronological order, vary shot selection, alter the timing of shots to improve pacing, and add transitions between shots. Editing is a tool to help the director tell his or her story better, and different programs will use different editing strategies. A promotional video for a sports event, for example, may edit action shots together in a furious frenzy of short clips to create excitement. A music video for a romantic ballad, in contrast, may use slow dissolves and far fewer cuts so that a more relaxed mood is created.
Traditionally, video editing has been a linear process, where the editor works with videotape and other inputs to build a program in chronological order (assemble editing) or replace sections of video and/or audio in an existing project (insert editing). The main advantage to linear editing is speed; an experienced editor can literally cut together a news package or other short, basic projects in minutes. Advanced editing systems also can incorporate graphics and other digital video effects (DVE). However, there are several limitations. Editing with analog videotape results in loss of video quality with each generation; in other words, an edited package that contains video copied from original video will not look as good as the original footage. A copy or "dub" of that edited footage will look even worse. Some tape formats show minimal loss during the first few generations, but loss of video quality cannot be avoided entirely. Excessive use can also wear down videotape and equipment. Also, it is troublesome to return to a completed project and make changes, because then the rest of the project must either be re-edited to accommodate the change, or it must be dubbed to a new tape and attached to the changed footage, which results in a generation loss.
Nonlinear editing became an affordable alternative to linear editing in the 1990s. Footage is digitized, or fed into the computer and turned into data. Editors can then manipulate graphics, audio, video, and transitional effects in any order. The advantages of nonlinear editing are numerous. First, there is less tape and equipment wear; once footage is digitized, tape is not needed for the editing process. Because there is no tape, there is also no generation loss. Digitized video maintains its quality no matter how much or how little it is used, although compression can affect quality. Editors can also "jump around" in the nonlinear world; pieces can be edited and re-edited without concern for the timeline. This opens the door to much more creative experimentation and flexibility. The major drawback to nonlinear editing is digitizing time. Footage must be digitized in real time for most systems, so editing has to wait until the footage has been loaded into the computer. For projects with tight deadline constraints, this can be impractical.
See also:Cable Television, System Technology of; Digital Communication; Film Industry, Technology of; Television Broadcasting; Television Broadcasting, Careers in; Television Broadcasting, History of; Television Broadcasting, Production of; Television Broadcasting, Station Operations and.
Bibliography
Millerson, Gerald. (1988). Video Production Handbook. Boston: Focal Press.
Millerson, Gerald. (1993). Effective TV Production, 3rd edition. Boston: Focal Press.
Silbergleid, Michael, and Pescatore, Mark J., eds. (1999). The Guide to Digital Television, 2nd edition. New York: Miller Freeman.
Wurtzel, Alan, and Acker, Stephen R. (1989). Television Production, 3rd edition. New York: McGraw-Hill.
Mark J. Pescatore
Michael Silbergleid