Before you begin creating a digital video project, it’s important to understand some basic terminology. Terms such as frame rate, compression, and frame size abundantly populate Premiere’s dialog boxes. Understanding these terms will help you to make the right decisions as you create new projects and export them to videotape or to disk.
Video frame rates
If you take a strip of motion picture film in your hand and hold it up to the light, you’ll see the individual picture frames that comprise the production. If you look closely, you’ll see how motion is created—each frame is slightly different than the previous frame.
A change in the visual information in each frame creates the illusion of motion. If you hold up a piece of videotape to light, you won’t see any frames. However, the video camera does electronically store the picture data into individual video frames.
The standard frame rate in video is approximately 30 frames per second. (The standard frame rate of film is 24 frames per second.) Frame rate is extremely important in Premiere, because it helps determine how smooth the motion will be in your project. Premiere enables you to set the frame rate at various speeds.
For example, if you capture video directly into Premiere, you can capture the video at approximately 30 frames per second, or you can capture it at a lower frame rate. The higher the frame rate, the smoother the motion. While you work in your project, you can also set the frame rate. This is the frame rate at which you create the project while you edit.
Often you will want the project frame rate to be the same rate as your capture rate. However, you may wish to edit at a lower frame rate and export the video at that lower frame rate if you are exporting the project to the Web. By creating a production at a lower frame rate, you enable the production to download to a Web browser faster.
The frame size of a digital video production determines the width and height of your production onscreen. Productions that will be output to videotape need to be created at specific frame sizes. If you are outputting to multimedia or to the Web, you’ll want to pick the frame size carefully.
The larger the frame size, the larger the file size that has to be downloaded to a browser.When you create a video production in Premiere, you can freely choose the size of the video frame. In Premiere, frame size is measured in pixels—the smallest picture element displayed on a computer monitor.
If you create a new project in Premiere and choose one of the settings for NTSC (the U.S. standard) videotape output, Premiere designates 640 ×480 pixels as the frame size. This means that each individual picture is composed of 640 horizontal by 480 vertical pixels.
If you pick the European standard (PAL), Premiere chooses a frame size of 768 × 576. If you pick a broadcast NTSC DV (Digital Video) format, Premiere chooses 768 × 576. (The DV PAL setting is 720 × 576.) However, if you pick a multimedia format, Premiere’s default setting is 320 × 240. Although you could use this frame size on the Web, you can create a smaller frame size using Premiere’s video setup boxes.
RGB color and bit depth A color image on the computer screen is created from the combination of red, green, and blue color phosphors. The combination of different amounts of red, green, and blue light enables you to display millions of different colors.
In digital imaging programs, such as Premiere and Photoshop, the red, green, and blue color components are often called channels. Each channel can provide 256 colors (28—often referred to as 8-bit color because there are 8 bits in a byte), and the combination of 256 red colors × 256 green colors × 256 blue colors results in over 17.6 million colors.
This color is often called 24-bit color (224). Although most users will leave Premiere’s color setting at 24-bit color, you can reduce the number of colors in a video project, or choose a specific palette of colors to use.
The larger the frame size, the greater the bit depth, and the greater the number of frames per second, the better the quality of a Premiere digital video project. Unfortunately, a full-frame, 24-bit color, 30 frames per second video production would require vast amounts of storage space.
You can easily calculate how much disk space a full-frame production would take up. Start by multiplying the frame dimensions. Assume that you are creating a project at 640 × 480. Each pixel needs to be capable of displaying red, green, and blue elements of color, so multiply 640 × 480 × 3.
Thus, each frame is approximately 1MB. Thus, one second of video at 30 frames a second is 30MB. (This doesn’t even include sound.) A five-minute production would consume about 8GB of storage space. To reduce the file size of a video project, you can reduce frame size, frame rate, and color bit depth.
However, taking all of these steps usually results in a tremendous loss in quality. To store more data in less space, and with a minimum loss of quality, software engineers have created a variety of video compression schemes. The two primary compression schemes are spatial and temporal.
- Spatial compression: In spatial compression, computer software analyzes the pixels in an image, and then saves a pattern that simulates the entire image.
- Temporal compression: Temporal compression works by analyzing the pixels in video frames for screen areas that don’t change. Without temporal compression, different frames are saved to disk for each second of video, whether the image onscreen changes or not.
Rather than creating many frames with the same image, temporal compression works by creating one keyframe for image areas that don’t change. The system calculates the differences between frames to create the compression.
For instance, for a video that consists of frames of a flower that sometimes blows in the wind, the computer needs to store only one frame for the flower and record more frames only when the flower moves.
When you work with compression in Premiere, you don’t choose spatial or temporal compression, instead you choose compression settings by specifying a codec. Codec stands for compression and decompression. Software manufacturers create codecs.
For instance, in Premiere, you can choose a codec called Cinepak, which provides temporal compression. When you create, edit, or export your project, you can specify how many keyframes you want to output per second. Another codec, Apple Animation, provides spatial compression. When you use this codec, no keyframe choices are provided.
For your computer to use a video compression system, software and sometimes hardware need to be installed. Both Macs and PCs usually come with video compression software built into their operating systems.
QuickTime is the digital video compression system automatically installed with the Mac operating system; Video for Windows is the digital video compression system automatically installed in the PC operating system. QuickTime is also packaged with Premiere and can be installed on PCs.
Because QuickTime is cross-platform, it is a popular digital video system for CD-ROM and Web digital video. When you use Premiere, it automatically accesses the QuickTime or AVI software, enabling you to choose from a list of QuickTime or AVI codecs. If you have a capture board installed in your computer, the capture board typically provides a set of codecs from which you can choose.