For most simple projects which will be shown once (like news packages) it is often slower to edit on a non-linear system. All footage, music and other elements must be digitized before they can be used. Hard drive space will be in short supply for some years, although this is getting better, especially with Firewire external drives. The new Panasonic P-2 format saves video on chips that can be read directly by non-linear editing systems, so this may eventually eliminate the digitizing process entirely. The high cost of the P-2 cards, however, usually means videographers want to off-load the contents to a server ASAP so they can re-use the cards.
The beauty of non-linear editing is when you have to replace a shot in the middle of a finished project with one that is considerably longer or shorter. In a linear world, the entire rest of the video will have to be re-edited or suffer a generation loss (not critical with digital formats). Complex "layering" of video and audio elements is also much easier in a non-linear session.
Digitizing
The first step after logging your footage and doing some preliminary off-line decision making with window dubs is to digitize the footage you think you may want to use. You should begin digitizing a few seconds before your intended in point, and let the clip run a few seconds after the intended out point (I use 3 seconds). These extra seconds are called handles, and allow clips to be combined through dissolves or wipes where parts of both clips are on screen simultaneously. Without handles, you may have to stop everything and re-digitize a clip to see how an effect will look. Final Cut Pro will add handles for you during digitizing from a digital deck if you select that option.
As you save each digitized clip to disk you are asked to name it. If you have a detailed script, you can use scene or shot numbers, but usually editors develop their own shorthand for describing clips effectively in a few words or abbreviations.
When organizing work for non-linear editing, be aware that most non-linear systems use film terminology. What PC users call directories and Mac users call folders, Apple calls bins. This refers to large canvas bins with clips on top to hold film in a clean environment.
The Final Cut Pro software in our lab is set up to store projects (small files telling the program to look various places for video) on a main server, allowing students to work from any workstation. The clips themselves are stored on each hard drive and are only copied when you "render" a new clip, combining several elements. These rendered clips should also be stored on the server in your home directory in what Final Cut Pro calls the "scratch drive." When video is "played" on a non-linear system the hard drive is actually jumping around showing individual frames from various parts of the drive. It isn't really a video until it is "printed" (the Mac term) to video tape or burned to a DVD.
Even on the best non-linear systems with huge media drives, space for raw footage becomes a problem for longer projects (and as HD becomes more common). Editors sometimes digitize footage at low quality (high compression) to do a rough off-line edit, and then re-digitize just the footage actually used in the project at high quality using the automatic batch digitize feature of the Final Cut Pro software.
On the other hand, some videographers with access to large video servers digitize whole tapes or P-2 cards as single clips. These can be organized into sub-clips in FCP or may be used just like raw footage in linear editing.
When digitizing footage for the Final Cut Pro software, you can choose many formats, but at KU we generally shoot in standard definition DV or DVCPro formats, so the format you should choose for most projects is NTSC/DV.
Department policy states that faculty members must supervise digitizing footage on the Final Cut Pro workstations, so we will primarily use footage prepared for class use. If you would like additional audio or video clips or would like to digitize a special project, this is possible. Discuss this with the instructor. There is an I/O workstation designed for digitizing footage to a super fast video server called the SAN disk. We will use this server for the actual media we digitize for the group project.
High-end NLE systems today provide many features to assist the editor. Specifically:
Here is a good time to mention a point of confusion in computer-based video editing. The card in the computer that makes images on the computer's own monitor is called a video card. Virtually all computers have these cards (a few do monitor video from a chip on the main board rather than a separate card), and these cards are critical for complex video editing. Most Macs come with a video card capable of supporting 2 monitors which is helpful for video editing. Apple's Motion program is especially demanding on the video card, and will not run at all on older Macs.
Video capture cards (sometimes just called video cards) for editors are special cards that convert analog NTSC video to a digital format the computer can handle, and then convert back to NTSC when you are ready to print your project to video tape. The newer ones also support direct input of digital information (such as a digital output from a DVCPro machine) with conversion from the tape standard to the computer's standard. "Firewire" is the most common of these input formats, but there are several others suitable for HD video.
Refer to the handout on Using Final Cut Pro for detailed information on how to work with our editing lab.
Good luck and have fun. Welcome to the digital world.