Backstory: The Film That Wasn't There

Each time you go to the movie theater these days, it is increasingly unlikely that you will be sitting down to watch a film.  Many recent movies (Slumdog Millionaire, The Social Network) were shot with digital cameras, and many more are on their way.  But even movies shot on film (The Master, Lincoln) are increasingly shown via digital projector.  In 2009, only 15% percent of movie screens world-wide were digital.  Today, it’s 65%, and by 2015, it will be 85%. 

The rapid digitization of cinema seems inexorable and (at least to the uninitiated) benign, and as digital resolution has improved (partisans say it has now surpassed film in sheer fidelity) the arguments for continuing to shoot and distribute movies on actual film have become less obvious to the general public.  Nevertheless, this digital revolution has sparked a fiery debate amongst filmmakers, distributors, theater managers and cinephiles.

Digital projection is rapidly eclipsing film because it’s an economic no-brainer for distributors.  Producing hundreds or thousands of film prints (at $2,000 a pop), shipping heavy canisters around the world, keeping track of them and making sure they are returned is many times costlier than copying computer files onto discs or portable (and re-writable) hard-drives, or beaming them directly to theaters via satellite or broadband.  And while digital projectors are far costlier than their film progenitors, digital distribution costs are so low that distributors are subsidizing their purchase and refusing to release some movies on film at all.

But it’s not just studio executives who love digital.  While there’s a top-down capitalist imperative driving the conversion to digital projection, the cheapness and flexibility of digital cameras, data storage, and editing software has triggered a bottom-up, populist surge in digital movie production, and the lower cost of distribution has opened new doors for small, independent moviemakers. 

But that which makes film costlier and clunkier is also what makes it special: it’s real.  Yes, all cinema is illusion – an optical trick that conjures invented worlds, but when light bounces off a movie star, passes through a lens and causes a chemical reaction on a strip of film, a physical artifact is left, visible to the naked eye.  A digital camera focused on the same actor receives the light with millions of tiny sensors, each of which translates it into an electronic signal that is then sampled by an analog-to-digital converter and stored in binary code.  To view digital footage, a computer uses algorithms to construct an approximation of the original image based on the sampled data.  A digital movie literally doesn’t exist unless it’s being viewed.

This sounds like a philosophical argument, but it has practical dimensions.  Even the sharpest digital movies do look different from films.  There’s a warm, fluid, organic quality to film that digital cinema hasn’t quite captured, and though digital distribution is less expensive, long-term digital storage is much trickier and costlier than film.  A modern 35 mm film stored in its canisters in a cool, dry, dark place will last 1,000 years.  The data on hard-drives is much more easily corrupted and as technology rapidly advances, the file formats in which movies are saved become obsolete.  They must either be converted or risk becoming unreadable by modern machines.  Finally, many argue that the cost and complexity of working with film forces a discipline and respect on filmmakers for the form and for the time and energy of one’s collaborators.  Digital democratization certainly opens up the possibility for exciting new voices to be heard, but they may be drowned out in a din of sloppy amateurism.    

As the shadow of the digital revolution falls across the titular movie house in The Flick, you may find yourself wondering: will the theater and its denizens grab hold of something real in a world that feels increasingly constructed pixel by pixel?

Alec Strum, Associate Literary Manager
December 2012