Danny Boyle's 'Steve Jobs' showcases an evolution of cinematography
The Danny Boyle-directed "Steve Jobs," which chronicles the life and achievements of the Apple co-founder and former CEO over three distinct acts, shows off how Jobs himself spearheaded an evolution in consumer technology. In fact, in order to represent this more fully, Boyle chose to give each act a unique, context-appropriate "feel" through carefully selecting the technology he, himself, used to capture its scenes.
According to AppleInsider, the first act, set in 1984 (and focusing on the unveiling of the first Macintosh "Mac" computer), "was shot by Boyle in grainy 16-millimeter film. The smaller 16-millimeter film size is cheaper to shoot with, which is why it’s generally used for television programs and not big-budget feature films," the website adds. This, Boyle explained, gives the beginning of the movie a "homemade" feel, reflecting the grassroots beginnings of Apple and its co-founder. "It felt like it was early days, and he (Jobs) very much thought of himself as the pirate, the rebel," Boyle said.
For the movie’s second act, set in 1988 at the announcement of the NeXT Computer, Boyle upgraded the film to 35 millimeters. This is the gauge most commonly used for major motion pictures, and was used for projection in virtually every movie theater up until recently.
Then finally, for the third (and final) act, which focuses on the unveiling of the iMac computer, Boyle made another, even more appropriate upgrade: one to digital cinematography. As AppleInsider reminds us, the iMac ditched the floppy drive; so too does the third act in Boyle’s film ditch, well, film itself. The result is a more modern feel to the picture as it approaches the closing credits.
All of this makes "Steve Jobs" less of a two-bit biopic and more of a true piece of art, which, being a Boyle-directed movie, it was always going to be. I don’t know about you, but I can’t wait to see it.