Five Minutes. Five Years — BTS


As I briefly mentioned in the Seven Snake Core post, I’ve answered questions about how the Five Minutes-Five Years video was created every single time I’ve shown it. Which I completely understand. But since it’s now “out in the open” I thought I’d give you a better peek behind the scenes.

Side Note
I’ve been given a 10% promo code on all Capture One products. Just enter this at checkout if you’re interested (it’s valid until October 31st 2019):
OCT-AF-SZTU

By the way, I need to mention the inspiration behind this piece: my good friend Kevin Mullins’ unedited wedding montage—which he first presented at Photokina in 2016. If you haven’t seen it please do so...it’s simply breathtaking. I remember sitting in the audience at the time, gasping, feeling like someone had knocked me over. It stayed with me and I eventually thought I could apply this to personal images, as a sort of recap of the blog and, well, our lives by extension. So big tip of the proverbial hat to Kev.

The quantity of images is of course the main driver here, and the idea of managing each file individually in a video editor is enough to give anyone a bad case of vertigo. Which I believe is what usually leads to those questions. So here’s the secret: that is NOT how you do it.

Legacy. Still useful.

Back in the day, we didn’t have iPhones that could edit 4K video and manage thousands upon thousands of images (geez, listen to me putting my old man cap on). I remember creating opening credits for television in After Effects and having to wait 10 hours for the final render. We’re talking mayyyybe a 40 sec intro here? And of course it would sometimes (too often) crash in the middle of the night and I’d wake up to zilch, having to start over. Anyway, one of the formats used for animating images at the time was called Image Sequence. And this was nothing more than a series of x number of images (JPEG or TIFF) held in a folder and named sequentially—image001, image002 and so forth. That’s the trick at work here: instead of importing every image individually and then animating each frame, I used Image Sequence. 

This means all the photos in the video were first gathered into a Capture One album (a collection in Adobe parlance); they were then re-ordered according to how I wanted them to appear in the final movie; and they were finally exported using a sequential naming scheme. Final count? 1,871 pictures.

public.jpeg

But the magic actually happens on import: when an app that supports the Image Sequence format encounters a series of sequentially named files, it will offer to either import the files individually (nope) or as an image sequence. Which really means that all those images will now become ONE video file as far as the editing application is concerned. In this case, I ended up breaking down the individual images into 6 image sequences (easier to manage and providing more editing freedom), which gave me a very short list of assets for such a complex edit. The generated names you see in the screenshot show the number of images in each sequence (the mix bkg audio files are leftovers I didn’t bother to clean up from the assets library).

Apple Motion 5

This film was edited entirely in Motion 5. Sometimes I’ll use iMovie on my iPad Pro to stitch a sequence together (like some of those new Core short films) but everything I do relating to video is always finalized in Motion. It’s a compositing application that’s certainly not meant for long form content, but it offers everything I need for these kinds of projects. It’s incredibly deep software but at its core it remains a timeline, layers (video + audio), keyframes and filters. Everything a boy needs to have fun. Yes, girls too. Sheesh.

Most importantly, it allows quite a bit of control over speed and time remapping of video assets, as well as frame blending—all very important for this type of work. Because the first step in making those image sequences usable is to slow.them.the.hell.down. That’s because by default, each image in the resulting video sequence appears as a single frame—that’s potentially (depending on the frame rate) around 24 individual images per second. Unless the goal is some kind of Clockwork Orange subliminal overload, that's a teeny weeny bit fast. So all these sequences were slowed down to between 14% and 20% of their original speed, stretching out their duration in the process.

Once this is done, the work simply becomes about editing (1). I did run into one pretty big hiccup however...

the hills are alive...

Logic Pro X project window.

Editing is about rhythm—even without the presence of music there will always be an inherent flow to the way images follow one another. But when there IS music present, it will immediately become a driving force: we expect cuts on a beat or a visual lull on slower passages. In this particular case—given the nature of the montage—the soundtrack became extremely important, almost acting as a director, guiding the entire assembly from start to finish. This was fine, but due to time constraints (I initially created the video for a specific talk) the first version was edited to a song from The Cinematic Orchestra (To Build A Home). Great song. But although this worked very well (the lyrics adding an interesting layer to the mix) I obviously didn’t have permission to use it. So when I decided to incorporate the video into most of my talks, I knew the soundtrack had to change. I also knew however, that I didn't want to break the entire work in the process.

So this meant I had to write and record a new song that would maintain the original rhythm and keep all existing cuts aligned. I’d done this before (a KAGE presentation video for a festival some years ago) but it’s always a tough thing to do: the technical aspects are tedious (video and audio are imported into Logic Pro X, tempo is analyzed and a new score is built using these elements as a guide track) but there’s also a very real disconnect as the brain adjusts to new sounds behind now familiar images. It’s crazy how easily we get used to an association and it takes awhile for any replacement score to feel “genuine”. Eventually, once the tempo and visual cues were set, I was able to create a soundtrack that was obviously much more personal—despite losing Patrick Watson’s ethereal voice in all its fragile, emotional glory.

Technical details aside, I feel this piece is very much about the point of what we do: of taking pictures, of documenting the ordinary, of marking the passage of time with our cameras. I don’t know...I usually mention how this might be the only work I’ll ever do that will end up meaning anything in the end. Twenty years from now if it somehow survives on a server or a hard drive or a cloud hanging in the blogosphere. It really might be.

Technical details aside…
Have a great weekend.

––––––––––––––––––––––––––––––––

1 This same method was used to create the final image montage in the GF 50mm film.