I covered the macro aspects of editing in my previous post; check that out for my take on how to use subject, balance, distance & pacing to support the music. This is the micro side; zooming in on each cut to make sure it lands on the best frame. This is one of those things that makes a small difference individually, and a big difference across the whole video. When you nail your timing, the cuts flow seamlessly (sometimes invisibly) through the song, becoming another aspect that supports the performance rather than distracting from it. My cuts land on one of two cues; visual & audio.
Visual cues are motions that draw the viewer’s attention, covering up the transition of a cut and maintaining the visual flow. Visual cues include:
- a performer moving, turning, waving, nodding, jumping, etc.
- a guitar strum, piano chord, drum hit, etc.
- a light flare or lighting change
- a camera pan or zoom
Continue reading “Editing Live Music Videos: Timing Your Cuts”
For me, editing a live music video is about recreating the performance through the eyes of the audience. Imagine you were the entire audience; not just one person, but everyone in the venue. What would your collective experience be? You’d be watching the performance from the front row, from the balcony, and from the crowd. Your focus would shift from the bassist to the lead singer, to the drummer as he laid down a fill, and to the people rocking out in the front. When I’m cutting together a live concert video, I’m trying to recreate that experience; bringing the viewer into the space and creating the feeling of being right there.
My second goal is to follow the music. The musicians are already telling a story through their words and notes; my job is to support & mirror all those elements. A bad edit will work against the music, cutting out of time and focusing on the wrong subjects. A good edit flows with the beat, amplifying the hits & emotional energy of the music. When I’m putting the edit together, I’m focusing on four elements; subject, balance, distance, & pacing.
Editing live music videos is mostly a pragmatic process; my main goal is just to focus on the most interesting element at any moment. On my first pass, I’ll go through all the different angles and pick out the best moments from each (I’ll also cut out any uninteresting or unusable footage). For the lead singer, I’ll pick out the verses & choruses. For the guitarist, I’ll pick out riffs that stand out and any solo sections. For the drummer, drum fills; and so on.
In my second pass, I’ll go through and pick the most interesting moments from all the angles combined. At this point, I’ve got a good starting point to build my edit from. Continue reading “Editing Live Music Videos: Recreating the Concert Experience”
I’ve been studying Alex Chaloff’s music videos recently, as they’re incredible examples of live music captures. It’s a testament to his work that even though I’m taking notes on camera angles, lighting, and sound all the way through, I can’t help but get lost in the performances.
Part of breaking down these videos is figuring out how much it might cost to put together a production like this. That’s helpful for me as it puts my rates into context; it’s also helpful for my clients to understand what goes into creating a video at this level.
There’s two parts to the cost of a production; kit fees and labor costs. Kit fees cover the equipment, and labor covers the efforts of everyone involved. $600 is a standard day rate for studios & camera ops here in CO, so I’ll use that for my estimates. As for the kit, a standard rate is 1% of its value per day. Let’s start with a minimal shoot like this one:
Continue reading “Breakdown: How Much Does a Live Music Video Cost?”
I shoot with the Sony a7S II, and I’m really happy with the images it produces. Unfortunately, it’s also limited by a codec which restricts it to 8-bit, low data rate video capture. In the real world, this manifests as digital artifacts that reduce the resolution of your footage and make it harder to color grade. I use the process below to clean up my footage before editing, but I’d recommend this workflow for any 8-bit video or footage with visible noise.
Digital noise is one of the biggest obstacles to high-quality video. Whereas analog noise can be aesthetically pleasing, digital noise looks cheap and pixelated. It’s a combination of the noise of your camera sensor (created by amplifying light, more prevalent at higher ISO values) and the artifacts created by the codec (created by compression, more prevalent at low data rates). A high-quality denoiser can make a world of difference on your footage; I’m using Neat Video, and I’ve been really impressed by its capabilities. Here’s a dramatic example (click here for full-size):
Continue reading “Digital Film: Cleaning Up 8-Bit Footage”
Codecs are one of the last things I wrapped my head around when I got into video; it’s not exactly an engaging subject, and all the technical terms can be overwhelming. But depending on the camera, your codec can be a real limiting factor. Overall, I’m really happy with my a7S II’s. The full-frame sensor, lens system, dynamic range, and overall image quality are incredible. But no system’s perfect, and the weakest point of the a7SII the codec. So let’s figure out how to work with it.
To put it simply, a codec is a file format. There’s a DVD codec, a Blu-Ray codec, a YouTube codec; all the media you watch has been encoded into a format that fits the playback system. Consumer codecs are fairly standardized, but when you’re on the professional side, there are a huge range of options to choose from (hence the confusion). Luckily, despite the vast number of individual codecs out there, the principles are pretty simple. Frame rate (frames per second) and resolution (number of pixels) are easy to understand, and both are set by the camera before you press record. The main factors controlled by the codec are bit rate and bit depth.
Continue reading “Digital Film: Understanding Codecs”
I’ve been using Adobe Premiere since I first started editing (nearly ten years ago), but I’ve been curious about Blackmagic’s DaVinci Resolve. With version 14 hot off the presses, I thought I’d dive in, give it a spin and see if it could replace Premiere as my go-to NLE. After 12 hours (straight), here are my thoughts:
Your ability to edit depends more on your familiarity with the NLE than the software itself, and as it’s a pretty standardized process, you won’t find major differences between programs. With both Premiere & Resolve, I was able to cut together a multicam within an hour or so of messing around; no need to open up a manual. But if you go beyond basic edits, Premiere shows its pedigree; its tools are more refined and offer more options for power users.
Continue reading “Gear Talk: Resolve 14 vs. Premiere Pro CC”
I’m a firm believer in using a small set of tools that you know inside & out, and this applies very strongly to my audio workflow. Rather than downloading massive plugin bundles, I’d highly recommend learning to work with just one DAW, one EQ, one compressor, and one reverb until you’re intimately familiar with the principles of each. 90% of my work is done with 4 very powerful pieces of software that I’ve listed below, and I’d highly recommend them to anyone who’s doing audio work at any level.
DAW // REAPER ($60)
Your DAW is your workflow, and workflow is the most important part of the mix process. I’ve been using REAPER for years, and I wholeheartedly recommend it for everything from editing podcasts & recording live shows to mixing & mastering studio albums. REAPER isn’t just cheaper than Pro Tools & Logic; in my opinion, it’s better. There are so many things I love about this DAW; free updates for life, incredible stability, fast & intuitive clip editing tools, flexible audio routing, great hardware integration, and perfect audio quality. I was able to use it competently after a few days of experimentation, but 2 years later I bought a hard copy of the manual and realized how many incredible features I hadn’t even discovered. Continue reading “Gear Talk: Quality Over Quantity”
It took me a while to realize this, but you can buy audio software secondhand. Used software is the same as new software, but with a little patience you can buy your plugins for far less than their sticker price (usually about 50% off). Here’s a list of places where you can buy licenses; all of these are reputable communities with checks & balances to make sure you’re not paying for pirated software.
I’ve been buying from these forums for years now and I’ve only had good experiences; that said, make sure you pay through a service that allows disputes in case anything falls through.
Around a year ago I came across an enlightening post by Bob Macc on the consequences of treating your tracks with aggressive filters. You should go check it out yourself, but the big idea is that using steep filters causes phase distortion that reaches all the way up and down the frequency spectrum. Phase distortion removes clarity and impact from your mix, so it’s not good. With that in mind, I’ve come up with some points of good practice for filtering your audio.
First, use gentle filter slopes. 6db/octave is my go-to for LPFs, and 12db/octave is my go-to for HPFs. This lessens the impact of phase distortion, but most importantly, it sounds more natural and there’s less risk of accidentally cutting out the good stuff.
Second, use a Bessel curve. Bessel filters are linear phase, which means there’s near-zero phase distortion when they’re implemented correctly. While some EQs have linear phase DSP options, remember that if you’re using steep filter slopes, linear phase will cause distortion too; it’s the combination of gentle slopes and Bessel curves that’ll preserve your audio.
This advice is in line with the Hippocratic philosophy of mixing, e.g. “do no harm.” I’m mixing mostly acoustic music, hut it doesn’t matter if you’re mixing orchestral, folk, punk, or EDM; phase distortion from filters is rarely something you want. Following the two points above will save you a lot of mix clarity and let you put the distortion where you want it.