Without access to split audio stems, this is the next best solution to get clean dialogue from movies and video games!Read More
A peek behind the scenes at a number of the trailers I edited in 2016.Read More
Between 2012 and 2016 I worked on trailers for a multitude of games I consider to be career highlights, but two in particular stand out to me. The projects were the cutscenes and trailers for the game BattleBlock Theater, and trailers for the game Firewatch. To this day they're two of the most visible projects I've worked on, and I often encounter people excited to meet or work with me because of them. They're extra special to me because each opportunity stemmed from me spending time on goofy side projects that got the attention of the right people, and then being ready to say yes when I received the job offer. I wanted to write this post as a celebration of the side projects we do out of the joy they give us, and how sometimes they can lead to our most interesting opportunities!
In 2012 a friend of mine got a job working for The Behemoth who was working on their third game BattleBlock Theater. I'd been a huge fan of their games, and being in any way connected to them was already a surreal experience.
That Summer they launched a Kickstarter campaign for a vinyl figure of the Necromancer from their game Castle Crashers. When they heard I was an editor, they asked if I could shoot, and edit the campaign video for them.
The Behemoth's games are known for their sense of humor, and while shooting take after take for the video, it became crystal clear where it all comes from. Often they could contain their laughter or off the cuff remarks as they struggled to get through the script.
I ended up incorporating several outtakes in the final video, but also was left with a treasure trove of goofy off-script improv that was too funny to ignore. So I made an alternate cut exclusively for The Behemoth's eyes with some crudely animated segments I made in After Effects using simple animations, Motion Sketch, and some particle effects.
I spent a couple days on the video because even though I knew only a handful of people would ever see it, couldn't help but try to make it as funny as possible. They loved it, and had a good laugh. More importantly something in my crudely animated video caught the eye of Dan Paladin (The Behemoth's co-founder, art director and animator).
I'm not going to post the video, but its style was very much in the vein of this:
For BattleBlock Theater, Dan was animating the story cutscenes by hand in Flash. The cutscenes are in the style of a stick figure puppet theater which is quite laborious to do via keyframe animation. What he saw in my video is that doing stick figure puppets in After Effects would be dirt simple via Motion Sketch. The game had already been in development for a few years, and with only two artists, it was still a monumental task to finish the art for the game, nevermind animating 8 cutscenes of up to a few minutes in length each. So he asked me if he thought I could take on the job of animating all the cutscenes for the game.
Dan's original hand animated version of the opening cinematic (the script also changed quite a bit)
I said I'd think about it, and then investigated physics plugins for After Effects. This is when I stumbled upon the relatively new plugin Newton. I did a very crude test of a puppet with floppy limbs, and some figures in a boat. It was very rudimentary, but Dan liked the look of it.
So when he asked me again if I could do it. I said YES!
It was by far some of the most fun I've ever had on a project! It took several months of work, and a lot of problem solving for looks I didn't know how to create in After Effects. I learned a TON along the way, and if I ever even think about the first After Effects project I made for the game I cringe at how horribly set up it is. I'm always reticent to take too much credit for the cutscenes since so much is owed to Dan for the art and WillStamper for the audio and voiceover, but I'm tremendously proud of what I did.
To think this all came about from a silly video I did just to entertain myself, and some friends.
Of course you can wave your hands, and say that I only got that opportunity because I was friends with someone who worked at the company. That is partially true, but had I not made the silly video, the idea of me being hired for the gig would've never come up.
Some opportunities are out of sheer luck, a lot of them are because of friends, some we make on our own, most are a combination of the three. What's ultimately important is being ready when they come up.
In 2014 Campo Santo announced their first game Firewatch. There was a lot of industry attention on this game because of the talented people who formed the company. A few of them, Jake Rodkin, Sean Vanaman and Chris Remo were also part of the Idle Thumbs podcast which I've been an avid listener of for several years. So it would be an understatement to say that I was very much looking forward to this game.
After seeing the first footage from the game I knew that I wanted to make a fan trailer when it came out. It has a beautiful art style, and a story told via interactive dialogue between the main characters. Little did I know that a year prior to the announcement I had already planted the seeds that would get me the gig of working on the Firewatch E3 trailer.
Around Fall 2013, Valve introduced their Steam controller, but leading up to its announcement they released a teaser website with some cryptic iconography on it. There was a lot speculation surrounding the website, and on the Idle Thumbs podcast they put forward their incredibly silly theories. After hearing their discussion I couldn't help but animate it in After Effects the same week the podcast came out, and then I put it up on YouTube that weekend.
I had a lot of fun making it, and the Idle Thumbs guys even put it on their blog, and retweeted it!
Later that year, Google purchased the robotics company Boston Dynamics which birthed another ridiculous Idle Thumbs discussion about how this was going to lead to humans being taken over by robots. It was a much longer conversation, but it had me laughing so much envisioning it, I once again couldn't help but animate it in After Effects. This one was nearly 7 minutes long, required some thumbnail storyboarding, and about a week of work.
They also featured this video on the Idle Thumbs blog!
At this point I still had no direct contact with anyone on the podcast save for some Twitter comments and retweets, but it still never occurred to me to ask them if they were looking for a trailer editor. The following year Campo Santo was at PAX East 2014 to show Firewatch to the public, and also have a separate Idle Thumbs panel.
I eagerly attended the Firewatch panel, and got very excited for the game. Later I met Jake and Sean on the PAX show floor, talked to them a little bit, and gave them a business card. Even then I still didn't think that they'd actually hire me for the job, but I allowed myself to imagine the possibility that they would.
At Double Fine's PAX party I met the rest of the Campo Santo team, and found out their graphics programmer Paolo Surrichio was a HUGE fan of my trailer for the Steam version of BattleBlock Theater! When I get compliments for that trailer I'm usually quick to give credit to Stamper who took my basic premise and bullet points, and sent me liquid gold in audio form. Again, it didn't occur to me then that this might've helped me get the Firewatch trailer job, but in retrospect it might've at least kept my name on the radar.
Skipping to the good part, six weeks before E3 2015 I got a DM on Twitter from Sean asking if I was available to make a trailer that would debut at the Sony Press Conference. They were too busy working on the game to devote time to the trailer. I was positively dumbfounded, and gave an emphatic YES!
For this blog post I asked Sean to confirm the story of how they decided to hire me.
We decided to hire you because your Idle Thumbs fan videos actually communicated two things to us: one, that you were a very competent editor and two, that you shared our bullshit sensibilities in terms of humor and entertainment. That was a good mix. PLUS when we learned that you were doing the job for an agency, and therefore had been in a professional environment, we felt pretty confident that you'd deliver. All of those things together made it a no-brainer.
ALSO, personally, we realized it was so so so much better to get someone else to cut our trailers instead ourselves, despite Jake and I being really competent with editing suites and making professional-looking video content. By having someone new cut your stuff you find out what's good about it from his or her perspective, instead of your own, which, thanks to years of development, has become pretty myopic.
You can read my post about making that trailer on the Campo Santo Blog. After that trailer I also made the Launch Mini-trailers and Accolades Trailer that announced the Xbox One version and Audio Tour mode. Just for fun I also made a Cinematic Playthrough of the game so that my mom could see what I'd been working on.
After I delivered the final version of the Firewatch E3 trailer I had one more thing for them. After the PAX Firewatch panel there was an Idle Thumbs panel where a reader mentioned he thought Firewatch looked like Gone Home, but in a forest. They had a laugh about it, but I thought to myself: "What if Campo Santo and Fullbright were petty, bitter rivals? What kind of trailer would come from that perspective?" (To be clear, the people at both companies are very good friends)
The result was another incredibly goofy video I spent no more than a day working on. Long story short, everyone at Campo Santo loved it, and sent it to Steve Gaynor, co-founder of Fullbright. Later via this connection I got to work on the launch trailer for the console release of Gone Home, which was another dream gig!
Side projects can be a wonderful thing, but it is also still important to have work experience. That is what can separate you from even the most talented editors posting on YouTube. If you're at the beginning of your career, it's important to find work at a post house so that you have the discipline required to work with others, and hit deadlines. If you can't finish a job in a timely manner then it doesn't matter how talented you are.
Multiple times I've had people mention that they were impressed with my side project work, but knowing I had worked at actual production companies is what gave them the confidence to hire me for the job.
In retrospect, what I realize is that side projects can show how passionate a person is about their work. When I think of my friends' side projects, it shows me they don't work just for the money, and that makes me more interested to work with them.
So if you have side projects you want to work on, then do it! Don't necessarily think about job opportunities they'll create while doing it, because that might take the joy out of it. Do what you find enjoyable, and you never know where it might lead!
Sometimes you need a sound effect or piece of music to reverb out. It could be a scream in a horror trailer, or something else that reverbs out over a cut to black. More commonly I use it when I want a piece of music I'm editing to end on a specific note to create a stop down moment in a trailer. Mostly it just gives me more options for editing music for trailers
This is very simple to do, but can take some finessing.
1. Find the precise note you want the cue to reverb out on.
2. Cut off everything after that note, or just before the next note starts. To get a clean reverb you don't want any sounds coming in after.
3. Make a cut a second or two before the note you're reverbing starts. This is to give some head to the clip so that you can fade from the "normal" clip to the "wet" clip with the reverb effect.
4. Nest that section into a new sequence.
5. Add a small fade at the end of the clip to avoid an audio pop, then add black slug into the new sequence. This will extend the duration of the nested clip.
6. Apply these reverb settings to the nested clip in the main sequence.
7. Fade between the "normal" and "wet" clips and tweak to you're liking. Usually it works best if you fade into the reverbed clip as late as possible so that you can't hear the effect before then.
8. Adjust the audio levels of the "wet" clip so you don't have a volume spike.
And that's it! This technique is a great thing to have under your belt because it opens up your options for editing music SIGNIFICANTLY. I always tell new editors to learn to edit and do all sorts of things to bend music to your will, and this is one of the key techniques I use to do just that.
Notes for Avid Media Composer, Final Cut Pro 7, and Final Cut Pro X.
Final Cut Pro 7
You can use the exact same steps for Final Cut Pro 7, just use these settings with the "Reverberation" Final Cut Pro filter. If it's not quite the sound you want, then experiment with the other presets.
Avid Media Composer
Since Media Composer can't nest clips the same way as Premiere and Final Cut Pro, after step #3 make an edit a few seconds after the end of the clip you want to reverb.
Select the clip along with the slug and create an Audio Mixdown.
Apply the D-Verb Audiosuite filter to the new clip, then adjust the reverb settings, audio levels and transition as needed.
Final Cut Pro X
There are a couple ways you can do this in Final Cut Pro X. You can do it the same way as before with a compound clip.
Open the compound clip and insert a "gap clip" that you then put at the end.
In the main sequence extend out the reverbed clip and apply the AUMatrixReverb filter to reverb out the audio. Trim the normal clip to the right, add a transition and that's it! I like this method because you can use the trimming tools to adjust where the cut is.
Then press "Shift-H" which in the Command list is "Retime: Hold" this will create an empty clip after that section.
After that you can apply the reverb effect! You can also adjust where the hold frame starts by clicking the edge of the green retime segment, and click "Edit" which will give you a source frame icon you can move.
This only applies if you're not doing the final audio mix yourself.
If for the final sound mix you're sending an OMF to the mixer be sure to "bake" in the reverb effect by exporting it as a separate AIFF or WAVE. Give the clip enough head and tail so that the sound mixer can smoothly transition to the reverbed clip. If an assistant editor is sending out final audio tracks, make sure to let them know that this needs to be done!
Alternately, your sound mixer most likely has vastly superior tools to make the audio reverb out, so the other solution is to give them the raw audio with a note about how it needs to reverb out.
I’ve made trailers for several video games, but Cosmic Trip by Funktronic Labs is the first VR game I’ve worked on. Since VR games are still new, people are experimenting with how best to present them in trailers.
The mixed reality trailers for Fantastic Contraption and Google’s Tilt Brush do a fantastic job of showing a person inside the experience. Other trailers I’ve seen either intercut live action footage of the player with gameplay, use picture-in-picture windows, or just focus on gameplay.
For non-narrative trailers I typically use title cards to present the game features, but for Cosmic Trip I decided to simply show the game in tightly choreographed gameplay. My thinking was that the Vive audience is eager to see new VR mechanics, so I didn’t have to make a hard sell. Cosmic Trip is also one of the first real-time strategy VR games, which makes it an extra novel thing to see.
Funktronic sent me a build of the game so I could brainstorm ideas for the trailer. It was still early in development, so there were only a handful of features. But just like game tutorials, it’s important for trailers to show features to the audience without overwhelming them. Starting the trailer with game capture of an expert player, would risk confusing the audience. If the audience doesn’t understand what they’re watching or where to look, they’re likely to zone out or just stop watching.
After playing the game I honed in on the basic things I should highlight and the order. I'd start simple with basics of the controllers, then show some of the base building mechanics, then finish off with combat with multiple enemies to show just a taste of what is in store for the game. This would let me nicely build upon each idea of the game, and hopefully not confuse the audience.
I saw right away that the teleport function in the game would allow me to seamlessly cut together pieces of capture because there’s a sudden shift in perspective as you zoom from one location to the other. This reduced the number of consecutive actions I had to capture in each take, which would save me a lot of time because if the trailer had to be one very long take there’d be more room for mistakes.
As much as I wanted the gameplay to be unedited takes, a few features such as the crystal mining took up time I didn’t want to spend waiting. So I had Funktronic capture some cutaway shots of the drone mining, and then returning to the refinery with the crystal. To avoid a jump cut, It was important that the mining cutaway feel entirely different from the player’s first person point of view. So it was captured with a locked camera from a perspective impossible for the player to achieve from the playfield.
Since I wanted the trailer to move quickly, I choreographed movements so there was always something going on. In the first shot I wanted to look at the planet, and quickly move onto attacking enemies with the controller discs. Funktronic added a feature that allowed me to spawn enemies by pressing the grip button on the controller. This is how I could time the first shot so that I was ready to attack by the time the enemy spawned.
In the crystal mining segment I took special care to focus attention on the game menus. We wanted to showcase the UI of the game, and also give time to read the names of objects I was using. There’s a bit of downtime when waiting for drones to build, so I positioned the flying robot friend in a spot I could turn to, and give it a quick wave before looking back just in time to see the drone pop out of the machine. I used the same approach for building the crystal refinery.
The final shot took the longest to capture by far. In my special build, the third area was set up with machines that had batteries in them. I knew that at this point in the trailer the audience would already have seen machine building and battery insertion, so this setup let me skip straight to the laser drone building. The Mothership’s approach was triggered by my teleportation to the third area (it otherwise stayed in the sky)
Originally I was going to do several cutaways to the giant Mothership in order to build suspense while also spawning several enemies that would be attacked by the laser drones. My first attempts ended up being far too chaotic for the eye to read, so I had to focus on what the final shot needed to communicate.
The most important thing was to show the laser drones, and have them attack the enemies dropped by The Mothership. I also wanted to whip out the controllers simultaneously because it’s really fun to do in the game. Lastly I wanted to tease the audience with the game’s turret.
With all this in mind I spent at least three hours doing take after take in order to get it just right. Most of the time was spent exploring just the right sequence of actions to take. For example, it wasn't until much later that I decided to look up at the Mothership while building to build some suspense. Once I knew what I wanted it was about performing those movements just right.
There were a myriad of reasons takes became unusable. For example, sometimes enemies would spawn too far apart which ruined my framing, but more often I’d mess up one movement in the choreography. The worst, but easiest to avoid mistake was leaving the mouse cursor in the middle of the screen, which I wouldn’t know about until I took off my headset. I blame at least a few of those ruined takes on my cats nudging my mouse.
I captured the title shot by first getting the robot friend in the right position in the last location, teleporting out, and then as soon as I teleported back in I slowly turned my head to the side. This took a bunch of takes just because I wanted the robot in the right position on the screen where you could see it but not have it be blocked by the text on screen.
The music was based on the timing of a rough cut edited using pieces of music that had already been composed. While the trailer doesn’t heavily sync to the music, there are a handful of moments that I specifically synced the capture to for some extra flair. In the future I might run the trailer’s music in a loop while performing game capture so that I can set my movements to the beat, and get that extra level of sync.
Some tips I learned for capturing VR with hand controllers!
1. Add motion smoothing to the monitor window! Unless you’re an amazing hula dancer or this chicken, the headset will capture every single small movement of your head. Trust me when I say that watching footage like this will eventually make you motion sick, and of course you don’t want your audience to get nauseated either. The framerate of the headset might be impacted while capturing because of the extra work the game has to do, but it's worth it.
2. Make sure your mouse cursor isn’t visible on the monitor! I wasted a bunch of time because of this. Make sure the mouse is in an area where it won’t get bumped by either yourself or curious pets. You're obviously not going to be able to see your computer monitor while capturing, so it doesn't hurt to double check.
3. Find your “dominant eye” and make sure that’s the eye you’re looking through when capturing. If you capture with both eyes open, there’s a good chance that when you think your controllers are centered, they’ll actually be off to the left or right.
You can find your dominant eye by forming a small hole between your two hands, and looking through it with both eyes open at a small object. Close your eyes one at a time, and the eye that can still see the object is your dominant eye! I learned this years ago when I took archery classes, and supposedly that tells you which side of your brain grew first.
4. Do test captures to find where the center of the screen is. That’s where you controllers should “live” during capture. It’s usually much higher up than you think, and it’ll feel totally unnatural while doing it.
5. When looking at objects, be very deliberate with your head movements. In real life we move our eyes around a lot, and move our head very little. But if you do this for game capture you’ll end up with a lot of objects off the edges of the frame. If you want to make sure something is in the center of the frame you’ll have to exaggerate where you look.
6. Take it slow! People are much better with their hands than with a traditional controller, so it can be easy for you to perform actions faster than the camera or your audience can read. Be very deliberate so that everything registers. Of course this one depends on the style of game that you’re making. It might actually be the perfect approach to be chaotic for your game, it all depends.
7. Carefully choreograph the movements and actions you make in the game. If you’re not going to rely on a lot of editing then the timing of the trailer is all in the performance. The more you can mitigate randomness via special tools or scripted events, the easier it will be to get multiple takes while capturing.
8. VR capture with hand controllers is a performance! Capturing for traditional video games with a controller is also a performance, but it makes a big difference when every nuance of your head and hand movements are tracked. Your movements say a lot about you and/or the emotion of your actions. A simple hand wave can be happy, sarcastic, insincere or dull. VR with hand controllers, more than any other type of game capture made me consider just how much the simplest of movements can communicate with the audience. Also it kind of makes me want to take dance classes.
Consider if you want the players in your trailer to look like they’re just having some casual fun playing a party game? Or do you want them to look like a total lord who deftly executes the perfect actions, and give the audience an idea of how amazing it feels to be good at the game? Either way, you'll still need a certain amount of choreography to have it look nice and readable to the audience.
Already I can tell that people with physical training whether dance or martial arts will be at an advantage for first person VR capture. I don’t know if my 9 years of martial arts practice helped me do capture for this trailer, but I’d like to think it did. At the very least, it meant I didn’t get physically tired from hours of capturing. I can easily imagine a the future where “VR performance” will become listed as a job skill.
That's it for now! I'm sure in the future I'll learn much more about what it takes to make a good trailer for a VR game, but this is what I've learned so far.
Just some technical tidbits. This was captured on a rig running an Intel Core i7-4790K, GTX 970 and 8GB of RAM. Video was captured using Nvidia’s Shadowplay.
Cosmic Trip is now available on Steam Early Access, and you can follow Funktronic Labs on Twitter!
My list of most frequently used keyboard shortcuts and techniques.Read More
A list of frequently used "ingredients" used in motion graphicsRead More