I’ve made trailers for several video games, but Cosmic Trip by Funktronic Labs is the first VR game I’ve worked on. Since at the time of the writing of this post VR games are still new, people are experimenting with how best to present them in trailers.
The mixed reality trailers for Fantastic Contraption and Google’s Tilt Brush do a fantastic job of showing a person inside the experience. Other trailers I’ve seen either intercut live action footage of the player with gameplay, use picture-in-picture windows, or just focus on gameplay.
For non-narrative trailers I typically use title cards to present the game features, but for Cosmic Trip I decided to simply show the game in tightly choreographed gameplay. My thinking was that the Vive audience is eager to see new VR mechanics, so I didn’t have to make a hard sell. Cosmic Trip is also one of the first real-time strategy VR games, which makes it an extra novel thing to see.
Funktronic sent me a build of the game so I could brainstorm ideas for the trailer. It was still early in development, so there were only a handful of features. But just like game tutorials, it’s important for trailers to show features to the audience without overwhelming them. Starting the trailer with game capture of an expert player, would risk confusing the audience. If the audience doesn’t understand what they’re watching or where to look, they’re likely to zone out or just stop watching.
After playing the game I honed in on the basic things I should highlight and the order. I'd start simple with basics of the controllers, then show some of the base building mechanics, then finish off with combat with multiple enemies to show just a taste of what is in store for the game. This would let me nicely build upon each idea of the game, and hopefully not confuse the audience.
I saw right away that the teleport function in the game would allow me to seamlessly cut together pieces of capture because there’s a sudden shift in perspective as you zoom from one location to the other. This reduced the number of consecutive actions I had to capture in each take, which would save me a lot of time because if the trailer had to be one very long take there’d be more room for mistakes.
As much as I wanted the gameplay to be unedited takes, a few features such as the crystal mining took up time I didn’t want to spend waiting. So I had Funktronic capture some cutaway shots of the drone mining, and then returning to the refinery with the crystal. To avoid a jump cut, It was important that the mining cutaway feel entirely different from the player’s first person point of view. So it was captured with a locked camera from a perspective impossible for the player to achieve from the playfield.
Since I wanted the trailer to move quickly, I choreographed movements so there was always something going on. In the first shot I wanted to look at the planet, and quickly move onto attacking enemies with the controller discs. Funktronic added a feature that allowed me to spawn enemies by pressing the grip button on the controller. This is how I could time the first shot so that I was ready to attack by the time the enemy spawned.
In the crystal mining segment I took special care to focus attention on the game menus. We wanted to showcase the UI of the game, and also give time to read the names of objects I was using. There’s a bit of downtime when waiting for drones to build, so I positioned the flying robot friend in a spot I could turn to, and give it a quick wave before looking back just in time to see the drone pop out of the machine. I used the same approach for building the crystal refinery.
The final shot took the longest to capture by far. In my special build, the third area was set up with machines that had batteries in them. I knew that at this point in the trailer the audience would already have seen machine building and battery insertion, so this setup let me skip straight to the laser drone building. The Mothership’s approach was triggered by my teleportation to the third area (it otherwise stayed in the sky)
Originally I was going to do several cutaways to the giant Mothership in order to build suspense while also spawning several enemies that would be attacked by the laser drones. My first attempts ended up being far too chaotic for the eye to read, so I had to focus on what the final shot needed to communicate.
The most important thing was to show the laser drones, and have them attack the enemies dropped by The Mothership. I also wanted to whip out the controllers simultaneously because it’s really fun to do in the game. Lastly I wanted to tease the audience with the game’s turret.
With all this in mind I spent at least three hours doing take after take in order to get it just right. Most of the time was spent exploring just the right sequence of actions to take. For example, it wasn't until much later that I decided to look up at the Mothership while building to build some suspense. Once I knew what I wanted it was about performing those movements just right.
There were a myriad of reasons takes became unusable. For example, sometimes enemies would spawn too far apart which ruined my framing, but more often I’d mess up one movement in the choreography. The worst, but easiest to avoid mistake was leaving the mouse cursor in the middle of the screen, which I wouldn’t know about until I took off my headset. I blame at least a few of those ruined takes on my cats nudging my mouse.
I captured the title shot by first getting the robot friend in the right position in the last location, teleporting out, and then as soon as I teleported back in I slowly turned my head to the side. This took a bunch of takes just because I wanted the robot in the right position on the screen where you could see it but not have it be blocked by the text on screen.
The music was based on the timing of a rough cut edited using pieces of music that had already been composed. While the trailer doesn’t heavily sync to the music, there are a handful of moments that I specifically synced the capture to for some extra flair. In the future I might run the trailer’s music in a loop while performing game capture so that I can set my movements to the beat, and get that extra level of sync.
Some tips I learned for capturing VR with hand controllers!
1. Add motion smoothing to the monitor window! Unless you’re an amazing hula dancer or this chicken, the headset will capture every single small movement of your head. Trust me when I say that watching footage like this will eventually make you motion sick, and of course you don’t want your audience to get nauseated either. The framerate of the headset might be impacted while capturing because of the extra work the game has to do, but it's worth it.
2. Make sure your mouse cursor isn’t visible on the monitor! I wasted a bunch of time because of this. Make sure the mouse is in an area where it won’t get bumped by either yourself or curious pets. You're obviously not going to be able to see your computer monitor while capturing, so it doesn't hurt to double check.
3. Find out which eye the monitor screen is displaying. Usually it's the left, but double check to make sure so that everything is centered according to that monitor.
4. Do test captures to find where the center of the screen is. That’s where you controllers should “live” during capture. It’s usually much higher up than you think, and it’ll feel totally unnatural while doing it.
5. When looking at objects, be very deliberate with your head movements. In real life we move our eyes around a lot, and move our head very little. But if you do this for game capture you’ll end up with a lot of objects off the edges of the frame. If you want to make sure something is in the center of the frame you’ll have to exaggerate where you look.
6. Take it slow! People are much better with their hands than with a traditional controller, so it can be easy for you to perform actions faster than the camera or your audience can read. Be very deliberate so that everything registers. Of course this one depends on the style of game that you’re making. It might actually be the perfect approach to be chaotic for your game, it all depends.
7. Carefully choreograph the movements and actions you make in the game. If you’re not going to rely on a lot of editing then the timing of the trailer is all in the performance. The more you can mitigate randomness via special tools or scripted events, the easier it will be to get multiple takes while capturing.
8. VR capture with hand controllers is a performance! Capturing for traditional video games with a controller is also a performance, but it makes a big difference when every nuance of your head and hand movements are tracked. Your movements say a lot about you and/or the emotion of your actions. A simple hand wave can be happy, sarcastic, insincere or dull. VR with hand controllers, more than any other type of game capture made me consider just how much the simplest of movements can communicate with the audience. Also it kind of makes me want to take dance classes.
Consider if you want the players in your trailer to look like they’re just having some casual fun playing a party game? Or do you want them to look like a total lord who deftly executes the perfect actions, and give the audience an idea of how amazing it feels to be good at the game? Either way, you'll still need a certain amount of choreography to have it look nice and readable to the audience.
Already I can tell that people with physical training whether dance or martial arts will be at an advantage for first person VR capture. I don’t know if my 9+ years of martial arts practice helped me do capture for this trailer, but I’d like to think it did. At the very least, it meant I didn’t get physically tired from hours of capturing. I can easily imagine a the future where “VR performance” will become listed as a job skill.
That's it for now! I'm sure in the future I'll learn much more about what it takes to make a good trailer for a VR game, but this is what I've learned so far.