When we first started preparing to shoot Act of Valor, over four years ago, we had to decide what cameras to use. Up to that point all of our work at Bandito Brothers had been shot on HD-Cam, XD-Cam HD, P2, HDV, and Film. Shooting stereoscopic 3D was considered, but dismissed after watching some NFL test footage. I think Scott Waugh was the first to point out that the level of immersion and intensity we were after would make our audience sick if viewed in 3D. We needed a light-weight camera system that operated well in low light, since SEALs usually work at night.
Mouse McCoy had long before envisioned someday being able to shoot professional video on a camera with a DSLR form factor. The Nikon D90 was released in April ’08 and we had purchased one immediately to try it out. We liked the form factor, and 24p was great, even if it was limited to 720p. But high motion shots clearly didn’t hold up at all, with the level of compression it used to record to Motion JPEG. Since everything we shoot is action-packed high-adrenaline footage, this limitation was a deal-breaker. The D90 was clearly not the fulfillment of McCoy’s vision, but the technology was getting closer.
The Canon 5D MkII was released in Sept ’08, and it took us a little longer to look into that after our first disappointment. Kevin Ward, who had been the DP on many of our previous projects, brought one by the office to play with one afternoon. Clearly this had more potential, but there had been a lot of positive feedback and buzz after the initial release, so camera bodies were hard to find. We ordered our own as soon as I found a decent deal that was actually available.
Our first major dedicated shooting test was an all night endeavor north of LA, with night-scopes and a variety of lenses and lights. The image quality of the videos from the 5D was amazing, (once you converted it to something that could actually playback on an edit system) and it really performed well in dark scenes. Plus it could use many existing lenses, but there were some downsides. The 30p limitation was the biggest one by far, but there was also the lack of control over the exposure, without manual control of the shutter speed or the aperture. By adapting it to use manual Nikon lenses, we were able to over-ride the aperture limitation, and then the shutter speed could be temporarily “locked” while aiming closer or farther from the lights, but this had to be done again for every take.
Our first major project that was shot primarily on the 5D was done with Shane Hurlbut. The results were beautiful, but we were constantly tricking the camera into doing what we wanted. And reinterpreting every shot to 29.97 and re-syncing the audio was extra work for post, but the native H264 files didn’t work well in any NLE software, so we were already transcoding everything anyway. All of Bandito’s previous commercial work had been natively 24p, but for a few months, all our cameras were set to 29.97 to be compatible with the 5D footage.
Once the 1.1 firmware update was released for the 5D, that gave us manual control over the exposure. The process on set became easier, but the post issues from shooting 30p remained. We started looking into motion compensated frame blending software, so we could return to shooting and editing at 24p, and still use content from the 5D. I asked around at NAB09 and found ReVisionFX and their plug-in Twixtor, which I may have used more than any other person on the planet by now. Back in early 2009, it took an hour to convert one minute of 5D source footage to 24p, on an 8-processor Core2 based Xeon workstation. Both the hardware and software have improved since then, so it is about 10 times faster than that now.
The other thing we had to look at was our creative editorial solution. All of our commercials up to that point had been cut in Premiere Pro 2.0 and CS3, on Matrox AxioLE and Cineform ProspectHD based systems. Still running 4GB RAM on XP32, those systems were maxed out on our commercials, with only 10 hours of source material. We knew Avid was the only reliable option at the time, for a project with hundreds of hours of source footage, but nearly all of our existing tools and experience were with Premiere Pro. It was at this point that we were introduced to Siobhan Prior, who became an integral part of the post team for the rest of the project. Her experience using Avid on tradition feature films was the exact opposite of mine with using Premiere Pro to integrate all sorts of unusual formats into short commercials. Our long debates over the “best” way to do things from our contrasting perspectives, eventually led us to a pretty solid workflow solution after a few months of discussion and experimentation.
I setup our first Media Composer system in the spring of ‘09, and started learning how to interface it with the Adobe Creative Suite. The biggest issue in that regard was that we were changing the time-base of our source assets from 30p to 24p during the edit, so it was a bit tricky to figure out how to keep things straight for the online process. Importing 30p material into a 24p Avid project causes it to drop frames as it converts to DNxHD36 MXF files. The resulting 24p EDL from Avid doesn’t match the time-code of the original 30p Canon .MOV files, but it DOES match the frame-count of the Twixtor processed 24p conversions in Premiere. This is because Premiere can ignore the “real” time-code of a file, and just convert the desired in-point to a frame count for the beginning of the file. That allowed us to move sequences from the Avid to Premiere with relative ease as long as all of our source assets were pre-converted to 24p, and they all had unique filenames.
The development of our file naming convention was probably the most widely debated workflow decision, with input from the camera department, editorial, VFX, post finishing, and even our executive producers. The end result accomplished 95% of what we set out to do, since you can’t account for every possibility from the beginning. I limited it to 8 characters, so that it would work with standard EDLs, but we packed as much information as possible into those 8 values. We also had Ben Gullotti develop a web database to log and track all of the footage, the design of which I based on our file naming convention.
Once we had manual exposure control, and a post solution that would allow us to intercut the footage with film, there was a major conversation that took place, where we discussed whether or not it would be feasible to shoot our upcoming feature film on the 5D. It is funny to look back now, and realize how little experience I had, to be qualified to answer that question, but we eventually agreed that is was the best option. Jacob Rosenberg, as the head of our post department, was confident that we would be able to overcome whatever issues would result from our decision to shoot with the 5D. There was some concern about the lack of real time-code, and the issues of CompactFlash failures, but I was most concerned about the 30p issue, and processing times involved for 24p conversion. I was assured that we would get a render-farm of powerful systems to do that when the time came. (In reality we ended up commandeering every edit station, every night, for six months, but it got done.)
Certain parts of the movie were still shot on film, and that footage was transferred to HD-Cam SR. These tapes were ingested with our Mojo DX directly into the Avid as DNxHD files, to intercut with the dropped frame footage from the 5D. The segments from the tapes were eventually recaptured with AJA Kona cards, into Premiere as Cineform444 files, as needed based on the Avid EDLs. So we had a workflow ready to go that allowed us to edit in Avid and finish in Premiere, and could handle both Canon 5D source files and SR based film transfers. Once things fit together successfully after our first shoot, we were pretty confident that we had it all figured out.