In setting up the post-production “ecosystem” for 6Below, the first step was to figure out how much storage we would need, to keep all of the 6K footage online. Unlike our previous 50 camera productions, having only 2 Red Dragons available limited the amount of footage that was shot to reasonable levels. We initially planned for 6K wide at 8:1, which is 73MB/sec. This is 4.4GB/min or 265GB/hr, and 2hrs/day for 30 days results in 16TB of footage. So I bought a 32TB SAS array from ProAvio, the 8 spindle EB800V2, which provided over 1GB/s to my main edit system. All of these calculations, including the critical 60 hours of footage estimate were right on, but they ended up deciding to shoot at 4:1 instead, doubling the data rates and file-sizes. Needless to say, we purchased another ProAvio RAID, twice as big, before we were done shooting. This ended up being a blessing, in that it gave us local storage options on both systems, so we could render from one to another, and back, for maximium performance. We could have survived with a 2nd 32TB array, but upgrading to 64TB was worth the price, and will be useful on our next project, whatever that is.
So now that we have a drive full of footage, what do we do with it? Smooth playback of 6K files requires all of the computing power that we could get, so we managed to get our hands on two maxed out Dell Precision T7910 workstations. They had dual socket Xeon-E5-2687W V3 10Core 3.1Ghz processors, with 128GB RAM, Nvidia Quadro M6000 graphics cards, and PCIe based SSDs. They also had 10GbE cards and Thunderbolt2 ports. While a slightly slower system might have still been able to provide real-time playback, we used every available bit of processing power we could get, frequently rendering dailies in the background while actively editing and ingesting new footage. And we of course were running Windows 7 Pro x64; is there really another option?
The 10GbE cards were the main way of sharing data, but without an expensive 10GbE switch, we had to use them as direct connect links. Each link had it’s own subnet, with a triangle of cables connecting our two edit stations to our VFX station. With some careful IP based drive mapping everything was running at top speed. We routinely get 800MB/s copying between systems, and occasionally over 1GB/s. The Thunderbolt ports were useful when we needed to temporarily boost storage with a Pegasus Thunderbolt array, which is also how we shuttled 8TB of DPXs to and from Technicolor later in the process. We also had a ton of 8TB USB3 drives for our footage backup and archive.
The next challenge was figuring out how to monitor a 6K movie. I knew that we would someday need to drive 3 DCI spec 2K projectors in real time, so that plan dictated some of the specs on the edit systems. But initial editing work was done on location using single 55″ Sony X850C 55-Inch TVs as the primary monitors. The editors ran Premiere Pro on Dell’s U3415W Curved 34″ LCDs at 3440×1440, which is everyone’s new favorite display choice around here.
I should have gotten more of those 34″ ultra-wide displays instead of the other 27″ UHD displays, but I considered the curved display a bit of a risk, not sure how the editors would like them. The ultra-wide 2.39 aspect ratio made sense for a wide format project, but when I used them to monitor the Escape 7:1 aspect ratio picture, I still lost the top and bottom thirds to matting. It was even worse on the regular monitors, with a 5″ tall picture on our 55″ TVs. Anyhow, I never heard any negative feedback on the wide curved displays, and I very much appreciate them myself. I would recommend them to anyone who spends all day at their computer, totally worth the investment. ($700)
Once we were to the finishing stages, huge mattes and tiny picture was no longer an acceptable way to monitor the Escape version of the movie. So I got a third Sony TV, and connected all three to the Quadro M6000 in our main Dell Workstation. After a bit of experimenting, I was able to get all three TVs to present themselves to Premiere’s Mercury Transmit plugin as a single 5760×1080 display, for full screen playback. My 6144×1152 sequence frame-size scaled perfectly into that, with variable opacity software mattes to aid the reframing process to 858 tall. The 3 TVs were infinitely better for simulating the Escape environment, especially the seams which I had been marking with the title-safe lines in Premiere, but that is not the same. And the shift in angle makes a difference as well, so working in that viewing environment totally changed how we were approaching the reframing process.
The final step was to hook up full sized 2K projectors in an Escape theater to the system, and tweak the brightness of certain parts of the frame on certain shots to enhance the overall contrast in the theatrical environment. There is always a difference between viewing an image on an LCD TV and a projection screen, in how the image feels, but Escape exaggerates that with the potential reflections on the TVs, and washout on the theater screens. So being able to view the project in the theater was important, but even more important was the capability to modify the project in the theater. That required real time playback on all three screens from 6K source files, with all of the effects and color manipulations done in real time.
I was able to control the system remotely with my laptop down in the theater. When we did it, we had to use half resolution rendering to get smooth playback, but I suspect that with new Pascal GPUs and more optimization in Premiere, we will be seeing full resolution pixel for pixel the next time this is done. Something to look forward to. Also, in QCing the final DCP source files on the office TVs, I believe I was the first person in the whole world to watch an entire feature film in the Barco Escape format.