FiLMiC Pro is the camera app for users who need more control than the native Camera app provides. Filmmaking on a camera phone is a challenging prospect when compared to filmmaking on an actual camera that is designed expressly for that purpose.
There are some stories that are only going to be captured with something as ubiquitous and invisible as a phone. Unfortunately, camera video apps aim for the middle with storage-saving bitrates and imaging automation software. FiLMiC Pro removes some of the limitations of the camera app that limit an iPhone as a cinematic capture device. I sent Kevin Buonagurio, the COO at Cinegenix, a series of questions about this latest release.
RSN: How much of the app was rewritten in Swift versus using the previous codebase in the new Swift architecture?
Kevin: The iOS9 v5.0 update to FiLMiC Pro represents a 100% rewrite of source code and core functionality all in Swift 2.0. We started by writing a basic video camera from scratch, then we pulled in the Objective-C user interface code and rewrote the rest of the underlying architecture. Finally, we transposed any remaining Objective-C to Swift and then removed the last Objective-C bits shortly before our ship date.
Was this a major undertaking for your dev team to pull off?
Absolutely. We were under an tight deadline considering WWDC and the release of Swift 2 was in mid-June and the iOS9 launch was September 16th. But we thought to ourselves if not now…when? Then add in the complexity of writing in a language that was in beta all summer and changing every week, so it was a lot to try stay on top of. And at release time there are still a few previously-available FiLMiC Pro features that haven’t quite made it back into the app but will shortly. Notwithstanding the challenge, the performance, simplicity and long term extensibility of writing in Swift made it all worth it.
Does Swift open up new possibilities for features and functions?
Apple did an excellent job of bridging the two languages, as so many of Apple's own APIs are still written in Objective-C. Presumably, there will not be any new flagship device/OS features that will only work from Swift. However, the new possibilities that Swift opens up for the developer are another story. The type safety in Swift is reason enough to make the switch. Over 50% of the crashes we’ve debugged in Objective-C aren't even possible in Swift, and the modern language features like closures, tuples and protocol extensions allow a small dev team to build software that's as clever, robust and sturdy as that of a much larger team in Objective-C
Does the stabilization tool use camera motion data to smooth the image (like Hyperlapse), or is that something you are doing with image data alone?
The guiding philosophy of FiLMiC Pro is to provide the best available, broadcast capable image quality possible on an iOS platform. For that we prioritize date rich resolutions. That frees us up to focus our engineering efforts on expanded user control such as granular speed focus pulls/ISO and Shutter Priority exposure modes, rack speed zoom control, and expanded audio functionality.
The Apple stabilization options for Standard and Cinematic are part of the current iOS framework and supported by device (cinematic is only available for the 6 and 6s devices). They use a combination of motion data with a small amount of latency along with a minor crop-in of the frame to produce pretty terrific stabilization results. So we are able to leverage these options in the app and as we mentioned, focus on the real value-add components that our users have come to expect over these past few years.
Other apps, that have built their own stabilization options, have achieved seemingly higher quality of stabilization which certainly their users appreciate. We have elected to avoid this route in the short-term because these solutions typically come at a cost to the data-rich, quality video that we are looking for in a final clip. Building our own stabilization befitting the FiLMiC label will be a pretty large undertaking, and at this point is a lower priority for us given the pipeline of enhancements we would like to deliver to our customer base.
You said that the Filmic team would never scale up pixel data to create larger spatial resolutions (the reason why no >1080P on an iPhone 5), why is that, what is the principle behind that design/ function decision?
Our design philosophy is to provide a professional grade filmmaking feature set to everyone. To that end we focus only on those features that provide value through practical application. We don't want to develop features for Filmic Pro that would degrade the image, and scaling up to 4K from 1080p is a destructive operation, even though you're going to a higher resolution. 4K video made from a 1080p source is really just the same video but quadruple the file size. Ultimately, if a user truly wants a 4K video file from a camera that can't capture 4K natively, they will be better off using a tool like Adobe Premier or similar to perform the up scaling.
Is FiLMiC grabbing the video data before native compression into h264? If so, is there any chance for adding codecs in the future? Do you think there will some day be ProRes on an iPhone?
We would love if we could do this someday. It's not currently possible, as Apple only provides hardware acceleration for h.264 and h.265, and if you give up hardware accelerated encoding, you'll drop so many frames while recording that the footage would not be usable.
The other sticking point is disk write speed on the iPhone's NAND flash. The recently released 6s/6s Plus is the first iPhone capable of writing at 100Mbps.
Our CTO John Clem built a proof of concept ProRes encoder for the iPhone earlier this year. At best we were seeing around 5-6fps using ProRes Proxy/LT, and more like 2fps using ProRes HQ.
So the short answer is: the day Apple adds hardware accelerated ProRes encoding, Filmic Pro will support it.
Are the new cameras in the iPhone 6S and 6S+ opening up new functions and features for you?
4K is the big one. Some smaller improvements over the previous iPhone like the addition of image stabilization at 720p (previously only available at 1080p) and high frame rate capture at 1080p up to 120fps (previously limited to 60fps). We also are looking into taking advantage of 3D Touch to further refine our camera controls.
How closely are you working with lens add-on companies, like Moondog Labs, to support their hardware?
We have a great relationship with Moondog Labs and love the work they’re doing, both conceptually and with regard to build quality. Their fit and finish is amazing. Additionally, they share a similar vision of giving everyone an opportunity to be a film maker. Best embodied by our collective support for Sean Baker’s wonderful film Tangerine, shot with on FiLMiC Pro on a few iPhone 5S’s with an early version of the Moondog Labs anamorphic adapter. We worked with Moondog early on in support of their anamorphic adapter and continue to discuss with them new ways to put our complimentary skill sets to work.
Lastly, collaboration and best-of-breed support has been a hallmark of FiLMiC development from the beginning. Back in the days of the iPhone 4s we introduced support for the EnCinema 35mm lens adapter. We recently added support for the Covr lens system and are continuing to review supporting complimentary hardware platforms where it makes sense for our customers and FiLMiC in general.
Latitude is still an issue for camera phone video. It’s especially true as dedicated lower-cost cameras like Sony’s RX100 IV gain log shooting and 4K. Will there be HDR video modes, log curves, or any other strategies for improving the latitude of cameraphone video in future versions?
We’re going to keep this purposefully vague, which means we have a few tricks up our sleeve. Suffice to say, LOG, S-Log and flat color space are things that we hear from our users all the time. We have some unique thoughts to how we aim to approach high dynamic range video and will be focusing on a whole slew of imaging options in an upcoming release of FiLMiC Pro.