Aaron Lindquist
Director, Editor, & Writer


Cinemascope on the iPhone

Magic Hour - Shot on an iPhone 6 with Moondog Labs 1.33x Anamorphic Lens Attachment

Magic Hour - Shot on an iPhone 6 with Moondog Labs 1.33x Anamorphic Lens Attachment

What follows are my tests of the Moondog Labs 1.33x anamorphic adapter and the Ikan Fly-X3 Plus gimbal.

Since hearing about Sean Baker's use of the adapter on his Sundance hit Tangerine I have been curious how practical it might be to shoot a film using the iPhone. I write this, having had experience with everything from Canon to Sony to Panasonic to Red to ARRI digital cinema cameras. What matters to me is that the tool being used to shoot the film be able to tell the story of the film. Judging from the movie, the iPhone was a good fit for Tangerine:

I placed my pre-order in April and waited patiently for the adapter to arrive by its mid-May launch date. Julie at Moondog Labs was very courteous in letting me know there would be a slight shipping delay due to the supplier. It arrived on May 21st.

FullSizeRender 14.jpg


To paraphrase the philosopher John Locke, 

"Knowledge without understanding is useless because it is mere words." 

An anamorphic lens squishes the image vertically so that when it is expanded horizontally it has a wider field of view. I did not want to shoot videos with the anamorphic adapter immediately, but understand what it was doing, so I concentrated first on shooting stills.

I took my first picture with the iPhone's Camera app and immediately saw the vertical squeeze the anamorphic adapter caused:

I then tried to unsqueeze it horizontally in Photoshop to a 2.39:1 aspect ratio:

Something about it didn't look right...so I opened Photoshop again and adjusted the settings...


There it was. Ultra wide angle. Wait, why was it 16:9?

The first misconception I had using Moondog Labs 1.33x anamorphic lens adapter with the iPhone 6's native Camera app was over what would create a Cinemascope image. Natively, the iPhone Camera app shoots in a 4:3 aspect ratio. Not considering this, I tried to 'unsqueeze' all I could out of the adapter by expanding the resulting 16:9 image into the 2.39:1 aspect ratio I had expected. I felt somewhat dim once I realized that shooting in 4:3 could only ever be expanded 1.33x to a 16:9 ratio (HDTV resolution). Getting this so wrong, it did make me think. It then became clear that if I began by shooting in a 16:9 native ratio that, when I expanded 1.33x, it would result in a final 2.39:1 Cinemascope image. 

I tested at least three different photography apps (many which do not offer 16:9 as an option) with the adapter and the one that sold me was ProCamera 8 for its reliability, frequent updates and the ability to shoot both stills and videos in 16:9.

I have to thank the ProCamera 8 team for creating such a useful app. It allowed me to get the full benefit of the adapter. Here you can see the squeezed 16:9 image I shot:

Wars of the Worlds  set from Universal Studios Tour shot in 16:9 with the adapter.

Wars of the Worlds set from Universal Studios Tour shot in 16:9 with the adapter.

...which I then unsqueezed 1.33x to 2.39:1 Cinemascope:

Image unsqueezed to 2.39:1 aspect ratio and color graded in ProCamera 8.

Image unsqueezed to 2.39:1 aspect ratio and color graded in ProCamera 8.


For stills, the unsqueeze can be done in Photoshop and Gimp on your computer. If you are on the go, I found Crop-Size to be the best app for the iPhone 6.

Full 16:9 images on the iPhone are 3264 x 1835 pixels. We only want to expand the horizontal pixels, so 3264 multiplied by 1.33 equals 4341. 4341 x 1835 pixels results in a 2.39:1 aspect ratio. 

I took a lot of 2.39:1 stills with the adapter to get the sense of composing scenes in Cinemascope:

I ended up preferring the effect of shooting in 4:3 with the adapter and unsqueezing to 16:9. It's a comfortable field of view for static images. The horizontal pixels are the same, so they still become 4341 pixels after the conversion. The vertical pixels are taller, however, and remain 2448 pixels throughout. While not Cinemascope, it makes a pleasing 16:9 image shot with an ultra wide angle lens:

There is an added benefit to this workflow. Shots can be cropped into a 2.39:1 Cinemascope image without losing any horizontal pixels:

Shot in 4:3 with Moondog Labs 1.33x anamorphic adapter and unsqueezed to 16:9

Shot in 4:3 with Moondog Labs 1.33x anamorphic adapter and unsqueezed to 16:9

Unsqueezed 16:9 image cropped to 2.39:1 Cinemascope

Unsqueezed 16:9 image cropped to 2.39:1 Cinemascope

The iPhone's lens is equivalent to a 29mm lens in traditional 35mm film photography. If you divide that by 1.33, it results in roughly a 22mm lens. I admit the ultra wide angle the adapter creates may not be everyone's cup of tea. If you prefer a more flat field lens you may even hate it.

I never imagined I would shoot so many stills with a tool marketed for videography, but it was a lot of fun. It has a unique look that surprised me. Seeing it happen on an image-by-image basis made me much more critical about shooting widescreen images. 


Now it was time to test the adapter for what it was made for: shooting movies in Cinemascope. Half of my motion tests were shot with the FiLMiC Pro app and the other half with ProCamera 8.

The first test is handheld (without a gimbal) and no stabilization added during post-production:

The second test is the same footage, but with a warp stabilizer effect added in Premiere Pro CC 2014. All but one of the shots benefitted from stabilization in post:

I realized afterward the glaring weakness of the iPhone 6: video is shaky as heck. I rectified this by ordering the Ikan Fly-X3 Plus electronic gimbal from Amazon.com:

Shipping took two days and it arrived to test:

Charging the battery took two hours or less. Once this was complete, I slid the battery into the gimbal. I seated the iPhone (with Moondog Labs 1.33x adapter attached) and turned the gimbal on. If you've seen any number of recent Sci-Fi films with malfunctioning robots, this was like an insect bot that had lost its balance; its robot brain trying forever to rebalance. I switched the gimbal off and understood that the weight of the anamorphic adapter was the crux of the problem. Ikan's gimbal is designed to account for the weight of the iPhone, not the additional weight of adapters and accessories you might want to pair it with. This presented a real problem. What could I do to account for it?

It turns out that Ikan already includes a counter-weight with their gimbal. It is incredibly small and almost useless, but it provided the seed of the problem's solution: the other side of the gimbal simply needed more weight. 

This required another trip to Amazon, where I bought the following:

Measuring the L bar that came with the gimbal, these screws and steel spacers worked the best for the size and weight distribution. They resulted in this:

Included Ikan weight (on left) with steel spacers added (on right).

Included Ikan weight (on left) with steel spacers added (on right).

Mounted on gimbal:

Correcting the weight discrepancy allowed me to test shooting tracking shots with more accuracy:

Shot at 24 fps. Graded with Lumetri Looks in Premiere Pro CC 2014

I did not use the manual exposure and focus features (which I recommend for professional use) for testing movement because I wanted to concentrate on alleviating camera shake. Nonetheless, FiLMiC Pro was very easy to use and does yield professional results. I appreciated the automatic 2.39:1 de-squeeze it did during shooting.

FiLMiC Pro has in-camera stabilization (called 'Cinematic' in the stabilization menu), but the app did not support it while using the automatic 2.39:1 de-squeeze (I'm told this is a bug that will be fixed in an update). I'd love to see this corrected because it makes a huge difference in the quality of the footage.

ProCamera 8 ended up being the app with the most reliable in-camera stabilizer for the tests. While this necessitated unsqueezing the 16:9 ratio to 2.39:1 in post, the combination of the gimbal and in-camera stabilization ups the ante for using the iPhone as a professional tool.

Shot at 50 fps, 24 fps, and 30 fps, then conformed to 24 fps. Graded with Film Convert in Premiere Pro CC 2014

The gimbal never performed well on an incline. It excelled at remaining stable on level surfaces and both ascending and descending stair cases. The added weight of the anamorphic adapter and counterweights seemed to exceed the physical limits of the gimbal's design when inclined. You can see a slight intermittent shake in this video:

Shot at 24 fps. Graded with Film Convert in Premiere Pro CC 2014

Post stabilization in Premiere Pro did help, but you can still see the effects of the shake toward the end of the clip:

Shot at 24 fps. Graded with Film Convert in Premiere Pro CC 2014


The iPhone 6 is a 2K camera for most purposes. If you have the storage capacity, FiLMic Pro will shoot in 3K (since removed from the latest 4.1.3 build 237, but hopefully re-added soon). There are some apps that claim to have 4K functionality, but of those I researched the most stable seemed to be the ones that offered a time-lapse function like TimeLapse. I did not do extensive 4K tests, so I consider the debate open. What is constant is reliable 2K functionality for real world production. The Moondog Labs 1.33x anamorphic adapter gives you more information to work with onscreen and, without a doubt, adds production value to whatever you are shooting with the iPhone. 

The Ikan Fly-X3 Plus is an amazing tool. I can't imagine shooting video on the iPhone without it. You have two options: you either lock the iPhone down to a tripod or you use the gimbal. There is no middle ground. While you are able to shoot handheld for very brief shots, I would not recommend it for the bulk of what you shoot. The gimbal gives you much more consistency. Remember to accurately offset the weight of the anamorphic adapter or the gimbal will not work properly. 

For those who are interested in shooting on a tripod, the Joby JM1-01WW GripTight Mount is inexpensive and does the job:

Hard as it may be to believe, I shot all these tests using a 16GB iPhone 6. It made me a bit more disciplined when shooting videos, but I do not recommend it for a real world production scenario. The bare minimum would be a 64GB iPhone 6. I was able to pause and upload videos to the cloud because I was only shooting short test clips. If I were shooting a narrative or a documentary these pauses would be a serious impediment to continuous shooting. You would be better served to use a larger capacity iPhone and plan for breaks (meals, just before a location move, etc.) during which you would offload the media via AirDrop (to a laptop) and then the cloud (for safety) or a backup hard drive (recommended). 

It's almost crazy to think that an entire feature film could be shot using a smartphone, but judicious use of the Moondog Labs 1.33x anamorphic adapter, a gimbal, and apps like FiLMiC Pro and ProCamera 8 can yield professional results. The burden is on you, the filmmaker, to devise a story that works for these tools and to be the eye that shapes the story into something unique. All the rules of cinema still apply. The technology present in such a small, accessible device will not gift any of us genius, but I feel our ability to learn from it is profound. To be able to take something so diminutive and see an immediate result is a great teacher. We can experience the screen direction present in successful films as we walk down the street (with our iPhone) and understand why a certain shot does or does not work. We can take it a step further and learn to make our own movies or become better filmmakers.

I applaud Moondog Labs and Ikan for extending the Apple philosophy of putting great technology in the hands of everyday people.