Shooting with Drones

Why do we need drones?

I decided to start writing this article about drones because I've been avoiding learning about drones for a long time.  I extremely dislike drones in my daily life.  Whether it's one overhead near my home making me feel like I'm being spied on or whether I'm relaxing on a remote Hawaiian island enjoying the scenery when all the sudden some nerd sets off a drone and it's a real "BZZZZZZZZZZ" kill.  The only footage of drones I enjoy is footage of eagles being trained to attack them. 

s3.reutersmedia.jpg

BUT ... 

Drones provide us with a great perspective we wouldn't otherwise have - and at a more affordable cost, in a more controllable vehicle than a helicopter or airplane, and in a smaller device that allow us access to travel to certain places we wouldn't otherwise be able to fit. 

Aerial footage adds ENORMOUS production value to travel videos, nature documentaries, real estate videos, and other reality TV.  My favorite aerial footage of all time is probably in nature documentaries, particularly ones like BBC's "The Hunt."  Watching a cheetah chase down a wildebeest from above is the best way to see the brilliant agility and speed of a cheetah as it reacts instantaneously to the movements of its prey. Also, ocean drone footages has allowed us to see things in completely new ways since it's notoriously expensive to film stable footage on the ocean itself.

sup-whales-jaimenhudson.jpg

How are drones used in Narrative Cinematography? 
The obvious answer is any way you like!  Traditional aerial cinematography has been around for a long time and used in countless productions as establishing shots, point of view shots, B-Roll, credits - you name it.  Now, drones have made aerial photography more accessible to filmmakers, and also greatly expanded the capabilities of aerial photography to go where no helicopter or airplane could fit (or descend to) before.   

As exiting as it is, too much or misused aerial cinematography can actually make a movie/video feel slow and disengaging.  In my opinion, drone footage is best used like traditional aerial cinematography to show a landscape - to make a character seem small by providing a "Gods Eye" POV, used as an artistic transition, or even a super hero or animals direct POV.  Using drones on small budget productions for camera movements that aren't necessarily aerial (high altitude), but would require unaffordable equipment (like a dolly to a jib to a crane to a Steadicam or whatever) is a creative economic use for a drone. 

 image from VIMESSA.COM

image from VIMESSA.COM

 

Things to think about when picking a drone: 

  1. WEIGHT: Most importantly, can your drone hold your camera!  The payload will affect flight time and agility. The heavier your payload, the shorter your flight time will be.
  2. FLIGHT TIME: Flight times vary depending on the drone, battery power, and payload.  Flight time usually varies between 10 and 30 minutes with averages around 18 minutes. 
  3. FIELD OF VIEW (FOV): Know how many degrees the drone can angle horizontally and vertically and how far your gimbal rotates. ALSO - consider if you will see the rotors in your shot when looking up! **With some drones (like the Sarah Flying-Cam) you can be looking straight up through their rotors, but at 24 or 25fps, you would not see them!** Other drones (like the Freefly Alta 8) allow you the options of mounting your camera on top of the blades or below, giving you a wider range of view if you have sky shots or shots angled upwards. 
  4. CONTROL + AGILITY:  Most drones have software and levels of "Intelligence" - check what your drone has and what automatic functions might help your shoot go more smoothly.  Fore example, using GPS and all kinds of smart stuff a drone with intelligence might be more easy to handle and can control steady hovering. Auto-pilot landing can also come in handy.  
  5. WEATHER RESISTANCE: Make sure your drone is water proof, cold proof, whatever proof, if you're flying in extreme conditions. 
  6. WHAT TO WATCH OUT FOR:  Vibration - test your done to make sure you don't feel vibrations in the image. Noise - it may seem obvious, but drones are noisy, if you're planning to sneak up on an animal or something, they may take a while to get used to the drone, chase it, or constantly stare at it & ruin your footage, when filmming animals allow time for them to get used to the drone.  
  7. LEGAL STUFF: Make sure you have an FAA licensed operator and are not flying your drone anywhere illegal like into the White House front lawn.  

DRONE MODELS

 

SARAH Flying-Cam
 

Plateform-helico-nov.2017-v06.png
 WEIGHT: >9.1kg/20lbs FLIGHT TIME: Varies from 10 minutes to 34 minutes.  FEATURES: 8 motors, Weather Resistance, Vibration Proof

WEIGHT: >9.1kg/20lbs
FLIGHT TIME: Varies from 10 minutes to 34 minutes. 
FEATURES: 8 motors, Weather Resistance, Vibration Proof

 PAYLOAD: < 50kg/110 lbs FEATURES: Can hold two cameras

PAYLOAD: < 50kg/110 lbs
FEATURES: Can hold two cameras

 PAYLOAD CAPACITY: &lt;20kg/44lbs FEATURES: Compatible with RED, Alexa Mini, etc., FIZ compatible,&nbsp;

PAYLOAD CAPACITY: <20kg/44lbs
FEATURES: Compatible with RED, Alexa Mini, etc., FIZ compatible, 

 PAYLOAD CAPACITY: &lt;15kg/33lbs FEATURES: Recommended for DSLR cameras, has built-in anti-vibration

PAYLOAD CAPACITY: <15kg/33lbs
FEATURES: Recommended for DSLR cameras, has built-in anti-vibration

xFold SPY

 PAYLOAD CAPACITY: &lt;3.5kg/7.7lbs FEATURES: Great for action sports and DSLR cameras

PAYLOAD CAPACITY: <3.5kg/7.7lbs
FEATURES: Great for action sports and DSLR cameras

 PAYLOAD: 9kg/19lbs     AERIGON´s &nbsp; proprietary power distribution system &nbsp;and dual coaxial design with 12 powerful counter-rotating rotors and pre-pref carbon-fiber exoskeleton provides the power, precision and stability to carry the weight of sophisticated camera configurations with pro zoom lenses and full FIZ (Focus, Iris and Zoom) control.   AERIGON´s innovative design &nbsp;absorbs vibrations, conceals and protects cables and electronics from external stress and its reachable arms provide operators the choice between power or endurance, depending on the environment, camera payload or the type of shot requested byt the production. This same feature enables the AERIGON to be quickly deployed and be ready to go in under 10 minutes and packed down into a pair of road cases in just under 5 minutes.

PAYLOAD: 9kg/19lbs
 

AERIGON´s proprietary power distribution system and dual coaxial design with 12 powerful counter-rotating rotors and pre-pref carbon-fiber exoskeleton provides the power, precision and stability to carry the weight of sophisticated camera configurations with pro zoom lenses and full FIZ (Focus, Iris and Zoom) control.

AERIGON´s innovative design absorbs vibrations, conceals and protects cables and electronics from external stress and its reachable arms provide operators the choice between power or endurance, depending on the environment, camera payload or the type of shot requested byt the production. This same feature enables the AERIGON to be quickly deployed and be ready to go in under 10 minutes and packed down into a pair of road cases in just under 5 minutes.

 

DRONE COMPANIES
(more to come...)

ZM interactive : San Francisco

Freefly Cinema 

Drone Crew: South Africa

Helicopter Film Services: HFS : UK

Intuitive Aerial

Shooting in a VR Environment

It's the future!  In addition to creating Visual Reality content, we can now also "traditionally" film people in virtual reality environments. 

What?? Virtual reality contains many innovative game and narrative experiences. The current problem is that not everyone owns a VR headset. They're still at a high price point and not everyone is into it (until they try it!). The best way to demonstrate a VR experience to someone without access to VR, is to traditionally film a person in VR. Below I will tell you how we (myself + Outpost VFX) accomplished it. 

Why??? Filming a composite of a person in a VR experience allows everyone else to witness the experience from a "normal" viewpoint.  Put it this way, if you only watch someone's point of view in VR, it's like watching footage from a go-pro haphazardly falling out of the sky, the player's point of view spins every direction on a static screen, which is disorienting and chaotic.  If you're watching someone in VR in reality they just look like a crazy person spinning around the room making erratic movements. Compositing the person into the environment is the best way to get a sense of what is going on, since from their POV the environment is static and we now can observe it in a normal way with them. 

How?? By mounting a virtual tracker (or an extra VR controller) onto your camera (any camera) it's possible to film someone while simultaneously showing the VR environment they're inside. This is called "mixed reality."  As a technophobe with an aversion to words like "virtual reality, mixed reality, sautéed reality, and upside-down and backwards reality" I have to say that the process (from a DP stand-point) is simple and a ton of fun.  I couldn't help but get giddily excited once I was operating a camera seeing and filming things that didn't exist in reality.  Once your virtual camera tracker and real camera are aligned, a feed goes to the computer, composites the subject (usually in front of a green screen), and sends the composited image back to your viewfinder allowing for the entire VR environment to be filmed. 

Here are some examples of the evolution of the process. Recently we worked with a game called RacketNX to show their game live at some VR conventions. In the future I am curious how this tool can be used in narrative environments or even as a pre-vis tool.  Hopefully this is just the beginning. 

Filming and live compositing at Unity Summit: 

 

This is another early test with a green screen where we filmed multiple VR experience being played: 

 

This is the first test we did with a webcam and no green screen:  

Cinegear 2016

Cinegear Expo is where most of the coolest new gadgets are on display for film nerds to ogle at. One of this year's main attractions (besides the free popcorn stand) was the Panavision XL GT X 8K 16-bit Turbo Charged Camera, to be released in early 2017.  Here I will briefly mention that and a few other booths I ended up stopping at!

Panavision Millennium DXL

This camera is being hailed as the ultimate combination of technologies from Panavision, RED, and Light Iron (and let's be real, it seems like it's body/menu design was inspired by Arri).  The main bragging points this camera has are its ability to record 8K and 4K internally on SSD cards, a compact and lightweight body (weighting only 10 lbs.), and well, I'm not sure what else yet... menus on both sides of the camera so you can have a war with your AC over settings. 

Luckily I was able to watch a demo of footage shot with the camera, but it was hard to assess the the images it produced.  At Cinegear Panavision screened footage that was shot on the Millenium DXL in 8K and then projected in 4K on a screen that was barely appropriate for 2K size. Unfortunately the test contained no information as far as stop ranges or camera settings. It also did not contain uncorrected footage vs. corrected footage, but I was able to make a few basic assessments.

I am going to assume all of the footage was messed with in post since I thought I noticed some power-windows/vignettes were added. The footage looks good, and holds up well for night exteriors.  I don't think the image feels as "soft" or filmic as they claim. Especially on an interior that was lit with a light through what looked like a 4x 250 diffusion frame, the lighting easily looked harsh and not great for pale skin tones. Moving forward, my main concerns are about how the camera's codec works and how Light Iron works. How they came up with the technology behind camera is not easily accessible online right now and hopefully we find out more soon. 

Tech specs: 

Sensor Type: 16-bit, 35.5 Megapixel CMOS
Resolution: 8192 x 4320
Sensor Size: Large Format 40.96mm x 21.60mm (Diagonal: 46.31mm)
Dynamic Range: 15 stops
Max Frame Rate: 60 fps at 8K Full Frame (8192 x 4320), 75 fps at 8K 2.4:1 (8192 x 3456)
Recording Codec: 8K RAW with simultaneous 4K proxy (ProRes or DNx)
Recording Media: SSD (up to 1 hour on a single magazine)
File Type: .r3d (supported in RED SDK)
Color Profile: Light Iron Color (compatible with all popular gamuts and transfer curves)
Weight: 10 lbs.

Additional Features:
6 video outputs
6 1D LUTs or up to 4 3D LUTs
Directly motorize Primo 70 lenses through wireless control
Wireless timecode for genlock (Ambient Control Network)
Dual menus (Operator side, Assistant side)
Quick changeover accessories

 

Other observations at Cinegear: 

Filmotechnic - Working title: "In & Out"

 This device can swing a camera (in this case an Alexa Mini) from a 45 degree frontal point of view to profile of the driver and all the way through to get a reverse shot inside the car behind the driver and passenger. Not having seen this thing in action I would be worried about the steadiness of the rig, but this might be useful to grab some practical shots without all the grip gak usually involved with shooting in and around cars. Also, if the ability to go from front to back in one smooth slide helps tell your story this could be a helpful tool to do it.&nbsp;

This device can swing a camera (in this case an Alexa Mini) from a 45 degree frontal point of view to profile of the driver and all the way through to get a reverse shot inside the car behind the driver and passenger. Not having seen this thing in action I would be worried about the steadiness of the rig, but this might be useful to grab some practical shots without all the grip gak usually involved with shooting in and around cars. Also, if the ability to go from front to back in one smooth slide helps tell your story this could be a helpful tool to do it. 

 

Arri Steadicam Head

 This gadget allows full rotation of the camera. On a Steadicam you can go from low-mode to high-mode, front mode to backwards mode,&nbsp;etc. It may be difficult to operate the head while also operating Steadicam, but it's a nice feature in theory to be able to go all which ways in one movement.

This gadget allows full rotation of the camera. On a Steadicam you can go from low-mode to high-mode, front mode to backwards mode, etc. It may be difficult to operate the head while also operating Steadicam, but it's a nice feature in theory to be able to go all which ways in one movement.

 

 

How to Make a Lazy LUT

On some projects with limited time or means for color correction, the best option is to bake a look directly into your camera.  A basic REC709 curve can give you a "normal" looking image that is passable, but once in a while it is nice to a have a look tailored to your tastes or to match the look of another camera you are shooting with. There are many great mathematically accurate ways of doing this (which I will not discuss now, but basically involve measuring the RGB values in decimal points and plotting them onto a graph), but I will describe a simple, non-mathematical way of creating a look to use in-camera.

Download the most recent version of DaVinci Resolve here. The non-studio version is free. 

  1. Shoot your test footage on the camera
    Make sure you are using the codec settings you will be using for your shoot.  Do not add any 709 curves, LUTS, matrix settings, or looks of any type. The 'raw' image should look grey and flat. Make sure your exposure setting, f-stop range of the environment, color temperature, and ISO are all similarly set to how you intend to shoot when using your LUT.  Including a grey card and skin tone is helpful. 
  2. Bring your footage into DaVinci
    - Create a new project file
    - Load clips into Media Pool
    - Go to "Edit" in bottom tab
    - Create a new timeline
    - Drag clips onto Timeline
    - Go to "Color" in bottom tab
  3. Create your look
    To alter the image to your liking in the most simple way, click on the thumbnail of the clip. Use the adjustment curves on the bottom right. There are tiny dark grey dots by the luminance curves charts. Click through these dots and make changes to luminance, hue, and saturation charts. Be sure to check your scopes: Go to View: Video Scopes
  4. Export your look as a LUT
    When you are happy with your image and ready to export your look right click on the original thumbnail and go to generate 3D LUT (.cube) 
  5. Save your look to your camera via an SD Card
    Load the .cube file onto an SD card. Put in your cameras correct file structure on the SD card so it can recognize the card. For the FS7 the card file structure is below. 

 

Insert the card into your camera and open the .cube file. by going to the File menu then "Monitor 3D Lut" and select "Load SD Card."  Save as User 1, 2, 3, or 4 and make sure to select that User LUT and turn on the LUT in your viewfinder and recording settings to view or bake it in. 

Ba-da-bing, ba-da-boom! You got a LUT.