Mission Critical, August 2015

Page 1

V O L U M E 5 N O. 3 | A U G U S T 2 0 1 5

HIS IMAGE

AERYON SCOUT SURVEYS CHRIST THE REDEEMER

Inside This Issue: Intuitive Controls for Robots CNN Trusts in Drones for Its News Railroads Turn to Unmanned Aircraft for Inspections


USHERING in the future JOIN THE #IHPrevolution @RoboticsIHP

EXTENDING our capabilities HELPING the environment Intelligent robots are helping researchers, public entities and corporations in ways

previously impossible.

TRANSFORMING disaster relief STIMULATING the economy FOSTERING education IMPROVING research methods HANDLING dangerous tasks

SAVING lives

Learn about all the endless benefits of this technology at

increasinghumanpotential.org

ENHANCING public safety


EDITOR’S MESSAGE

Miles to Go Before We Sleep In June, I attended DARPA’s Robotics Challenge — a competition that tasked international teams with a series of the most difficult tasks ever given to rescue robots. The inspiration for the challenge came out of disasters like the Fukushima Dai-ichi nuclear plant meltdown, where sending humans in to release mounting pressure in the facility was too dangerous.

Danielle Lucey Editor

I had two takeaways from the event. One, robots have a long way to go. But, more importantly, two, there is an extremely passionate community of people that want more out of robotics.

There is an

extremely passionate community of people that want more out of robotics.

By this point we have all seen videos of the mostly bipedal humanoids quaking at the knees, stumbling over the tiniest door threshold or flat-out eating it just trying to take a step. (If you haven’t, you are missing out. Watch. This. Now. https://www.youtube.com/ watch?v=g0TaYhjpOfo) The technology on display was beyond impressive, and the winning team, KAIST, proved that a robot can — without a doubt — perform in a disaster scenario. However, an adult human could still far outpace the robots — DARPA challenge director Gill Pratt estimates it would take a person at most 10 minutes instead of the top robot time of 44 minutes and 28 seconds. In a pinch, unmanned systems and robotics technology get the job done, but it’s still going to be a very long time until a robot rescues someone from a burning building or drives up to a disaster in progress to help. But back to point two. If the crowd at the Pomona, California, Fairplex is any indication, the future of robotics is healthy. The stadium facility, which houses the L.A. County Fair, was packed. People brought their families out to see these robots perform. And instead of seeming bored or going home early, people were cheering like they were at a sporting event. Each turn of a valve or opening of a door was met with roars, while the crowd collectively moaned with each failure or fall. The event also had an interactive pavilion area, where kids could try out their own robot hands, drive a SeaPerch underwater robot around or watch small UAS flying in a cage. I suspect that many of the children at this event went home to their parents and asked to join a local robotics STEM initiative. And when that generation gets to try out their own DARPA challenge, I’m sure whatever it is about will beyond our wildest dreams. And the memories of slapstick-prone rescue robots from 2015 will seem as dated as an old telephone.

MISSION CRITICAL

1


It’s here. AUVSI’s Unmanned Systems is now

CONFERENCE MAY 2-5, 2016 TRADE SHOW MAY 3-5, 2016 Ernest N. Morial Convention Center  New Orleans, USA

www.xponential.org


V O L U M E 5 N O. 3 | A U G U S T 2 0 1 5

CONTENTS FEATURE

14

Full Steam Ahead Automation, Drones to Improve Train Safety

22

A Simple Gesture Industry Turns to Easy-to-Use Controllers

DEPARTMENTS

4 Essential Components

  

6

Danielle Lucey Editor dlucey@auvsi.org

8

Dave Donahoe Account Executive ddonahoe@auvsi.org Wes Morrison Account Executive wmorrison@auvsi.org

10

Technology Gap Apps, Websites to Help New Drone Users

CONTRIBUTING AUTHORS

Scott Kesselman is advocacy and public relations associate at AUVSI. He is a frequent contributor to Mission Critical magazine.

Q&A April Blaylock Aeryon Labs, Projecto Redentor

11 Marc Selinger is a freelance writer based in the Washington, D.C., area. He can be reached at marc2255@yahoo.com.

Timeline

A History of Artificial Intelligence

MISSION CRITICAL CONTACTS Brett Davis Vice President of Publications and Marketing bdavis@auvsi.org

Yuneec Releases Typhoon T-Mobile Gets Inspection Drone Flytrex Aims for Delivery Market

12

Uncanny Valley Aldebaran’s New Robot Wants to be Your Friend

State of the Art

Unmanned Systems Help the Environment

18

20

Spotlight Wolfram Lets Users Tap Big Data

Testing, Testing

Cover Photo: Aeryon Labs LLC.

EasyJet Uses Drones for Plane Inspections

26 End Users CNN Sees Drone ‘Ecosystem’ for News

MISSION CRITICAL

3


Yuneec International Releases Typhoon In July, Chinese company Yuneec International launched its new Typhoon Q500 4K drone. The electric-powered unmanned aircraft can shoot video in 4K, as the name suggests, and collect still images at 12 megapixels. The platform has a 25-minute flight time while filming. Its three-axis CGO3 gimbaled camera has an optimized fixed-focus lens, and images

Jeremy Wigmore, CEO of Aerialtronics; Lampros Iskos, chief technology innovation officer at T-Mobile Netherlands; and Derk van Luijk, sales director at Aerialtronics.

The Typhoon Q500 4K comes with a follow-me feature.

are stored on board on an internal memory card, which can stream in real time to the unit’s ST10+ Personal Ground Station. The ST10+ has an Android touchscreen display. “We are very excited to be introducing one of the world’s first 4K camera drones,” says Tian Yu, CEO of Yuneec International. “The Typhoon 4K has elicited the attention of everyone from professional cinematographers to techie geeks to those just discovering drone aviation. No matter the audience, Yuneec produces spectacular images and provides an unparalleled flying experience.” The system comes with the built-in ability for geofencing, speed control, and follow-me and watch-me features.

T-Mobile Takes Delivery of Aerialtronics Drone In late June, telecommunications company T-Mobile took delivery of its first drone, an Altura Zenith, for use in inspections. The unmanned aircraft will be used by the company to look at its 5,000 antenna masts in the Netherlands. During tests near the Galgenwaard soccer stadium in the city of Utrecht, T-Mobile determined drones cut down on the time it normally takes to inspect cell towers — up to seven days using technicians on cherry pickers. “In this pilot [program], we flew around the stadium in 15 minutes using HD video on the drone. It was a significant time and cost-saving exercise,” says Jeffrey Leentjes, a network specialist at T-Mobile. In addition to taking longer, inspecting cell towers manually comes with risks. In 2014, there were 12 mobile tower fatalities. The Zenith is outfitted with an HT camera, thermal sensors and transmitters to relay the information it gathers. “Aerialtronics provides a detachable gimbal where we can switch devices quickly. Next to that, safety is important of course. We also wanted a system that is fully approved by the authorities, can fly under different conditions and is robust,” says Leentjes. “The fact that we are stepping into drone technology creates an opportunity to look at the development of the service. I think, in telecoms in general, there will be more utilization of drone technology in future. For T-Mobile and Deutsche Telecom, it’s very good to get the experience with Aerialtronics at such an early stage.”

Swiss Firm Demos Weeding Robot

EcoRobotix’ weeding robot, shown at the Innorobo 2015 show.

4

MISSION CRITICAL

The company ecoRobotix is demonstrating a weeding robot, which looks like a wheeled table, that it says can reduce herbicide use by up to 95 percent. The robot is powered by solar panels and can work up to 12 hours a day. It can be deployed by a standard tractor and configured with a smartphone, according to the four-year-old Swiss company. “The robot is adapted to most types of row crops and performs weeding by combining an advanced vision system that recognizes weeds and a fast robotic arm to remove them,” the company says. The vehicle weighs 100 kilograms (220 pounds), at least in its prototype phase, light enough to avoid soil compaction issues. It can detect more than 95 percent of weeds, according to ecoRobotix. The company plans to put the system on the market by the end of 2016.


ESSENTIAL COMPONENTS

Dronecode Project Raises Funds

The Flytrex Sky can deliver packages weighing less than one kilogram.

Flytrex Sky Aims to Become First in Delivery Drone Though fulfillment giant Amazon has made many headlines over wanting to create a massive delivery drone push as a part of its business, Israel’s Flytrex announced in late June it aims to beat the company to the punch by releasing a platform capable of delivery right now with its Flytrex Sky UAS. The cloud-connected system has 3G connectivity that lets it be able to deliver lightweight packages out of the box. The system uses Collaborative Piloting technology so both the sender and recipient can control the platform along its route. “We plan for the drone to have environmental awareness,” said Flytrex CEO Yariv Bash in an interview with Wired magazine. “Both for terrain and man made structures. This will be a firmware upgrade in the next few months. The Flytrex Sky is capable of performing firmware upgrades from the cloud, thus this will be a pretty easy to install upgrade.” The Sky can also be piloted via the company’s Flytrex Pilot app, which lets users move a virtual drone across a map. The system can carry payloads up to one kilogram for up to seven miles for a maximum flight time of 32 minutes. The system comes with a dual battery and a mount for a GoPro. The system is available for preorder in the United Kingdom for around £470, or $749.

San Francisco nonprofit Dronecode, which is developing a common and open-source platform for unmanned aircraft, added to its members and list of investors in late June, according to an organization press release. Arsov RC Technology, Erle Robotics, Event 38 Unmanned Systems, Parrot, Team Black Sheep and Walkera all joined as members of the organization. OpenRelief, Open Source Robotics Foundation, The Autonomous Systems, Control and Optimization Laboratory at Johns Hopkins, Team Tiltrotor and Uplift Aeronautics all became new sponsors. Dronecode is merging different partners that have open-source drone projects under one organization governed by The Linux Foundation. “We’re thrilled to welcome today’s new members and sponsors so soon after forming Dronecode as a neutral, transparent initiative for advancing [UAS] technology,” says Amanda McPherson, chief marketing officer at The Linux Foundation. “Their participation affirms the collaborative development model, enabling more parties to provide resources and support to the already vibrant drone community. From improving wildlife protection and search and rescue to 3-D mapping and precision farming, drones can change our world for both goodwill and economic gains.”

Photo: Yuneec International, Aerialtronics, AUVSI, Flytrex, B Go Beyond.

B-Unstoppable Kickstarter Blends Tracked Robot With Quadcopter A Kickstarter project from the creators of the drone seen in the movie “The Expendables 3” aims to blend the world of tracked ground robots with drones with the B-Unstoppable Tank-Quadcopter. The patent-pending project can navigate rough ground terrain and then takeoff and fly over otherwise daunting objects. The platform, which weighs 84 grams (.19 pounds), can fly for up to nine minutes or drive for 12 to 18 minutes through its lithium-polymer battery. The product retails for £59 (about $92). The B-Unstoppable is part tank, part drone.

MISSION CRITICAL

5


Smart Machines Getting a machine that can think like a human has been a concept for hundreds of years, and researchers are getting ever closer to making computers that may one day even be smarter than people. Here’s a look some of the capstone events in the field of artificial intelligence through time.

1770

The Turk Timeline Originally built to impress the Empress of Austria, the Turk was alleged to be an intelligent automaton that could play chess. It wasn’t. In reality, it was a very elaborate cabinet that could house a human chess master, who could make it seem like the automaton was doing the work.

1966 - Shakey

Shakey was an attempt to build the world’s first mobile, intelligent robot, one capable of sensing its environment and making decisions for itself. The project continued until 1972. Shakey was a big cart, powered by truck batteries. Mainframe computers handled the computations and sent commands back to the robot via antenna.

1950 - Alan Turing

In a paper, Alan Turning postulated that people could create machines that could think. The concept for determining this is now commonly called the Turing Test.

1951 - Ferranti Mark 1 Also called the Manchester Electronic Computer, the Ferranti Mark 1 was the first commercially available electronic computer. It could do 600 10-digit multiplications in three seconds.

6

MISSION CRITICAL


2045

The Singularity

1974 - 1980

The Winter of AI In the 1970s, research on artificial intelligence ground to a halt, after prior high expectations of how quickly machines would become smarter weren’t realized.

1980 – 1987

The Expert Systems Era

Artificial intelligence systems once again gained funding prowess through a series of projects on computers called expert systems. These systems work in a specific domain to solve problems. Notable AI projects from the time include the XCON automated ordering system and the MYCIN bacteria identification system.

Futurist Ray Kurzweil projects that in 2045, artificially generated intelligence will surpass human intelligence in a phenomenon called the Singularity. The term was originally invented in 1958 by mathematician and physicist John von Neumann.

2011 - IBM Watson IBM’s question-answering AI system Watson proved it could out-think humans at a quiz game by beating out top “Jeopardy!” contestants for a $1 million prize. Now the system is being integrated into fields like medicine and food.

1996 - IBM Deep Blue

1987 - 1993

Though development on IBM’s Deep Blue artificial intelligence computer began in 1985, it wasn’t until 1996 that the machine was up to the task of defeating Garry Kasparov in a chess match.

The Second Winter of AI Interest from business in artificial intelligence bottomed out with an economic downturn. The half-billion-dollar computing industry nearly disappeared overnight with more budget cuts.

MISSION CRITICAL

7


MISSION CRITICAL

FO UN D ER AERYO N L ABS : P ROJEC TO RED EN TOR

APRIL BLAYLOCK 8

April Blaylock is senior unmanned aerial systems engineer for research and development at Aeryon Labs. For Projecto Redentor, which mapped Rio de Janeiro’s famous statue of Christ the Redeemer, she planned and piloted all data acquisition flights with the Aeryon SUAS. Blaylock graduated with a Masters of Applied Science in mechanical and mechatronics engineering from the University of Waterloo, Ontario, Canada, with a background in autonomous robotics and computer vision. How did the Projecto Redentor project of mapping Christ the Redeemer come to be?

What were the goals of the project and how did the drone play into meeting those objectives?

The NEXT Lab of PUC University in Rio de Janeiro contacted Pix4D to determine if a 3-D reconstruction could be achieved using both image processing technology and UAS for data acquisition. Pix4D then contacted Aeryon Labs to ask if we would be interested in using our SUAS for data and image capture for the project. [The] Aeryon SUAS platform was chosen because of its built-in fail-safes and safety features, as well as being a VTOL [vertical takeoff and landing] system it could takeoff/land in the confined area at the base of the statue.

The goals were to determine if SUAS technology could be used to collect the data required to create a 3-D model of the statue and it was possible to reconstruct both the statue and its surroundings and combine everything into one model.

After months of planning amongst the three organizations, the project took place in October 2014. What are the traditional methods of 3-D modeling the statue?

Previously, all of the 3-D models and replicas have been created by hand. Accurate 3-D reconstruction has not been possible because the size, location, and challenging weather conditions make it difficult and expensive to fly close enough, within airspace regulations, to collect the data. Technologies, like lidar, have not been able to scan the statue because of these same challenges.

The Aeryon SUAS was able to fly close enough to the statue to collect the images and data required. The data collected was within one-centimeter accuracy and provided detailed views of the statue that people don’t get to see from the ground. What were Pix4D and PUC University of Rio de Janeiro’s roles in the project?

Pix4D was responsible for the image processing and generation of the 3-D model (point cloud and textured mesh) as the final output result. PUC’s NEXT Lab assisted with project logistics, including special permission to fly UAS at the Christ the Redeemer heritage site. Were there any regulatory concerns going into the project about flying?

As part of our planning process, we received approval from the various authorities involved in the operation of the monument


Q&A as a tourist attraction, as well as the local aviation authority. Safety was the primary concern for the project team and park authorities. Christ the Redeemer is the most visited monument in Brazil, so the park was open during its regular visitor hours. To ensure the safety of visitors, the project team was given special permission from various administrations to enter the site before and after official visiting hours. This meant that we didn’t have a lot of time to capture the data we needed. Was it particularly challenging flying the drone around the statue?

There were a number of challenges that affected the project. The most important include the size of the statue and its remote location on the peak of the mountain, changing weather and wind conditions, restricted hours for data acquisition, inconsistent lighting conditions and shadows in early morning and late afternoon resulting in the inhomogeneous color balancing of imagery, electromagnetic disturbances around the statue limited the takeoff locations around the base of the statue, [and] the takeoff area was also restricted to a threefoot-by-three-foot area due to the many electrical cables that were used to light the monument and surrounding area.

Blaylock preparing the Aeryon Scout for a flight

Photos: Aeryon Labs, Pix4D.

What kind of data was the system capturing, and what is going to be done with the results?

The UAS collected geotagged images that were stitched together and processed using Pix4Dmapper Pro to create the 3-D model ‌ as the final output result. The 3-D model will be used and presented by PUC University for a variety of projects and events.

A point cloud of Christ the Redeemer.

MISSION CRITICAL

9


TECH GAP

Apps, Websites Help Drone Users Know Where to Fly Let’s say you have just bought an unmanned aircraft and you aren’t sure if you can fly it in your neighborhood. Never fear: There’s an app for that. Actually a couple. And a few websites, too. Speaking at AUVSI’s Unmanned Systems 2015, Federal Aviation Administrator Michael Huerta announced the agency’s own app, B4UFLY, designed to let UAS users know if they can fly in their current location or where they plan to go. “We want to make sure hobbyists and modelers know where it is and isn’t OK to fly,” Huerta said at a press conference held during the show. “While there are other apps that provide model aircraft enthusiasts with various types of data, we believe B4UFLY has the most user-friendly interface and the most up-to-date information.” The app will be released to beta testers this summer, including to

members of AUVSI. It’s expected to go live later this year. It will include a status indicator that immediately informs operators about whether they can fly where they are or where they are going, as well as information on why that’s the case. It includes a planner mode for future flights, interactive maps with filters and contact information for nearby airports. The app complements Know Before You Fly (www.knowbeforeyoufly.org), started by AUVSI and the Academy of Model Aeronautics, with the FAA as a partner. Although the FAA may bigfoot the app space, it’s not the first one out of the gate. Developer Dale Jones released RCFlyMaps in late 2014, selling it for $1.99 on Apple’s App Store. The app uses GPS to map no-fly zones and also taps into information from the Academy of Model Aeronautics.

There are websites as well that aim to educate people about where they can and can’t fly. The mapping company Mapbox has created the website Don’t Fly Drones Here, which does what it says by highlighting areas in the United States where unmanned aircraft are not allowed (www.mapbox.com/blog/ dont-fly-here). The map highlights military bases, airports, national parks and areas with temporary flight restrictions. Users are encouraged to add data to the map. ICAO, the International Civil Aviation Organization based in Montreal, is also developing a map using software from ArcGIS. It only lists airports, but it does so worldwide (search for ICAO “drone safe to fly”). Last but far from least, drone maker DJI — which builds many of the most popular UAS in use today by both amateurs and professionals — features its own map of no-fly zones on its website, also featuring airport locations worldwide (www. dji.com/fly-safe/category-mc). The company is able to take that one step further, however, by adding flight restrictions to its drone software so that some of its products won’t take off if they are too near an airport.

Images: FAA.

“You will see every airport and the three-mile radius around it, you will see as many AMA authorized fields as I could find, and you will see crowdsourced data on the best/worst places to fly,” says the app’s promotional material. “You will have a

list of all the places you like to fly, and you will be able to favorite any location for future reference.”

10

MISSION CRITICAL


UNCANNY VALLEY

Peppered With Emotion Aldebaran’s Latest Humanoid Comes With Added Feature — Feelings There are robots to help you vacuum. There are artificial intelligence programs that can help you cook dinner. There are even robots that, albeit slowly, want to help you fold laundry. But if there were a robot that just wants to be your friend? That’s the untapped market Aldebaran and SoftBank are aiming to corner with its Pepper robot. Unlike companion robots that have come before it with little consumer attention, this robot sold out in less than a minute after it went on sale June 20 in Japan. A white humanoid that sits atop a wheeled base, Pepper is programmed to analyze the emotions of others and then provide feedback, like playing your favorite song when you are sad or telling you a joke. Its manipulation abilities are limited, but it comes with a suite of sensors to determine what is happening in its environment — four microphones, two HD cameras on its mouth and forehead, a 3-D depth sensor, a gyroscope in the torso, touch sensors in its hands and on its head, two sonars, six lasers, three bumper sensors and a gyroscope. The 12-hour lithium-ion-powered robot is not yet available outside of Japan, but comes prepared to respond emotionally to who it interacts with in Japanese, English, French or Spanish.

Photo: Aldebaran.

Paris-based Aldebaran is no stranger to the robotics market. It is the maker of the prevalent Nao robot, which has sold in the thousands. Click on this QR code to see a video of Pepper.

SoftBank’s goal with Pepper is to have a robot that can serve as a companion instead of as a housekeeper.

Japan’s SoftBank, however, is dipping its corporate toe into the world of robotics for the first time, at the direction of its CEO Masayoshi Son. He directed the mobile connection company to select a robot maker to carry out his vision of an emotional robotic companion.

uses the same foundational software as its smaller Nao robot, called NAOqi. However, Pepper creates its own emotion-like states through its artificial neural networking. It can interact socially through speech or by offering to take family photos or read recipes to a person cooking.

“We wanted to have a robot that maximizes joy and minimizes sadness,” he said in an interview with IEEE Spectrum.

It scans its environment for social cues like language and facial expressions and comes up with a numeric determination as to whether the person in its presence is having positive or negative emotions. And it can then interpret joy, surprise, anger, doubt or sadness.

SoftBank has a financial goal in mind with Pepper too, of course. The roughly $1,900 platform requires a $120 per month data fee to be able to use it, in addition to an $80 monthly damage insurance fee. SoftBank marketed the robot to the public by having Peppers greet customers in its stores in Japan. Aldebaran has been at work for two years on Pepper, which is manufactured for the public by Taiwan firm Foxconn. The robot

“Our goal at Aldebaran is to create robots for the well being of humans, kind robots living with humans as a new artificial species,” says the company’s website. “In order to realize this dream, it’s not enough to simply have Pepper working at SoftBank stores. The ultimate goal is for Pepper to live with humans. The stores are just the beginning.” MISSION CRITICAL

11


GREEN MACHINES Southam pton, U.K.

Keeping tabs on the environment is becoming increasingly important, with more focus on climate change and frequent natural disasters altering the landscape. Here’s a look at how unmanned systems from multiple domains have been tasked with monitoring the health of the planet.

Through a project called the Massive Atmospheric Volume Instrumentation System, or MAVIS, a team from the University of Southampton studied atmospheric layers using unmanned aircraft.

Colorado, Nebraska, Wyoming and Kansas

Fairbanks, Alaska The University of Alaska Fairbanks has used the Aeryon Scout UAS to monitor wildlife and ice conditions in the Arctic, while Conoco Philips has used the Insitu ScanEagle in the state to monitor ice floes and migrating whales.

Oregon Oregon State University is measuring atmospheric temperatures in the boundary layer with fiber optic cables suspended from unmanned aircraft.

Edwards, California NASA flies UAS for hurricane research out of Neil A. Armstrong Flight Research Center in Southern California. The long-endurance UAS includes lasers, a microwave system, GPS, radar, temperature, humidity and pressure sensors to gather storm data over 20-hour periods.

12

MISSION CRITICAL

A team from the University of Colorado at Boulder flew a 10-foot-wide fixed-wing UAS throughout the Midwest targeting the rear-flank downdraft of tornadoes to gather data to help them better understand how tornadoes form.

Williamsburg, Virginia Researchers at the Virginia Institute of Military Science at the College of William & Mary have developed an underwater robot that can help clean up oil spills. The remotely operated vehicle uses acoustic signals to determine the volume of a spill.


STATE OF THE ART Pacific Ocean

Gothenburg, Sweden

Liquid Robotics supplied the data from its Pacific Ocean crossing to feed the PacX challenge, where researchers were given open access to the information the platforms collected for environmental research.

The Port of Gothenburg has deployed a GPS-guided robotic system that can deploy booms to contain oil spills. The system includes an unmanned underwater vehicle that can tow a 1,300-foot boom around the dock.

Khumjung, Nepal Hydronalix deployed its EMILY maritime vehicle to study glacier lake outburst floods on Lake Imja.

Western Australia Western Australia’s Emergency Services Commissioner used an Indago quadrotor UAS from Lockheed Martin to provide situational awareness to firefighting teams through ground control stations and laptops, including pinpointing the fire’s edge, identifying the location and intensity of hotspots, and finding people and property at risk through smoke.

Queensland, Australia Researchers at the Commonwealth Scientific and Industrial Research Organisation (CSIRO) used a custom-built unmanned underwater vehicle to study the impact of floods in Moreton Bay.

MISSION CRITICAL

13


FULL STEAM

AHEAD Railroad to Test Drones for Long-Distance Track Inspections, Automate Safety By Lee Ewing

14

MISSION CRITICAL


In a groundbreaking research initiative intended to test long-distance commercial use of small unmanned aerial vehicles, the BNSF Railway Co. will fly them to inspect railroad tracks for obstructions or damage hundreds of miles beyond the operator’s visual line of sight. The Federal Aviation Administration’s Pathfinder Program, announced May 6, also provides that CNN will test the use of UAS for gathering news in cities within the operator’s visual line of sight and PrecisionHawk UAS will be deployed in a test of extended-line -of-sight flights, including those over rural areas for surveying crops. All three companies had approached the FAA, seeking to participate in the program, and the FAA says others are welcome to apply for Pathfinder. The unmanned aerial inspections will supplement, but not replace, those conducted by railroad employees, says Michael Trevino, BNSF’s assistant vice president for external communications. The railroad’s UAS normally will fly at 65 to 250 feet, but in certain isolated training and test areas, the authorized maximum altitude will be 400 feet, says Todd Graetz, BNSF’s Unmanned Aerial Systems manager.

Photo: BNSF Railway Co.

In developing its Pathfinder concept of operations, BNSF plans to use lessons learned from flights BNSF is supplementing its track inspections with UAS through an FAA program.

of the Insitu ScanEagle UAS in the Arctic, Graetz says. Earlier this year, on March 12, the FAA granted BNSF an exemption to the general ban on commercial UAS operations under Section 333 of the FAA Modernization and Reform Act of 2012. The exemption allows the railroad to operate certain UAS systemwide within visual line of sight under conditions similar to those required in exemptions granted to movie studios, real estate agencies and other companies. The BNSF exemption authorizes the railroad to fly the AirRobot AR180C, a quadcopter with a maximum gross takeoff weight of 11 pounds; the AirRobot AR200, a hexacopter with a maximum weight of 18.5 pounds; or the 3D Robotics Spektre Industrial Multirotor Aerial Vehicle, a quadcopter with a maximum weight of 10.5 pounds. These systems are likely to be among the first to be used in Pathfinder, but the railroad also is working with the FAA toward gaining type certification for one rotorcraft and two fixed-wing UAS. The rotorcraft, which can fly for about 30 minutes, will suffice for the early phases of the test, Graetz says, and the fixed-wing UAS will enable inspections over much longer distances. The railroad already has benefitted from the Section 333 exemption. In early May, BNSF flew UAS to assess and monitor two train derailments, FAA spokesman Les Dorr says. UAS covered the May 6

derailment in Heimdal, North Dakota, of six tank cars carrying crude oil and the May 8 derailment at Valley View, Texas, of four locomotives and 13 freight cars. The visual data from UAS provided a valuable new perspective on the accidents, Graetz says.

“That was a big win.”

The inspection aircraft also can mount infrared sensors, and the railroad plans to test hyperspectral sensors as well. The 36-month program is expected to begin sometime during the next 12 months, Graetz says. Initially, the UAS operator will be located at a launching point in the field, but Graetz expects that eventually the inspection aircraft will be controlled from BNSF headquarters at Fort Worth, Texas, which has an extensive network with advanced communication technologies. The Pathfinder Program could have broad-reaching effects on the unmanned systems industry and its customers, FAA Administrator Michael Huerta indicated in announcing the program. “We anticipate receiving valuable data from each of these trials that could result in FAA-approved operations in the next few years,” he said. “They also will give insight into how unmanned aircraft can be used to transform the way certain industries do business — whether that means making sure trains run on time, checking on the health of crops or reporting on a natural disaster.” MISSION CRITICAL

15


FAA Administrator Michael Huerta announces its BNSF Pathfinder Program initiative at AUVSI’s Unmanned Systems 2015.

impossible, as outlined in a March 2014 industry report on the progress of PTC implementation.

A second major railroad safety initiative — Positive Train Control — has been in the works for 45 years. Positive Train Control is an integrated set of highly advanced technologies designed to automatically stop or slow a train remotely to prevent certain types of accidents. “PTC refers to communicationbased/processor-based train control technology designed to prevent train-to-train collisions, overspeed derailments, incursions into established work zone limits and the movement of a train through a main line switch in the improper position,” the Federal Railroad Administration says. The system integrates groundbased sensors, signals, command and control facilities, and Global Positioning System constellations whirling through space. Satellites beam GPS data directly to computers in locomotives equipped with digital radio systems to monitor train location and enforce speed limits. Passive transponders along the tracks contain information on the train’s location, speed restrictions and signal locations. Congress, in the Rail Safety Improvement Act of 2008 (RSIA), mandated that PTC be implement16

MISSION CRITICAL

ed across a significant portion of the nation’s rail industry by December 31 of this year, but the Association of American Railroads, which represents freight railroads, has said its railroads, despite their extensive efforts, can’t meet the deadline. Amtrak, the passenger rail system, has said it expects to comply with the PTC mandate by December 31. “Since enactment of RSIA, railroads have devoted enormous human and financial resources to develop a fully functioning PTC system over the 60,000 miles that are subject to the PTC mandate,” the association says on its website. “Progress to date has been substantial. Railroads have retained more than 2,400 signal system personnel to implement PTC and has already spent $5 billion on PTC development and deployment. Railroads expect to spend more than $9 billion before development and installation is complete. “Nevertheless, due to PTC’s complexity and the enormity of the implementation task — and the fact that much of the technology PTC requires simply did not exist when the PTC mandate was passed and has had to be developed from scratch — much work remains to be done. Despite railroads’ best efforts, various technical and nontechnical challenges make full development and deployment of PTC by 2015

“If on Jan. 1, 2016, railroads required to implement PTC systems are in violation of this statutory deadline, FRA will take appropriate enforcement actions to achieve compliance,” FRA’s Chief Safety Officer Robert C. Lauby told a U.S. Senate committee June 10. “To address those concerns, the GROW AMERICA Act proposes that FRA be granted authority to review, approve and provisionally certify PTC plans on a railroad-by-railroad basis.” Preliminary findings of investigations into the May 12 derailment of an Amtrak train in Philadelphia, in which eight people were killed and 200 injured, indicate that excessive speed was a likely cause. “The reality is that Positive Train Control is specifically designed to prevent overspeed accidents,” Lauby testified. “If we believe that the cause of this incident was overspeed, it would have been prevented by PTC.” The National Transportation Safety Board has called for Positive Train Control since 1970 in more than 50 recommendations. 

Photo: Robb Cohen.

AUTOMATING RAILROAD SAFETY

“Consequently, the implementation deadline should be made more realistic to ensure that a fully interoperable PTC system is deployed in a logical manner and thoroughly tested prior to implementation.” Legislation pending in Congress legislation would extend the deadline by five years, to Dec. 31, 2020.


YOUR SHOT HERE

Have a great photo you’ve taken with an unmanned system?

CONTACT : Danielle Lucey Editor dlucey@auvsi.org

Send it in to AUVSI’s Unmanned Systems and it could be published as our Viewfinder photo of the month. All photos must be high resolution and must have been taken during legal unmanned system operations. If the photos were taken with an unmanned aircraft, the operator must also have adhered to AUVSI’s Code of Conduct. See http://www.auvsi.org/conduct for more information. MISSION CRITICAL

17


GOOGLE ON STEROIDS Searching the ‘Computational Universe’ With Wolfram By Scott Kesselman

Wolfram Research has been at the forefront of computer, Web and cloud software innovation and has led advances in computational knowledge since releasing its Mathematica technical computing platform in 1988. The company’s extensive database and powerful algorithms combine with intuitive and simplistic syntax to create a powerful tool for engineers, scientists, mathematicians and programmers. In 2009, Wolfram extended its offerings to include Wolfram|Alpha, one of the most complex and ambitious software projects of all time, according to the company. The online software gives millions of users around the globe access to a world of computable information in areas such as mathematics, words and linguistics, units and measurements, statistical and data analysis, people and history, chemistry, culture and media, money and finance, physics, art and design, socioeconomics, astronomy, music, health and medicine, engineering, places and geography, food and nutrition, education, materials and earth science, life sciences, weather, sports, transportation, Web and computer systems, and more. 18

MISSION CRITICAL

Wolfram has been able to aggregate the accumulated knowledge of our culture and provide it in an easily searchable and interactive format. With Alpha Pro, users can engage with modular, three-dimensional models of Lagrangian fluid dynamics or nitrile functional groups, nutrient graphs for sweet potato consumption or poker betting odds and everything in between. With Wolfram, the world is becoming computable. Stephen Wolfram, founder of Wolfram Research spoke frankly about the future of human cognition and human-robot interaction at the RoboUniverse conference in New York City this spring. In his eyes, computers will eventually be able to analyze data from our world so efficiently it will develop an understanding of our lives beyond the bounds of language, which we generally require to pass on information. Imagine an outside observer with no concept of your thoughts or reasons behind your every move. This observer merely collects all of the sensory data surrounding your actions including spatial, visual, auditory and thermodynamic information. Through

Wolfram|Alpha allows users to leverage big data to pull specific queries, like this earthquake search from Wolfram’s mobile app or the sky chart on Page 19.

deep learning neural networks, these data can form an incredibly precise model of your existence with an understanding beyond that of ours, which parses experience and chemistry into arbitrary classifications that we culturally agree on — language. The outside observer, computer or robotic technology, can interpret your life more accurately than you can, and at the end of the day, “humans get to define the goals.” Wolfram calls this “post-linguistic emergent concepts,” or fundamental realities that arise independent and at a higher level than language. We do have some small connection to this phenomenon when we share deep experiences that cannot be described with language, or when something is readily apparent to you even when you can’t understand why or how you know it to be true.


SPOTLIGHT

Big data can help the world achieve a closer bond with our natural origins, according to Wolfram. Complex interactions of protein chains, water and energy over billions of years have lead us to become sentient and social living things able to share at least a basic understanding of what it is like to be alive. And although true robots will never share these same biological foundations, they will help us to understand ours in a way we would otherwise be blind to. In this sense, computational programming is the most natural, and possibly effective, process to understanding ourselves. Just as we take in and process our environment, so too can robots be an ideal extension of ourselves beyond our biological limits, where we can take full advantage of the computational potential built into our biology and the natural world.

Image: Wolfram Research.

In comes the Wolfram Language, built upon 25 years of experience in technical computing, plus the successful Alpha platform. Its lofty purpose is to enhance the organic capabilities of mankind and become a pure extension of our minds, he says. Wolfram doesn’t believe that programming should be an activity exclusively for the appropriately educated, he says, because we all rely on a similar way of processing information to survive and interact everyday. The interface synthesizes code based on natural language processing. It enables users to define the goal of an algorithm and will find an algorithm out there that works. He calls this “discovering the computational universe.” It will

also turn questions into symbolic language, run a model and return results or visualizations — for example, “What will happen when I move this lever?” The latest update supports lightning-fast builds of simple tasks such as geolocating your IP address and creating concentric circles around that point, which Wolfram demonstrated live at the RoboUniverse conference. It also integrates with the cloud for more complicated tasks involving pulling in and analyzing data from Internet of Things devices. Wolfram has a database of nearly every IoT device ever created for seamless integration with the software. OBJECT IDENTIFICATION The Wolfram Language also comes with built in image recognition for over 10,000 objects currently.

In the future, Wolfram hopes to create a universal sensor platform that can pare down the countless data sensors on the market to a set of core sensors from which all the other information can be extrapolated. According to Wolfram, the future is bright and we have only just started to see how robotic technologies will benefit us. In his eyes, we don’t have to worry about the robots taking over, but instead focus on becoming automated ourselves once robotic technologies are better than humans at everything. Once machine learning is perfected, we will have an auto-suggest for everything in life, an idealized set of recommendations, and it will be up to us to continue to update our definition of life and our goals.

At the event, the program browsed for a picture of a koala. The system had no trouble recognizing the first image of a koala, and, when tasked with a more obscure koala wearing a yellow hat and eating an ice pop, Wolfram himself was surprised at the positive result. The programming language figures to fit in with the most current issues in data management and analysis from connected devices. For instance, it is plausible to send up an Internet-connected drone over rural farmland and stream data to the Wolfram cloud from the device. Built-in analytics features can sort through the voluminous data and return actionable information to the operator all over the cloud. MISSION CRITICAL

19


HIGH-TECH HANGAR EasyJet Turns to Drones for Aircraft Inspections Being an airplane can be tough. Planes can be damaged by lightning, hailstorms, foreign objects, bird strikes, and even from getting banged into by air bridges and supply vehicles. They need to be inspected frequently, but such inspections can take time, pulling them out of use 20

MISSION CRITICAL

for half a day or more. If the inspection needs to be done at a remote location, an onsite engineer isn’t always available or parts may not be at the ready, adding even more time. EasyJet, a low-cost British air carrier, has now teamed with U.K. neighbor Blue Bear Systems to speed up that process using small

unmanned aircraft. They are using Blue Bear’s RISER drone, originally created to measure radiation in nuclear plants. Its name is actually an acronym for Remote Intelligent Survey Equipment for Radiation. The aircraft was developed in 2012 along with Createc Ltd., another U.K. company, to carry radiation detection and mapping payloads. That software also means it can operate indoors in areas where GPS doesn’t work, which allows it to inspect aircraft. Three years after it was created, that became its focus. The vehicle caught the eye of EasyJet, according to Blue Bear Systems CEO Yoge Patel, after it was shown to be able to inspect a nuclear facility in 35 minutes. Patel


TESTING, TESTING An artist’s conception of a Blue Bear Systems RISER UAS inspecting an EasyJet aircraft.

described the origin of drone plane inspections at a recent unmanned aircraft conference sponsored by the International Civil Aviation Organization. RISER is able to get close to aircraft using a laser scanner to keep from bumping into the wings or fuselage. It’s able to image damage of less than one millimeter and do so without hitting the plane or people working nearby. It can conduct its scan automatically, so maintenance crews don’t have to fly it. In the future, Blue Bear Systems hopes to allow for automatic damage recognition, which would speed aircraft inspections even further. “Really, we want these platforms and inspections to be cost-effective assets, so we do not want a huge training burden,” Patel said. “This is really pointing to automated cooperation. You allow the vehicle to fly around a commercial vehicle and it just does the inspection.” Using the RISER drone — even in its current, less automated state — has several advantages over the traditional inspection method, according to Patel and EasyJet’s Ian Davies. It’s faster, for one thing, which keeps the planes available for flight more often, thereby cutting costs and the need for spare airplanes.

Images: Blue Bear Systems.

It also allows for change detection over time, so EasyJet can track the wear and tear its planes experience. EasyJet conducted trials of the system this spring and summer, starting with initial tests using a cabin trainer and moving on to hangar tests with actual EasyJet aircraft.

The RISER UAS can laser-scan planes for inspection.

The trials were successful, and in early June Blue Bear announced that its drones will join EasyJet’s maintenance regime at company hangars in London, Geneva, Basel, Switzerland, Berlin, Paris, Milan and others across Europe. That work will start next year. “Our team looks outside of the industry to see what sort of available cutting-edge technology could be put to use in aviation and drone technology is the perfect example of this. Our successful automated trials have proved that drones can be extremely effective to help us perform detailed aircraft checks,” Davies said in a Blue Bear press release. “Implementing automated drones next year means that checks that would usually take more than a day to complete can now be performed in a couple of hours with greater accuracy, which in turn leads to a reduction in technical-related delays. Excluding natural events outside of our control, like lightning and bird strikes, my ambition is to get technical delays down to zero through the clever and innovative use of technology.”

Blue Bear hopes to expand that work beyond just the initial 10 sites. “What’s the target? Why, the entire fleet for EasyJet,” Patel said. “Why not? If it works for one aircraft, why would you not want to inspect the entire fleet?” BEYOND AIRPLANES Patel also said aircraft inspection isn’t the last word for the versatile RISER. “Our RISER system is a gamechanger within the world of autonomy and commercial aviation,” she said in a press release. “However, it is not just limited to aircraft inspection — it can be used to examine ships, buildings, structures and other vehicles too.” EasyJet’s Davies agreed, saying, “We believe there may be other uses for drones in aviation in the future, like the transportation of spare parts.”

MISSION CRITICAL

21


A Simple Gesture Drone and Robot Builders are Pursuing a Wide Range of Easier-to-Use Controllers By Marc Selinger

22

MISSION CRITICAL


Developers of drones and robots are trying out and rolling out new controllers that are more intuitive to operate and, therefore, require less training than more traditional devices, such as joysticks and video game-like controllers. Touchscreen tablets, goggle-like headsets and gesture-control devices are among the new options, which offer many benefits, including making drones and robots available to a wider range of people and missions. “The future of drones lies in their underlying utility, usability and adding autonomy as necessary,” says Radley Angelo, chief executive officer of Spark Aerial, a San Diego-based engineering firm. “We see human-computer interaction as the next big frontier for drone innovation.”

The Oculus Rift controls the camera “with just the turn of a head,” Angelo says. With the Leap Motion Controller, “it’s possible to just think of your hand like the drone. Move it higher, and the drone flies higher. Spin left or right, and the device will yaw in the air.” Michael Perry, spokesman for China-based DJI, says the test showed that either controller might be easier to use by people who are uncomfortable with joystick-type controllers or have disabilities or poor motor skills. While DJI does not plan to sell the Oculus or Leap Motion devices with its drones, it is exploring how its drones could more easily interface with nontraditional controllers.

“It’s not like one user interface fits everybody,” Perry says. “It will be exciting when consumers have a lot of choices for how they interact with technology.” DJI and Accel Partners, a Palo Alto, California, venture capital firm, recently launched SkyFund, which plans to invest millions of dollars in promising “early-stage entrepreneurs” who are developing a wide range of aerial robotic technologies, including new user interfaces. While the initial investment in each project will be about $250,000, “the option exists for follow-on funding opportunities from DJI and/or Accel as companies grow,” according to the fund’s website.

Aeryon Labs says it continues to update the tablet-style controller for its SkyRanger unmanned aircraft to make it more intuitive.

DRONES

Photo: Georgia Tech, Aeryon Labs.

Drone giant DJI recently flew its new Matrice 100 quadcopter using an Oculus Rift virtual-reality headset and a three-inch-long Leap Motion Controller. The M100 is designed for experimentation, and Spark Aerial integrated both controllers with the M100 at DJI’s request. During the demonstration, which occurred in the San Francisco Bay area, the Oculus Rift steered the drone’s camera, and the hand gesture-guided Leap Motion device directed the drone’s flight.

Georgia Tech developed a tablet-based system that directs a swarm of small robots. Magnus Egerstedt, the professor who led the project, holds the robots. MISSION CRITICAL

23


Leap Motion says lesser-known developers are also testing its controller with a variety of small drones, including the Parrot AR and Puppet PPM quadrotors. Small firms are pursuing other advances in intuition. Lily Robotics is developing Lily, a 2.8-pound drone that it calls “the world’s first throw-and-shoot camera.” There is “no controller required,” according to the company.

of $999,” the startup says. CyPhy Works of Danvers, Massachusetts, is designing the six-rotor LVL 1, which promises to allow “absolutely anyone” to operate a drone to shoot video and photographs. Users will fly LVL 1 by swiping a finger on a smartphone screen, and they will be able to post images on social media while the drone is still in the air, CyPhy says. Deliveries of LVL 1 are scheduled to begin in 2016. The

Drone developers are testing a variety of intuitive control systems, including this Leap Motion

device, which responds to hand gestures.

A Lily user carries a disk-shaped tracking device that can be worn like a wristwatch. The user throws Lily in the air, and the drone automatically flies itself and shoots video and still photographs of the person. A company video shows Lily following a snowboarder down a slope and landing in his palm. “Focus on your activity while Lily flies itself to capture your adventures,” the company wrote on its website. Lily Robotics, which began two years ago in the basement of a University of California Berkeley robotics laboratory, says it plans to begin shipping the drone to customers in 2016. Lily sold for $499 until midJune and will now “progressively increase up to the regular retail price 24

MISSION CRITICAL

drone sold for $495 during a Kickstarter fundraising campaign and will retail for “over $600.” Drone companies are also working to make existing products more intuitive. Aeryon Labs of Ontario, Canada, says it continues to update the software for its map-based, tablet-style controller. For instance, with the new HDZoom30 camera, which can read a license plate from 1,000 feet away, the user of an Aeryon SkyRanger small unmanned aircraft can simply tap the controller’s screen to have the camera zoom in on a particular location. The Aeryon controller “ensures that the aircraft and payload respond to operator commands in a fashion that is consistent and

logical, and [it does] not require the operator to have any knowledge of aviation or [radio-controlled] flight as is required for joystick-based” controllers, says David Proulx, Aeryon’s vice president of product and marketing. ROBOTS Builders of unmanned ground vehicles are also working to make their controllers more intuitive. The new uPoint Multi-Robot Control System uses a tablet to operate iRobot’s family of robots. Instead of using a joystick to steer a robot to a location, the user simply points a finger on the uPoint screen and the robot automatically goes there. Orin Hoffman, technical director for iRobot’s defense and security unit, says the company picked the tablet because it is easier to use than a joystick-based system and because many of its customers already use tablets for other purposes. “We’re in the tablet generation now, so everyone knows how to use a tablet,” Hoffman said. “Our users are coming into the experience already pretrained.” The tablet also allows users to control multiple robots, access databases and share images with colleagues who are nearby or at other locations. The company looked at other devices, including Oculus Rift, Leap Motion, Nintendo Power Gloves and speech-recognition technology, but concluded they were not as ready for prime time as the tablet. The Oculus Rift, for instance, was judged too


cumbersome for a battlefield. “You can’t have someone in an Oculus Rift who can’t see where they’re walking or can’t see who’s shooting at them,” Hoffman says. But iRobot, of Bedford, Massachusetts, continues to conduct research on other types of robot controllers, as do others. A student team at the University of Pennsylvania is testing Oculus Rift with the Dexterous Observational Roving Automaton, which consists of a pair of cameras mounted on a wheeled cart. The DORA telepresence robot allows the person wearing the Ocu-

lus Rift to see through the cameras remotely. As the user turns his head left, right, up or down, the cameras do the same. The Oculus Rift allows the robot to “track nuanced human head movement in all six degrees of freedom and provide real-time visual and auto feedback via a headmounted display,” the DORA team writes on its website. The robot provides “a natural, responsive and clean graphical user interface.” The Georgia Institute of Technology says its researchers have developed a user-friendly, tablet-based system to direct a fleet of Khepera III miniature

wheeled robots made by Switzerland’s K-Team Corp. The user taps the tablet to control where a beam of light appears on a floor. The robot swarm then rolls toward the light. “The new Georgia Tech algorithm that fuels this system demonstrates the potential of easily controlling large teams of robots, which is relevant in manufacturing, agriculture and disaster areas,” the university says. For instance, in a tsunami-ravaged region, “the robots could search for survivors, dividing themselves into equal sections. If some machines were suddenly needed in a new area, a single person could quickly redeploy them.” 

Photo: Leap Motion, iRobot.

The new uPoint Multi-Robot Control System, made by i-Robot, uses a point-and-go tablet.

MISSION CRITICAL

25


SKY WITNESS NEWS CNN Sees ‘Ecosystem’ Developing for UAS News Coverage CNN has been relying increasingly on small unmanned aircraft to add depth to its news coverage. While the broadcast giant is working to develop its own inhouse capability, so far it has been relying on a growing number of commercial enterprises that are able to meet its needs for high-quality video and reliable platforms. CNN broadcast its first domestic story with drone footage to mark the 50th anniversary of the civil rights march in Selma, Alabama. 26

MISSION CRITICAL

Since then, it has used them to cover the anniversary of the federal building bombing in Oklahoma and the oil spill near Santa Barbara, California. They have also used drones to cover flooding in Texas. While the recent spate of Section 333 exemptions is giving CNN a greater choice of vendors, it’s still a small field, says Greg Agvent, CNN’s senior director for news operations. “The reality is there’s only a few dozen companies in the U.S. that are equipped to do this. Equipped is

the key word on this thing,” he says. “We see a lot of companies that are getting their exemptions that are just not in the news business or are not flying the level of craft that we might want to fly.” However, an increasing number are using the DJI Inspire as their main UAS, which is one of the platforms that has a good enough camera, safety record, ease of use and flight time, “all of those things that we have to factor in,” he says. All it lacks is a robust enough signal for live video streaming. CNN goes about acquiring footage by identifying news opportunities, whether they are breaking stories or ones that can be anticipated, such as the Selma event or flooding. Agvent then identifies people in the area who have exemptions and sounds them out, asking, “What are you flying, who’s your pilot, what kind of video can you show me?”

He also needs to know if they


END USERS A UAS operated by Texas-based HeliVideo on behalf of CNN prepares to fly at a 20th anniversary commemoration of the bombing of the a federal building in Oklahoma City.

are “willing to work the way that we work, because a lot of guys aren’t. If you’re going to shoot for real estate, you can do your job and do it well from 9 to 5. News is not real glamorous. We’re going to ask you to go to places at any hour of the day, in some of the most severe conditions that we find, and figure out a way under the current rules and regulations that your 333 exemption mandates. That’s not easy.” The current marketplace is a bit of the Wild West, he says, as new players are arriving on the scene but can’t always deliver what a 24-hour news operation needs. “I’ll give guys opportunities, and I have given guys opportunities, and there are guys who hit the ball out of the park, and there are guys who quite frankly weren’t quite up to the expectations that we had of them,” he says. “It’s an art and a science to do what these guys are doing.” The art is the videography part, which requires a certain kind of eye. The science is the piloting and the aerial knowledge and the ability to fly a UAS in a way that’s safe and reliable.

Photos: Mike Shore Photo.

Many of the crews that CNN hires, especially those flying DJI UAS, are “two-man bands,” one functioning as a spotter and one flying the aircraft and operating the camera. For bigger operations, such as in Oklahoma or Selma — which were not tied to fast-breaking news — CNN put a third person into the mix, so there was a separate pilot, camera operator and spotter. The business model of the news business is changing as more people get into the game and get their exemptions with lower price craft that

meet CNN’s basic needs, Agvent says. “When we first started shooting, and we were shooting with giant cameras, very big, almost film cameras, and filmcapable platforms, the cost was extremely high. The costs are coming down, and that’s a good thing. There’s still plenty of room for more guys on this thing, because there are probably only a couple of three dozen that are probably really fully functional at this CNN employed a drone for coverage at the memorial moment.” to the Oklahoma City federal building bombing. CNN is building its by the FAA to work on flying in own database by keeping an eye on the Federal Aviation Ad- urban settings for news. ministration’s Section 333 exemp “Everybody talks about safety, tion Web page, and then researches but it just ramps up when you’re the companies that might work. flying in an urban setting,” he says. Agvent says he has talked to 10 to CNN is talking with vendors 12 companies so far and hired five. about potential UAS platforms, He’s also approached by companies as well as putting the necessary offering their wares and adds them operational practices in place. to the database. Agvent says the agency could have The news agency also buys footage from people who may not have Section 333 exemptions but come to them with imagery after a news event.

“There are hobbyists out there now that are shooting video. If they capture a news event and then approach CNN, we’ve had these internal discussions and we’ve purchased that footage from them. At that point we think it’s a rights issue, not anything more.” CNN is working with Georgia Tech on researching UAS for newsgathering and plans to develop its own in-house capability, Agvent says. The network has been designated as a Pathfinder Program

some in-house pilots, a few tiers of aircraft — high-end, lower end and even tethered UAS for long endurance — and then it would still hire out other companies to augment that capability. Agvent says the industry is seeing the dawn of an ecosystem that will support not just news but also insurance companies, inspection companies and others. “We will have our basic level in house, where we’ll be able to handle any of the assignments that come our way, but we’re going to have to supplement that … with these trusted vendors, with this ecosystem that we’re starting to see.” MISSION CRITICAL

27


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.