San Francisco, CA January 4, 2017

Commercial drones have become one of the most promising segments of enterprise technology, promising to revolutionize the way businesses digitize their operations and interact with the physical world. In 2017, significant advances will be made in two key areas of commercial drone technology: Integration and Autonomy.

To date, drones have been used primarily as stand-alone tools. The planning, capture, and analysis of drone data have been conducted as separate efforts. When commercial drone operations scale out widely, it will be due in large part to the following aspects of technology integration:

  • Drones will become increasingly dependent on cloud-based operations planning. Flight plans will be created in the cloud and tasked to drones onsite. Specific sensor settings and configurations will also be determined in the cloud to ensure that entire fleets of vehicles and operators collect data consistently.
  • To enable cloud-based planning, an abstraction layer will begin to form between the requirements of the data and the physical hardware used to collect it. For example, commercial users may only need to specify the geographic boundaries of a job and the desired sampling resolution.The planning abstraction layer will use that information to determine which drone, sensor, and settings are appropriate to conduct the job. Eventually, commercial users may be able to simply specify the desired information product, like a digital terrain model for floodplain analysis, and let the planning system figure out the rest.
  • An important pain point in drone systems today is the difficulty with which data is transferred from the drone sensor to the cloud for processing. It often requires physical transfer of a storage device like an SD card from the sensor to a tablet or mobile device, then transfer from the mobile device to the cloud through a loosely integrated file sharing tool. This process will be streamlined through every layer of the stack. Tightly-integrated sensors will offload data wirelessly and automatically without any need for a physical transfer. Third-party “middle-man” data transfer tools will be eliminated in favor of direct, secure connections to cloud data analytics platforms. In some cases, drones may connect directly to cloud data APIs through the use of integrated mobile LTE devices.
  • Fleet management and diagnostic data collection are painfully manual processes today. It is often difficult (and sometimes impossible) for drone operators to access operational log data, let alone determine when a drone may need preventative maintenance or scheduled servicing. Beyond maintenance, drone operators manually track usage statistics (hours flown, number of takeoffs/landings, etc.) that they are then often required to report to regulatory agencies like the FAA. As drone systems become more tightly integrated with the cloud, diagnostic and operational data will be automatically uploaded and managed in a central location. This will unburden individual operators and make the management of entire fleets of commercial drones feasible for scaled operations.

Drones have come a long way, from fully manual “stick-and-thumb” piloting to fully autonomous operations that require zero input from launch to landing. Even so, flight autonomy is still mostly limited to basic survey-style data capture, and typically with fully pre-programmed flight paths. Autonomy in drone operations will develop in several ways to enable safer, more efficient, more consistent data capture for increasingly complex applications:

  • Autonomous data capture via drone has been predominantly “2-dimensional”, in the sense that the drone is tasked with collecting downward-looking data from a conservative flight altitude. This type of data collection is sufficient for many applications involving land survey, such as quarry and mine operations, construction site monitoring, and civil planning. Operations are typically piloted manually for more complex data capture tasks such as inspections of buildings, like houses, offices, and warehouses, vertical structures, like cell towers, and infrastructure, like electrical substations, bridges, and dams. Autonomy for these types of data capture tasks is significantly more complex than overhead surveys, due to the much closer proximity to structures of interest. Drones will increasingly utilize pre-measured 3D models for flight planning, tightly integrated proximity sensors to ensure safety during flight, and sensors like lidars, stereo cameras, and structured light sensors to capture 3D information in real time to navigate complex sites.
  • Most drone operations today capture data in an “open-loop” fashion, i.e. data is captured during the flight and interpreted only after landing, when the data can be manually accessed and reviewed. This creates significant inefficiencies for drone inspection operations that require data to be captured at varying levels of detail. Take, for example, a drone inspection of a large warehouse roof. In an initial pass, the drone may collect low-resolution data over the entire roof, land, and present that data to the operator to determine where a closer look may be needed in subsequent flights. As drones begin to carry more powerful onboard computers, computer vision techniques will enable “closed-loop” data capture and re-tasking. Vision processing onboard the drone will automatically determine where subsequent detailed inspection is needed and collect the detailed data during the same flight.
  • Beyond just the capture of data, the analysis of data will become increasingly automated as commercial drone operations scale up. Today, we rely mostly on humans for aerial data analysis, like determining the condition of a house’s exterior or the presence of rust on a structural member. As drones collect more data, and as humans manually annotate this data with assessments of damage or condition, the result will be an increasingly large database with which to train automated image analysis algorithms. This process of training machine learning algorithms with datasets that have been manually labeled is called supervised learning. In the past few years, image segmentation and classification algorithms have made significant improvements through the use of convolutional neural nets for deep and robust learning, and GPUs for fast, parallelized computation. It’s important to note that humans will remain a critical part of the data analysis process.The human effort will simply shift towards the training phases, while drone systems will take on more automated classification and analysis through onboard or onsite computation.

With these advancements in integration and autonomy, commercial drone technology will be more valuable than ever to enterprises, and that value will be progressively easier to derive. Those that take advantage of these advances will find themselves with a competitive edge in their industries.

Buddy Michini, Ph.D.

Buddy Michini is the CTO of Airware. Buddy has a BS, MS and Ph.D. in Aeronautics and Astronautics from the Massachusetts Institute of Technology (MIT). His research has included adaptive control for indoor UAVs, autonomous battery swap for aerial platforms, autopilot design for R&D, and robot learning from demonstration. As CTO, Buddy helps guide the long-term product roadmap to incorporate new developments from industry and academia into the Airware product line.

Related Posts


Drones in Davos

January 29, 2018

Keep reading

Drones and the Digital Revolution

December 6, 2017

Keep reading

The Eight Stages of Drone Technology-Driven Transformation

August 15, 2017

Keep reading