Category Archives: UAVs

Drones Counting Ducks Down Under

One of the oldest citizen science projects is bird watching.  For more than a century, enthusiastic birders have amassed vast datasets of avian sightings.  To date, technology has enhanced but not displaced this proud nerd army. Photography, GPS, and databases have vastly improved the data from birders, but nothing has replaced boots on the ground.


This month, a research project at the University of Adelaide reported a demonstration of a UAV mounted image system that, for once, beats human birders [1].

Specifically, the study compared the accuracy of humans versus a small survey quadcopter, on a task to count birds in a nesting colony.  In order to have a known ground truth, the tests used artificial colonies, populated by hundreds of simulated birds.  The repurposed decoys were laid out to mimic some actual nesting sites.

They dubbed it “#EpicDuckChallenge”, though it doesn’t seem especially “epic” to me.

The paper compares the accuracy of human counters on the ground, human counts from the aerial imagery, and computer analysis of the aerial imagery.

First of all, the results show a pretty high error for the human observers, even for the experienced ecologists in the study. Worse, the error is pretty scattered, which suggests that estimates of population change over time will be unreliable.

The study found that using aerial photos from the UAV is much, much more accurate than humans on the ground. The UAV imagery has the advantage of being overhead (rather than human eye level), and also holds still for analysis.

However, counting birds in an image is still tedious and error prone.  The study shows that machine learning can tie or beat humans counting from the same images.

Together, the combination of low-cost aerial images and effective image processing algorithms gave very accurate results, with low variability. This means that this technique would be ideal for monitoring populations over time, because repeated flyovers would be reliably counted.


This study has its limitations, of course.

For one thing, the specific task used is pretty much the best possible case for such an aerial census.  Unrealistically ideal, if you ask me.

Aside from the perfect observing conditions, the colony is easily visible (on an open, flat, uniform surface), and the ‘birds’ are completely static.  In addition, the population is uniform (only one species), and the targets are not camouflaged in any way.

How many real-world situations are this favorable?  (Imagine using a UAV in a forest, at night, or along a craggy cliff.)

To the degree that the situation is less than perfect, the results will suffer.  In many cases, the imagery will be poorer, and the objects to be counted less distinct and recognizable. Also, if there are multiple species, very active birds, or visual clutter such as shrubs, it will be harder to distinguish the individuals to be counted.

For that matter, I’m not sure how easy it will be to acquire training sets for the recognizer software.  This study had a very uniform nesting layout, so it was easy to get a representative subsample to train the algorithm.  But if the nests are sited less uniformly, and mixed with other species and visual noise, it may be difficult to train the algorithm, at least without much larger samples.


Still, this technique is certainly a good idea when it can be made to work.  UAVs are great “force multiplier” for ecologists, giving each scientist much greater range. Properly designed (by which I mean quiet) UAVs should be pretty unobtrusive, especially compared to human observers.

The same basic infrastructure can be used for many kinds of surface observations, not just bird colonies.  It seems likely that UAV surveying will be a common scientific technique in the next few decades.

The image analysis also has the advantage that it can be repeated and improved.  If the captured images are archived, then it will always be possible to go back with improved analytics and make new assessments from the samples.  In fact, image archives are becoming an important part of the scientific record, and a tool for replication, cross validation, and data reuse.


  1. Jarrod C. Hodgson, Rowan Mott, Shane M. Baylis, Trung T. Pham, Simon Wotherspoon, Adam D. Kilpatrick, Ramesh Raja Segaran, Ian Reid, Aleks Terauds, and Lian Pin Koh, Drones count wildlife more accurately and precisely than humans. Methods in Ecology and Evolution:n/a-n/a, http://dx.doi.org/10.1111/2041-210X.12974
  2. University of Adelaide, #EpicDuckChallenge shows we can count on drones, in University of Adelaide – News. 2018. https://www.adelaide.edu.au/news/news98022.html

 

 

Robot Blimp For Exploring Hidden Spaces

I noted earlier the discovery of what seems to be a chamber in the Great Pyramid at Giza. The discovery opens the question of how to further explore the hidden space without damaging the ancient structure.  One idea is to drill a small shaft, and push through a tiny robot explorer.

A research group at INRIA and Cairo University is developing a robotic blimp for such a mission.  The deflated blimp can be pushed through a 3cm shaft, then inflate and reconnoiter the hidden space.  The task requires a very compact and light system, and likely will operate autonomously.

Evan Ackerman interviewed senior investigator Jean-Baptiste Mouret for IEEE Spectrum [1].  He notes that a blimp is a good choice, because it is “pillowy”, and less likely to damage the structure.

Mouret describes the challenges imposed by the size and weight limits.  Conventional sensors, including GPS, would be too heavy and power hungry.  They are developing “bioinspired” sensors, based on bees and flies.  These include a miniature optic-flow sensor that can operate in low-light conditions.

Getting the robot into the space is one thing, making sure that it is retrieved is more difficult.  It is important not to litter the structure with a lost robot, so the robot will need to return to the tiny access hole, dock with the base, and fold up so it can be pulled out.  It will be designed with backup behaviors to search for the dock, even if damaged.

It will be years before any expedition to the Great Pyramid happens. The robot is still being developed and the measurements of the Pyramid are being refined.   The Pyramid is over 4,000 years old, so there is no need for haste.


  1. Evan Ackerman, Robotic Blimp Could Explore Hidden Chambers of Great Pyramid of Giza, in IEEE Spectrum – Automation. 2017. https://spectrum.ieee.org/automaton/robotics/drones/robotic-blimp-could-explore-hidden-chambers-of-great-pyramid

 

Robot Wednesday

Sun2ice: Solar Powered UAV

One of the important use cases for UAVs is surveillance in all its forms. Small, cheap aircraft can cover a lot of area, carry a lot of different sensors, and swoop in to obtain very close up information.   In some cases, a human can directly control the aircraft (as in selfie cams and drone racing), but for many cases the UAV needs to be substantially autonomous.

Furthermore, remote observation generally needs long, slow flights, rather than short, fast ones. Range and flight duration are critical.

Remote sensing by UAVs is ideal for many kinds of environmental research, especially in remote areas such as deserts, oceans, or polar regions. A fleet of (inexpensive) UAVs can multiply the view of a single (very expensive) scientist by orders of magnitude, measuring a broad area, and identifying points of interest for detailed investigation.

This summer a group of researchers from ETH and the AtlantikSolar company have demonstrated a UAV that continuously monitored glaciers in Greenland. The Sun2ice is solar powered, so it charges its batteries as long as the sun is shining. In the polar summer, there is essentially 24 hour sunlight, so the UAV has power to fly continuously for months, at least in principle. Like other solar powered aircraft and boats, the AtlantikSolar needs not fuel and should be capable of extremely long missions.

Of course, flying over Greenland is difficult for any aircraft, and flying a small UAV continuously over remote and rugged glaciers is very challenging. The aircraft must deal with high winds and cold temperatures, even in good weather. With no pilot on board, the control systems must be highly automated.

The UAV must navigate over uninhabited territory, far from the humans back at base. It has to stay on station to collect data continuously, with little help from people. Magnetic compasses don’t work on Greenland, and continuous daylight means that celestial navigation is not possible either.

The researchers also had to deal with take off and landing from a remote field station. The video shows the UAV being delivered to its launch point via dogsled—Pleistocene technology deploying twenty first century technology. The test flights were successful, though flying time was less than a full day.

Flying an experimental solar-powered UAV as AtlantikSolar in Arctic conditions is very challenging due to the narrow sun angle, extreme climatic conditions, the weakness of the magnetic field used for the compass, and the absence of smooth grass-covered terrain to land a fragile airplane.

This technology is ideal for intense observation of glaciers and other natural phenomena. The UAV flies low enough to obtain high resolution images, and if it can stay on station, can provide updated data every hour or less. The UAV is cheaper than a satellite, and even than a piloted aircraft. It would be possible to deploy a fleet of UAVS to monitor a glacier or volcano in great detail for substantial periods.

Cool.


  1. Philipp Oettershagen, Amir Melzer, Thomas Mantel, Konrad Rudin, Thomas Stastny, Bartosz Wawrzacz, Timo Hinzmann, Stefan Leutenegger, Kostas Alexis, and Roland Siegwart, Design of small hand-launched solar-powered UAVs: From concept study to a multi-day world endurance record flight. Journal of Field Robotics, 34 (7):1352-1377, 2017. http://dx.doi.org/10.1002/rob.21717

 

Robot Wednesday

Drone Shows

I have noted the cool collaboration between roboticists from ETH and Cirque du Soleil.

This is now a for-hire business, with the tag line, “Drone shows: The magic is real”.

I note that the basic technology is pretty standard stuff, it’s “just quadcopters”. But developing a show or installation involves careful planning for safety, and they also do “costume design” (i.e., dressing up the flyers), choreography (flyers and human-flyer combos), as well as the control systems for the real time performances.

These theatrical spectacles are probably paving the way for robots in the home and cityscape better than all the engineering studies ever done.  First, the elegant storytelling is enchanting and attractive. I want to dance with these pretty robots.

Second, their choreography is developing a sense and a “grammar” of how humans and UAVs should interact.  Notably, the UAVs have a certain personality that seems  appropriately mechanical but still readable and approachable by humans.

I will add one criticism.

Esthetically, their shows are starting to all look the same, and the “gee whiz” factor is wearing off fast.

I’m hoping to see the next thing, something new and different. I didn’t really find that in the 2017 shows. In fact, the 2017 show reel is about 50% the exact same shows as the 2016 reel.

Perhaps it’s time to open up this technology to more artists.

 

Robot Wednesday

Collapsable Delivery Drone

I’m not a huge fan of buzzy little quadcopters, nor am I a fan of delivery drones. The former are about as welcome as a cloud of mosquitos, and the latter promises to transfer even more wealth to the 0.001%. (I’m not sure who these drones will be delivering to, when none of us have jobs or money to buy things.)

That said, I was interested to see the “origami-inspired cargo drone” developed by a group at Ecole Polytechnique Fédérale de Lausanne [2]. Their design wraps the copter in a flexible cage, which protects the package and also encloses the dangerous rotors. The cage is foldable, so it closes up to a relatively small package when not in use.

The cage is a nice design. It addresses the safety (and perceived safety) of the drone in a nice way. Rather than depending on complex algorithms to make the drone “safe” and “friendly”, their design makes the drone a soft beach ball like thing—the affordances are obvious and visible. Furthermore, the safety factor is passive. The effectiveness of the enclosure does not depend on either software or humans.

I’m sure that this basic idea can be realized in a lot of geometries. The EPFL design is modular, which means that a variety of cages can be made from the same design. It folds up rather neatly, and, of course, is light and strong.

I could imagine versions of this concept that have a standard coupling to a range of quadcopters. Sort of a “delivery cage” costume for drones. (I smell a new standard for “drone costume attachment” coming.)

Clearly, there is no reason why the cage has to be so bare and undecorated. Why not streamers, glitter, and even LEDs? These might make the drone more appealing, and would also make the drone more visible to cameras, radar, and sonar. (Another standard? Passive safety reflectors for drones?)

I’m still not eager to have my local stores put out of business by Amazon, but if I’m going to have to live with drones, I’d like them to bounce off walls and people, rather than crash into them.


  1. Evan Ackerman, EPFL’s Collapsable Delivery Drone Protects Your Package With an Origami Cage, in IEEE Spectrum — Automation. 2017. https://spectrum.ieee.org/automaton/robotics/drones/epfl-collapsable-delivery-drone-protects-your-package-with-an-origami-cage
  2. Przemyslaw Mariusz Kornatowski, Stefano Mintchev, and Dario Floreano, An origami-inspired cargo drone, in IEEE/RSJ International Conference on Intelligent Robots and Systems. 2017: Vancouver. http://infoscience.epfl.ch/record/230988

 

Robot Wednesday

Disposable Sensor Drones

Today, the Internet of Things is in its early “steampunk” stage, basically adding mobile phone technology to toasters and refrigerators. The real IOT will be closer to the vision of Smart Dust described a quarter of a century ago—massive numbers of very tiny sensors, networked together. No, we really don’t know how to build that, yet, but we’re working on it.

For the past few years, the US Naval Research Lab has been working on disposable drones, which are beginning to become more like smart dust. Shrinking robot aircraft down, they are creating a sort of ‘guided dust’. OK, the dust motes are pretty chunky still, but it’s the steam era.  They’ll get smaller.

The CICADA project (Close-in Covert Autonomous Disposable Aircraft) has done a lot of prototypes, and they are showing their Mark 5 this year.

The robot glider is 3D printed and made up from already worked out technologies, sensors, radios, autopilot and guidance systems are dropped in. The design is “stackable”, and designed to be dropped in batches from an aircraft. Each glider steers toward a specific target, and beams back its data when it lands. The whole thing is cheap enough to be considered disposable (at least by the Pentagon.)

With different sensors, there are many obvious things that these could do. Their usage is captured nicely by the idea of dropping a batch into a storm, to capture a bunch of readings from inside. For meteorology, these are sort of like sounding balloons, except they fall instead of float.

Right now, [CICADAs] would be ready to go drop into a hurricane or tornado,” he said. “I really would love to fly an airplane over, and each of these could sample in the tornado. That’s ready now. We’d just need a ride. And [FAA] approval.” (Quoting NRL’s Dan Edwards)

I’m pretty sure that the military will find less benign uses for this concept, though there are already plenty of guided weapons already, so this isn’t anything new.

The prototype is said to run about $250, which is cheap for the Navy, but seems high to me. I’m not seeing anywhere near that much gear in these little birds, and most, if not all of it can be done from open source. I would expect that hobbyists could probably replicated this idea in a maker space for a whole lot less per unit. Couple it will inexpensive quad copters to lift them, and I could see a huge potential for citizen science.

As a software guy, I have to wonder what the data system looks like. Whatever the Navy has done, I’m pretty sure that hobbyists or science students can whip up a pretty nice dashboard to grab, analyze, and visualize the sensor traces.


  1. Evan Ackerman, Naval Research Lab Tests Swarm of Stackable CICADA Microdrones, in IEEE Spectrum – Automation. 2017. http://spectrum.ieee.org/automaton/robotics/drones/naval-research-lab-tests-swarm-of-stackable-cicada-microdrones
  2. US Naval Research Laboratory. CICADA: Close-in Covert Autonomous Disposable Aircraft. 2017, https://www.nrl.navy.mil/tewd/organization/5710/5712/research/CICADA.

 

 

Robot Wednesday

US NSF Funds Antarctic Science Drones

All around the world, Unoccupied Aircraft Systems (AKA, drones) are becoming useful scientific instruments. With the technological and economic push-pull of military and consumer demand, drones are becoming ubiquitous and cheap. Cheap enough for poverty stricken scientists to use.

Small drones have many advantages besides cost. They can carry cameras and other instruments to extend the view of science teams by many kilometers. They fly low, and can, indeed, touch down if needed.   With advances in control systems, it is becoming reasonable to operate flocks of them, to cover even more ground.

Many groups around the world are booting up this technology (E.g., reports by the US Marine Mammal Commission [2] and a coalition in New Zeeland [1]).

This week the US National Science Foundation announced funding of the Drones in Marine Science and Conservation lab at Duke University, which is specifically aimed at monitoring animals in Antarctica.

The advantages are obvious. Antarctica is huge, far away, and hard to get to. Satellites are blinded by cloud cover, and limited in resolution. Aircraft can only operate a few days per year, and are awfully expensive. Drones offer the advantages of aerial surveying at a reasonable cost.

As the video makes clear, the basic use is similar to civilian and military scouting, with the advantage that the penguins will neither shoot nor sue.  🙂

These drones are a bit more complicated than the toys under the Christmas tree, because they are equipped with a variety of instruments, potentially radar, lidar, multispectral cameras, and chemical samplers. As the NSF article points out, they “can even be used to sample breath from individual whales”.

The thrust of the NSF funding is to pull together all the rest of the picture, namely data analysis, visualization, and archiving the data. The project also contemplates training and other assistance to help future projects that want to employ drones.

This is pretty neat.


  1. Lorenzo Fiori, Ashray Doshi, Emmanuelle Martinez, Mark B. Orams, and Barbara Bollard-Breen, The Use of Unmanned Aerial Systems in Marine Mammal Research. Remote Sensing, 9 (6) 2017. http://www.mdpi.com/2072-4292/9/6/543
  2. Marine Mammal Commission, Development and Use of UASs by the National Marine Fisheries Service for Surveying Marine Mammals. Bethesda, 2016. https://www.mmc.gov/wp-content/uploads/UASReport.pdf

 

Robot Wednesday