Category Archives: Environmental Sensing

Sun2ice: Solar Powered UAV

One of the important use cases for UAVs is surveillance in all its forms. Small, cheap aircraft can cover a lot of area, carry a lot of different sensors, and swoop in to obtain very close up information.   In some cases, a human can directly control the aircraft (as in selfie cams and drone racing), but for many cases the UAV needs to be substantially autonomous.

Furthermore, remote observation generally needs long, slow flights, rather than short, fast ones. Range and flight duration are critical.

Remote sensing by UAVs is ideal for many kinds of environmental research, especially in remote areas such as deserts, oceans, or polar regions. A fleet of (inexpensive) UAVs can multiply the view of a single (very expensive) scientist by orders of magnitude, measuring a broad area, and identifying points of interest for detailed investigation.

This summer a group of researchers from ETH and the AtlantikSolar company have demonstrated a UAV that continuously monitored glaciers in Greenland. The Sun2ice is solar powered, so it charges its batteries as long as the sun is shining. In the polar summer, there is essentially 24 hour sunlight, so the UAV has power to fly continuously for months, at least in principle. Like other solar powered aircraft and boats, the AtlantikSolar needs not fuel and should be capable of extremely long missions.

Of course, flying over Greenland is difficult for any aircraft, and flying a small UAV continuously over remote and rugged glaciers is very challenging. The aircraft must deal with high winds and cold temperatures, even in good weather. With no pilot on board, the control systems must be highly automated.

The UAV must navigate over uninhabited territory, far from the humans back at base. It has to stay on station to collect data continuously, with little help from people. Magnetic compasses don’t work on Greenland, and continuous daylight means that celestial navigation is not possible either.

The researchers also had to deal with take off and landing from a remote field station. The video shows the UAV being delivered to its launch point via dogsled—Pleistocene technology deploying twenty first century technology. The test flights were successful, though flying time was less than a full day.

Flying an experimental solar-powered UAV as AtlantikSolar in Arctic conditions is very challenging due to the narrow sun angle, extreme climatic conditions, the weakness of the magnetic field used for the compass, and the absence of smooth grass-covered terrain to land a fragile airplane.

This technology is ideal for intense observation of glaciers and other natural phenomena. The UAV flies low enough to obtain high resolution images, and if it can stay on station, can provide updated data every hour or less. The UAV is cheaper than a satellite, and even than a piloted aircraft. It would be possible to deploy a fleet of UAVS to monitor a glacier or volcano in great detail for substantial periods.

Cool.


  1. Philipp Oettershagen, Amir Melzer, Thomas Mantel, Konrad Rudin, Thomas Stastny, Bartosz Wawrzacz, Timo Hinzmann, Stefan Leutenegger, Kostas Alexis, and Roland Siegwart, Design of small hand-launched solar-powered UAVs: From concept study to a multi-day world endurance record flight. Journal of Field Robotics, 34 (7):1352-1377, 2017. http://dx.doi.org/10.1002/rob.21717

 

Robot Wednesday

Thirty Years of Space Archaeology

Over the sixty years of the Space Age, remote sensing from the air and space has developed into an amazing tool. Originally driven by military necessity, remote sensing from space has revolutionized Earth Science as well as planetary science in the whole solar system. There simply would be no arguments about climate change if not for terabytes of satellite data clearly and irrefutably showing world wide trends.

Airborne and satellite measurements have also begun to revolutionize archaeology. Remote sensing can see through jungle and sand, and cover thousands of kilometers with centimeter resolution. It not lonely helps find where to dig, it gives an essential bigger picture to understand what has been dug up.

For many modern archaeologists, remote sensing tools have become as valuable as carbon dating.

This summer Pola Lem discussed the history of space archaeology[1], Beginning with declassified images from spy satellites, imagery from the US Space Shuttle and now an ever growing fleet of Earth Observing satellites from many nations, archaeologists can explore areas of interest “from their desk” before rolling the dice on an expensive, dangerous, and time consuming excavation.

Lem recounts the 1985 observations of the Omani desert from the Space Shuttle. Based on historical guesswork, the radar imagery detected evidence to locate the ancient oasis and city of Ubar. This is the first recorded instances of space imagery being used specifically for archaeology.

Okay, let me get this straight: You want to use my spaceship to find your lost city?

Even more important is the development of LIDAR (light detection and ranging) deployed on aircraft and nowadays on UAVs. Lidar can generate extremely precise elevation maps which reveal buried structures or ancient landscapes. (Lidar is also one of the key technologies for intelligent and self-driving vehicles.) Lidar is especially useful for seeing through jungle foliage, and has led to discovery of vast new evidence about pre-Colombian Mesoamerica.

In a rather Karmic cycle, archaeology is threatened by the data they thrive upon. Space Archaeology was born from public release of declassified military secrets, and now many archaeologies try to keep their satellite imagery secret to protect the sites from looters. (This is unlikely to work for long—it is easy to get remote sensing data.) Archaeologists now seeks to use remote sensing to protect ancient sites from tourism and looting.


  1. Pola Lem, Peering through the Sands of Time: Searching for the Origins of Space Archaeology, in The Earth Observatory – Features. 2017, NASA. https://earthobservatory.nasa.gov/Features/SpaceArchaeology/

 

Space Saturday

Citizen Science: NoiseCapture App

Contemporary digital technology offers many opportunities for collecting scientific data. Millions of people are carrying highly capable networked computers (mobile phones), with cameras, microphones, and motion sensors. Most personal devices have capabilities available only in a few laboratories twenty years ago.

Furthermore, these devices are in the hands of “civilians”. It is now possible to do “citizen science” for real, using personal devices to collect data and aggregate it through network services.

This has been used for environmental sensing (microbe populationsmicrobe assays, weather, air pollution, particulates,, odors), earthquake detection, food quality, detecting poachers, and wildlife observations (pollinators.  bird watching, bird song, insect song).

As I have remarked before, simply collecting data is not actually that useful scientifically. It also invites misguided pseudoscicence, if data is not carefully analyzed or misinterpreted.

What is needed is the rest of the picture, including data cleaning, careful models and analysis, and useful , valid visualization and reports.  You know, the “science” part.

This summer, a team from several French research institutions are releasing the NoiseCapture app , which allows anyone tomeasure and share the noise environnement [sic]”.

Specifically, this app measures noise in a city, as the user moves through ordinary activities. The microphone records the sounds, and GPS tracks the local of the device. (There are plenty of tricky details, see their papers [1, 2].)

The collected data is transmitted to the project’s server, where it is analyzed and cross-calibrated with other data. Any given measurement isn’t terribly meaningful, but may data points from many phones combine to create a valid estimate of a noise event. They incorporate these data into a spatial model of the city, which creates an estimate of noise exposure throughout the area [1].

Ii is very important to note that estimating noise exposure from a mobile phone microphone is pretty complicated (see the papers). Crowdsourcing the data collection is vital, but the actual “science” part of the “citizen science” is done by experts.

I’m pleased to see that the researchers have done some careful development to make the “citizen” part work well. The system is designed to record readings along a path as you walk. The app gives visual indications of the readings and the rated hazard level that is being observed. The data is plotted on interactive digital maps so that many such paths can be seen for each city. The project also suggests organizing a “NoiseCapture Party” in a neighborhood, to gather a lot of data at the same time.

Overall, this is a well thought out, nicely implemented system, with a lot of attention to making the data collection easy for ordinary people, and making high quality results available to the public and policy makers.


This research is primarily motivated by a desire to implement noise control policies, which are written with detailed technical standards. Much of the work has been aimed to show that this crowdsourced consumer device approach can collect data that meets these technical standards.

That said, it should be noted that technical noise standards are not the same thing as the subjective comfort or nuisance value of an environment. One person’s dance party is another person’s aural torture. A moderately loud conversation might be unnoticed on a loud Saturday night, but the same chat might be very annoying on the following quiet Sunday morning.

I also have to say that I was a little disappointed that the “environment” in question is the urban streetscape. For instance, the app is not useful for indoors noise (where we spend a lot of time).

Also, I would love to have something like this to monitor the natural soundscape in town and country. When the machines and people aren’t making so much noise, there is still plenty to hear, and I would love to be able to chart that. These voices reveal the health of the wildlife, and it would be really cool to have a phone app for that.

This is what “dawn chorus” folks are doing, but they don’t have nearly as nice data analysis (and non Brits can’t get the app).

Finally, I’ll note that simply detecting and recording noise is only a first step.  In the event that the neighborhood is plagued by serious noise pollution, you’re going to need more than a mobile phone app to do something about it. You are going to need responsive and effective local and regional government.  There isn’t an app for that.


  1. Erwan Bocher, Gwendall Petit, Nicolas Fortin, Judicaël Picaut, Gwenaël Guillaume, and Sylvain Palominos, OnoM@p : a Spatial Data Infrastructure dedicated to noise monitoring based on volunteers measurements. PeerJ Preprints, 4:e2273v2, 2016/09/28 2016. https://doi.org/10.7287/peerj.preprints.2273v2
  2. Gwenaël Guillaume, Arnaud Can, Gwendall Petit, Nicolas Fortin, Sylvain Palominos, Benoit Gauvreau, Erwan Bocher, and Judicaël Picaut, Noise mapping based on participative measurements, in Noise Mapping. 2016. https://www.degruyter.com/view/j/noise.2016.3.issue-1/noise-2016-0011/noise-2016-0011.xml

 

Disposable Sensor Drones

Today, the Internet of Things is in its early “steampunk” stage, basically adding mobile phone technology to toasters and refrigerators. The real IOT will be closer to the vision of Smart Dust described a quarter of a century ago—massive numbers of very tiny sensors, networked together. No, we really don’t know how to build that, yet, but we’re working on it.

For the past few years, the US Naval Research Lab has been working on disposable drones, which are beginning to become more like smart dust. Shrinking robot aircraft down, they are creating a sort of ‘guided dust’. OK, the dust motes are pretty chunky still, but it’s the steam era.  They’ll get smaller.

The CICADA project (Close-in Covert Autonomous Disposable Aircraft) has done a lot of prototypes, and they are showing their Mark 5 this year.

The robot glider is 3D printed and made up from already worked out technologies, sensors, radios, autopilot and guidance systems are dropped in. The design is “stackable”, and designed to be dropped in batches from an aircraft. Each glider steers toward a specific target, and beams back its data when it lands. The whole thing is cheap enough to be considered disposable (at least by the Pentagon.)

With different sensors, there are many obvious things that these could do. Their usage is captured nicely by the idea of dropping a batch into a storm, to capture a bunch of readings from inside. For meteorology, these are sort of like sounding balloons, except they fall instead of float.

Right now, [CICADAs] would be ready to go drop into a hurricane or tornado,” he said. “I really would love to fly an airplane over, and each of these could sample in the tornado. That’s ready now. We’d just need a ride. And [FAA] approval.” (Quoting NRL’s Dan Edwards)

I’m pretty sure that the military will find less benign uses for this concept, though there are already plenty of guided weapons already, so this isn’t anything new.

The prototype is said to run about $250, which is cheap for the Navy, but seems high to me. I’m not seeing anywhere near that much gear in these little birds, and most, if not all of it can be done from open source. I would expect that hobbyists could probably replicated this idea in a maker space for a whole lot less per unit. Couple it will inexpensive quad copters to lift them, and I could see a huge potential for citizen science.

As a software guy, I have to wonder what the data system looks like. Whatever the Navy has done, I’m pretty sure that hobbyists or science students can whip up a pretty nice dashboard to grab, analyze, and visualize the sensor traces.


  1. Evan Ackerman, Naval Research Lab Tests Swarm of Stackable CICADA Microdrones, in IEEE Spectrum – Automation. 2017. http://spectrum.ieee.org/automaton/robotics/drones/naval-research-lab-tests-swarm-of-stackable-cicada-microdrones
  2. US Naval Research Laboratory. CICADA: Close-in Covert Autonomous Disposable Aircraft. 2017, https://www.nrl.navy.mil/tewd/organization/5710/5712/research/CICADA.

 

 

Robot Wednesday

Penguin Feathers Tell All

One of the important questions for filed biology is to document and understand the movements of animals, which reveals many aspects of behavior, including nesting, mating, what they eat, and what eats them. But it isn’t at all easy to track animals in the wild.

For centuries, this difficult problem was tackled through personal observations and with tags. The former is possible only in some fortunate circumstances, and the latter requires capture, release, and recapture, which is difficult, expensive, and lossy. But 21st century technology is now available (and cheap enough) for filed biologiists to use.

In recent years, electronic location tags have become small and cheap, opening a new age of animal tracking. With a small radio tag attached, almost any animal can be tracked, on land, sea, or air. This still requires capture and release or at least touching the animal to tag it. And tags are cheap but not free.

Another cool advance is the use of chemical analysis of tissue to infer the travels and history of an animal. These techniques have advanced to the point that one discarded feather can speak volumes—without harming the animal.

This month Michael J. Polito and colleagues report on some successful experiments tracking Penguins through this method [2]. The study tagged Penguins with location tracking tags and when recaptured, took one tail feather.

The chemical analysis of the feathers detected the isotopes of Carbon in the feathers, which are different in different regions of the ocean, which have different plankton and fish to eat. The study showed that this method was as accurate as the location tag in identifying which waters were visited by each bird that winter.

Cool!

This means that catching a sample of Penguins once (rather than twice) and plucking one feather (rather than attaching a tracker) can reveal where they fed during the dark winter.


  1. Sarah Gabbott, Penguin feathers record migration route, in BBC News -Science & Environment. 2017. http://www.bbc.com/news/science-environment-40868224
  2. Michael J. Polito,, Jefferson T. Hinke, Tom Hart, Mercedes Santos, Leah A. Houghton, and Simon R. Thorrold, Stable isotope analyses of feather amino acids identify penguin migration strategies at ocean basin scales. Biology Letters, 13 (8) 2017. http://rsbl.royalsocietypublishing.org/content/13/8/20170241.abstract

US NSF Funds Antarctic Science Drones

All around the world, Unoccupied Aircraft Systems (AKA, drones) are becoming useful scientific instruments. With the technological and economic push-pull of military and consumer demand, drones are becoming ubiquitous and cheap. Cheap enough for poverty stricken scientists to use.

Small drones have many advantages besides cost. They can carry cameras and other instruments to extend the view of science teams by many kilometers. They fly low, and can, indeed, touch down if needed.   With advances in control systems, it is becoming reasonable to operate flocks of them, to cover even more ground.

Many groups around the world are booting up this technology (E.g., reports by the US Marine Mammal Commission [2] and a coalition in New Zeeland [1]).

This week the US National Science Foundation announced funding of the Drones in Marine Science and Conservation lab at Duke University, which is specifically aimed at monitoring animals in Antarctica.

The advantages are obvious. Antarctica is huge, far away, and hard to get to. Satellites are blinded by cloud cover, and limited in resolution. Aircraft can only operate a few days per year, and are awfully expensive. Drones offer the advantages of aerial surveying at a reasonable cost.

As the video makes clear, the basic use is similar to civilian and military scouting, with the advantage that the penguins will neither shoot nor sue.  🙂

These drones are a bit more complicated than the toys under the Christmas tree, because they are equipped with a variety of instruments, potentially radar, lidar, multispectral cameras, and chemical samplers. As the NSF article points out, they “can even be used to sample breath from individual whales”.

The thrust of the NSF funding is to pull together all the rest of the picture, namely data analysis, visualization, and archiving the data. The project also contemplates training and other assistance to help future projects that want to employ drones.

This is pretty neat.


  1. Lorenzo Fiori, Ashray Doshi, Emmanuelle Martinez, Mark B. Orams, and Barbara Bollard-Breen, The Use of Unmanned Aerial Systems in Marine Mammal Research. Remote Sensing, 9 (6) 2017. http://www.mdpi.com/2072-4292/9/6/543
  2. Marine Mammal Commission, Development and Use of UASs by the National Marine Fisheries Service for Surveying Marine Mammals. Bethesda, 2016. https://www.mmc.gov/wp-content/uploads/UASReport.pdf

 

Robot Wednesday

Remote Sensing Penguin Guano

There is so much we don’t know about the Earth and the biosphere. Even for relatively big and easy to see species such as birds, it is hard to know how and where they live, or even how many individuals exist. There are only so many biologists, and humans can only go and see so much.

Remote sensing of the planet from space has gives important insights about large scale processes that can’t be seen easily form a human perspective. For instance, a few images from space make absolutely clear how important dust storms in Africa are for the Amazon forests in South America.

In the past, it has been difficult to learn much about animal populations, because individuals are small and elusive. Biologists are getting better at detecting and tracking animals, especially mass movements of them.

This month NASA calls attention to a successful long term project that uses satellite imagery to locate colonies of Penguins [3]. Penguins are, of course, far too small to be reliably detected from most satellite imagery. However, Penguins live in colonies, and produce immense amounts of guano, which can be seen from space.

In fact, Penguin colonies could be seen from space 30 years ago [2], and space imagery and analysis have gotten a lot better since then.

The basic technique is to detect the color of guano covered rocks, and to infer how many Penguins live there from the area covered. Cross checking on the ground has confirmed that this indirect and remote measure is a pretty good estimate of and many Penguins there are and where they nest.

As the researchers note, Penguins live on sea ice, which means that they are a sensitive indicator of how ice conditions change. As sea ice is melting in parts of Antarctica, we can document how Penguins relocate in response. Penguins are also eat krill and fish, so they are a visible indicator of the health of these foods in an area.

Mathew Schwaller, Heather Lynch and colleagues have completed a global census of Adelie Penguins using imagery from several satellites [1]. They use machine learning techniques to identify the visual signature of nesting areas. Based on the very characteristic nesting habits of Adelies, it is possible to estimate the number of Penguins based on the area. Naturally, the satellite data is combined with on-site investigations and other reports, in order to validate the remote sensing and the estimation.

From [1] FIGURE 1. Map of extant Adélie Penguin colonies, as well as penguin colonies not found in imagery and presumed extinct. Solid bars represent sections of coastline in which populations are generally increasing in abundance, and dashed lines those in which populations are generally decreasing. Areas with no bar are either a mix of increasing and decreasing populations, are not changing in abundance, or do not have sufficient data to assess population change (see Supplemental Material Appendix A). Right: example of high-resolution imagery from Devil Island (−63.797°, −57.290°; location indicated by black arrow). Areas identified in the analysis as guano are shaded in light green. Imagery © 2014 by DigitalGlobe, Inc.
One huge advantage of the satellite data is that there is continued coverage of the whole world, so it is possible to track the changes in Penguin populations. For instance, the 2014 report indicates that over the last twenty some years, nesting sites in West Antarctica have dwindled. This is where sea ice is shrinking. In the same period, new nesting sites have appeared in East Antarctica, where sea ice has increased. Overall, the total population of Adelies seems to have increased in recent years, even as the birds have migrated to more favorable ice.

Ideally, the census can be maintained for a number of years to accumulate a much more detailed baseline, to improve the technique, and refine the understanding of the Penguin population. This census is only one species, so it remains to be seen how similar techniques might track other species.


  1. Heather J. Lynch and M. A. LaRue, First global census of the Adélie Penguin. The Auk, 131 (4):457-466, 2014/10/01 2014. https://doi.org/10.1642/AUK-14-31.1
  2. Heather J.  Lynch,  and Mathew R. Schwaller, Mapping the Abundance and Distribution of Adélie Penguins Using Landsat-7: First Steps towards an Integrated Multi-Sensor Pipeline for Tracking Populations at the Continental Scale. PLOS ONE, 9 (11):e113301, 2014. https://doi.org/10.1371/journal.pone.0113301
  3. Adam Voiland, Penguin Droppings Are Fertile Ground for Science : Image of the Day. NASA Earth Observatory.2017,  https://earthobservatory.nasa.gov/IOTD/view.php?id=90372

PS.  Wouldn’t “Penguin Guano” be a good name for a band? How about ‘Adelie Census’?

 

 

Space Saturday