Category Archives: Robotics

Bioinspired “spring origami”

Our latter day Prometheans (is that a word?) heartily boast of creating “programmable matter” and “4D printing”.  This would be crazy if it weren’t true that astonishing, near magical designs are coming every day.

Many of these developments are inspired by nature and by origami.  As I have said, it is clear that all Engineering and Design students should learn origami as part of the twenty first century curriculum.

This spring researchers at ETH Zurich report on an cool development which is inspired by the wing of an earwig [1].  This is especially interesting because the biological system actually works better than conventional origami.

The wing of the Dermaptera has an extremely large range from compactly folded to open in flight. It also deploys without muscular action (i.e., it unfolds), but snaps into a strong rigid form for flight. Their analysis shows that “current origami models are not sufficient to describe its exceptional functionality” ([1], p.1387)

They conclude that the key feature is that unlike “strict” origami, the earwig wings are not folded on straight rigid lines.  Instead, they folds are curved and consist of  elastic biopolymer, which is springy  The biopolymer behaves as a system of extensional and rotational springs.

Not origami, but origami plus (biological) clockwork!

The researchers explain that this bioinspired analysis opens a broad space for “spring origami”, which exceeds the capabilities of traditional origami. The paper has the technical details, which, among other things, involve complex surfaces of energy levels in multiple springs which yield bistable regimes (i.e., snap through).

This analysis makes possible the design and fabrication of many different low energy, folding systems.

“We transferred the biological design principles extracted from the earwig wing into a functional synthetic folding system that can be directly manufactured by 4D printing” ([1], p. 1390)

“Our ability to tune the energy barrier between bistable states using simple geometrical and material properties […] enables the design and fabrication of spring origami structures that can undergo fast morphing, triggered by an environmental stimulus.”

The researchers see potential for many applications, including antennas and solar arrays for space craft, architecture, robots, or packaging.

I’m seeing a fancy new version of an umbrella—lighter, stronger, and simpler design.


  1. Jakob A. Faber, Andres F. Arrieta, and André R. Studart, Bioinspired spring origami. Science, 359 (6382):1386, 2018. http://science.sciencemag.org/content/359/6382/1386.abstract
  2. Peter Rüegg, Earwigs and the art of origami, in ETH News. 2018. https://www.ethz.ch/en/news-and-events/eth-news/news/2018/03/earwigs-and-the-art-of-origami.html

 

 

Robot Origami Wednesday

Robot Concepts: Legs Plus Lift

Lunacity seems to be lunacy, or at least fantasy. “Personal jetpacks” are at the edge of possibility, requiring impractically huge amounts of power to lift a person (and, once lifted, are impossible to control).  But that doesn’t mean that moderate sized personal jetpacks have no possible use.

Two recent projects illustrate how copter tech can be combined with articulated bodies to create interesting hybrid robots.

One interesting concept is to add ducted fans to the feet of a bipedal (or any number of pedal) robot.  The lift is used to aid the robot when it needs to stretch for a long step over a gap.  The video makes this idea pretty clear:  one foot is anchored, and the other uses the thrust to keep balanced while stepping over the void.

This is the “Lunacity” idea applied to each foot independently, and it is plausible (if noisy and annoying).  There isn’t much hope of lifting the whole robot, but the thrusters probably can add useful “weightlessness” to parts of the robot.  In this case, the feet, but the same idea might add lifting power to arms or sensor stalks.


A second project sort of goes the other way;  adding a light weigh, foldable “origami” arm to a flying UAV [2].   The idea is to have a compact arm that extends the capabilities of the flyer, within the weight and space limits of a small aircraft.  The design unfolds and folds with only a single motor.  Origami is so cool!

Instead of adding lifters to the robot, the robot arm is added to the flyer, to make a hybrid flying grasper.  I think there is no reason why there couldn’t be two arms, or the arms can’t be legs, or some other combination.


I look forward to even more creative hybridization, combining controllable rigid structures with lifting bodies in transformer-like multimode robots.


  1. Evan Ackerman, Bipedal Robot Uses Jet-Powered Feet to Step Over Large Gaps, in IEEE Spectrum – robotis. 2018. https://spectrum.ieee.org/automaton/robotics/humanoids/bipedal-robot-uses-jetpowered-feet-to-step-over-large-gaps
  2. Suk-Jun Kim, Dae-Young Lee, Gwang-Pil Jung, and Kyu-Jin Cho, An origami-inspired, self-locking robotic arm that can be folded flat. Science Robotics, 3 (16) 2018. http://robotics.sciencemag.org/content/3/16/eaar2915.abstract

 

Robot Wednesday

 

Robot Bonsai Tree

Another cool looking robot demonstration project from Japan:  BonsAI.

Part of the TDK company’s “Attracting Tomorrow” PR campaign, this is an autonomously mobile bonsai.

There have been sensor equipped plants before, and mobile plants, too.  But Bonsai is way, way cooler because it is is esthetically and psychologically much more attuned to human interaction than most plants.

In addition to the standard “seeking sunlight” and “monitoring water” features, the BonsAI seems to “walk beside you” and it is implied that the tree can offer sage advice.  As the website says, “BonsAI is a bit smarter than a man, because it lives longer than a human being.” (Via Google Translate).

It’s not really clear whether this wisdom is metaphorical (as bonsai in general can be said to do), or whether the computer augmentation is actually capable of whispering inscrutable treely wisdom to its human companion. It would certainly be technically possible, though I wonder how psychologically effective it would be.  (Whatever actual voice deployed, it couldn’t possibly sound like a tree to me.)

Overall, this project t gives me mixed feelings.

On the one hand, the bonsai is certainly attractive and sensuous in ways that many robot toys and pets are not.  And, as I’ve said, it is more attractive than most augmented plants. I’d rather cohabit with a robot bonsai than most other personal robots.

But on the other hand, mobility and digital augmentation do little to improve bonsai, which is already a cool augmentation to human life.  Indeed, the digital features may well detract from the value of the bonsai.

For example, part of the pleasure of a bonsai is carefully tending it, which include carefully attending to it. Having sensors that automatically report what the tree needs, or even that autonomously seeks needed sunlight eliminates the need for human attention and effort.  This defeats an important and deep part of the relationship between human and living tree.

I would also note that, on the day that the BonsAI starts emitting advertising, reminders of appointments, or movie recommendations, is the day I say “turn it off”, and chuck it out the window. Ick.

 

Robot Wednesday

 

Rongzhong Li’s Open Cat

In a world filled with biomimetic robots resembling everything from microbes to hellhounds, not to mention thousands of uncanny humanoids, Rongzhong Li thinks there is a gigantic missing piece:  house cats.

Not just a feline Aibo, but an open source robot cat, OpenCat.

“A programmable and highly maneuverable robotic cat for STEM education and AI-enhanced services. Powered by Arduino and Raspberry Pi, it’s also affordable for DIY makers.”

This is kind of cool, and certainly a solid DIY ready project.

Of course, it’s got a long, long way to go to be actually cat like.  The gait could never be mistaken for a real cat, and a lot of the stuff like turning and getting up from lying down are really not feline—yet.  And call me when OpenCat can walk silently and  land on its feet when dropped!

Obviously, the thing to do is hack away at the open source boards and algorithms, to make some seriously feline motion boards.

As far as behavior, particularly play; well, it’s a start. Again, a lot of hacking is needed to create a real cat ‘attitude’.  The “uncanny valley” is the home territory of felines.  OpenCat should be watching you, aloof except when its friendly, demanding and commanding.

Ultimately, the biggest issue with open cat is captured by one of his images:

we’ve already got cats.

What is the purpose of a robot cat?


One final idea for future development:  can you build a robot cat that is remote operated by an actual cat?  Feline-computer interfaces are notoriously hard because cats just don’t care and won’t cooperate.  But imagine the chaos of a cat with its own robot avatar or posse of avatars!  Or maybe the cat will want a robot butler, to fetch food and turn up the heat and generally do whatever she wants.

 

 

Robot Wednesday

Self parking slippers(!)

Now this is what I call a neat demo!

I’ve never tried one of these new self-parking cars, so I don’t really get it.  Sure, parking is tricky, but that’s life, no?  Is this something I need, or even want?  I suspect that this is something that once you experience it, you can’t live without.

Nissan has put out an interesting demo that illustrates the idea of the technology, but in a different context.

This application is arguably even more useless than parking your car, but it is so cool to watch, it is compelling.  It also puts you outside and above the action, with a ‘god’s eye view’, which makes the magic all that more visible.  And I’ve seen cars park many times, but never seen a slipper park, autonomously or otherwise!

I like it!

Now, I can’t really tell exactly how this is done (and neither can Sensei Evan [1]).  The press materials imply that this is based on the same technology that the self parking automobile uses.  But that can’t be literally true, since the slippers clearly don’t have multiple cameras and sonar sensors, and I’d be surprised if they have microchips “autonomously” running anything like Nissan Leaf firmware.  Presumably, the slippers are guided by a simulation using cameras in the room, or something.  That would be reasonably cool in itself, and nothing to be ashamed of.

Anyway, I love the demo, regardless of how it was done.

I never knew how badly I needed a self-parking slipper until now”  (Evan Ackerman [1])


  1. Evan Ackerman, Nissan Embeds Self-Parking Tech in Pillows and Slippers, in IEEE Spectrum – Cars That Think. 2018. https://spectrum.ieee.org/cars-that-think/transportation/self-driving/nissan-embeds-selfparking-tech-in-pillows-and-slippers

 

Robot Wednesday

Drones Counting Ducks Down Under

One of the oldest citizen science projects is bird watching.  For more than a century, enthusiastic birders have amassed vast datasets of avian sightings.  To date, technology has enhanced but not displaced this proud nerd army. Photography, GPS, and databases have vastly improved the data from birders, but nothing has replaced boots on the ground.


This month, a research project at the University of Adelaide reported a demonstration of a UAV mounted image system that, for once, beats human birders [1].

Specifically, the study compared the accuracy of humans versus a small survey quadcopter, on a task to count birds in a nesting colony.  In order to have a known ground truth, the tests used artificial colonies, populated by hundreds of simulated birds.  The repurposed decoys were laid out to mimic some actual nesting sites.

They dubbed it “#EpicDuckChallenge”, though it doesn’t seem especially “epic” to me.

The paper compares the accuracy of human counters on the ground, human counts from the aerial imagery, and computer analysis of the aerial imagery.

First of all, the results show a pretty high error for the human observers, even for the experienced ecologists in the study. Worse, the error is pretty scattered, which suggests that estimates of population change over time will be unreliable.

The study found that using aerial photos from the UAV is much, much more accurate than humans on the ground. The UAV imagery has the advantage of being overhead (rather than human eye level), and also holds still for analysis.

However, counting birds in an image is still tedious and error prone.  The study shows that machine learning can tie or beat humans counting from the same images.

Together, the combination of low-cost aerial images and effective image processing algorithms gave very accurate results, with low variability. This means that this technique would be ideal for monitoring populations over time, because repeated flyovers would be reliably counted.


This study has its limitations, of course.

For one thing, the specific task used is pretty much the best possible case for such an aerial census.  Unrealistically ideal, if you ask me.

Aside from the perfect observing conditions, the colony is easily visible (on an open, flat, uniform surface), and the ‘birds’ are completely static.  In addition, the population is uniform (only one species), and the targets are not camouflaged in any way.

How many real-world situations are this favorable?  (Imagine using a UAV in a forest, at night, or along a craggy cliff.)

To the degree that the situation is less than perfect, the results will suffer.  In many cases, the imagery will be poorer, and the objects to be counted less distinct and recognizable. Also, if there are multiple species, very active birds, or visual clutter such as shrubs, it will be harder to distinguish the individuals to be counted.

For that matter, I’m not sure how easy it will be to acquire training sets for the recognizer software.  This study had a very uniform nesting layout, so it was easy to get a representative subsample to train the algorithm.  But if the nests are sited less uniformly, and mixed with other species and visual noise, it may be difficult to train the algorithm, at least without much larger samples.


Still, this technique is certainly a good idea when it can be made to work.  UAVs are great “force multiplier” for ecologists, giving each scientist much greater range. Properly designed (by which I mean quiet) UAVs should be pretty unobtrusive, especially compared to human observers.

The same basic infrastructure can be used for many kinds of surface observations, not just bird colonies.  It seems likely that UAV surveying will be a common scientific technique in the next few decades.

The image analysis also has the advantage that it can be repeated and improved.  If the captured images are archived, then it will always be possible to go back with improved analytics and make new assessments from the samples.  In fact, image archives are becoming an important part of the scientific record, and a tool for replication, cross validation, and data reuse.


  1. Jarrod C. Hodgson, Rowan Mott, Shane M. Baylis, Trung T. Pham, Simon Wotherspoon, Adam D. Kilpatrick, Ramesh Raja Segaran, Ian Reid, Aleks Terauds, and Lian Pin Koh, Drones count wildlife more accurately and precisely than humans. Methods in Ecology and Evolution:n/a-n/a, http://dx.doi.org/10.1111/2041-210X.12974
  2. University of Adelaide, #EpicDuckChallenge shows we can count on drones, in University of Adelaide – News. 2018. https://www.adelaide.edu.au/news/news98022.html

 

 

Singaporean Robot Swans

Evan Ackerman calls attention to a project at National University of Singapore, that is deploying robotic water quality sensors that are designed to look like swans.

The robots cruise surface reservoirs, monitoring the water chemistry, and storing data as it is collected into the cloud via wifi.  (Singapore has wifi everywhere!)  The robots are encased in imitation swans, which is intended ‘to be “aesthetically pleasing” in order to “promote urban livability.”’ I.e., to look nice.

This is obviously a nice bit of work, and a good start.  The fleet of autonomous robots can maneuver to cover a large area, and concentrate on hot spots when needed, all at a reasonable cost. I expect that the datasets will be amenable to data analysis machine learning, which can mean a continuous improvement in knowledge about the water quality.

As far as the plastic swan bodies…I’m not really sold.

For starters, they don’t actually look like real swans.  They are obviously artificial swans.

Whether plastic swans are actually more aesthetically pleasing than other possible configurations seems like an open question to me.  I tend to thing that a nicely designed robot might be just as pleasing or even better than a fake swan.  And it would look like a water quality monitor, which is a good thing.

Perhaps this is an opportunity to collaborate with artists and architects to develop some attractive robots that say “I’m keeping your water safe.”


  1. Evan Ackerman, Bevy of Robot Swans Explore Singaporean Reservoirs, in IEEE Spectrum – Automation. 2018. https://spectrum.ieee.org/automaton/robotics/industrial-robots/bevy-of-robot-swans-explore-singaporean-reservoirs
  2. NUS Environmental Research Institute, New Smart Water Assessment Network (NUSwan), in NUS Environmental Research Institute – Research Tracks -Environmental Surveillance and Treatment 2018. http://www.nus.edu.sg/neri/Research/nuswan.html

 

Robot Wednesday