Category Archives: Robotics

Under the Antarctic Ice

Under the Antarctic Ice

The Thwaites Glacier in West Antarctica is changing rapidly.  If there is any doubt in your mind, NASA has a nice before and after pair of images, from 2001 and 2019 [3].  These images look like winter versus spring on a frozen pond.  But they are the same time of the year (summer), less than 20 years apart.  The glacier is breaking up over the water, and this is happening fast.

These changes could mean that the glacier will flow even more rapidly to the sea, moving more ice from the interior into the sea, where it will melt.  This is a big deal, if that happens.  So there is a major research campaign to measure the bejesus out of Thwaites.

One of the areas of interest is what is happening at the grounding line, where the glacier touches bedrock.  This is a major brake on the ice, halting or slowing the flow out onto the water.  There is evidence that the ocean water offshore is warming, and if that warmer water reaches the grounding line it could lubricate or otherwise change things, releasing the ice to rush on into the (warm) ocean.  Boom!

(The BBC tags this “the Doomsday Glacier” [4], which I think is a bit over the top.  But it’s certainly important.)

In addition to visiting, sensing, and drilling into the ice [4], the research mission included the first robot submarine visit to the actual grounding line [1]. <<link>>  This was a pretty heroic mission, drilling through 590 meters of ice, and remote operating the sub for 15 km round trips.  Wow!

 

The mission captured the first images of the important grounding line.  (The full results will be published soon I’m sure.)

This same team is contributing to the development of missions to explore under the ice on Europa (should we last that long.)


  1. Michelle Babcock, First look under Thwaites Glacier and Kamb Ice Stream, in Life Under the Ice – Blog, January 28, 2020. https://schmidt.eas.gatech.edu/2019-field/firstlookunderthwaitesglacier/
  2. Ben Brumfield, Robotic Submarine Snaps First-Ever Images at Foundation of Notorious Antarctic Glacier, in Georgia Tech News, January 29, 2020. https://cos.gatech.edu/news/robotic-submarine-snaps-first-ever-images-foundation-notorious-antarctic-glacier
  3. Kathryn Hansen, Thwaites Glacier Transformed, in NASA Earth Observatory, February 6, 2020. https://earthobservatory.nasa.gov/images/146247/thwaites-glacier-transformed
  4. Justin Rowlatt, Antarctica melting: Climate change and the journey to the ‘doomsday glacier’, in BBC News – Science & Environment, January 28, 2020. https://www.bbc.com/news/science-environment-51097309

 

Robot Wednesday

Drones For Solar Farms

OK, let’s combine some of my favorite things, UAVs and Solar Power installations, to get…Percepto.

Over the past few years, I have had a few ideas about drones and PV, most notably some cunning ideas about using drones to sweep snow and other debris from roof tops.  (Shh!  Don’t tell anyone.  I’m still working on it.  : – ))

But I really didn’t know what autonomous drones would do for solar farms, per se.  So I was interested to read about Percepto’s offerings [2].

Percepto aircraft come in a box, and have visual and IR cameras.  The software manages the camera data, steers the UAV, and can use AI to analyze the data.

The main point of the solar power farm version, of course, is to continuously monitor large arrays out in the field.  In the case of solar arrays, the drone system can identify malfunctioning panels and infrastructure, as well as intruders and other anomalies.   PV arrays would seem to be particularly juicy targets for aerial monitoring, since they are exposed, static, and generally very regular.  Learning to recognized a solar panel should be easy for AI!  (Hint:  look for arrays of rectangles, absorbing a lot of light.)

I can see that this kind of autonomous inspection could be both cost effective and very thorough.  The drone won’t get tired or bored, and can fly 24/7 (weather permitting).

All of this is important for operators who want to build really big solar farms.   I guess you know if you need it.

I’ll note that Prcepto’s technology is also sold for other industrial uses, to monitor infrastructure of many kinds.  The solar farm isn’t a particularly unique application except it’s probably easier to implement that other types of infrastructure.  (Basically there are no moving parts, the processes are really simple, the geometry is pretty much a 2D layout.)

So, yeah.  Autonomous Drones!  Solar Panels!   Can we work in Blockchain, too? : – )

Actually, the main thing is found here is that it is all so boring, because it all makes so much sense.  Of course, you want to continuously monitor your huge PV array.  Of course, aerial monitoring is a good way to go.  Of course, you don’t need a large or occupied aircraft to survey it.

So, obviously, you want something like this Sparrow system from Percepto.

And there it is.

If you need it, you know you need it.


PS.  It would be cool to have a stripped down open source version of this kind of thing, suitable for periodic inspection of home and other rooftops.   The requirements are way simpler, but I doubt the economic payback is enough for a business.  But it could be a great PR thing.  “Monitor your home PV with the same technology as leading utilities use.”


  1. Ariel Avitan, Autonomous drones make solar projects more productive, cost-effective and secure, in Solar Power World Online, January 30, 2020. https://www.solarpowerworldonline.com/2020/01/autonomous-drones-make-solar-projects-more-productive-cost-effective-and-secure/
  2. Percepto, Autonomous Drones for Solar Farms. Percepto BR-7-01 01/2020, 2020. https://percepto.co/solar-energy-industry/

Robotics is Easy, Comedy is Hard

Robots can stand up, but can they do stand up?

“there’s nothing like “live”—er, well, “physically embodied”—robot comedy!”

Oregon State U. Professor Naomi Fitter is pursuing a funny (as in peculiar and as in ha-ha) research theme:  creating a robot who performs live stand-up comedy to human audiences [1].   (I guess she’ll tackle robot audiences later.)

Fitter must be a really interesting teacher!

Now, I personally don’t particularly enjoy stand up.  Probably my sense of humor is out of synch with most audiences.  And, honestly, when stand up fails to click, it’s really unpleasant.

But I recognize that this sort of improvisation is hard, really hard.  And success is all about connecting with an audience of strangers, on the fly.

So this is a pretty intense, Turing Test level challenge for a computer and/or robot.

Cool!

Fitter has studied the art of improvisation herself, so she is effectively trying to capture this expertise in her little robots.  Up to now, she’s just trying to get the robot to tell jokes that people laugh at.  Teaching the robot to write its own jokes will come later.

Writing jokes for the robot is kind of interesting.  She play acts being a robot, looking for typical comical material, “annoyances, likes, dislikes, and “life” experiences.”

“I could play-act being a robot, map classic (and even somewhat overdone) human jokes to fit robot experiences, and imagine things like, “What is a robot family?”, “What is a robot relationship like?”, and “What are drugs for a robot?””

Fitter confirms that the results confirm one of the oldest rules in the book: comedy is about timing.  Specifically, the timing must be right for the audience.  So the robot needs to grok the human audience, in real time.

That’s going to be hard. Real, real hard.

Her current techniques for this are kind of hacky (using ML to recognize laughter, hard coding pauses in the delivery.)

How well does it work?  The videos are, well, excruciating.  Let’s say the robot is as good as a lot of bad human comedians.  It’s impressive to be in the ballpark, but boy is it painful to watch.

She mentions in passing her interest in bantering with assistants such as Alexa and Siri and friends. That will be an impressive upgrade to these to date dull (if frightening) conversational agents.  Whether witty banter, jokes, and puns will make life better or worse, I’m not sure.


Of course, this research inevitably calls forth all sorts of silly joking.   Here I go:

 

“Robotics is easy.  Comedy is hard.”

 

The 18 meter Gundam robot does standup:

“YOU WILL LAUGH—NOW!”


  1. Naomi Fitter, What’s the Deal With Robot Comedy?, in IEEE Spectrum – Robotics, February 6, 2020. https://spectrum.ieee.org/automaton/robotics/artificial-intelligence/whats-the-deal-with-robot-comedy

 

Robot Wednesday

Now That’s What I Call A Robot!

As Evan Ackerman puts it “if anyone can do it, it’s Gundam Factory Yokohama. Because no one else will.” [1]

At 18 meters tall and 25 tons, this will be the largest humanoid robot ever.

Why?

Because we can!

[Simulation Video]

 

(And there is an open source simulator available to play with!  Cool!)


  1. Evan Ackerman, Japan Is Building a Giant Gundam Robot That Can Walk, in IEEE Spectrum – Robotics, January 28, 2020. https://spectrum.ieee.org/automaton/robotics/humanoids/japan-building-giant-gundam-robot

 

Robot Wednesday

Pigeonbot!

People have built machines that fly for a couple of centuries now, but we don’t generally use feathers.

Feathers are complicated, finicky things, generally beyond puny human engineering.

But feathered flight does really sophisticated stuff, definitely beyond puny human engineering.

Some research has explored designs that use artificial feathers, or at least feather-like entities.  These don’t work that well.

This winter researchers at Stanford report on remarkable studies of a UAV that uses real feathers [2].

The research is motivated by the gap in performance of natural biological wing systems and human engineered aircraft.  And, unlike some earlier investigations, this project used real feathers, not feather-like structures.  The bot also has a realistic number of feathers, and they are organized to mimic the original pigeon’s wings.  It’s an amazingly complicated mechanism.

“The outcome, PigeonBot, embodies 42 degrees of freedom that control the position of 40 elastically connected feathers via four servo-actuated wrist and finger joints.“ ([2],  p.1)

And it works, looking just like the pigeon it emulates.

 

The natural feathers are important.  As Evan Ackerman reports, the researchers discovered that feathers have “micron-scale features that researchers describe as “directional Velcro””.  “Real feathers can slide to allow the wing to morph, but past a certain point, the directional Velcro engages to keep gaps from developing in the wing surface.”  [1]

Cool!

The study has some other interesting implications.

For one thing, these real, biological feathers are not only unique, the feathers of an individual bird are a unique interlocking set.  Unlike engineered wings, these wings grew up, all the parts grew and developed together through the life of the bird.  You might say, it’s an “organic” wing! : – )

“we observed qualitatively that biological variation between pigeon individuals is too large to exchange a particular flight feather between different individuals without compromising the wing planform. Accordingly, we found that manufacturing of the biohybrid morphing wing is accurate and repeatable, provided that feathers from a single individual are used” ([2], p. 5)

They also observe that this pigeon wing is only one of “10,000 extant bird species— offering unprecedented comparative research opportunities”( [2], p. 11) If each pigeon wing is subtly different, what will we learn from all the other species of naturally evolved bird wings?

Nice work, all!


  1. Evan Ackerman, PigeonBot Uses Real Feathers to Explore How Birds Fly, in IEEE Spectrum – Robotics, January 16, 2020. https://spectrum.ieee.org/automaton/robotics/drones/pigeonbot-uses-real-feathers-to-explore-how-birds-fly
  2. Eric Chang, Laura Y. Matloff, Amanda K. Stowers, and David Lentink, Soft biohybrid morphing wings with feathers underactuated by wrist and finger motion. Science Robotics, 5 (38):eaay1246, 2020. http://robotics.sciencemag.org/content/5/38/eaay1246.abstract

Another Cool Exoskeleton

Sarcos Guardian XO “Give Workers Super Strength”  (And it’s untethered!)

The video and company materials make clear who the target market is:  military workers.  Most of these exoskeletons target military uses, though they’ll be pretty useful for a lot of work (I’m not surprised to read that Caterpillar and other heavy industries are investing.)

Evan Akerman’s report helped me understand the design of these systems a lot better [1].

“In a practical sense, the Guardian XO is a humanoid robot that uses a real human as its command and control system”

First of all, this can be thought of as a vehicle.  A formfitting single occupant vehicle.  You drive it by moving your body, and it follows and amplifies your movements.  A key part of the design is feed back to the rider/driver. It increases capabilities, but it is important for the human to “feel” the effort.

“It’s better to think of the exo as a tool that makes you stronger rather than a tool that makes objects weightless,”

Second, this is potentially a very dangerous vehicle. If the exoskeleton doesn’t stay in close correlation with the driver’s body, someone’s going to get hurt.  And it’s going to be the puny carbon-based unit that breaks first.  So there are dead man’s switches and regulators to make sure the robot doesn’t disarticulate the rider.

“All of the joints are speed limited, meaning that you can’t throw a punch with the exo”

Akerman points out that these systems will be potentially very dangerous around other people (suited or naked).  It’s pretty clear that the operator is responsible for avoiding injury and damage to people and objects around him.  Speed limitation helps, but still.  This is as dangerous and any other powered machinery.


I’ve been wanting to get this technology in the hands (and feet) of dancers.  Think of it!  Jumping!  Climbing!  Inhuman acrobatics!  Superhuman stamina!  So cool!

This report is a bit deflating for this particular idea of mine.  Obviously, you’d need to adjust the speed limitations and other safety features to be able to dance in one of these.  And that’s not going to be easy or safe.

Dancing is risky, must be risky.  That’s the beauty of it.

Taking risks is not going to be possible in these safety-minded industrial units, at least as currently designed.

But on the other hand, dancers are experts at motion, including making extraordinary movement safely.  It might be quite interesting to do some collaborative exploration, letting dancers carefully loosen the safety envelope, to see what you can do.  I suspect that dancers might help create even better and smarter safety systems.

In fact, I wonder if Sarcos might do well to involve dancers in their design team.  One of my own rules of thumb is, if you want to study embodied computing, you don’t want to rely on engineers.  You want to collaborate with dancers, who are all about embodied motion.


  1. Evan Ackerman, Sarcos Demonstrates Powered Exosuit That Gives Workers Super Strength, in IEEE Spectrum – Robotics. 2019. https://spectrum.ieee.org/automaton/robotics/industrial-robots/sarcos-guardian-xo-powered-exoskeleton

 

A Self-repairing robot

In one sense, the idea of robots building and repairing robots is obvious and old hat.  And repairing yourself can be a pretty simple extension of repairing some other machine.  But it’s not done very often.

This fall researchers from the University of Tokyo reported on demonstrations of teaching a self repair operation to commodity robots [2].  Specifically, the robots learned to use their own manipulators to tighten screws on their own body. (For this demo, the robot didn’t figure out for itself when a screw needs adjustment.)

 

Now, tightening a screw isn’t a gigantic deal.  However, robot manipulators are not really designed to reach their own body, so some screws are going to be challenging.  And some of them require an Allen wrench, which is a different grip and generally calls for changing the grip as you go, ‘regrasping”.

“The actual tightening is either super easy or quite complicated, depending on the location and orientation of the screw.”  Evan Ackerman in [1].

They also demonstrate that once you can do screws, you can screw on additional pieces, such as carrying hooks.  Neat.

Part of the trick is that they use CAD data describing their body.  They use this data to learn how to operate on themselves. Duh!  It’s so obvious, once you see it!

It seems to me that part of the challenge here is that these generic robots were not designed to self-repair or even repair each other.  There is no reason for this.  With a bit of care, robots can be assembled in ways that are easier for them to self-repair.  One way to assure this is to use robots to assemble the same model of robot.  And CAD systems themselves can analyze designs to maintain self-repair-ability.

This concept will be especially interesting to combine with evolutionary design.  The robot should not only be able to assemble and repair a robot, it should learn to optimize the assembly/repair process, presumably in tandem with evolutionary design of the robot to be assembled.

(To prevent a runaway robot uprising, the system should have to submit detailed proposals and requests for funding, in order to acquire the resources needed for the new versions.  That ought to keep them under the control–of accountants!)


  1. Evan Ackerman, Japanese Researchers Teaching Robots to Repair Themselves, in IEEE Spectrum – Robotics. 2019. https://spectrum.ieee.org/automaton/robotics/robotics-hardware/japanese-researchers-teaching-robots-to-repair-themselves
  2. Takayuki Murooka, Kei Okada, and Masayuki Inaba, Self-Repair and Self-Extension by Tightening Screws based on Precise Calculation of Screw Pose of Self-Body with CAD Data and Graph Search with Regrasping a Driver, in IEEE-RAS International Conference on Humanoid Robots (Humanoids 2019). 2019: Toronto. p. pp.79-84.

 

Robot Wednesday