Remote Navigation of Turtles


For several years Phill-Seung Lee, Sungho Jo and colleagues at Korea Advanced Institute of Science and Technology (KISTI) have been “remote controlling” turtles (red-eared sliders (Trachemys scripta elegans). The movements of the turtle are manipulated remotely by blocking its vision except for a movable slit. Their natural obstacle avoidance behavior steers toward the light and away from darker stimuli [3].

Figure 1. Depiction of experimental remote-controlled visual stimulus delivery and tracking systems. (A) To examine the turtle’s visual obstacle recognition, an experimental arena was equipped with a camera and two movable cylinders as obstacles (shown from the side view and from above). The dimensions of the arena, surrounding walls, and obstacles are indicated. (B) Experiments performed on the laboratory floor area (with the dimensions indicated) are shown in the drawing. The placements of the turtle, obstacle, and tracking system are shown. (C) The embedded control system to block the turtle’s view is shown in the drawing. The servo motor controls the positioning of the semi-cylinder obstacle (in the image, it is positioned directly in front of the turtle). The red circle on the controller tracked by the simple tracking algorithm was regarded as the location of the turtle. (D) The turtle was remotely controlled to follow the desired path by alternating the visual angle of the obstacle between ±180 (no stimulus) and ±90 degrees (Movie S1).

Not satisfied with enslaving a helpless captive turtle, the researchers dream of the “repeated theme in fiction involves people imagining themselves in the body of another human or that of an animal.” ([2]p. 491) We have the technology now to “approach this appealing topic.”

So….this year they have extended this work to add in a Brain Computer Interface for the human controller [2]. A head mounted display and an EEG sensor connect the human to the turtle control system.

The human operator wears the integrated BCI-HMD system, while the turtle is equipped with devices for stimulation, wireless communication, and imaging. Based on the images acquired from the cyborg turtle, the human uses thought to command the turtle.” ([2], p. 491)

That’s right, the human is remotely riding on the back of the turtle, steering with the HMD and EEG, while the turtle wears a backpack with camera, wireless, and the blinder.

This setup is described as “simple and non-invasive”, unlike embedded electrodes and other remote control methods. They see this as “an innovative framework for human-animal interaction systems”. (p. 501)

OK, I’m all in favor of “it could be done, so we had to do it” (E.g., this [5] or this [1, 4]), but I’m more than a little concerned about this technology.

First, I’m not sure what problem it is solving. The investigators are fascinated with the idea of experiencing being in the body of an animal, but this technology scarcely does that. The human only has an over the shoulder view (monocular) which isn’t even a turtle eye vie, let along what the turtle really sees.

At the same time, this project suffers from the shortcomings and ethical problems of remote driving another organism. The turtle has no knowledge or communication with the human rider, and, in fact, is basically blind. The turtle is nothing more that an engine, propelling the camera and radio.  This must be rather unpleasant for the turtle, and it cannot really  know what is going on.

My primary concern in any Animal Computer Interface is “what is the benefit to each party, including the animal”. In this case, there is little benefit, and potentially a lot of danger for the turtle.

Specifically, it would be very dangerous for the turtle to go out in a natural setting wearing this apparatus that blocks its vision and restricts its mobility. The turtle is helpless and at the mercy of the remote human pilot. I note that the risks are very asymmetrical, because there is no danger to the human. In addition, the pilot also has limited information, so he is not capable of protecting the turtle from harm. This is a very bad situation for the turtle.

There are also concerns that this technology might be used to gather intelligence or even deliver a weapon. The mere possibility could lead to a preemptive slaughter of all turtles, just in case one might be remote guided bomb.

I know that these researchers take good care of their turtles. But this research has little benefit for the animals, and serious potential to harm them and their species.

  1. Alexis Kikoen, The NeuroMaker: When Science Meets Design (video). 2011, National Center for Supercomputing Applications: Urbana.
  2. Cheol-Hu Kim, Bongjae Choi, Dae-Gun Kim, Serin Lee, Sungho Jo, and Phill-Seung Lee, Remote Navigation of Turtle by Controlling Instinct Behavior via Human Brain-computer Interface. Journal of Bionic Engineering, 13 (3):491-503, 2016/07/01 2016.
  3. Serin Lee, Cheol-Hu Kim, Dae-Gun Kim, Han-Guen Kim, Phill-Seung Lee, and Hyun Myung, Remote Guidance of Untrained Turtles by Controlling Voluntary Instinct Behavior. PLoS ONE, 8 (4):e61798, 04/17 2013.
  4. Robert E. McGrath and Johan Rischau, The NeuroMaker 1.0: Personal Fabrication through Embodied Computing. 2011.
  5. University of Illinois, Augmented Alma: The New Image of an Illinois Icon. 2013, Big Ten Network.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.