For several years Phill-Seung Lee, Sungho Jo and colleagues at Korea Advanced Institute of Science and Technology (KISTI) have been “remote controlling” turtles (red-eared sliders (Trachemys scripta elegans). The movements of the turtle are manipulated remotely by blocking its vision except for a movable slit. Their natural obstacle avoidance behavior steers toward the light and away from darker stimuli .
Not satisfied with enslaving a helpless captive turtle, the researchers dream of the “repeated theme in fiction involves people imagining themselves in the body of another human or that of an animal.” (p. 491) We have the technology now to “approach this appealing topic.”
So….this year they have extended this work to add in a Brain Computer Interface for the human controller . A head mounted display and an EEG sensor connect the human to the turtle control system.
“The human operator wears the integrated BCI-HMD system, while the turtle is equipped with devices for stimulation, wireless communication, and imaging. Based on the images acquired from the cyborg turtle, the human uses thought to command the turtle.” (, p. 491)
That’s right, the human is remotely riding on the back of the turtle, steering with the HMD and EEG, while the turtle wears a backpack with camera, wireless, and the blinder.
This setup is described as “simple and non-invasive”, unlike embedded electrodes and other remote control methods. They see this as “an innovative framework for human-animal interaction systems”. (p. 501)
First, I’m not sure what problem it is solving. The investigators are fascinated with the idea of experiencing being in the body of an animal, but this technology scarcely does that. The human only has an over the shoulder view (monocular) which isn’t even a turtle eye vie, let along what the turtle really sees.
At the same time, this project suffers from the shortcomings and ethical problems of remote driving another organism. The turtle has no knowledge or communication with the human rider, and, in fact, is basically blind. The turtle is nothing more that an engine, propelling the camera and radio. This must be rather unpleasant for the turtle, and it cannot really know what is going on.
My primary concern in any Animal Computer Interface is “what is the benefit to each party, including the animal”. In this case, there is little benefit, and potentially a lot of danger for the turtle.
Specifically, it would be very dangerous for the turtle to go out in a natural setting wearing this apparatus that blocks its vision and restricts its mobility. The turtle is helpless and at the mercy of the remote human pilot. I note that the risks are very asymmetrical, because there is no danger to the human. In addition, the pilot also has limited information, so he is not capable of protecting the turtle from harm. This is a very bad situation for the turtle.
There are also concerns that this technology might be used to gather intelligence or even deliver a weapon. The mere possibility could lead to a preemptive slaughter of all turtles, just in case one might be remote guided bomb.
I know that these researchers take good care of their turtles. But this research has little benefit for the animals, and serious potential to harm them and their species.
- Alexis Kikoen, The NeuroMaker: When Science Meets Design (video). 2011, National Center for Supercomputing Applications: Urbana. https://youtu.be/Mq8xjnlqdUo
- Cheol-Hu Kim, Bongjae Choi, Dae-Gun Kim, Serin Lee, Sungho Jo, and Phill-Seung Lee, Remote Navigation of Turtle by Controlling Instinct Behavior via Human Brain-computer Interface. Journal of Bionic Engineering, 13 (3):491-503, 2016/07/01 2016. http://www.sciencedirect.com/science/article/pii/S1672652916603220
- Serin Lee, Cheol-Hu Kim, Dae-Gun Kim, Han-Guen Kim, Phill-Seung Lee, and Hyun Myung, Remote Guidance of Untrained Turtles by Controlling Voluntary Instinct Behavior. PLoS ONE, 8 (4):e61798, 04/17 2013. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3629119/
- Robert E. McGrath and Johan Rischau, The NeuroMaker 1.0: Personal Fabrication through Embodied Computing. 2011. http://cucfablab.org/sites/cucfablab.org/files/NeuroMaker_Rischau_McGrath.pdf
- University of Illinois, Augmented Alma: The New Image of an Illinois Icon. 2013, Big Ten Network. http://youtu.be/qLvKfAF_KjQ