Tag Archives: Maximilian Hofbauer

Virtual Reality for Animals

For several years now, Andrew Straw and colleagues at Albert-Ludwigs-University Freiburg have been doing interesting development of Virtual Reality for non-human species.  This work applies the basic idea of (mainly visual) Virtual Reality for non-humans.

This is trickier than it sounds, because VR depends on a deep understanding of the subjective experience of the world through vision. For humans, we have both research and extensive experience to guide development. For other species, we have no experience and it is much more difficult to understand the individuals subjective experiences.

At the same time, if you can make VR work, it is has a lot of advantages for learning about non-human perception. Conventional experiments require the animal to be restrained while test stimuli are presented. Even if not uncomfortable, this is an unnatural situation, and precludes a fully natural response to the stimuli.

An all round VR experiment that allows the animal to move naturally is a much more realistic situation, that can elicit normal behavior. In addition, a VR space can be configured and reconfigured in many ways, and can include an extensive virtual world. These capabilities open the way for extensive experimentation without seriously harming the animals.

Building on earlier work which created a VR experience for flies, the research group has extended and generalized the system so it can be used with other species. In their recent paper they report on experiments with flies, mice, and fish [2]. (Land, sea, and air—get it?)

The FreemoVR system exploits contemporary 3D video game technology to rapidly render a realistic scene all around the animal. It also uses computer vision to non-invasively track the position of the animal. The system rapidly renders the correct perspective view of the virtual world as the animal moves.

To prove out the system with different species, a species-appropriate VR world has to be created for each.

For a given species, the virtual world needs to be designed that reflects the natural environment of the animal, and that is rendered for their sensory apparatus. A fly’s world is different from a fish, and their eyes see differently.

The computer vision also needs to be trained to recognize the body pose and motion for each species. The VR depends on accurately tracking both the position and where the animal is looking.

In earlier work, they showed how the system can reveal how the fly uses visual cues to navigate. The current work illustrates other creative experiments. For instance, the fish were presented choices in the form of “teleportation” ports, which instantly shifted the fish to a new scene. (Apparently, this didn’t distress the fish as much as it would upset me!)

This is a classic single user VR system that presents the world registers to one point of view. It isn’t suitable for experiments with multiple animals at the same time, because the viewpoint is correct only for one of them. It is, as they say, a CAVE for animals.

However, they are able to present some group or even “social” situations, by projecting other animals nearby. And, in the case of the fish, they simulate a school of fish, and the subject swims along with them. These effects make it possible to explore interactions, at least based on visual cues.

Indeed, they also presented a world full of cartoonish “space invaders”, which did seem to worry the fish a bit.

Image: Straw Lab

 

The technology is open source, but kind of complicated, building on video game VR and computer vision libraries. The also use Robot Operating System (ROS) as the framework, presumably because it is a modular real time operating system.

Cool stuff!


  1. Charles Q. Choi, Virtual Reality Platform Created For Lab Animals, in IEEE Spectrum – The Human OS. 2017. http://spectrum.ieee.org/the-human-os/computing/hardware/virtual-reality-platform-created-for-lab-animals
  2. John R. Stowers, Maximilian Hofbauer, Renaud Bastien, Johannes Griessner, Peter Higgins, Sarfarazhussain Farooqui, Ruth M. Fischer, Karin Nowikovsky, Wulf Haubensak, Iain D. Couzin, Kristin Tessmar-Raible, and Andrew D. Straw, Virtual reality for freely moving animals. Nat Meth, advance online publication 08/21/online 2017. http://dx.doi.org/10.1038/nmeth.4399

IEEE Computer July Issue: Interesting Articles on Virtual Reality

The July issue of IEEE Computer magazine (full text available electronically in your friendly local library) has a group of articles on how Virtual Reality is being used for complex explorations of brains and neuro systems, and several other items of interest.

Storers and colleagues report on “Reverse Engineering Animal Vision with Virtual Reality and Genetics” — cool!!

I learn that VR is just the thing for vision research, assuming the system has sufficiently low latency and high resolution.  Classic methods of presenting stimuli for vision studies can now be seen as tepid compromises due to the lack of better alternatives.  But with VR, you can create a complete visual field, changing with time, and registered to the eyes of the subject–what we wanted to do all along.

The scientists in question study non-human species (flies, spiders, things with cool eyes).  So the VR system needs to present stimuli relevant to the particular subject’s visual system.  Yet another “species appropriate” computer interface!

In fact, the parameters of non-human vision systems are quite different than humans, so the VR displays depart from those of ordinary humancentric systems.  The authors have created an open source package, flyvr, to enable the precise specification of these visual parameters.

Similarly, it is necessary to track the movements of the subject, either free moving or tethered.  This is done through extensions of standard VR visual tracking taking account of the species specific locomotion (e.g., flying) or by simulating movement for tethered subjects (e.g., by measuring wingbeats).

The authors note the additional difficulties in calibrating the system, since the subjects cannot give verbal reports about the effects of latency or other problems.

This system has been used to implement “species appropriate” VR systems, such as FlyCave.  These environments enable less constrained and more natural movement, and can be programmed for complex, naturalistic scenarios.

(The ethics of these experiments is a close call, as far as I’m concerned.  But if you are going to play god with spiders and flies, you should do it right.)


 

Another article discusses “Open Source intelligence” (OSINT) — Robert David Steele’s explanation of why “Open Source” collection and analysis are the right thing to do.

It’s kind of a strange piece, full of strong but questionable assertions, such as, “open source is the only form of engineering that’s affordable, interoperable, scalable, and therefore sustainable.”

One crucial point Steele makes is that a large amount, he says 80% or more, of what you need to know is not secret, and is available from public or open sources.   Only a tiny fraction of critical information comes from secret sources and methods.

A second point is that lot’s of people need “intelligence”–data and analysis about the state of the world–not just the “National Security” sector that runs intelligence agencies.

Steele clearly has a whole hive of bees in his bonnet about the US National Intelligence apparatus and political leadership.  One doesn’t have agree with his sweeping condemnations to see his point.

Of course, the powers that be don’t actually want to “fix” the fact that they own the system.  His blithe proposal that everyone should do everything open source, and we should deep six the national security state are, well, not believable.

But anything that moves in that direction would probably be good for everyone.


 

There is also an interview with Sensei Andrew Tanenbaum — we are not worthy!  Where did Linux come from?  From Andy’s lab, who showed it could be done.

Heroes still walk amongst us.


References

  1. Stowers, John R. , Anton Fuhrmann, Maximilian Hofbauer, Martin Streinzer, Axel Schmid, Michael H. Dickinson, and Andrew D. Straw, Reverse Engineering Animal Vision with Virtual Reality and Genetics. Computer, 47 (7):38-45, July 2014.
  2. Berghel, Hal, Robert David Steele on OSINT. Computer, 47 (7):76-81, July 2014.
  3. Severance, Charles, Andrew S. Tanenbaum: The Impact of MINIX. Computer, 47 (7):7-8, July 2014.