Tag Archives: Anne Roudaut

Protospray Ubiquitous Control Surfaces

Over the past couple of decades there have been many experiments in interactive computing interfaces that basically enable some part of the physical world to be become a “button” to trigger digital events.  The basic trick is to recognize when a person touches the defined “button”, and signal the controlling software of that event. Usually, this involved computer vision, though there are fun variations such as MakeMakey.

There has also been a rapid development of “electronic ink”, which let a person literally draw a circuit.  (This technology is a lot more use to people who understand circuits better than I do! : – ))

This spring, MIT students demonstrate a mash up of these technologies:  ProtoSpray, “Sprayable user interfaces” [1].

The basic idea is that the system creates stencils that are the “buttons” when you spray the appropriate layers of electronic ink.  (This helps the circuit-clueless like me create something that works.)

A major goal is to allow these interfaces to be applied to many surfaces, including rough and irregular surfaces.

The “button” is created with 3D design software [2,3].  The software creates stencils to control the application of the inks.  These might be printed in cardboard, or projected on a curved surface.  Then the inks are air brushed on.  Attach leads from an Arduino to the painted interface and, voila!

It looks to me you still need a working knowledge of simple circuits to make a working “button”.  And it looks like there is some skill in creating a good interface, not just one that sort of works.

Of course, in the end it is just a fancy touch interface.  The demonstration is kind of cool, but not something I would need or want.  I mean, why do I need a really complicated light switch or door bell?

At least its not a touchscreen!

But it’s an interesting technology, and it would be interesting to see what clever designers might make of it.


  1. Rachel Gordon, Sprayable user interfaces, in MIT News, April 8, 2020. http://news.mit.edu/2020/mit-csail-sprayabletech-sprayable-user-interfaces-0408
  2. Ollie Hanton, Michael Wessely, Stefanie Mueller, Mike Fraser, and Anne Roudaut, ProtoSpray: Combining 3D Printing and Spraying to Create Objects with Interactive Displays, in CHI 2020. 2020. http://www.michaelwessely.com/data/spraying_3dprinting.pdf
  3. Michael Wessely, Ticha Sethapakdi, Carlos Castillo, Jackson C Snowden, Ollie Hanton, Isabel Qamar, Mike Fraser, Anne Roudaut, and Stefanie Mueller, Sprayable User Interfaces: Prototyping Large-Scale Interactive Surfaces with Sensors and Displays, in CHI 2020. 2020. http://www.michaelwessely.com/data/sprayable_wessely.pdf

Yet More Improved Robot Skin

Just about six years ago, I predicted that “remote haptics” was the next big thing. This was perhaps premature, but I stand by my basic point:  teledildonics is—dare I say it—coming soon.

Recently I noted progress in haptic skin that can simulate touch.   This is the “output device” if you wish, that lets a computer touch the user (“partner?”).

To complete the picture, researchers from Paris report on a biomimetic artificial skin that is designed to be an “input device” [1]. This is a multilayer sensor skin that senses touch a lot like human skin does.

Their demo is strange and icky:  they wrap a mobile phone in skin, so you can communicate via touch.  They also demonstrate touch pads and wrist bands.  The latter suggests the potential for wearable interfaces.  These are rather icky because they look (and maybe feel) so much like skin, that it is like something out of a horror film.  Vat grown mutant “hone people” or something.

I’ll note that the prototype looks like “flesh”, and tellingly, it is ‘flesh colored’—the band-aid pink of European skin.  To be fair, the researchers discuss the range of colors possible, including interesting colors not natural to any human skin.  But I assume that they chose colors that appeal to themselves.

In their research article, the researchers discuss the possible uses, but stay in technical mode.  They suggest that the enhanced touch interface might be efficient, enhancing work and user productivity.  They briefly touch on “applications for emotional communication”, including the “embodiment” of virtual agents.

They demonstrate “mobile tactile expression”, by which they mean “a messaging application where users can express rich tactile emoticons on the artificial skin.” ([1], p. 316)  Essentially, you can tickle your phone to write an emoji.

This is obviously not even close to the real goal.  The paper indicates that future work must include “output” capabilities, which, as I have noted, have been demonstrated down the road at Lausanne.

Putting this all together, we see now that we can build remote haptic interfaces.  Wearing a full body suit (or however much of the body you want to play with), the computer can generate touches.  Given a robot or doll or whatever form factor you like, you can touch the skin of the computer.

The obvious application is a virtual world in which each person has an avatar which receives and sends touches from these interfaces.  These can be delivered via a network, along with whatever audio and video might make sense.  And we have achieve the Icky-arity:  actual Internet caresses.

(It will be interesting to find out how lag, jitter, dropped packets, and so on “feel” when you are using such a channel.)

Of course, this communication is digitally mediated, so any number of algorithms might—dare I say it—manipulate the data.  Autotune to “optimize” kissing?  Magnified personal capabilities? Replays of greatest hits?  Libraries of celebrity partners?

Finally, I’ll note the interesting opportunities for hacking or malicious use.  Who is really out there touching you?  Just how crazy could you make someone by taking over and messing with their haptic interface?   There will be fatalities, and they might be murder.

“Ick” doesn’t begin to cover it.   But it’s going to be here soon.


  1. Marc Teyssier, Gilles Bailly, Catherine Pelachaud, Eric Lecolinet, Andrew Conn, and Anne Roudaut, Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive Devices, in Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 2019: New Orleans. p. 307-322. https://dl.acm.org/citation.cfm?doid=3332165.3347943
  2. University of Bristol, Artificial skin creates first ticklish devices, in University of Bristol -News. 2019. https://www.bristol.ac.uk/news/2019/october/skin-on-interface.html