The IEEE “Computing Edge” calls attention to an article from 2016, “Genteel Wearables: Bystander-Centered Design”.  In this case, “genteel” seems to mean, “polite to other people in public”.
This seems like ancient history now—who can even remember what wearable tech was like in 2015-6? But the basic point is, if anything, even more salient. With the deployment of more and more sensors, connected to who knows where, accumulating who knows what data, the world is becoming ever less “genteel”. Extending the IoT into wearables is only the last step in invasiveness.
“Compared to users, bystanders are often considered a second-order phenomenon; in other words, “human-centered design” really applies only to the user. I argue that bystanders have a decisive effect on the behavior of public wearable users.”
In particular, wearables must fit into social norms and customs. To the degree that wearables provide new or novel capabilities, they must be designed with bystanders in mind, which is a challenge because there are no norms and customs.
Writing in 2016, Flammer was reacting in part to the then current experiences with Google Glass and similar devices. These devices are potentially recording everyone in range, and potentially uploading to the Internet. Yet they are designed to be “invisible” when in use, which is a bad idea. (Hence the emergence of the now obsolete derogative term, “Glassholes”.)
“aiming for invisibility is actually equivalent to hiding what the users are up to from bystanders.”
Flammer outlines some of the key issues, which center on data collected about bystanders. Wearable devices might accidentally measure people nearby, e.g., in an image intended for navigation. Devices also might deliberately monitor and even identify people nearby, perhaps for social applications or threat assessment or more invasive uses such as “lie detection”.
As obnoxious as such short range data use, uploading to the Internet is much worse. Advertising companies such as Facebook and Google are eager to obtain second by second traces of everyone, all the time. So are governments. So are criminals. So are teenage hackers down the street. What could possibly go wrong?
Flammer gives some “design imperatives”, which are actually good advice for any human oriented design.
- Be concerned about the worst case scenario
- Engage in negotiated action
- Provide an Opt-Out Solution
- Avoid the Panopticon effect
The first point is just good design. But in this case the opacity of the wearable’s actions mean that bystanders may have radically different perceptions of the situation, which may escalate rapidly. Ignoring this possibility is a recipe for conflict and disaster.
The second point is basically, “don’t be opaque”. One way or another, obtain permission—tacit or explicit—from bystanders.
The third point is a variation of one of Bob’s Design Principles: Every interface must have an “Off” button that works.
The final point is obvious to me, but apparently not to Silicon Valley. Major advertising companies such as Facebook and Google, not to mention every kind of retailer have actually made a Panopticon the crux of their business model. As we see today, when people become aware of this fact, they tend to strongly dislike the companies and services, all the more so when deceived by the same arrogant snoops.
One approach Flammer advocates is “Peacock Design”, in which wearable actions are materialized and thereby announced to bystanders. This is tricky to get right, because the wearable device generally needs be unobtrusive for the user, so it is counterproductive to have to telegraph every action. I would add that exactly what kinds of signals might be used, and how well they would be understood by bystanders is not completely obvious.
He also suggests using a “privacy dashboard” such as already used for web browsers and phones. The idea would be to check with nearby users’ devices, and honor their preferences. This approach has a lot of challenges, including mere ability to recognize of bystanders—and what if they don’t have a cooperating device?
This “solution” also requires action on the part of everyone, unless the default is set to “no permissions at all”. It seems far fetched that everyone will take time to fiddle with their permissions. It’s more likely that they will simply be left off, and the genteel wearable will probably be rendered useless.
I’ll add that this kind of privacy profile suffers from many fundamental logical flaws. For one thing, the semantics of the “permissions” are hazy at best. What exactly am I permitting? This is especially problematic because different people may have different interpretations of the settings, leading to misunderstanding and conflict.
Also, if the permissions are defined by the requesting device, then they don’t necessarily even reflect the bystanders interests at all. For that matter, most permissions should really be on a case by case basis, “who, what, where, when, why”. Who is going to mess around dealing with thousands of encounters, case by case? It is more logical to set “denied” and forget about it.
Worst of all, the bystander has to trust the invasive device to honor the permissions (whatever they mean). Since the wearable device is operating in its own interests and under the control of its user, should it be trusted to honor the interests of bystanders (assuming they are even understood)? Probably not.
This article raises lots of important challenges. But I don’t think the suggested solutions are viable.
As a historical note, we were thinking about these things a long time ago, before the iPhone and Android got everyone on the net all the time. We recognized these problems and considered these solutions back then. It is important to note that they have not happened yet, which suggests they are not going to happen.
The only solution I know of is to design devices to not be intrusive in the first place.
That hasn’t happened either.
- Ivo Flammer, Genteel Wearables: Bystander-Centered Design. IEEE Security & Privacy, 14 (5):73-79, 2016.