A group of authors in the August 2014 issue of IEEE Computer outline some pros, cons and examples of proximity sensing technology that initiates advertising, action and may report your presence to some data collection process. The article is called The Dark Patterns of Proxemic Sensing.
There are simple examples which most folks have encountered: the faucet that turns on when you put your hands near it, followed by the automated hand dryer or paper towel dispenser. This paper Identifies some current examples that many of us may not have encountered: the mirror that presents advertising, a wall of virtual “paparazzi” that flash cameras at you accompanied by cheering sounds, and urinals that incorporate video gaming. Some of these systems are networked, even connected to the internet. Some interact anonymously, others are at least capable of face or other forms of recognition.
The article identifies eight “dark” aspects of this proximity interaction:
- Captive Audience – there is a concern of unexpected/undesired interactions in situations where the individual must go for other reasons.
- Attention Grabbing – detection and interaction allows these systems to distract the target individual. Which may be problematic, or just annoying.
- Bait and Switch – initiating interaction with an attractive first impression, then switching to a quite different agenda.
- Making personal information public — for example, displaying or announcing your name upon recognition.
- We never forget – tracking an individual from one encounter to the next, even spanning locations for networked systems.
- Disguised data collection – providing (personalized) data back to some central aggregation.
- Unintended relationships – is that person next to you related in some way — oh, there she is again next to you at a different venue…
- Milk factor – forcing a person to go through a specific interaction (move to a location, provide information …) to obtain the promised service.
Most of these are traditional marketing/advertising concepts, now made more powerful by automation and ubiquitous networked systems. The specific emerging technologies are one potentially disturbing area of social impact. A second is the more general observation that the activities we consider innocuous or even desirable historically may become more problematic with automation and de-personalization. The store clerk might know you by name, but do you feel the same way when the cash register or the automatic door knows you?