Computers as Bad Social Actors: Dark Patterns and Anti-Patterns in Interfaces that Act Socially
The Computers Are Social Actors paradigm suggests people exhibit social/anthropomorphic biases in their treatment of technology. Such insights have encouraged the design of interfaces that interact with users in more social (chatty or even friend-like) ways. However, in typical `dark pattern' fashion, social-emotional responses to systems as (seemingly sentient) agents can be harnessed to manipulate user behaviour. An increasingly common example is app notifications that assume person-like tones to persuade or pressure users into compliance. Regardless of being manipulative, difficulties meeting contextual social expectations can make automated social acting seem rude, invasive, tactless, and even disrespectful - constituting 'social' anti-patterns. This paper explores ways to improve how automated systems treat people in interactions. We mixed four qualitative methods to elicit user experiences and preferences regarding how automated systems "talk" to/at them. We identify an emerging 'social' class of dark and anti-patterns, and propose guidelines for helping (social) interfaces treat users in more respectful, tactful, and autonomy-supportive ways.
READ FULL TEXT