Calm Tech Antipatterns
From Cyborg Anthropology
Definition
Antipatterns in Calm Technology are recurring design mistakes that disrupt our natural attention patterns and create unnecessary cognitive load. These are common but problematic approaches that violate the principles of technology working smoothly with human attention and awareness.
These antipatterns are particularly harmful because they:
- Create ongoing anxiety and stress
- Reduce trust in technology
- Prevent development of healthy attention patterns
- Increase cognitive load unnecessarily
- Interfere with natural task completion
- Degrade important signal channels
- Create learned helplessness in users
While patterns in Calm Technology help information move naturally between center and periphery of attention, these antipatterns force inappropriate attention shifts, create anxiety, or demand unnecessary focus.
1. Anxious Notifications
- Definition: Notifications that create ongoing anxiety through their mere possibility of arrival.
- Example: Email notifications that appear randomly throughout the day cause users to remain in a constant state of anticipation, unable to fully focus on other tasks because they're always halfway expecting the next notification. The psychological cost isn't just in the interruption, but in the continuous state of waiting for interruption.
2. Attention Thievery
- Definition: Interfaces that hijack attention without user consent or clear purpose.
- Example: Autoplay videos that start when scrolling through an article, forcing users to actively hunt for and silence the unexpected audio. This pattern doesn't respect the user's current attention state or choice about where to focus.
3. False Urgency
- Definition: Using urgent signals for non-urgent information.
- Example: A shopping app that sends "emergency" notifications about sales, degrading the meaning of urgency and training users to either be unnecessarily anxious or to ignore potentially important alerts. This crying-wolf pattern damages the trust relationship between user and technology.
4. Attention Splitting
- Definition: Systems that force unnecessary multitasking or context switching.
- Example: A car interface that requires multiple menu levels of touch-screen interaction to adjust basic functions like temperature, forcing drivers to split attention between driving and complex menu navigation when simple physical controls would be safer and more intuitive.
5. Modal Madness
- Definition: Overuse of modals or popups that demand immediate attention regardless of context.
- Example: Cookie consent popups that appear immediately upon visiting a website, interrupting the user's intended task and forcing an immediate decision about something that could be handled more gracefully or at a more appropriate time.
6. Information Flooding
- Definition: Overwhelming users with more information than they can process peripherally.
- Example: A "smart" dashboard that displays every possible metric simultaneously, preventing users from easily finding relevant information or maintaining peripheral awareness of important changes.
7. Forced Focus
- Definition: Technology that demands attention at inappropriate moments.
- Example: Software updates that force immediate action during focused work, disrespecting the user's current task and attention state when the update could easily wait for a natural break.
8. Trust Breaker
- Definition: Systems that act unpredictably or contrary to user expectations.
- Example: AI assistants that make decisions without clear feedback or control, leaving users uncertain about what will happen next and unable to develop reliable mental models of system behavior.
9. Attention Gambler
- Definition: Interfaces that create intermittent reinforcement patterns leading to compulsive checking.
- Example: Social media "pull-to-refresh" mechanisms that mirror slot machine mechanics, creating addiction-like patterns of attention and preventing healthy disengagement.
Further Reading
- Cooper, Alan. "The Inmates Are Running the Asylum" (1999)
- Harris, Tristan. "Time Well Spent" (2016)
- Norman, Donald. "The Design of Everyday Things" (1088)
- Weiser, Mark and Brown, John Seely. "The Coming Age of Calm Technology" (1996)