Essential Utility
Systems optimize for survival and autonomy. Features serve human need, not engagement metrics or corporate extraction.
What it is: Essential Utility means the system's reason for existence is to serve clearly identified human need. Every feature is evaluated against "does this help users survive or maintain autonomy?" Engagement metrics, retention mechanics, and monetization schemes that conflict with user welfare are rejected.
Why it matters: Many systems are designed to extract attention, data, or money from users — not to serve them. A vulnerable person cannot afford a tool that optimizes for something other than their survival. Essential Utility is a philosophical commitment: this system is for you, not through you.
When to use: Always. No system should sacrifice user welfare for any other goal. Systems serving vulnerable populations especially must be ruthlessly focused on essential use-cases.
Why Essential Utility Matters
Many contemporary systems optimize for metrics that harm users:
- Engagement metrics: "Time on app" incentivizes addictive design, not usefulness
- Attention capture: Dark patterns, notifications, auto-play video waste users' cognitive bandwidth and battery
- Monetization: Ads prioritized over utility; free services harvest users as product
- Lock-in: Features designed to prevent user departure, not to serve user goals
- Distraction: "Recommended for you" feeds pull users away from their intent
For privileged users, this might be tolerable. For vulnerable users — someone in crisis, with limited cognitive bandwidth, or using metered connectivity — distraction is not a luxury problem. It's a threat to survival.
Implementation Patterns
Define Essential Use Cases Explicitly
For each feature, ask: "Does this serve a survival or autonomy need?" If not, deprioritize.
- Crisis communication app: Essential: send/receive messages. Useful: offline messaging. Nice-to-have: stickers, filters. De-prioritize the nice-to-have.
- Medical device: Essential: take vital readings, alert on critical values. Useful: sync to record. De-prioritize: social sharing, gamification.
- Financial tool: Essential: check balance, transfer funds. Useful: budgeting advice. De-prioritize: investment recommendations, FOMO alerts.
Remove Dark Patterns
Systematically eliminate features designed to exploit:
- Dark patterns to audit:
- Notifications that manipulate (FOMO, social pressure, urgency)
- Infinite scroll that wastes time and battery
- Hidden settings that default to the worst user outcome
- Confirmations that trick users into sharing data
- Paywalls hidden until after work is done
- Friction to unsubscribe or delete account
- Replace with transparent design: what does this feature do, and is it optional?
Minimize Cognitive Load
Users in crisis have limited attention. Remove choices:
- Standard workflows: "Send a message" should be one button, not a menu of options
- Smart defaults: Assume the most common, safest action. Expert users can override.
- No decision paralysis: If a user faces 20 options, they make worse decisions
- Plain language: Jargon burns cognitive cycles on parsing, not action
Measure Success by User Outcome, Not Engagement
Change your metrics:
- Bad metrics: DAU (daily active users), time on app, feature adoption
- Good metrics: Did user accomplish their goal? Did they need customer support? Did they leave the system satisfied or frustrated?
- For vulnerable users: Did they remain safe? Did they maintain autonomy? Did the system reduce crisis?
No Ads or Invasive Monetization
Ads require attention capture and privacy invasion. Never acceptable for systems serving vulnerable users:
- Fund via donations, grants, or mainstream pricing (subscription, one-time purchase)
- If free is required, fund transparently (nonprofit, philanthropic support)
- Never harvest user data to monetize
- If you monetize, be transparent about how, and give users a way to opt-out without losing core service
Anti-Patterns (What to Avoid)
System optimizes for time-on-app before optimizing for user goals. Facebook, TikTok, Twitter use this. Essential Utility is the opposite.
Consequence: Users lose attention to system goals. If system serves vulnerable population, this can be dangerous.
App makes unsubscribe difficult, hides delete-account options, uses notifications to pull users back.
Consequence: User cannot leave even if system no longer serves them. Relationship is coercive.
Core feature is free until deep engagement; then paywall appears.
Consequence: User invests time, then discovers they must pay or abandon work.
Variable rewards, notifications, streak mechanics, leaderboards designed to be habit-forming.
Consequence: Users in crisis cannot afford addictive distraction. System is exploitative.
System claims to be free, but actually sells user data to advertisers.
Consequence: User thinks they're using a tool; actually they're the product. Violates consent and trust.
Real-World Examples
Signal — Essential Messaging, Nothing Else
What it is: Encrypted messaging that does exactly one thing: let you communicate securely.
Why it's essential utility:
- No ads, no engagement metrics, no dark patterns
- Funded by donations and grants, not user data sale
- Open-source: users can verify nothing evil is hidden
- Feature decisions driven by security and privacy, not engagement
- Can delete account anytime; Signal deletes your data
Consequence: Dissidents, journalists, and abuse survivors trust Signal. It serves them, not itself.
Wikipedia — Knowledge Commons
What it is: Free encyclopedia, no ads, no engagement metrics.
Why it's essential utility:
- funded by nonprofit Wikimedia Foundation, not advertisers
- No algorithm to keep you scrolling
- Content is donated; system optimizes for accuracy, not virality
- Offline distribution; works without internet
Consequence: People worldwide depend on Wikipedia for genuine knowledge, not filtered information.
Tor Browser — Anonymity Utility
What it is: Browser that routes your connection through anonymization network.
Why it's essential utility:
- Funded by grants, not ads or data harvesting
- Open-source; security researchers can audit
- Designed to serve journalists, dissidents, and those under surveillance
- No dark patterns; does one thing well
Consequence: In hostile contexts, Tor is survival-critical.
Anti-Example: Facebook / Meta Ecosystem
The problem: Optimizes for engagement (time on platform) and monetization (ad targeting) before user welfare.
- Algorithm surfaces outrage-inducing content because emotion drives engagement
- Notifications designed to pull users back repeatedly
- Financial incentive to know everything about users (better ads)
- Hard to delete account; company resists user departure
Consequence: Billions of users, but system is not designed to serve them. It's designed to extract from them.
Anti-Example: Medical Startup with Dark Patterns
The scenario: Telehealth app that helps users book therapy sessions. Core value: connecting with mental health providers.
The dark pattern: App sends notifications: "You haven't booked a session in 3 days. Book now!" (FOMO), shows "X therapists are reviewing your profile" (social proof), makes unsubscribe hard.
Consequence: User in crisis feels more pressure, not less. System exploits their vulnerability for engagement.
Scope and Applicability
All systems should prioritize Essential Utility. This is not optional.
Systems serving vulnerable populations must be ruthless about it:
- Mental health, crisis support, suicide prevention tools
- Abuse-survivor resources, domestic violence support
- Medical devices, emergency communication
- Refugee and displacement support
- Privacy and security tools for dissidents and persecuted minorities
Even for non-vulnerable users, Essential Utility improves everything:
- Users accomplish goals faster
- System is more reliable (less complex = fewer bugs)
- Trust increases (no fear of dark patterns or data harvesting)
- Differentiation is clearer (competitor tools all optimized for dark patterns; you optimize for actual utility)
Synthesis Lineage: Disciplinary Roots
Essential Utility formalizes principles from ethics, design, and usability:
Design Ethics & Value-Based Design
Contemporary design ethics (Costanza-Chock, Schuler & Namioka) emphasizes that all design embodies values. Technology is never neutral. If you don't explicitly value user welfare, your system will value something else (profit, engagement, control).
Protective Computing applies: Choose human welfare as your primary value metric. Every other metric must serve this, not compete with it.
Dark Patterns & Deceptive Design
HCI research (Susser et al., Gray et al.) catalogs "dark patterns" — manipulative UI/UX designed to trick users. Essential Utility is the antidote: transparent design that respects user autonomy.
Protective Computing formalizes: Audit your product for dark patterns; eliminate them systematically.
Usability & Human-Centered Design
Foundational HCI work (Norman, Nielsen) teaches that well-designed systems match user goals. System complexity should serve user needs, not system convenience.
Protective Computing applies: This principle intensifies for vulnerable users, who cannot afford to put up with poor usability.
Technology & Social Justice
Critical technologists (hooks, Bietti, Zuboff) argue that technology is a site of power. Systems can liberate or oppress. Essential Utility is liberatory: technology designed for human autonomy, not control.
Protective Computing formalizes: Technology serving vulnerable populations cannot be neutral. It must actively resist oppression.
Relationship to Other Principles
Essential Utility is strengthened by:
- Degraded Functionality: Focusing on essential features naturally leads to graceful degradation. Non-essential bits are first to go under resource scarcity.
- Exposure Minimization: Collecting no unnecessary data also means no data to monetize or misuse. Aligns incentives away from exploitation.
- Reversibility: Undo/redo prevents mistakes, reducing user frustration. Complements simplified UI.
Next Steps
For system designers:
- Define essential use-cases explicitly. What problem does this system solve?
- List all features. For each, ask: "Is this essential, useful, or superfluous?" Cut the superfluous.
- Audit for dark patterns: notifications, friction to exit, surprise costs, addictive mechanics.
- Change success metrics from "engagement" to "user goal accomplishment."
- Make monetization transparent. If you charge, say so upfront. If you harvest data, say so.
- User test with actual target users. Do they accomplish their goal? Are they frustrated or satisfied?
- Commit to no dark patterns. Document and enforce this in design reviews.
Related Principles
- Reversibility: Undo/redo is essential utility. Reduce mistakes and frustration; users focus on goals, not system recovery.
- Exposure Minimization: Collecting no unnecessary data eliminates monetization conflict. System incentives align with user welfare.
- Local Authority: Local operation is essential; cloud-first architecture introduces dependency that compromises utility.
- Coercion Resistance: Essential utility under threat; systems for vulnerable users cannot optimize for anything else.
- Degraded Functionality: Essential features survive resource scarcity. Graceful degradation sustains utility under stress.
All principles working together: