Essential Utility

Systems optimize for survival and autonomy. Features serve human need, not engagement metrics or corporate extraction.

Definition Summary

What it is: Essential Utility means the system's reason for existence is to serve clearly identified human need. Every feature is evaluated against "does this help users survive or maintain autonomy?" Engagement metrics, retention mechanics, and monetization schemes that conflict with user welfare are rejected.

Why it matters: Many systems are designed to extract attention, data, or money from users — not to serve them. A vulnerable person cannot afford a tool that optimizes for something other than their survival. Essential Utility is a philosophical commitment: this system is for you, not through you.

When to use: Always. No system should sacrifice user welfare for any other goal. Systems serving vulnerable populations especially must be ruthlessly focused on essential use-cases.

Why Essential Utility Matters

Many contemporary systems optimize for metrics that harm users:

For privileged users, this might be tolerable. For vulnerable users — someone in crisis, with limited cognitive bandwidth, or using metered connectivity — distraction is not a luxury problem. It's a threat to survival.

Implementation Patterns

Define Essential Use Cases Explicitly

For each feature, ask: "Does this serve a survival or autonomy need?" If not, deprioritize.

Remove Dark Patterns

Systematically eliminate features designed to exploit:

Minimize Cognitive Load

Users in crisis have limited attention. Remove choices:

Measure Success by User Outcome, Not Engagement

Change your metrics:

No Ads or Invasive Monetization

Ads require attention capture and privacy invasion. Never acceptable for systems serving vulnerable users:

Anti-Patterns (What to Avoid)

❌ "Engagement-First" Product Design

System optimizes for time-on-app before optimizing for user goals. Facebook, TikTok, Twitter use this. Essential Utility is the opposite.

Consequence: Users lose attention to system goals. If system serves vulnerable population, this can be dangerous.

❌ Dark Patterns & Friction for Exit

App makes unsubscribe difficult, hides delete-account options, uses notifications to pull users back.

Consequence: User cannot leave even if system no longer serves them. Relationship is coercive.

❌ Surprise Paywalls

Core feature is free until deep engagement; then paywall appears.

Consequence: User invests time, then discovers they must pay or abandon work.

❌ Addictive Design (Slot Machine Mechanics)

Variable rewards, notifications, streak mechanics, leaderboards designed to be habit-forming.

Consequence: Users in crisis cannot afford addictive distraction. System is exploitative.

❌ Monetizing User Data

System claims to be free, but actually sells user data to advertisers.

Consequence: User thinks they're using a tool; actually they're the product. Violates consent and trust.

Real-World Examples

Signal — Essential Messaging, Nothing Else

What it is: Encrypted messaging that does exactly one thing: let you communicate securely.

Why it's essential utility:

Consequence: Dissidents, journalists, and abuse survivors trust Signal. It serves them, not itself.

Wikipedia — Knowledge Commons

What it is: Free encyclopedia, no ads, no engagement metrics.

Why it's essential utility:

Consequence: People worldwide depend on Wikipedia for genuine knowledge, not filtered information.

Tor Browser — Anonymity Utility

What it is: Browser that routes your connection through anonymization network.

Why it's essential utility:

Consequence: In hostile contexts, Tor is survival-critical.

Anti-Example: Facebook / Meta Ecosystem

The problem: Optimizes for engagement (time on platform) and monetization (ad targeting) before user welfare.

Consequence: Billions of users, but system is not designed to serve them. It's designed to extract from them.

Anti-Example: Medical Startup with Dark Patterns

The scenario: Telehealth app that helps users book therapy sessions. Core value: connecting with mental health providers.

The dark pattern: App sends notifications: "You haven't booked a session in 3 days. Book now!" (FOMO), shows "X therapists are reviewing your profile" (social proof), makes unsubscribe hard.

Consequence: User in crisis feels more pressure, not less. System exploits their vulnerability for engagement.

Scope and Applicability

All systems should prioritize Essential Utility. This is not optional.

Systems serving vulnerable populations must be ruthless about it:

Even for non-vulnerable users, Essential Utility improves everything:

Synthesis Lineage: Disciplinary Roots

Essential Utility formalizes principles from ethics, design, and usability:

Design Ethics & Value-Based Design

Contemporary design ethics (Costanza-Chock, Schuler & Namioka) emphasizes that all design embodies values. Technology is never neutral. If you don't explicitly value user welfare, your system will value something else (profit, engagement, control).

Protective Computing applies: Choose human welfare as your primary value metric. Every other metric must serve this, not compete with it.

Dark Patterns & Deceptive Design

HCI research (Susser et al., Gray et al.) catalogs "dark patterns" — manipulative UI/UX designed to trick users. Essential Utility is the antidote: transparent design that respects user autonomy.

Protective Computing formalizes: Audit your product for dark patterns; eliminate them systematically.

Usability & Human-Centered Design

Foundational HCI work (Norman, Nielsen) teaches that well-designed systems match user goals. System complexity should serve user needs, not system convenience.

Protective Computing applies: This principle intensifies for vulnerable users, who cannot afford to put up with poor usability.

Technology & Social Justice

Critical technologists (hooks, Bietti, Zuboff) argue that technology is a site of power. Systems can liberate or oppress. Essential Utility is liberatory: technology designed for human autonomy, not control.

Protective Computing formalizes: Technology serving vulnerable populations cannot be neutral. It must actively resist oppression.

Relationship to Other Principles

Essential Utility is strengthened by:

Next Steps

For system designers:

  1. Define essential use-cases explicitly. What problem does this system solve?
  2. List all features. For each, ask: "Is this essential, useful, or superfluous?" Cut the superfluous.
  3. Audit for dark patterns: notifications, friction to exit, surprise costs, addictive mechanics.
  4. Change success metrics from "engagement" to "user goal accomplishment."
  5. Make monetization transparent. If you charge, say so upfront. If you harvest data, say so.
  6. User test with actual target users. Do they accomplish their goal? Are they frustrated or satisfied?
  7. Commit to no dark patterns. Document and enforce this in design reviews.

Related Principles

All principles working together: