dontsurveil.me 01
← All Announcements Threat vectors

The map

Multiple threat vectors. One pattern.

A "threat vector" is the path a surveillance harm takes to reach you. They look unrelated on the surface. The shape underneath is the same.

The shift, in one move

Old default

Private unless someone gets a warrant.

New default

Visible unless you take steps to protect yourself.

Encryption mandates

State forces providers to engineer access to encrypted communications

Laws that require messaging apps, cloud storage, and other services to build a "second key" — a way for state actors to read communications that are otherwise end-to-end encrypted. The harm is the architecture itself: once the backdoor exists, it doesn't only get used by the police it was built for. Salt Typhoon proved that.

What it feels like You text your sister about something hard — a diagnosis, a divorce, a kid in trouble. You assume the conversation stays between you. After this, it might not. Someone in a Crown attorney's office, eventually, could pull up the thread. You'd never know they did.

Types of action

  • PoliticalSign open letters opposing the bill. Contact your MP. Push for explicit no-backdoor language in committee.
  • PersonalUse end-to-end encrypted messengers (Signal, iMessage). Turn on iCloud Advanced Data Protection. Enable encrypted backups.
  • CollectiveSupport EFF, OpenMedia, Internet Society, Mozilla — orgs litigating and lobbying.

Examples

Covered by

Bulk metadata retention

Providers must keep records of who-talks-to-whom, on everyone, for months or years

Laws that force providers to retain communications metadata — who messaged whom, when, from where, on what device — for fixed periods, regardless of suspicion. Metadata patterns are often more revealing than content; knowing who called a divorce lawyer at 9pm, a moving company at 10am, and a mother in between is enough to know what's happening in someone's life.

What it feels like Nobody is reading your messages. They don't need to. Someone could lay out exactly who you talked to at 2am every Saturday for the past year. Your bank. Your therapist. The number you only called once. The pattern tells a story you didn't agree to write.

Types of action

  • PoliticalPush for narrowly-scoped preservation orders, not blanket retention. Demand sunset clauses.
  • PersonalUse messengers that minimize metadata (Signal logs almost nothing). Use disappearing messages.
  • CulturalMake the metadata-vs-content distinction visible — most people still hear "we don't read your messages" as a privacy guarantee.

Examples

Covered by

Border & device search

State agents inspect phones and laptops at borders, airports, and immigration checkpoints — usually without a warrant

The encryption on your phone doesn't help when a customs officer can compel you to unlock it. Border-search regimes vary, but most Five Eyes countries permit some level of warrantless device inspection at entry points. Falls hardest on travelers from over-policed countries, refugees, and people whose work makes them subjects of state interest.

What it feels like You land after a long flight. The officer asks for your phone and your password. Saying no means a longer interview, possibly missing your connection, possibly being sent back. So you unlock it. They see whatever's on it — texts, photos, what you've been searching, who you've been talking to. The encryption you set up at home doesn't follow you across that line.

Types of action

  • PersonalTravel with a burner device. Power down phones at borders. Use file-vault encryption. Know your rights to silence and counsel.
  • PoliticalPush for warrant requirements before device searches. Limit "exceptional" no-warrant inspections.
  • EducationalShare know-your-rights guides (CCLA, EFF border-crossing materials).

Examples

Covered by

  • No PSA yet.

Surveillance commerce

Your behavior is a product — collected by platforms, resold by data brokers, purchased by anyone with a budget

No law required, no warrant required. The advertising/data-broker ecosystem collects and sells location, behavior, demographic, and inferred-trait data at scale. Police, lawyers, employers, foreign governments — anyone with the money — can buy what citizens never knowingly opted into providing. The biggest day-to-day surveillance footprint for most people.

What it feels like Your phone knows you're pregnant before your family does — because the app you downloaded sold the signal to a broker, who sold it to an advertiser, who sold a slightly different version to your insurer. Nobody asked. Nobody told you. You agreed once, four years ago, scrolling past a screen you didn't read.

Types of action

  • PersonalUse a privacy-respecting browser (Firefox, Brave). Block ad-tech. Refuse cookies. Delete data-broker records (DeleteMe, Privacy Duck).
  • PoliticalPush for comprehensive privacy reform (PIPEDA update in Canada, federal privacy law in the US). Ban government purchase of broker data.
  • CulturalMake the invisible visible — share what gets collected and who buys it.

Examples

Covered by

  • No PSA yet.

State-grade spyware procurement

Governments buy off-the-shelf mass-surveillance tools from private vendors — no new law needed

Distinct from #1 in that it bypasses the legislative process entirely. Government agencies acquire commercial spyware (Pegasus, Cellebrite, Clearview AI), IMSI catchers (Stingrays), and face-recognition platforms through procurement contracts. No parliamentary debate, no public scrutiny, no warrant requirement built into the purchase itself.

What it feels like Your phone on the nightstand can become a microphone without a button being pressed. Not by a teenage hacker — by a vendor a government department bought a license from, under a procurement contract that never made the news. If you're a journalist, an activist, an ex of someone with connections, it might already have happened. You wouldn't know.

Types of action

  • PoliticalDemand procurement transparency. Push for parliamentary oversight of surveillance-tool purchases. Sanction abusive vendors.
  • PersonalMaintain device hygiene. Use Lockdown Mode (iOS) if at elevated risk. Watch Citizen Lab and Amnesty reports.
  • CollectiveSupport investigative journalism that exposes contracts and uses.

Examples

Covered by

  • No PSA yet.

Platform compulsion

Laws that force platforms to identify users, censor content, or moderate at scale

Surveillance dressed as safety. State-mandated content takedowns, age-verification gates, real-name policies, and deplatforming rules. Often arrives framed as child protection or anti-misinformation. The result is the same: the platforms become arms of state speech enforcement and identity verification.

What it feels like You go to upload something — a photo, an essay, footage of a protest. The platform asks for your government ID first. You scan it. The platform now has your face, your address, and a record of what you tried to say. The site is "safer." So is your file in the record of who said what.

Types of action

  • PoliticalOppose age-verification mandates that require ID disclosure. Demand narrow takedown processes with judicial review.
  • PersonalUse pseudonymous accounts where lawful. Federated and decentralized platforms (Mastodon, Bluesky AT-Proto) where moderation can't be centrally compelled.
  • CulturalPush back on "safety" framing that conflates surveillance with protection, especially around children.

Examples

Covered by

Cross-border data sharing

Legal architecture that lets one country's authorities reach into another's data

Mutual legal-assistance treaties (MLATs), CLOUD Act executive agreements, and bilateral arrangements that make one country's subpoena power effective inside another. A force multiplier on every other vector: a backdoor mandated in Country A becomes accessible to Country B through a treaty, and the public conversation in either country may never surface what's happening in the other.

What it feels like You're a Canadian using a U.S. service. Or an Australian using a Canadian one. The terms-of-service mentions data sovereignty in passing. None of that protects you when a treaty quietly signed between two governments lets one of them reach into the other's servers — and ask about you — without your country's courts ever weighing in.

Types of action

  • PoliticalDemand parliamentary review of every new bilateral or executive agreement. Insist on transparency reports.
  • PersonalPick providers in jurisdictions with stronger data-protection regimes (e.g., Swiss, German hosting).
  • EducationalTrack which agreements your government is negotiating — most never make the news.

Examples

Covered by

Algorithmic decision-making

Machine systems decide credit, hiring, immigration, policing, sentencing — using the same data the other vectors collect

Where the data collected by every other vector goes to be acted on. Opaque scoring systems decide who gets a loan, a job, an apartment, a visa, a longer prison sentence. The harm is the inability to know what data was used, what weights were applied, or how to contest the result — and the consent that supposedly justifies it was scrolled past years ago, on a screen you didn't read. A meta-vector: it doesn't add new data collection, it weaponizes the data already collected.

What it feels like Your loan gets denied. Your visa stalls. Your job application disappears into a portal. No person to call. No reason given. A score, a model, a weighted thing you'll never see — decided. The data feeding it came from places you never knowingly agreed to be tracked by; the consent, if anyone produces it, was a checkbox on a screen you scrolled past four years ago. Consent stretched that thin isn't really consent at all.

Types of action

  • PoliticalPush for "right to explanation," human-in-the-loop requirements, algorithmic accountability laws, and bans on automated decisions in high-stakes domains.
  • PersonalWhen affected, ask what data the decision used and request human review. Document outcomes for advocacy use.
  • CollectiveSupport orgs investigating algorithmic harm (AI Now Institute, Citizen Lab, Data & Society).

Examples

Covered by

  • No PSA yet.

Financial surveillance

Payments infrastructure as observation infrastructure — including central bank digital currencies designed with programmable controls

Money is increasingly a surveillance signal. Card networks already log every purchase; CBDCs (central bank digital currencies) being designed now can add programmable conditions, time limits, and category restrictions to money itself. Combined with #4 (data brokers) and #7 (cross-border sharing), financial surveillance becomes the most granular real-time map of behavior governments have ever had.

What it feels like Every coffee you buy, every donation you make, every magazine subscription — already a row in someone's database. The next step is money you can spend only on what's allowed, only where it's allowed, only by people the system approves. The infrastructure is being designed now. The public conversation about it isn't happening in your language.

Types of action

  • PoliticalDemand privacy-preserving design in CBDC consultations. Push for cash-equivalent privacy as a non-negotiable design requirement.
  • PersonalUse cash where legal and practical. Understand what your bank reports automatically.
  • CulturalMake the programmability tradeoffs visible — most public CBDC debate ignores them.

Examples

Covered by

  • No PSA yet.

How to read this map

The vectors aren't ranked; they're a set of distinct mechanisms, and any one of them can be the entry point for the next.

If you know an emerging threat that fits one of these vectors and isn't announced yet, tell us about it →