How to See Through the Lies: A Digital Survival Guide

Yesterday, I spoke to 35 young people from across Europe.

It was part of an EU-funded programme called Facts Over Fiction, focused on fake news and disinformation. Preparing for it made me rethink some of my core ideas on digital literacy and critical thinking—the same ones I share with activists and political movements.

The audio of the talk is below (reasonable quality!), the slides I used are here, and scroll down for a short summary of the transcript (with links to sources). A video will be out soon.

Think For Yourself: Digital Literacy and the Fight for Critical Thinking
0:00
/1144.6595918367348

I think these are useful insights for any activist or news consumer navigating today’s information war. As ever, if you have thoughts on these too, let me know.

cheers

Mehran


Who I Am

Thank you for having me. I’m a political consultant, and I help activists challenge power, balancing ethics and strategy. My clients include NGOs, political movements, and individual activists, and it’s from this perspective that I approach the challenge of media literacy and critical thinking.

Today, I’ll share some principles I’ve learned that have helped in my work.

  • First, I’ll outline the problem.
  • Then, I’ll explain why the mainstream solution is wrong.
  • Finally, I’ll share what actually works.

What’s at Stake?

Let’s start with a real-world example: the Rohingya massacre in 2017.

In Myanmar, a country the size of France, the military launched a brutal crackdown on the Rohingya minority.

  • 700,000 people fled to Bangladesh.
  • Villages were burned, women were raped, and tens of thousands were killed.
  • This didn’t start with guns—it started on social media.

For months, nationalist groups flooded Facebook with hate speech, calling the Rohingya “maggots” and “rapists” who should be exterminated. Facebook, the country’s main news source, amplified the content, helping fuel the genocide. Facebook later admitted its failure, but by then, the damage was done.


1. The Problem: Lies and Their Consequences

Lies Are Everywhere

Anyone can publish anything. We’re flooded with:

  • Scams, misinformation, deepfakes, AI-generated propaganda.
  • AI is making it worse—soon, we’ll be drowning in AI-generated rubbish.

Some lies are harmless:

  • The Earth is flat.
  • The Great Wall of China is visible from space.
  • COVID was spread by 5G.

Others are deadly:

  • The Rohingya genocide, fueled by Facebook hate speech.
  • WhatsApp rumours in India (2020) led to 23 mob lynchings over false child predator claims.

Then there’s another category: lies from those in power.


Establishment Lies Are Worse

The Iraq War (2003) was built on a lie.

  • The U.S. and U.K. claimed Saddam Hussein had WMDs—he didn’t.
  • Media repeated the government line without evidence.
  • The war killed hundreds of thousands of civilians, collapsed Iraq’s government, and helped create ISIS.
  • No one was held accountable—not Bush, Blair, or the journalists who sold the war.

Other examples:

Every time authorities get caught lying, trust collapses. That’s dangerous.


2. The Wrong Solutions: Censorship & Control

Censorship Doesn’t Work

  • “Don’t worry, we’ll remove the lies for you.”
  • Examples:
    • Hunter Biden Laptop Story—censored in 2020, later confirmed as true.
      • Just before the 2020 U.S. election, a New York Post report revealed emails from Hunter Biden’s abandoned laptop suggesting his father, Joe Biden, had knowledge of his dubious business dealings abroad.
      • Facebook and Twitter suppressed the story, claiming it was Russian disinformation. The New York Post was locked out of its account for reporting it.
      • After the election, major media outlets and U.S. intelligence confirmed the laptop was real—raising questions about whether censorship influenced the election outcome.
    • Germany’s ban on Palestine protests (2023)—framed as hate speech, but looks like censorship.

Why Censorship is Dangerous

  1. Power always abuses it—governments and corporations decide what you can see.
  2. It doesn’t stop bad ideas—it makes them more attractive. This is called the Streisand Effect.
    • In 2003, singer Barbra Streisand tried to suppress an aerial photo of her house, filing a lawsuit to have it removed from the internet.
    • Before the lawsuit, only six people had seen the photo. After news of the lawsuit spread, it was viewed over 420,000 times in a month.
    • The more you try to silence something, the more people want to see it.
  3. Who decides what gets censored? Once censorship tools exist, they’re never just used for extreme cases.

Algorithms & Fact-Checking Trap You in a Bubble

  • Algorithms: “Don’t worry, we’ll curate your online experience for you.”
  • Fact-checking: “Don’t worry, we’ll correct the lies for you.”

Why This is a Problem

  1. Algorithms narrow perspectives. They trap you in echo chambers, making you easier to manipulate.
  2. Fact-checkers reflect establishment biases. They are not neutral.
  3. Community Notes (X’s fact-checking) is better but still gamed by interest groups.

The real issue? These “solutions” don’t empower you.

They infantilise you.


3. The Only Real Solution: Strengthening Your Digital Immune System

Shift Your Mindset

  1. Accept that deception is part of the internet. With AI, it’s only getting worse.
  2. Accept that all journalism is advocacy. Every fact choice creates a narrative.
  3. Accept that experts are more biased than you think. A UCL study found experts reinforce their biases and are less likely to change their minds than non-experts.

Your Critical Thinking Toolkit

  1. Find multiple sources. If you only read one viewpoint, you’re being fed a narrative. Use expert input, but don’t outsource your thinking.
  2. Recognise manipulation. Ask: What are they trying to make me feel?
  3. Use probabilistic thinking. Don’t rush to pick a side—evaluate the likelihood of competing claims.
  4. Be unoffendable – its a superpower. If they can’t offend you, they can’t manipulate you.

Escape the Algorithm

  1. Use RSS readers (like Unread or Readwise Reader) to curate your own news.
  2. Follow voices you disagree with. Use the Follow tab on X, not “For You.”
  3. Use independent search engines (like Kagi) instead of Google.
  4. Be skeptical of AI answers—always check sources.

The Rohingya and Why This Matters

We started with the Rohingya genocide, because it shows what happens when:

  • Hate speech is amplified by algorithms.
  • No one questions the dominant narrative.
  • A tech giant refuses to act.

The same system that fueled that genocide shapes what you see online today.

Don’t let an algorithm think for you. Don’t let power dictate your reality.

Curate your own sources. Challenge your own beliefs. Think critically.

Use your mind, and trust others to do the same.