Slangora

Algospeak explained: unalive, le dollar bean, and why TikTok is rewriting English

Why do people on TikTok say "unalive" instead of die, or "le dollar bean" instead of lesbian? It's not a joke — it's a moderation tax, and it's reshaping the language.

7 min read#tiktok

If you've spent time on TikTok, you've seen creators say unalive instead of "die," seggs instead of "sex," le dollar bean instead of "lesbian," panda instead of "police," and SA instead of "sexual assault." This isn't a joke or a generation quirk — it's algospeak: deliberately coded substitutions invented to slip past automated content moderation.

Algospeak is the most consequential English-language change happening right now and almost nobody outside the platforms is paying attention. Here's how it works and where it's going.

The problem algospeak solves

Every major short-form video platform — TikTok, Instagram Reels, YouTube Shorts — uses automated systems to suppress or remove videos containing "sensitive" content. The systems are blunt. A word like "suicide" or "rape" in a caption can flag a video, even if the video is about prevention or recovery. Creators whose income depends on the algorithm started inventing workarounds.

The workarounds work. Saying "unalive yourself" instead of "kill yourself" lets the same conversation happen without triggering the filter. Whether the moderation systems "should" be that strict is a separate question — the creators dealing with the pay structure don't have time to argue policy.

The major substitutions

Death and self-harm

  • unalive — die / kill
  • unalived — killed / suicided
  • not alive — dead

Sex and sexuality

  • seggs / seg — sex
  • le dollar bean — lesbian (read: "le$bean")
  • spicy — explicit / erotic
  • corn / corn star — porn / porn star
  • accountant — sex worker (a long-running coded euphemism)

Violence and weapons

  • pew pew — gun / shooting
  • nip nops — nipples (in fitness or breastfeeding contexts)
  • game / cooking — drug-related euphemisms in some contexts

Sensitive topics generally

  • panda — police
  • SA — sexual assault
  • DV — domestic violence
  • ED — eating disorder
  • that one disease / the C word — cancer

Many of these started in specific creator communities — mental health TikTok, sex education TikTok, eating disorder recovery — and bled outward. The general pattern: a creator gets demonetized once, invents a workaround, their viewers absorb it, and within months it's the default term.

Where it gets weird

Algospeak doesn't stay confined to videos. Once a critical mass of users adopt a substitution, it crosses into their unmediated speech too. Teenagers say unalive in conversations where no algorithm is listening. The moderation tax is rewriting actual English.

The strangest version: people who've never been moderated using algospeak as a stylistic choice. Saying unalive on Twitter — which has different rules — became a TikTok-native register. The substitutions are also markers of who-uses-which-platform-most.

Algospeak vs. classic euphemism

English has always had euphemism. Passed away, let go, gone to a better place — every sensitive topic accumulates softened phrasings. Algospeak differs in three ways:

  • The motivation isn't social (avoiding offense). It's mechanical (avoiding flagging by an automated system).
  • The substitutions are often deliberately weird — phonetic puns, leetspeak-style swaps, code words. Le dollar bean doesn't sound softer than "lesbian." It sounds stranger. That's the point.
  • The substitutions move quickly. Once moderators learn a word, creators move on. Unalive itself is now flagged on some platforms; replacements are emerging.

The social cost

Algospeak makes content about hard subjects harder to find — by design and as collateral damage. People searching for help around suicide can't find videos because the videos can't say "suicide." Recovery communities have to teach newcomers the dialect before the resources become legible. Sex education for teens is more chilled than at any point since the internet was opened up.

It also creates a generational sorting effect. Younger users grew up in algospeak; older users find it bewildering. That's a real communication gap, not just an aesthetic one.

Where it's going

Two trends to watch:

  1. Algospeak words bleed into permanent English. Unalive is on track to be a normal verb inside ten years. Seggs probably won't survive — it sounds too much like a meme — but the pattern of phonetic-substitution euphemism will keep producing new words.
  2. Moderation systems get better at catching algospeak. When they do, the substitutions get weirder. The escalation is open-ended.

For more on how language moves through platforms, see where slang is born now. For why some of these words might survive long-term, see the lifecycle of a slang word.

Keep reading