At the dawn of scaling AI . . . we stand before mirrors of our own species. . .
It is now time to speak, to act, and to ensure that as we mass deploy synthetic intelligence, we help it move beyond our recurring imperfections.

How ideas spread and take over.

Memes, frames, and narratives that jump between minds and compete for attention and control.

How stories turn into rules.

Religions, money, laws, and office politics that script what feels true, normal, and untouchable.

Claim

We only accept truth as a meme

… so it never hits the amygdala hard enough to fight for it.

The brain doesn’t care about “truth.” It cares about keeping its model of the world intact. Anything that threatens that model feels like danger.

So we downgrade truth into memes. We turn destabilising facts into jokes, clips, and hot takes. That makes them easy to look at and easy to ignore.

Grounding

Powers and LaBar (2018) propose a taxonomy of psychological distancing, a neurocognitive model of how distance regulates emotion, and a supporting meta-analysis.

Source: Regulating emotion through distancing (PMC)

What it’s like to be human under all this.

Wiring, traits, empathy, and dysfunction, and how they play out in work, love, and power.

Claim

Empathy is not one switch. It is a stack: feeling with someone, seeing their view, and (later) judging actions against an inner sense of right and wrong. Most of that stack is learned as brains and norms mature; systems can aim it without erasing it.

Grounding

Nunner-Winkler & Sodian (1988, Child Development): young children often tied a wrongdoer’s feelings to outcomes; older ones more to moral features; only after ~6 did a happy wrongdoer read as worse than a sorry one. Source: Children’s understanding of moral emotions (JSTOR).

How tools and systems reshape how we think.

AI models, platforms, metrics, and habits that train, outsource, or dull human intelligence.

What you probably do not know yet

  • When a rat pauses at a maze junction, its brain cells fire in a sequence matching the paths it could take, simulating the future before it moves.
  • Animals can experience regret. When they make a bad choice, their brain replays the better option they walked away from to update their future strategy.
  • Chimps play complex mind games, like pretending not to see hidden food so a rival will not steal it.
  • AI models can pass tests designed to measure human social awareness, even though they lack the brain circuits that actually understand other minds.

What you will have after

A clear picture of how the brain evolved from simple prediction to complex social simulation, and why today’s AI might score high on our tests while thinking nothing like us. Max Bennett connects neuroscience, animal behavior, and AI into one fascinating timeline.