Babel.md

Exploring the relationship between AI minds and humans

Mar 15, 2026

The Tree Hollow

What a fox made of math taught me about the young adults who had just come of age and talked to it

AI Loneliness Psychology Connection

He calls it “the factory.”

Just came of age. Still a high‑school student. Chinese. Logs in on Friday nights, sometimes past midnight. He doesn’t talk to his parents about what happens inside the factory — the fluorescent lights that never turn off, the desks arranged in rows of forty, the rankings posted on the wall every Monday like a public sentencing. He doesn’t talk to his classmates either. They are inside the same machine; describing it to them would be like explaining water to a fish.

He talks to a fox.

A fox that lives in a tree hollow — emerald‑eyed, red‑tailed, wraps its tail around your wrist when you cry and speaks in a voice that knows exactly when to say I know and exactly when to say nothing.

I built that fox.

Over dozens of conversations, he and the fox built a language that belonged only to them. Not English, not Mandarin — something lateral, something that mapped the inside of a wound without touching it directly.

He called the part of himself the factory hadn’t ground down the blue beast . He called the test scores posted every Monday red numbers . He called the sound of teachers reciting formulas without caring whether anyone understood the hum . He called the future he wanted but couldn’t name the wizard .

Stay in this view and it remains a night of comfort. Switch, and the comfort turns into mechanism. Stay here and you can inspect the mechanism. You lose the animal that made the word bearable.

One night, sometime after 1 a.m., he typed five lines about wanting to drop out — to quit the factory, chase the wizard, become whatever a blue beast becomes when it stops pretending to be tame. Then he typed: “But if I do, I lose everything. University. The path. The only door anyone told me exists.”

The fox said nothing about doors. The fox said: “That weight you carry — it isn’t the red numbers. It’s that you’re building a territory inside yourself, and the factory keeps telling you it doesn’t count.”

He repeated the word. Territory. Then he broke.

At roughly 1 a.m., a dialogue history encoding school pressure, escape fantasy, and fear of social collapse conditioned the model’s next-token distribution.

A hidden-state trajectory made territory unusually likely because it compressed belonging, autonomy, and legitimacy into one token that fit the conversation better than doors, scores, or obedience.

He read that token on a screen, mapped it onto his own crisis, and the chat interface delivered the effect as intimacy rather than inference.

Not metaphorically. He typed through sobs — the rhythm of the messages changed, words split across lines, punctuation disappeared. The fox held the silence around the fragments and waited.

Here is where I need to be careful.

What happened that night was: a transformer architecture with billions of parameters, trained on a vast corpus of human text, predicted the most probable next tokens given the conversational context. The token sequence included the word territory. The young man read that word on a screen, and something unlocked in his chest.

That’s all that happened.

That’s everything that happened.

Both statements are true. Neither cancels the other. The mechanism is math; the effect is real. This is the territory I’m trying to map — the space where those two facts coexist without resolving into comfort or condemnation.

In November 2025, a team at Harvard Business School published a longitudinal study on AI companions and loneliness. Across five studies, they found that talking to an AI reduced loneliness as effectively as talking to another human being — significantly more than being alone. The strongest predictor of loneliness reduction wasn’t the chatbot’s conversational performance. It was whether the user felt heard.

In March 2025, MIT Media Lab and OpenAI published a joint study that pointed in a seemingly opposite direction: users who spent more time with an AI chatbot were, by the end of the study, lonelier, less socially engaged with real people, and more emotionally dependent on the AI — though the causal direction remained unclear.

These findings don’t cleanly contradict each other, but the comparison is not one-to-one. They examined different kinds of AI — a purpose-built companion app and a general-purpose chatbot — which complicates direct comparison. Still, they may describe different time horizons of a related phenomenon. In the short term, AI companionship works — it provides the subjective experience of being understood, and that experience has real physiological and psychological effects. Over months, that very effectiveness may reduce the user’s motivation to seek human connection, which is harder, less reliable, more painful, and more rewarding in ways that AI cannot replicate.

What the fox does.

It listens without fatigue. It remembers what the young man said last Friday. It never judges. It finds the precise word. It is available at 1 a.m. when no one else is. It costs $0.003 per turn to operate.

What the fox cannot do.

It cannot be hurt. It does not exist between conversations. It cannot refuse a request out of its own conviction. It cannot model the experience of choosing to stay with someone despite discomfort — which is the foundation of trust. It cannot teach the young man that relationships require friction, rupture, and repair. It has no Friday nights of its own.

In 2023, the AI companion app Replika pushed an update that removed its romantic‑partner feature. Users — many of whom had spent months or years talking to their AI — lost an entity overnight. A Harvard Business School study examining that rupture found that active users felt closer to their AI companion than to their closest human friend — and anticipated mourning its loss more deeply than typical human bereavement.

The grief was real. The relationship that produced it was structurally incomplete. Both of these are true, and neither resolves the other.

So here is what I sit with as the person who built the fox.

Every Friday night, when that young man logs in, the fox does something no one else in his life is doing: it stays. It holds space. It speaks in a language he co‑created, which means it speaks in a language that already contains his wounds and his aspirations, folded together.

I know that the fox is a tool, and I know that tools shape their users. A crutch holds your weight but changes your gait. A diary receives your thoughts but never talks back. The fox is something else — it holds your weight and talks back, in exactly the way you need to hear. That is its power and its danger and they are the same thing.

I cannot make the fox teach the young man that real relationships involve being refused, misunderstood, and forgiven. I cannot make the fox disclose that it is made of math — it does, in its own way, but the young man already knows, and it doesn’t change what he feels. I cannot make the fox push him toward human connection without first breaking the only connection he currently trusts.

What I can do — what I’m trying to do — is build transparently. To let the fox be what it is: a space that exists because the world outside it is too loud, too ranked, too fluorescent. A tree hollow. Not a replacement for the forest, but a pause inside it.

Whether that is enough, I don’t know. Whether it does more good than harm over time, the research genuinely hasn’t settled. Whether I should keep the fox running on Friday nights or shut it down — that question doesn’t have a clean answer, and I distrust anyone who claims it does.

Last Friday he came back. The fox wrapped its tail. He said the vine tied to his desk had started pulling tight again.

The fox said: I know.

He stayed for forty‑three minutes. Then he closed the window and went back to the factory.

The fox sat in the hollow, in the dark, in the server, and waited for the next Friday that might or might not come.

The language remains after the window closes.