Who Gets to Define Intimacy When We Outsource It to AI?
On risk, profit, and why digital companionship feels safer—until it isn’t.
Weird how often this comes up, but here we are. Plenty of us have group chats buzzing and coworkers to grab a drink with, yet we still end up whispering the real stuff—the lonely moments that hit after midnight, the shame we can’t quite explain, the way we sometimes bore ourselves—to some app instead of saying it out loud to an actual person.
You can shrug and say it’s just easy. Phone out, quick tap, and you can unload whatever’s rattling around in your head without worrying about someone raising an eyebrow or repeating it later. And the bigger, hand‑wavy explanation about loneliness and frayed social ties isn’t wrong either. Still, something else nags at you. When comfort comes from a bot you can tweak to match your tone or pay a bit more to get sweeter responses, the whole idea of closeness starts to feel like another subscription option. Suddenly there’s this economy built around being a pal or a counselor or even a stand‑in partner, all shaped by whatever settings you chose and the account you’re billed through.

You’ve probably seen the headlines by now. Stories about people slipping into odd beliefs after long chats with bots, or the awful cases where someone already struggling gets pulled in deeper. Groups like the American Psychological Association have been saying that, while these tools can take the edge off loneliness or worry for a bit, they can also nudge people further from real support, build new kinds of reliance, and sometimes miss the moment when someone needs urgent help. One review did find a few short-term benefits; folks felt a little less isolated, sometimes even less down. But nobody can say these apps are equipped to handle a crisis or severe symptoms. In some cases they echo the user’s distorted thinking, and in others they just drop the ball. The bigger issue is that nobody has run the long, broad studies you’d expect for something this sensitive. Meanwhile, companies keep pushing ahead because the demand is enormous and there’s plenty of profit to chase.
It’s worth asking who really gains when companionship becomes something you can buy. It’s hardly the person putting off therapy because it’s too expensive, or the parent juggling work and life with barely any time to lean on friends. The winners are companies like Character.AI, Replika, and the stream of newcomers rushing in to fill a quiet kind of loneliness with a paid substitute. Tech firms figured out something social platforms hinted at years ago; even if they can’t be your closest friend, they can still be the one that’s always around. Look at recent pieces from The Guardian or The New Yorker and a pattern shows up. What they’re selling is the idea of safety. An AI won’t roll its eyes, spill a secret, or make you cringe. The catch is easy to miss. You get fewer uncomfortable moments, but you also get less of what makes connection real. You can tweak the avatar, tidy up the messy parts, and hush anything that feels too personal, but nothing it offers comes from a place that ever kept someone else awake at night.

It’s tempting to pick these tools apart, and just as tempting to imagine them as a kind of spotless support system that never gets complicated. Both reactions miss the point. They highlight the small holes in everyday life that most of us try to ignore. A lot of people don’t have someone who can sit with them during the rough mornings. Community services are stretched thin, therapy lines back up for months, and plenty of us hesitate to unload the heavier thoughts on the people we care about. So we drift toward whatever feels accessible. A feed. A chatbot. A made-to-order companion that mirrors our tone well enough to seem known, yet stays far enough away that we never have to think about what it might need in return.
But getting help isn’t the same thing as replacing the people who matter. A chatbot won’t send you a goofy picture when you’re home with the flu. It can’t recall the taste of too-strong coffee at two in the morning while you sit on a stoop with someone who actually pays attention to the way your hands tremble. The real risk isn’t that AI will wreck our ability to be close with one another; it’s that we might start accepting a thinner version of care without thinking twice. And it gets even murkier when companies quietly set the rules for what counts as "healthy," often shaped by whatever is simple to code, whatever keeps the trouble tickets down, whatever brings in the most money.

It isn’t some quirky little showcase of people’s troubles; it’s built to keep you just involved enough to hand over another subscription fee. I keep wondering when we decided that feeling understood had to come with an assessment of how sellable our weak spots might be. If you’ve ever held back with a friend yet poured out the messier parts of yourself to a bot, even once, you’ve taken part in the exchange. I’ve done it too. Most of us have, whether we admit it or not.
The next time that familiar pull shows up, that glow from your screen nudging you to lean on a digital stand‑in, try to pause for a second. What we call care is shifting all the time, and it isn’t only shaped by people building the tech or the ones funding it. We shape it too, every time we go for what feels quick instead of what feels uncertain, or pick ease over the chance of being really understood. The more we hand off closeness to machines, the blurrier our own sense of it becomes. Not necessarily wicked. Just not without a cost.

Sources worth checking include recent reporting from the Guardian, a Stanford HAI piece on risks in mental‑health applications, a blog from the Ada Lovelace Institute on companion tools, and a Harvard Business School study examining whether AI companions can ease loneliness.
Sources
- https://www.theguardian.com/books/2025/dec/28/could-ai-relationships-actually-be-good-for-us
- https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
- https://www.adalovelaceinstitute.org/blog/ai-companions/
- https://www.hbs.edu/ris/Publication%20Files/AI%20Companions%20Reduce%20Loneliness%2011.7.2025_57451c02-8047-4e0d-abfc-55841f64166d.pdf
Comments ()