I went to a conference where a speaker from Meta stood up and pitched AI companions for the elderly. He was genuinely excited about robots helping with loneliness.

The audience raked him over the coals during Q&A.

But before we debate whether AI companions are morally right or wrong, there’s a simpler question worth asking:

Does it even work?

In today’s newsletter:

  • What happens when you swap human texting for a chatbot?

  • 3 questions about the use of AI companions.

  • Can I get your feedback on something?

Can chatbots actually make you less lonely?

Researchers at the University of British Columbia set out to answer that across two studies.

In one, they took nearly 300 first-year college students and split them three ways: one group texted a randomly assigned fellow student every day for two weeks, one group chatted daily with a ChatGPT-powered Discord bot, and a control group journaled a sentence about their day.

In the other, they tracked 2,000 adults across four surveys over 12 months, measuring how chatbot use and loneliness shaped each other over time.

Here's what they found:

1. Chatbots offer short term emotional relief… similar to fast food.

Daily chatbot use reduced negative mood compared to journaling, and that finding holds up across earlier research too. Brief chatbot interactions produce immediate emotional benefits that look, in the moment, comparable to talking with a human.

Fast food, or junk food that we like, also makes us feel good in the short term. It affirms our sense of taste. Chatbots are built to affirm us emotionally... but it is a robot.

Keep reading.

2. But over time, more chatbot use led to more loneliness.

The 12-month study found that heavier social chatbot use at one point predicted higher emotional isolation four months later.

And in the student experiment, two weeks of daily chatbot conversations did nothing to reduce loneliness, doing no better than the journaling control. Only the students paired with another human showed a meaningful drop.

Short-term relief seems to compound the problem.

3. Curing loneliness requires two-way empathy

In the student study, the chatbot was actually rated as more empathetic than the human partner. But the participants, themselves, expressed less empathy talking to the bot than they did talking to a peer.

The researchers argue that what actually moves the needle on loneliness is being a source of care for someone else. And that requires another person on the other end who actually needs you, too.

It is better to give than to receive, and this principal holds true in emotional support.

3 Questions On The Use Of AI Companions (I Want Your Answer To The Last One)

I can think of at least four times in the last year that I’ve heard someone say, "ChatGPT is my best friend."

It was said half as a joke, but also kind of like a recommendation. Like, hey you should try this! And not as a tool, but as a… companion.

Don’t get me wrong. I’m not anti-AI. I use it at work, and its great at performing certain tasks.

But using it as a stand in for human companionship?

That gives me pause.

I want my opinions on the matter to be informed by research rather than giving into alarmism or sensationalism.

The research is still early, but here are three questions I’m asking right now:

1. When should children be introduced to chatbots (and by whom)?

Last month, my fourth-grade daughter came home and told us her class had used ChatGPT. I was upset and so were other parents. Schools are moving fast on AI adoption, and the conversation about how to introduce it to kids is lagging behind.

When kids are given a computer interface that responds like a person and they can ask about anything… they can ask about anything.

2. What does it mean when a chatbot feels safer than a trusted adult?

A Common Sense Media study found that a third of kids who use AI companions said they'd rather go to a chatbot than a trusted adult when they have something hard to talk about.

This is a great study and one I think every parent with a teen or pre-teen should understand. We will be doing a deeper dive in a future newsletter.

3. The question to you. What platforms are are your kids or students using that I don't know about?

After early instances of ChatGPT being found guilty of endorsing suicidal thoughts, OpenAI has invested heavily in safety.

But ChatGPT is just one platform. There are dozens of AI chatbot products built on open-source models with no meaningful safety standards, and kids are finding them the same way they've always found things online: a friend sends a link.

Most parents are still only worried about about ChatGPT. But their kids have already discovered platforms like Character.AI and Replika—which is literally marketed with phrases like: “Your AI friend. Always here to listen. Always on your side.

I have investigated these platforms and they are not even remotely safe for children or adults.

If you're seeing other chat tools show up at home, in the classroom, or in your counseling work please reply to this email and let me know!

My 7 year old son got a puppy puppet for Christmas (try saying “puppy puppet” three times fast).

At mom’s prompting and idea, I built a puppet show stage out of cardboard and old 2x4s and we delivered a full show of the Good Samaritan to some of the neighbor’s little kids.

It was fantastic. 🤌

Studies Referenced

The UBC study — Ruo-Ning Li and colleagues at the University of British Columbia ran the two-week experiment with first-year students, published in the Journal of Experimental Social Psychology (2026). Link

The year-long study — Dunigan Folk and Elizabeth Dunn (also at UBC) tracked 2,000 adults across four surveys over 12 months. Forthcoming in Psychological Science. Link

Keep Reading