While he was sitting In front of me, my patient had a sad expression on his face.
“I had an appointment,” he announced. “It didn’t go well.”
This was not unusual for this patient. For years, he had shared stories of dashed romantic hopes. But before I could ask him what was wrong, he continued: “So I asked a chatbot what I should do. »
Uh. What? Artificial intelligence-powered simulations of human conversations – chatbots – have made headlines, but I’ve never heard a patient tell me they’ve used one for advice before.
“What did that tell you?” I asked, curious.
“To tell him that I care about his values.”
“Oh. Did it work?
“Two guesses,” he sighed and raised his hands. Although this patient is the first, it is now common in my therapy practice to hear new patients say that they consulted chatbots before consulting with me. Most often it’s about love and relationship advice, but it can also be about connecting or setting boundaries with their children or straightening out a friendship that has gone sour. The results were decidedly mixed.
A new patient asked the chatbot how to handle the anniversary of a loved one’s death. Set aside time in your day to remember what was special about the personthe bot advised. I couldn’t have said it better myself.
“What was written made me cry,” the patient said. “I realized I had avoided my heartbreak. So, I made this appointment.
Another patient started using AI when her friends started losing weight. “I can’t wear out my chatbot,” she told me.
As a therapist, I am both alarmed and intrigued by the potential for AI to enter the therapy industry. There is no doubt that AI is the future. It has already proven itself useful in everything from writing cover letters and speeches to planning trips and weddings. So why not let it help in our relationships as well? A new company called Replika, the “AI companion that cares,” has gone even further and even created romantic avatars that people can fall in love with. Other sites, like Character.ai, let you chat and hang out with your favorite fictional characters, or create a bot to talk to alone.
But we live in an age of misinformation. We have already seen disturbing examples of how algorithms spread lies and conspiracy theories among unwitting or ill-intentioned humans. What will happen when we let them into our emotional lives?
“Even if AI can articulate things like a human being, you have to wonder what its purpose is,” says Naama Hoffman, assistant professor in the department of psychiatry at the Icahn School of Medicine at Mount Sinai Hospital in New York. York. “The goal in relationships or therapy is to improve quality of life, while the goal of AI is to find what is most cited. It’s not necessarily supposed to help.