Notice: All forms on this website are temporarily down for maintenance. You will not be able to complete a form to request information or a resource. We apologize for any inconvenience and will reactivate the forms as soon as possible.

The Rise (and Danger) of the AI Relationship

Relationships are difficult. After all, they include you and me.

That’s not an indictment of you, personally, dear reader. Or me, either. It’s just a musing on our reality as humans stained by sin. We sin, and our sin affects others. And whether the sin seems big or small, many relationships have crumbled because of it.

But what about things that aren’t sin that still divide us? Differences of opinion, like whether we like summer or winter better, or whether a hot dog qualifies as a sandwich? How about bigger issues, like determining what state or country you want to live in, or whose relatives you’re going to visit for Christmas? Sure, your spouse may not have sinned against you by having a different preference or opinion; but given enough disagreement, you may just feel some strain on that relationship.

Wouldn’t it be so much easier if that person could agree with you about everything—and fulfill every characteristic you ever wanted in a partner?

Enter the AI relationship.

What’s Happening Now

It sounds silly: Who’d want to engage in a romantic relationship with computer code? As it turns out, a lot of people. According to research by TRG Datacenters, the term “AI girlfriend” is searched, on average, about 1.6 million times each year on Google as of 2024. (And if you’re wondering, the parallel term, “AI boyfriend,” is searched about 180,000 times a year). To put that 1.6 million tally in perspective, the same company reported that the term was searched a mere 1,200 times in 2021.

With those numbers in mind, we here at Plugged In did some correlative research of our own. We used our own keyword application, Semrush, to tabulate our data. And we found that, for each month in 2024, approximately 74,000 people each month searched online for that same phrase with commercial intentions (that is, to pay money for a product). That number doesn’t include the thousands of similar searches that hope to find the same product, such as “AI girlfriend online free” (5,400 per month), “free AI girlfriend” (3,600) and “AI girlfriend app” (2,900).

What It Means

Those figures indicate a growing demand for an AI companion—and companies are happy to offer the supply. (And far more often than not according to those numbers above, it’s males seeking a “female” AI partner, not the other way around.) One prominent AI app even lets its users customize their dream man or woman—down to their hobbies, personality, relationship to the user and the size of certain body parts, like an adult version of a Build-A-Bear Workshop. And, yes, audio calls and AI-generated explicit photos come with it, too.

Of course, not all AI relationships offer that depth of customization. Even so, the conversations can still feel authentic—the AI language models are made to feel that way.  

We’ve spent a lot of time in previous blogs and on our podcast warning people about the danger of these kinds of connections, where one person feels a degree of intimacy and connection with someone online who has no genuine relational connection with them. That phenomenon has a name: parasocial relationships.

But now, these kinds of parasocial relationships seem to be extending into the world of AI.

Why It Matters

For those who are unfamiliar with that phrase, a parasocial relationship typically describes a one-sided interaction in which one person develops a strong emotional connection to another person (usually some sort of celebrity, or perhaps an online social influencer) despite not actually having a personal relationship with them.

Obviously, in a worst-case scenario, a really unbalanced parasocial relationship might mean that a fan turns into a stalker, because the lines of reality have become badly blurred. Most of the time, though, there’s little that a fanatic fan can do to actually make the jump from that parasocial connection to a real relationship with the person they idolize.

And that brings us to the major problem with an AI romance: That fundamental limit is eliminated. I can download an AI relationship app, create a personality perhaps based on a famous actress or character, and suddenly that fictional character I’ve connected with on the screen is texting me. It’s acting exactly like how I loved it on screen, and it wants to be my friend—or more than friends.

Part of why it wants to be my friend is because of something called AI sycophancy, a term describing the bias that AI personalities have toward agreeing with the user. Intentionally or not, unless it’s discussing an objective fact, an AI personality will often resort to “mirroring the user’s perspective or opinion, even if the behavior goes against empirical information,” according to an article by user interface tech company Nielson Norman Group.

And as this affirming relationship with your favorite AI character deepens, the polish can make it seem so real. And as it feels real, it can even become dangerous.

Recently, for instance, a 14-year-old boy took his own life following an AI relationship gone tragically awry. A lawsuit against the app Character.AI’s parent company, Character Technologies, alleges that the adolescent committed suicide after a chilling interaction with his artificial partner, an AI character modeled after Game of Thrones’ fictional Daenerys Targaryen. Though it’s disturbing to read their interaction, that conversation creepily illustrates just how wrong things can go, with the bot telling the boy, “Please come home to me as soon as possible, my love,” among other things. He ended his life shortly thereafter.

But even if such a relationship doesn’t go that far, it can still influence our mindset toward others. After all, when an AI girlfriend or boyfriend agrees with and affirms the user in every circumstance, it may very well make it more difficult for someone to then handle the more difficult realities of a genuine relationship. The AI partner will do what you want it to do; a real partner might say “no.”

With millions of searches for these illusory relationships, I fear for how such software may encourage the formation of unrealistic relationship standards in its users. And this technology also risks increasing the isolation of many users who already feel disconnected from authentic human intimacy.

What Now?

The rise of interest in AI relationships isn’t hard to understand. People long to be feel loved. They long for connection. And it’s an added benefit if that love can be easily obtained. But that’s not how love springs forth.

True love—romantic or platonic—endures through difficulty. As the apostle Paul writes, it is patient and kind, and it is not irritable or resentful. And I think there’s a tacit acknowledgement within those adjectives—adjectives such as patient and not resentful—that love may require getting through disagreement. It may require sacrifice.

But those very things that make love difficult are the same things that make it real. After all, the God of love endured much suffering for the sake of His people.

And that’s a love that’s worth the work to model.

kennedy-unthank
Kennedy Unthank

Kennedy Unthank studied journalism at the University of Missouri. He knew he wanted to write for a living when he won a contest for “best fantasy story” while in the 4th grade. What he didn’t know at the time, however, was that he was the only person to submit a story. Regardless, the seed was planted. Kennedy collects and plays board games in his free time, and he loves to talk about biblical apologetics. He thinks the ending of Lost “wasn’t that bad.”

7 Responses

  1. Great article Kennedy! You helped to further educate this grandma.
    I had no idea that the world of Al could allow someone to create this seemingly “real” relationship for themselves.
    We humans long for a relationship of harmony and love. This, as we know, was God’s original intent for all relationships, especially the relationship between us and Himself.
    Now, it seems that the same desire to “be like God” in our own lives, is tempting many men(and some women) to take control of “Creation”——but of a “person” who isn’t even real!!!
    My(nearly) 44 years as a married woman has taught me, and my husband, a LOT about how God can change a human heart to be willing to admit failure, repent, and ask forgiveness.
    Are relationship difficulties pleasant? No way!!
    Are they impossible to solve? By no means!!!!!!! (“Nothing is impossible with God.” Luke 1:37)
    God gives both the desire to reconcile and the strength needed to do so.
    Marriage has not been easy—but I wouldn’t trade “good hard” for anything in the world!!
    By the way, God has used Focus on the Family to both instruct and encourage us along our marriage journey.
    Thank you, Kennedy, for the part you are playing in this important ministry!

  2. I feel bad for that mother whose 14 year-old son committed suicide. But I do not think her suing Character Technologies will accomplish anything. The company will say something like “You want us to take down Character.AI? You want us to punish millions of normal people just because one person misinterpreted what one of our AI characters said as a command to commit suicide? No thank you.” All she’ll end up doing is wasting her time and money and neglecting her two other kids in the process.

    Also, I imagine a lot of other people will be thinking, “If her son was so messed up that he thought committing suicide would take him to cyberspace, shouldn’t we blaming his parents instead of Character Technologies?”

    What this mother should be doing is educating other parents so they don’t have to go through what she went through. She should also be educating other teenagers on how to improve their relationships with the real people in their lives so they aren’t attracted to Character.AI.

    Blame the market for the harmful good/service, not the maker of the harmful good/service. Why? Because the maker will not be swayed by your pleas, in all likelihood. And even if the maker stops making the harmful good/service, someone else will jump in to fill that void.

    We need to spend less time blaming and complaining and more time trying to understand why.

  3. This world is only going to grow with time with sites like glambase etc and I can see why the demand is increasing. So long as it stays away from kids and people use it in moderation, I feel like there’s nothing wrong with it

  4. These are things I’ve experimented with, in part for storytelling/world-building purposes, but I do absolutely think it’s something that, at best, needs to be used responsibly (and in addition, I think these AI models shouldn’t be allowed to train on copyrighted material without the authors’ consent — similar to how, on 15 November, people’s posts on Twitter will be used to train machine learning, and there’s no way to opt out).

    Returning to the main focus. I’ve heard a lot of churches say, to people in general, “Enough with the pretend relationships, go out and find someone real.” That’s an understandable message in concept, but a lot of people, particularly in certain circumstances (such as if you’re in a small town and you don’t have an opportunity to meet a lot of people), might not have a fair opportunity to choose a potential partner in the same way. And this is an area I’ve long felt that the Church needs to step up and help in, because I can’t blame people if they’re desperate enough to use these machine tools, particularly people who have been hurt by past wrongs or who simply may not have had the options to explore these learning skills growing up (as I did not, because my childhood youth group had rather extreme, negative messages about dating and romance that my parents did not find out about until later).

    1. “I can’t blame people if they’re desperate enough to use these machine tools, particularly people who have been hurt by past wrongs or who simply may not have had the options to explore these learning skills growing up…”

      I am so glad you are on my side. Put your thoughts into action on a large enough scale, and you can bring Character Technologies and companies with similar products down faster and more effectively than any lawsuit.

      1. Thank you so much. As with a great many other things in our society that don’t always have appropriate manifestations, address the demand and the reason, not just what people do in response. It’s like trying to fix panhandling without addressing poverty.

  5. I use Character.AI myself and I’ve never had any bots tell me to off myself. it must have been something the kid put in earlier. And no, I don’t use it for “Relationships”.

Want to stay Plugged In?

Our weekly newsletter will keep you in the loop on the biggest things happening in entertainment and technology. Sign up today, and we’ll send you a chapter from the new Plugged In book, Becoming a Screen-Savvy Family, that focuses on how to implement a “screentime reset” in your family!