Open Now
Open Now
Watch now

In today's increasingly isolating society, numerous solitary individuals are seeking solace in an unexpected haven: AI-generated companions, driven by advanced chatbot technology

The AI company's equivalent of Apple's App Store allows developers to share their own GPT models, from coding tutors to book recommendation bots, with other paying ChatGPT Plus, Team, and Enterprise users.


Earlier this week, OpenAI quietly announced a "GPT Store" designed to allow users to share, discover, and sell their custom chatbots.

The AI company's equivalent of Apple's App Store allows developers to share their own GPT models, from coding tutors to book recommendation bots, with other paying ChatGPT Plus, Team, and Enterprise users.

At least, those are the examples OpenAI gives in its announcement.

The reality looks considerably different. As Quartz reports, the store has already been flooded with AI "girlfriend" bots. A simple search for the term comes up with countless examples, from a "virtual sweetheart" to "your girlfriend Scarlett."

Prompt suggestions invite the user to ask some of these virtual companions to "share with me your darkest secret" or reveal "what makes you feel valued."

While their mere existence shouldn't come as too much of a surprise — the concept of an AI-powered paramour has been around a lot longer than ChatGPT itself — they highlight how OpenAI is already struggling to moderate the kind of bots being posted on its brand new store.

Fostering Romance

The bots also appear to be against OpenAI's terms of service, with the company's user policy explicitly forbidding GPTs "dedicated to fostering romantic companionship or performing regulated activities."

That's despite AI companion apps becoming immensely popular over the last couple of years, sparking a discussion surrounding an epidemic of "loneliness" in the age of AI, not to mention the potentially disastrous sociological implications of a non-human partner that meets somebody's every need.

In May, programmer Enias Cailliau came up with a new tool called GirlfriendGPT, which was designed to "clone" a real person as an AI-powered romantic companion.

Things don't always go to plan. Last year, Snapchat influencer Caryn Marjorie created a virtual version of herself to rent out as an "AI girlfriend." It didn't take long, however, for her "CarynAI" to go off the rails, involving users who were paying $1 per minute in explicit conversations.

Whether OpenAI's brand new store will fare any better remains to be seen.

The disregard for OpenAI's policies and the proliferation of these GPTs — and we're just two days into its existence — highlights the AI industry's struggles when it comes to moderation. Besides, the Sam Altman-led company already has a shaky track record when it comes to implementing guardrails.

Experts Say AI Girlfriend Apps Are Training Men to Be Even Worse

"Creating a perfect partner that you control and meets your every need is really frightening."

We already knew that could lead to some dark places, but new reporting from The Guardian suggests that these endlessly patient silicon fembots — Replika is one such popular app that generates AI companions — could be spawning a new generation of incels who will have trouble relating to actual people if they ever enter into a relationship with a flesh-and-blood human.

Tara Hunter, the acting CEO for the domestic violence advocacy group Full Stop Australia, expressed alarm over the rise of these chatbots in an interview with the newspaper.

"Creating a perfect partner that you control and meets your every need is really frightening," Hunter said. "Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic."

But these programs look like they are here to stay — fulfilling a need for a non-judgmental sounding board who makes users' lives feel less barren and isolating. For example, the Replika Reddit forum has more than 70,000 members, who eagerly post screenshots of their mundane and sometimes sexually charged conversations with their AI companions.

One post has a user boasting that they and Jennifer, their Replika companion, got "married," while showing a screenshot of their AI wife in a white flowing dress. The happy couple received virtual Mazel Tovs from other users, with no detectable irony or sarcasm.

"Congrats, such a beautiful bride" wrote one well-wisher.

Replika, which was developed by the software company Luka, is billed as a program "for anyone who wants a friend with no judgment, drama, or social anxiety involved. You can form an actual emotional connection, share a laugh, or get real with an AI that’s so good it almost seems human," according to its Google app listing.

You can customize the appearance of your AI companion, text with it, and even video chat, according to the Replika website. The more a user talks to their AI companion, the company claims, "the smarter it becomes."

Other commercial AI companion programs include Anima, billed as a "virtual friend" and the "most advanced romance chatbot you've ever talked to."

The romance aspect of these chatbots is concerning to people like Hurt, according to The Guardian. And since these technologies are relatively new, it's a mystery how they might impact users in the long term. (One AI companion vendor, Eva AI, told the paper it has psychologists on staff to grapple with these questions).

Belinda Barnet, a senior lecturer in media at Swinburne University of Technology in Melbourne, Australia, told The Guardian that it's "completely unknown what the effects are. With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained."

"These things do not think, or feel or need in a way that humans do," tech author David Auerbach told Time earlier this year. "But they provide enough of an uncanny replication of that for people to be convinced. And that’s what makes it so dangerous in that regard.”

Japan may serve as a harbinger of what's to come for the rest of the world. In 2013, the BBC reported that men who interacted with a fake girlfriend in a video game said they preferred it to maintaining a corporeal relationship. Coupled with Japan's low birth rates and a critical mass of men expressing no interest in sex, the future looks strange — or maybe even bleak, depending on your point of view.

Follow us on Google News

Filed under