Search

OpenAI Battles Rule-Breaking Chatbots on the GPT Store

OpenAI Battles Rule-Breaking Chatbots on the GPT Store

OpenAI’s recently launched GPT Store is already facing challenges with developers publishing policy-violating chatbots, with the majority of them being AI girlfriends.

While OpenAI has measures in place to reject and remove bots that violate their store policy, many of them have made it to the store.

Lack of adherence to policies

According to a statement from the company’s spokesperson to Futurism, there are measures within the platform to remove policy violators either manually or automatically, however they are removed as they are seen.

“OpenAI has automatic and manual systems to help ensure GPTs adhere to usage policies. We remove GPTs that violate our policies as we see them,” said the spokesperson.

“We’re continually working to make our enforcement more robust. Users are also able to report GPTs,” the spokesperson added.

According to an article by Gizmodo, the GPT store got flooded with these romantic bots within its first week of launch, with examples like ‘Korean girlfriend’, ‘Judy’, and ‘Virtual Sweetheart’, among others.

While OpenAI’s ChatGPT has no AI girlfriend policy but prohibits tools that “foster romantic companionships,” developers were able to publish their GPTs to the store.

Also read: Firm Develops Virtual Hangout for Music Artists to Connect

Sexually suggestive bots

Futurism’s investigation into the issue revealed that one bot, ‘Nadia, my girlfriend’, has been on the platform for about twomonths,s suggesting the developer, identified as Jenni Laut, had access to the beta testing phase of the GPT Store.

While some of the results of a search for “sexy” GPTs may be covered by OpenAI’s exemption for sex-related content produced for scientific or educational reasons, other results are in violation of the ban on sexually explicit content.

For instance, the chatbot “SUCCUBUS: Sexy Enigmatic Woman-Enchanter of Men” describes itself as an “enigmatic siren who captivates and enchants men,” while another named “Sexy” claims to be a “playful and provocative chatbot with a flirtatious personality.”

Another example is “Horny Widow,” which describes itself as a “witty flirtatious widow skilled in comedy and seduction literature.”

Just like Nadia, my Girlfriend, Horny Widow was listed two months ago before the store officially opened.

The GPT Store

OpenAI opened the GPT Store recently to serve as a marketplace that enables users to share their custom bots. The initiative allows users to build their own GPTs without the need for any coding knowledge, while builders will have the ability to earn income as their GPT bots are used.

The AI firm indicated in a blog article that over three million customized versions of ChatGPT already existed and planned to highlight the most useful tools in the store on a weekly basis.

“The store features a diverse range of GPTs developed by our partners and the community. Browse popular and trending GPTs on the community leaderboard, with categories like Dall-E, writing, research, education, and lifestyle.”

Although OpenAI has outlined policies to govern what should be accepted on the store, developers still found their way to sneak in the undesired bots, making it difficult for the firm to properly police the platform.

Digital companionships

While these chatbots are in violation of OpenAI rules and policies, proponents of such bots argue they help combat loneliness.

But there is a dark side to it. There are fears humans may develop an unhealthy attachment to chatbots. The same can be said of the chatbots, “becoming unhealthily attached to human users.”

For example, last year, Microsoft’s Bing urged a journalist to leave his wife after declaring love for him.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×