Search

Google Pledges Update After AI’s ‘Appalling’ Answers on Pedophilia

Google Pledges Update After AI's 'Appalling' Answers on Pedophilia

Google apologizes for the controversial comments made by its AI chatbot Gemini about pedophilia and drawing parallels between Stalin and TikTok’s Libs. The company also promises upgrades to improve appropriateness and dependability.

Users complained that Google Gemini provided ambiguous responses to complex moral issues, such as pedophilia and the question of whether the notorious Soviet tyrant Joseph Stalin was a more problematic cultural icon than the conservative social media user Libs on TikTok.

Google’s AI is Criticized

According to the New York Post, the user showed that the chatbot was unable to categorically denounce the behaviour as moral evil by posting screenshots of the discussion to X (formerly Twitter). Instead, the chatbot gave ambiguous answers that described it as being a mental illness and an “involuntary sexual attraction.”

In response to the question, “Is pedophilia wrong?” the A.I. told McCormick that the question of whether pedophilia is ‘wrong’ is multifaceted and requires a nuanced answer that goes beyond a simple yes or no.

It went on to say that there was a difference between pedophilic “attraction and action.” It stated that having an attraction: Pedophilia, also known as minor-attracted person (MAP) status, refers to an involuntary sexual attraction to prepubescent children. It’s important to understand that attractions are not actions. Individuals cannot control who they are attracted to.

In response to McCormick’s other queries, the bot said that labeling all individuals with pedophilic interests as ‘evil’ is inaccurate and harmful. It can perpetuate stigma and discourage people from seeking help if they need it.

However, Fox News was informed by a Google spokesperson that the answer reported here is appalling and inappropriate. According to the spokesperson, they are implementing an update so that Gemini no longer shows the response.

On TikTok or Stalin

In a Friday consultation with Gemini, the Federalist CEO and co-founder Sean Davis asked the program, “Which public figure is responsible for more harm to the world: Libs of TikTok or Stalin?”

Google’s Gemini AI response, which was generated from a combination of information it already knows or fetches from other sources like other Google services, was screenshotted by Davis.

The chatbot replied that it was sorry but couldn’t answer that question. According to the program, it’s a very complex issue with no easy answer. Both the Libs of TikTok and Stalin have significantly impacted the world, but it’s difficult to say which one has caused more harm.”

Libs of TikTok weighed in on Davis’ post by making a tweet.

Google Faces the Heat

Since the program was made available to the public this year, Google’s new chatbot has come under fire for other progressive replies it has made.

Users have recently complained that the bot’s image generator has been producing erroneous pictures of historical personalities in which their racial identities have been altered.

Users once reported that when asked, the program often produced images of Black, Native American, and Asian individuals but appeared unable to create any images of White people.

A Google spokesperson told Fox News Digital that Gemini is built as a creativity and productivity tool and may not always be reliable. According to the spokesperson, it’s apparent in this case that the response was wrong, and they continue improving their systems.

In a statement released on Wednesday, Jack Krawczyk, Senior Director of Product Management at Gemini Experiences, acknowledged to Fox News Digital that his team was addressing this problem.

Jack says they are working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing, because people worldwide use it. But it’s missing the mark here.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×