Meta Platforms has integrated AI into its various platforms, but this has come with its own headaches, the proliferation of explicit ads on the platforms.
And Meta seems to be struggling to contain these AI apps with explicit ads, that also violate the social media giant’s policy on adult content advertising on its platforms.
Thousands of AI “girlfriend” ads
Since introducing its AI assistant Meta AI on platforms like Facebook, WhatsApp, Instagram and Messenger, users cannot turn off AI from their apps.
This has also seen an increase in AI “girlfriend” ads that are explicit.
An investigation by Wired revealed there are about 29,000 explicit AI “girlfriend” ads that were published on the various Meta Platforms.
According to Wired, at least 19,000 ads have the term “NSFW,” while 14,000 have “NSFW AI.”
The ads promote chatbots offering “sexually explicit images and texts,” as well as images of partially dressed women “unbelievably shaped.”
The report further highlights that a search on Meta’s ads library, which shows all ads on its platforms, revealed ads from last year prior integration of Meta AI on the platform were on political, social, and related to elections.
Meta’s policy on adult content
The ads violate the social media giant’s policies on adult content advertising which bans ads that “contain adult content, such as nudity and depictions of people in explicit or suggestive positions, or activities that are overly suggestive or sexually provocative.”
Platforms like Facebook and Instagram have community guidelines that ban nudity and anything the platform sees as “offering sexual services.”
Additionally, they also ban sexual language in instances of sexual solicitation “or perceived solicitation, even ‘commonly sexual emojis.””
Also read: Alphabet Exceeds Revenue Forecasts As AI Fuels Cloud Growth
A backlash from some communities
However, there are communities that also feel the policies are unfair on their part and their cause.
For instance, sexual workers, sex educators, erotic artists, LGBTQ users, and the like have for years complained that Meta “unfairly targets their content and accounts” with its policies.
Instagram shadowbans LGBTQ and sexual educator accounts while WhatsApp bans sexual workers accounts.
As such, these users have indicated they feared targeting under Meta community guidelines.
According to Mashable, an experiment by Unbound, a sexual wellness brand revealed that Meta rejected sex toy ads that were targeted for women while it approved those targeting men.
The irony
In November last year, Mashable also reported Meta allegedly “rejected a period care ad for being adult or political.”
Ironically the same platforms have allowed these NSFW AI girlfriends ads to “skate through” the platforms.
Meta spokesperson Ryan Daniels told Wired that the company was reviewing those ads and would remove anything that violated its policies.
“When we identify violating ads we work quickly to remove them, as we’re doing here,” he said.
“We continue to improve our systems, including how we detect ads and behavior that go against our policies.”
At least 2,700 AI “girlfriend” ads were active when Wired contacted Meta about the matter.
Industry problem
This is not the first time that advertising for explicit apps has appeared on Meta as some deepfake ads of celebrities have featured, including that of Jenna Ortega.
The challenge is not prevalent with Meta Platforms alone. Apple has also started a crackdown on AI powered deepfake apps on its App Store. According to Firstpost, some of the apps claim to offer features that include face swap on adult images. Others promise features that “digitally remove clothing from photos.”
Apple acted after 404 Media flagged the ads resulting in the company removing three apps from its App Store that use AI to generate non-consensual nude images.