Search

Worrisome Racial Bias as AI Tools Alter Skin Tones on Professional Images

Worrisome Racial Bias as AI Tools Alter Skin Tones on Professional Images

While generative AI has garnered praise for its ability to create images, videos, or audio, the same tech has also attracted attention for wrong reasons like racial bias. 

The tech has been criticized for distortions and some have been described as racially biased. An array of women of color who asked AI to generate professional headshots indicated the technology distorted their facial features, and skin tone, and messed up their hairstyles.

From Asian to white

Rona Wang, a 24-year-old Asian Massachusetts Institute of Technology (MIT) graduate was shocked to see her AI-generated image showing a different person. She prompted Playground AI image editor for a more professional image she intended to use for her LinkedIn profile.

She uploaded her original picture to the image editor with instructions “Give the girl from the original photo a professional LinkedIn profile photo.”

To her surprise, she got an image of a white woman with dark blonde hair and blue eyes.

“My initial reaction upon seeing the result was amusement,” she told Insider.

“However, I’m glad to see that this has catalyzed a larger conversation around bias and who is or isn’t included in this new wave of technology,” she added.

Wang, who took to X platform to express her displeasure caught the attention of Playground AI founder Suhail Doshi who said the company was not happy with the result and would solve it, although it would take more “effort than something like ChatGPT.”

Responding to Wang’s tweet, he said “The models aren’t instructable like that so it’ll pick any generic thing based on the prompt. Unfortunately, they’re not smart enough.

Also read: PayPal’s Stablecoin a Stunt to ‘Profit From The Crypto Hype’

The misfiring

Unlike forking out as much as a thousand dollars for a photo shoot, users are turning to AI-powered image generators to edit their images and give them a professional look.

However, according to the Wall Street Journal, women of color say the tools are misfiring way beyond the well-known mishaps such as six fingers on hands or too many teeth.

New Orleans-based Danielle DeRuiter-Williams said she used AI SuitUp and was shocked to see the tool narrow her nose as well as lighten her skin.

“It was just more comical than anything,” said the 38-year-old diversity, equity and inclusion specialist.

“And just a demonstration to me of how in the nascent stages a lot of this technology is, especially when it comes to its adaptability to non-white people,” she added.

In another instance, Nicole Harris used Secta Labs to create an image for her website. Instead of reflecting her ‘cultural’ clothing style, the AI tool produced several images showing her with a bindi although she is not Hindu or South Asian. Other images showed her with “ethnically ambiguous adornments.”

Recurring problem

AI racial bias has been topical with the boom in generative AI. However, the bias goes beyond facial or physical appearances. Studies have shown AI tools also struggle to pick the speech patterns of nonwhite people.

A study also recently revealed that chatbot detection tools are biased against non-native English speakers after mislabeling essays originally composed by non-native English speakers as AI-generated further raising questions around bias and fairness.

ChatGPT maker OpenAI has acknowledged the racial bias and said the company was working on improving its tools to reduce bias against any group of people.

“Bias is an important industry-wide problem,” OpenAI spokesperson Alex Beck told New York Times by email, adding the company would try “to improve performance, reduce bias and mitigate harmful outputs.”

Last month, OpenAI together with other tech companies including Google, Meta, Amazon, Anthropic and Inflection voluntarily agreed with the White House on responsible AI development. One of the commitments is to develop tools that do not produce harmful material or are biased against any group of people.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×