Search

Facebook Under Fire, Failed to Protect Kids’ Privacy

Facebook and Twitter Traffic to Major News Sites in Dramatic Decline

Facebook is under fire from US regulators for failing to protect the privacy of children using its Messenger Kids app, saying the social media platform misled parents. Added to its charges is misrepresenting the access it provided to app developers to private user data, prompting the Federal Trade Commission (FTC) to propose some changes on privacy.

The FTC on Wednesday proposed sweeping changes to a 2020 privacy order with the company, now known as Meta, that would prohibit it from profiting from data it collects on users under 18.

Also read: Presidential Hopeful RFK Comes Out in Support of Crypto

The company would also be subject to other limitations, including its use of face recognition technology and required to provide additional privacy protections for its users.

According to the FTC, this would include data collected through its virtual reality products. The FTC says Meta has failed to fully comply with the 2020 order.

“Facebook has repeatedly violated its privacy promises,” said FTC’s Bureau of Consumer Protection director Samuel Levine.

“The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”

Facebook and broken promises

FTC also said Facebook, from late 2017 until 2019 misrepresented that parents could control whom their children communicated with through the Messenger Kids app.

Despite the company’s promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, the children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls, according to the FTC.

Facebook launched Messenger Kids in 2017, as a way for children to chat with family and friends, approved by their parents. The app does not give kids a separate account but works as an extension of a parent’s. Therefore, parents get controls, such as the ability to decide who their kids communicate with.

At the time of launch, the company indicated Messenger Kids would not show ads or collect material for marketing, though it would collect some data, which was said to be necessary to run the service.

Politically inspired move

In a prepared statement, Meta said the announcement was politically motivated, indicating efforts had been made to engage with the FTC with no success.

“Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory,” said Meta.

“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil,” added the company.

Responding to FTC’s announcement on Twitter, one user identified as Lordiori described the move as an ideological warfare.

“You just want to ban everything related to American companies. An endless ideological battle… if it’s BIG Technology, the hand of the blockade will shake,” said Lordiori.

Others feel the FTC has been unfair and deserving to be investigated themselves.

“Seriously?!?!?!?! Shouldn’t you be more concerned about what Tik Tok is doing?” questioned Mayma.

Meta, which has 30 days to respond to the FTC action said they will “vigorously fight” the FTC decisions and expect to win.

Facebook criticized

Despite Meta’s stance, experts from various groups have criticized the company. In 2018, a group of 100 experts, advocates, pediatricians, educators, and parenting organizations contested Facebook’s claims that the app was filling a need kids had for a messaging service.

“Messenger Kids is not responding to a need – it is creating one,” said the letter.

“It appeals primarily to children who otherwise would not have their own social media accounts,” reads another passage of the letter, while another criticized the social media company for “targeting younger children with a new product.”

In response to the letter at the time, Facebook said the app “helps parents and children to chat in a safer way,” emphasizing parents were in control of their kids’ activities always.

However, the FTC says this has not been the case. The 2020 privacy order, which required Facebook to pay a $5 billion fines, required an independent assessor to evaluate the company’s privacy practices.

According to the FTC, the assessor identified “several gaps and weaknesses in Facebook’s privacy program.”

Good intervention

Other critics backed FTC for this intervention blaming Facebook for violating users’ data privacy.

“They have Always violated peoples’ legal rights. Tracking/Spying on devices u didn’t even Use facebook/Instagram on… Violating Every agreement Ever made #Spying_on_people #Meta #Criminal behavior,” tweeted Mr. Stocks Freeman.

Center for Digital Democracy executive director Jeffrey Chester applauded the FTC’s action calling it a long-overdue intervention, in what has become a “huge national crisis for young people.”

Chester said Meta, with its platforms like Facebook and Instagram “are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and well-being of children and adolescents.”

He added the company is unleashing “even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards.

 

 

 

 

 

 

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×