Tech Platforms Struggling to Verify Users’ Age

Tech Platforms Struggling to Verify Users' Age

As parents and lawmakers become increasingly worried about how children and teenagers use online services, social media and streaming platforms are grappling with the challenge of determining the most effective methods for verifying a user’s age, reports AXIOS.

The minimum age limit for social media use varies by platform and location. In many countries, the minimum age limit for social media use is 13 years old. This is due to the Children’s Online Privacy Protection Act (COPPA) in the United States, which requires websites and online services to obtain verifiable parental consent before collecting personal information from children under 13.

Nevertheless, some social media platforms have a higher minimum age limit. For instance, in Europe, the General Data Protection Regulation (GDPR) sets the minimum age limit for social media use at 16, although specific countries can lower it to 13.

However, those under 13 could easily lie about their age to get access to social media and the content therein.

Recently enacted laws in the United Kingdom and California have pushed companies to try new processes for ensuring underage users aren’t getting onto sites and services meant for older people.

Also Read: FBI Investigating Snapchat over Its Role in Fentanyl Crisis

In response to growing complaints about mental health harms, privacy violations, and other issues affecting children, age verification and age estimation are just a few measures being taken to enhance the safety of technology for kids.

“Age verification is something that seems like a very reasonable, very easy ask, but an online world creates all sorts of problems,” said Cody Venzke, senior counsel at the Center for Democracy and Technology’s Equity in Civic Technology project.

Utah limits minors’ social media use

A bill in Utah proposing restrictions on social media use by children and teenagers without parental consent and requiring adults to verify their age is now heading to the governor’s desk.

The proposed law is one of many attempts across individual states to address concerns over the negative impact of technology on children’s mental health, privacy, and safety.

Following the successful passage of SB 152 in the Utah Legislature, Governor Spencer Cox announced on Friday, the last day of the 2023 general session, his intention to sign the bill.

The state was “holding social media companies accountable for the damage that they are doing to our people,” stated Cox.

As a result, the state’s Division of Consumer Protection would determine the method for verifying users’ ages and obtaining parental consent.

Tech Platforms Struggling to Verify Users' Age

‘Industry should support raising age limit’

Experts say the industry should support updates to COPPA that raise the age limit to keep up with modern times, even if the measure wouldn’t be a catchall for keeping kids safe online.

“We’re dealing with such a substantially different internet experience now, compared to in the 1990s when we had very primitive types of advertising,” said Jennifer King, a privacy and data fellow at Stanford’s Institute for Human-Centered Artificial Intelligence.

She said the age limit of 13 is both “problematic” and “arbitrary.” Increasing the age limit may help parents keep an eye on kids.

“If we increase the age, we’re banning behavioral ads, it’s still going to really help parents who are trying their best to keep an eye on what their kids are seeing,” stated Irene Ly, the policy counsel at Common Sense Media.

As concerns about the safety of children online continue to rise, social media and streaming platforms are under pressure to implement effective age verification measures to prevent underage users from accessing their services.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.