Social Media Platform US States are struggling to navigate a patchwork of laws that require them to verify the age of users and give parents more control over their children’s accounts.
States including Utah and Arkansas have already passed child social media laws in recent weeks, and similar proposals have been put forward in other states such as Louisiana, Texas and Ohio. The legislative effort is designed to address fears that online platforms are harming the mental health and well-being of children and teens amid a rise in teen suicides in the US.
But critics — including platforms, as well as some children’s advocacy groups — argue that the measures are poorly designed and fragmented, potentially leading to a raft of unintended consequences.
A senior staffer at a major tech company who leads his state’s legislative policy described the patchwork of proposals as “nightmarish.” [and] Absurd, if not Kafkaesque”.
“Being able to confidently prepare for this is a daunting task,” the person said, describing it as “engineering lift”. The person added that their legal teams are figuring out how to interpret the various rules and the risks associated with them.
There is a growing body of research linking heavy use of social media by children and adolescents to poor mental health, prompting calls for better protection of children from toxic content.
Republican Utah state representative Jordan Teisher, who was the state bill’s House sponsor, said it was created in response to a number of studies showing “some really devastating effects of social media on teenagers.”
“We strongly believe that parents know best how to care for their own children. Parents were coming to us saying, ‘I need help,'” he said of the decision to introduce the law, which will take effect in March 2024.
Utah law requires social media platforms to verify the age of all state residents and then obtain parental consent before allowing people under 18 to open an account. Additionally, platforms must give parents access to those accounts, and are prohibited from showing ads or targeted content.
Governments and regulators around the world are rushing to introduce legislation, with both the UK’s Online Safety Bill and the EU’s Digital Services Act obliging social media companies to protect children from harmful content.
In the US, a new federal proposal, the Kids Online Safety Act, was introduced by US Senators Marsha Blackburn, a Republican, and Richard Blumenthal, a Democrat, which would put a duty of care on the platform to protect children. Earlier this year, Republican Senator Josh Hawley also introduced a bill that would implement a minimum age requirement of 16 for social media users.
Social media platforms and experts agree that federal laws would be most effective in imposing a uniform nationwide standard. But a flurry of state laws in the meantime has forced platforms to adapt.
Zaman Qureshi, co-chairman of the Yuva Alliance, which advocates for safer social media for young people, said states have turned into “two lanes” in taking action on the issue. In one, several Democratic-led states such as California have focused on regulation aimed at “forcing technology companies to make design changes to their products to better protect minors”, he said. In others, a large number of Republican states have focused on the role of parents.
A common theme in Republican state legislative efforts is to require platforms to conduct age verification for all users. This also paves the way for another requirement in some states for platforms to obtain the consent of children under 18 before allowing them on their apps, and in some cases to allow parents access to their child’s accounts.
Given the lack of specificity in the draft measures, according to multiple people familiar with the matter, the platform has been left confused about how to gather parental consent, whether it could be a simple check-box exercise or whether companies would require it. For example, collect a copy of the birth certificate.
Academics and advocacy groups have also raised questions about the free speech and children’s privacy the laws are designed to protect. Qureshi warned that certain state regulations could leave out LGBT+ children whose families do not support them.
“What an active parent means is very different for every child or every young person,” he said.
The age verification mandate poses some major challenges for companies. Verification for age, which typically involves requesting ID or using age estimation through face scanning technology, will result in underage users being removed from the platform, thereby impacting advertising revenue. If ID is the main method for verification, critics warn that not all minors have access to official identification. Also, age range estimation is an inexact science.
For instance, Arkansas, whose law goes into effect in September, has mandated platforms use third parties to verify age, citing concerns about whether there are enough tools to manage demand.
Yoti, a small British provider of age verification technology, is already used by Meta’s Instagram and Facebook dating, the company said. TikTok is also weighing using the technology, according to two people familiar with the matter. One of the largest companies offering age verification technology is MindGeek, owner of pornography sites PornHub and RedTube, according to two tech policy employees.
Meanwhile, social media platforms including Meta and Snap have begun pushing the idea that age verification should be managed through app stores where they are downloaded or at the device level – for example, on the Apple iPhone.
Meta said the company has already developed more than 30 tools, including parental supervision tools for teens and families. “We will continue to evaluate the proposed legislation and work with policymakers on these important issues,” the spokesperson said.
Snap, which also developed parental controls, said it was in discussions with industry peers, regulators and third parties on how to address the age verification challenge. TikTok said it believes “industry-wide collaboration” is needed to address the issue.
However, some children’s advocacy groups argue that the law’s focus is wrong. “The theme puts it on parents and gives more parents more rights. . . It says there’s no need to change the platform,” said Josh Golin, executive director of the nonprofit FairPlay. “Really, we think what we should be focusing on is making platforms safer and less exploitative of children.”