A California Assembly member representing Long Beach is introducing bills to keep kids under 16 off social media platforms and more tightly regulate companies that operate them after two juries recently ordered Meta and Google to pay hundreds of millions for harming children who sued them alleging their platforms are addictive and unsafe.
Democratic Assemblymember Josh Lowenthal said the recent landmark court verdicts in New Mexico and Los Angeles “reaffirm our need to keep kids safe online.” Yet while the rulings “are a good start,” they don’t protect children aside from the plaintiffs. Lowenthal aims to change that, demanding oversight of the multi-trillion-dollar industry of social media.
In recent years, he has proposed legislation to hold social media platforms accountable when they are found to have knowingly harmed children. Now, Lowenthal is taking that effort a step further, presenting two companion bills that forbid social media companies from granting access to children under 16 and create an oversight commission to establish safety standards.
Lowenthal said he is adopting the “Australia model,” referencing the country’s under-16 social media ban, which took effect in December 2025 and restricts minors’ access to major platforms like TikTok and Instagram and levies hefty fines against companies for noncompliance.
Lowenthal’s proposed age-gating legislation has gained bipartisan support in the state assembly, and Gov. Gavin Newsom has spoken out in favor of it. “The momentum is there,” said Gwen Shaffer, a faculty member at Cal State Long Beach who teaches classes on internet regulation. (Shaffer also serves on the board of directors for the nonprofit that owns the Long Beach Post. She has no direct say in editorial decisions.)
“If there’s a possibility of an age verification law passing in the U.S., California is probably the place where it’s going to pass,” she said.
Yet using technology to verify users’ ages before granting access raises privacy issues, Shaffer said, adding that more clarity is needed around what system will be in place and how long users’ data will be retained.
Others have raised concerns about the balance between protecting young users from harm without infringing on their First Amendment rights, “the central dilemma of the digital age,” according to Jason Shepard, dean of the college of communications at Cal State Fullerton.
“California often sets the tone nationally, so if this law survives legal challenges, it could become a model,” said Shepard.
In the absence of federal regulation, California has moved to crack down on kids’ use of social media in the past. The state has passed laws that restrict social media platforms from providing addictive, algorithmic feeds to minors without parental consent; force platforms to respond when adults report content that threatens kids’ safety and display warning labels when minors use social media.
But Lowenthal’s legislation would be the state’s most direct attempt to limit young users’ use of social media. He called it a necessary “draconian measure.” Lowenthal hopes that because courts have already ruled that social media platforms addict and harm minors, platforms may be less likely to challenge legislation like this. Alternatively, platforms may just “double down on the party line” that they already protect young users, Shaffer said.
Regardless, many see an urgent need for action, especially when teens spend an average of 4.8 hours on social media daily, and those with the highest social media use rate report poor mental health, according to the American Psychological Association.
“I have a laboratory right in front of me,” Lowenthal said, referencing what he observes in his own children: the impacts on socialization and school performance, the inability to be present.
“This is my life’s work,” Lowenthal said, referencing his 25 years as a tech executive when he focused on maximizing engagement and shareholder value. “It wasn’t my mandate to consider impacts,” he said. Since becoming a legislator, “I became squarely focused on impacts.”
Many of the risks of social media have come into focus, yet in designing legislation, “the challenge here isn’t just identifying actual harms, but it’s crafting a solution that protects young people without giving the government too much control over who can speak, what they can access and when,” Shepard said.