Here Are The California Social Media Regulations Awaiting the Governor's Signature
Kristin Snyder is dot.LA's 2022/23 Editorial Fellow. She previously interned with Tiger Oak Media and led the arts section for UCLA's Daily Bruin.
Having passed through the state Senate, two California bills targeting social media content are awaiting Gov. Gavin Newsom’s signature.
The California Age-Appropriate Design Code Act and Social Media Companies: Terms of Service received bipartisan support. Here’s how the legislation seeks to protect minors using the internet and how it fits into a growing trend of social media regulation.
The California Age-Appropriate Design Code Act
AB 2273 targets web servers “likely to be accessed by children.” The act requires server design codes be age-appropriate and platforms will not be able to profile young users or utilize personal information in harmful ways. Under this law, companies are required to tighten data privacy measures—high-privacy settings will become the default and privacy policies will be written in language accessible to children.
Age-appropriate design is meant to protect children’s data. Safeguarding young people has been an increasing concern in digital spaces, as seen in recent attempted legislation. Earlier this August, a bill allowing social media companies to be sued over features that allegedly addict children to their platforms died in the Senate Appropriations Committee.
Data privacy is also facing increased scrutiny, especially regarding law enforcement accessing social media content. California is also currently considering a law specific to biometric data that could shape features offered by social media platforms. At the federal level, The American Data Privacy and Protection Act is under debate in Congress—it would force big tech companies to minimize the amount of data they collect.
Social Media Companies: Terms of Service
AB 587 requires social media companies to make their content removal policies public. Companies have to report what content is allowed on their platforms and how it is monitored to the state attorney general. Intended to stanch the flow of extremism across social platforms, opponents believe it could encroach on freedom of speech.
Content moderation has become increasingly important as harmful content and misinformation spreads online. But moderators for companies like TikTok and Meta have complained of workspaces that do not provide proper support, while Twitter’s former head of security said Spaces lacks strong moderation.
The bill could improve content moderation working conditions by requiring companies to disclose how they train employees. They would also have to reveal how both human employees and technology detect inappropriate content.
In the Works
The SB 1018 Platform Accountability and Transparency Act would require social media companies to allow public access to statistics about content that violated their policies but was nevertheless promoted by their algorithms. If passed, the data would be available beginning mid-2023. This would be a major step towards making elusive algorithms more transparent—a move some believe can curb the spread of misinformation.
- Why Social Media Apps Are Like Casinos for Children - dot.LA ›
- California's Social Media Addiction Bill Dies in Senate Committee ... ›
- Discord Acquires ‘Positive’ Social Media App Gas - dot.LA ›
Kristin Snyder is dot.LA's 2022/23 Editorial Fellow. She previously interned with Tiger Oak Media and led the arts section for UCLA's Daily Bruin.