December 24, 2024

Advertisement
Supported by
The bill could require many social media sites, games and other online services used by children to install protections for minors.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.

Natasha Singer, a technology reporter, has covered children’s online privacy since 2012.
California will adopt a broad new approach to protecting children online after Gov. Gavin Newsom signed a bill on Thursday that could transform how many social networks, games and other services treat minors.
Despite opposition from the tech industry, the State Legislature unanimously approved the bill at the end of August. It is the first state statute in the nation requiring online services likely to be used by youngsters to install wide-ranging safeguards for users under 18.
Among other things, the measure will require sites and apps to curb the risks that certain popular features — like allowing strangers to message one another — may pose to younger users. It will also require online services to turn on the highest privacy settings by default for children.
“We’re taking aggressive action in California to protect the health and well-being of our kids,” Governor Newsom said in a statement that heralded the new law as “bipartisan landmark legislation” aimed at protecting the well-being, data and privacy of children.
Called the California Age-Appropriate Design Code Act, the new legislation compels online services to take a proactive approach to safety — by designing their products and features from the outset with the “best interests” of young users in mind.
The California measure could apply to a wide range of popular digital products that people under 18 are likely to use: social networks, game platforms, connected toys, voice assistants and digital learning tools for schools. It could also affect children far beyond the state, prompting some services to introduce changes nationwide, rather than treat minors in California differently.
Industry groups had opposed the measure, saying its scope was too broad and its provisions too vague to carry out.
Groups like TechNet, which represents many of the largest U.S. tech companies, pressed California legislators to narrow the bill’s definition of a “child” to a person under 16, rather than a minor under 18. They also warned that the bill’s wide focus on online services for general audiences “likely to be accessed by children” would subject far too many sites and apps than needed to the bill’s requirements.
Civil liberties experts argued that the measure could also have deleterious consequences for adults. In compelling sites to treat children differently, they warned, popular services for general audiences might set up invasive age-verification systems requiring all users to provide sensitive personal details to companies.
But children’s groups said the legislation was needed to protect young users from automated systems that could expose them to harmful content, put them in contact with adult strangers or prod them into staying online for hours on end.
Earlier this month, regulators in Ireland issued a fine of about $400 million to Meta, the social media giant formerly known as Facebook, for violating European data protection rules in its treatment of children’s data on Instagram. Meta has said it disagreed with the Irish regulators’ charges and planned to appeal the decision.
The California legislation will require online services likely to be used by children to protect their privacy and safety by design and default. It also specifically prohibits online services from profiling children, nudging children to provide personal information or tracking their precise locations.
“Our kids deserve to be safe,” said Buffy Wicks, a Democrat in the State Assembly who co-sponsored the children’s online safety bill with a Republican colleague, Jordan Cunningham. “Our kids deserve to be assured that strangers can’t direct message them on their social media accounts,” she added, and that social networking apps cannot send phone notifications at all hours of the night, disrupting children’s sleep.
The California law, which is to take effect in 2024, comes one year after Britain instituted comprehensive online protections for minors. Last year, as British regulators were developing that effort, Google, YouTube, Instagram, TikTok, Snap and other major platforms announced new safeguards for younger users worldwide.
Congress is also working to boost online safety for youngsters.
In July, the Senate Committee on Commerce, Science & Transportation advanced a privacy bill that would prohibit online services from targeting teens and children with ads or collecting their personal data without permission. Separately, the Commerce Committee advanced a bill that would require online services to protect minors by, among other things, not recommending they view potentially harmful content on suicide or eating disorders.
Advertisement

source

About Author