November 23, 2024

By Andrea Vittorio
Social media platforms under pressure to shield children from harmful content face a dilemma: figuring out how old their users are without violating their privacy.
Lawmakers and advocacy groups are urging the platforms to protect young users from content that might cause body image or other mental health issues, while also safeguarding their personal data. Such data protections depend in part on knowing the age of a social media user, even as kids under 13 are said to lie to join platforms like Meta Platform Inc.’s Instagram and ByteDance Ltd.’s TikTok.
Privacy advocates worry that efforts to determine kids’ actual age by analyzing information about them that confirms or approximates their identity will undermine the goal of keeping their personal data protected and private.
“This is the paradox,” said Jon Callas, director of technology projects at the nonprofit Electronic Frontier Foundation. “If your real-world identity followed you around with everything you browse, that would be a privacy violation.”
Children under age 13 typically aren’t allowed on social media, but they can bypass birthday-based age screens, according to a study published earlier this year by Lero, an Irish research center. Other automated mechanisms for weeding out underage users rely on clues like birthday wishes posted to their account. Posts a user likes or accounts they follow can also factor in to efforts to estimate age.
“There are lots of signals that kids give off and that companies are already analyzing,” said Josh Golin, executive director of children’s advocacy group Fairplay.
Despite challenges to understanding children’s age online, Meta Platforms Inc.‘s Instagram removed more than 850,000 accounts in the third quarter of 2021 that were unable to demonstrate meeting its minimum age requirement. TikTok, which also uses keywords and other methods to look out for children under 13, removed more than 11 million suspected underage accounts in the second quarter of 2021.
Public pressure to remove underage users has intensified even as children’s advocates say U.S. privacy law has inadvertently discouraged the social media industry from acknowledging issues with age gates.
Companies are obligated to comply with the federal Children’s Online Privacy Protection Act if they know that children under age 13 use their platforms. The law, known as COPPA, gives parents control over what information online platforms can collect about their kids.
Children’s advocates argue that a stricter knowledge standard is needed to prevent companies from turning a blind eye toward children that shouldn’t be on their platform. Legislation proposed in the Senate (S.1628) would raise legal expectations for social media companies to know that children are on their platform.
While it’s clear that sites and apps geared toward children must comply with COPPA, compliance is more challenging for platforms with mixed audiences, said Phyllis Marcus, a partner at Hunton Andrews Kurth who previously led the Federal Trade Commission’s program for children’s online privacy.
That includes apps like TikTok, which is popular with teens and younger children. In 2019, TikTok agreed to pay $5.7 million to settle FTC allegations that the company collected personal information from children in violation of COPPA.
The FTC alleged that the app was aware a significant percentage of users were younger than 13, and received thousands of complaints from parents that their children under 13 had created accounts.
“So there is some wiggle room there for the FTC to define what is directed to kids,” Marcus said.
After the settlement, TikTok added a section of its app for kids under 13 that includes additional safety and privacy features. More recently, TikTok changed privacy settings for users ages 13 to 17 to give them more control over video sharing and messaging.
TikTok enforces its age requirements by training its safety moderation team to watch for signs that an account holder may be underaged, according to a May company blog post. The app also uses keywords and reports from other users to identify and remove accounts as needed, the company said in its post.
The FTC is working on an update to its rules for implementing COPPA. Alvaro Bedoya, President Joe Biden‘s pick to fill the FTC’s open seat, has said he wants to prioritize kids’ privacy. The rule update could offer a chance to push the knowledge standard’s boundaries, though the commission remains constrained by the law’s limits.
Adam Mosseri, Instagram’s CEO, pointed out to U.S. senators in a Dec. 8 committee hearing that young children lack identification cards like driver’s licenses that could be used to verify age. Mosseri suggested it would be easier for parents to tell their kid’s phone their age, rather than leaving it to apps to decipher.
“It’s a really intriguing idea,” Fairplay’s Golin said, adding that it’s “worth exploring.”
Meta is in talks with others in the tech industry on potentially working together so that operating systems or internet browsers can share information “in privacy-preserving ways” that helps apps determine whether users are over a certain age, it said in a July blog post.
A Meta spokesperson confirmed that the company is looking into the idea but declined further comment.
Apple Inc. and Alphabet Inc.‘s Google didn’t immediately respond to requests for comment on whether they would pursue such a feature in their mobile phone operating systems.
To contact the reporter on this story: Andrea Vittorio in Washington at

av*******@bl**********.com












To contact the editors responsible for this story: Kibkabe Araya at

ka****@bl***************.com











; Keith Perine at

kp*****@bl***************.com












To read more articles log in.
Learn more about a Bloomberg Law subscription.

source

About Author