November 26, 2024

Lawmakers will deliver a blistering indictment of Facebook at a Capitol Hill hearing Thursday, as company scandals are generating renewed political will to reframe the laws protecting children online — proposals that have stagnated.
Antigone Davis, Facebook’s global head of safety, will testify and lawmakers are expected to grill her on the company’s research into the negative effects of the company’s products on children and teens’ mental health.
The hearing was called days after a Wall Street Journal report revealed internal research in which teen girls reported that Instagram made their body image issues worse. Though it was one of a series focusing on the tech giant, lawmakers seized on the story on the Facebook-owned app as evidence of the company’s willingness to exploit children for advertising data.
Facebook is signaling that it will be on the defensive, and on the eve of the hearing, it released an annotated version of its own internal research, where it sought to minimize its own researchers’ findings.
For years lawmakers and regulators have railed against tech companies, who they allege place profit over children’s well-being, but have yet to pass legislation that would rein in those practices. Reforming the decades-old laws that govern companies’ relationships with kids online is a pet project of a number of legislators, including Sen. Edward J. Markey (D-Mass.).
Consumer advocates and a growing number of members of Congress warn that the Children’s Online Privacy Protection Act (COPPA) has been under-enforced, and that a rapid evolution of apps, video games and other websites has outpaced the 1998 law, which restricts the tracking and targeting of people under 13.
Lawmakers from both parties have proposed bills intended to keep kids safe — ranging from updating COPPA to entirely new legislation that would take aim at the design of tech platforms, including a proposed ban of “autoplay” settings that lead to lengthy sessions on apps for kids and young teens. These bills have faced significant opposition from the tech industry’s powerful lobbying force and have not advanced in a Congress mired in inertia and partisan rancor.
But there is renewed optimism among lawmakers and advocates that Congress may be reaching a turning point.
“Clearly, there is a growing acknowledgment of the urgency of these problems and an emerging consensus that we cannot afford to ignore how technology and media are harming young people,” Markey said.
Sex, drugs, and self-harm: Where 20 years of child online protection law went wrong
Jim Steyer, president of Common Sense Media, compared the moment to 2018, when California passed a state privacy law. Public events, including Europe adopting a robust data privacy law and the Cambridge Analytica privacy scandal, generated enough widespread outrage to shift political will, he said.
“This is potentially a seminal moment that it’s time for Washington to step up to [the] plate and act,” Steyer said.
The flurry of activity also highlights how children’s digital safety has emerged as a rare bipartisan issue in a bitterly divided Congress, in part because of its strong resonance with voters. The issue has even united strange bedfellows. Sen. Richard Blumenthal (D-Conn.) has been working closely with Sen. Marsha Blackburn, a Tennessee Republican at the opposite end of the political spectrum. (Blackburn accused President Biden of having a “socialist agenda” Wednesday on Twitter; Blumenthal defended abortion access in a news conference that same day.)
Thursday’s hearing is just one in a series that they’re planning together. Next week, the same Senate panel will host a “Facebook whistleblower” for another hearing expected to focus on the Instagram research.
Lawmakers are setting up more hearings on protecting children online with other prominent tech companies, Blumenthal said.
“We have the same interest and goal to protect children,” Blumenthal said in an interview. “There’s nothing partisan about that goal.”
COPPA was designed in the dial-up era, and consumer advocates and lawmakers say it wasn’t built for the world of streaming, gaming and social networking that dominates children’s time online in 2021. In the interim, COPPA has been unevenly enforced and weakened by court rulings. Companies have sidestepped the law by not asking users their age because enforcement is triggered by “actual knowledge” that a user is under 13, as The Washington Post has reported.
“The law was very well intended and was effective for many years,” said Angela Campbell, a Georgetown Law professor emeritus who also serves as the chair of Fairplay, which advocates for children’s protections. “The world has just changed so much 1998.”
Beyond Congress, regulators at the Federal Trade Commission have also reviewed the ways that they could update COPPA for the modern era of Fortnite, Snapchat, TikTok and other services. They solicited comments in 2019 about potential updates to its rules for enforcing the law, but have not announced major changes.
Federal regulators eye update to rules governing children’s privacy and the Internet
Meanwhile, other countries have pushed ahead of the United States. In September, the United Kingdom implemented its “Age Appropriate Design Code,” which aims to stamp out design tactics to keep children glued to a service for a long period of time.
In the United States, lawmakers are increasingly recognizing that children’s online safety is an issue that resonates with voters — especially parents and grandparents who have been struggling with children’s increased screen time during the pandemic.
Blackburn says she calls these voters “the security moms.”
“They are indeed very concerned about people using devices and some of these apps — TikTok and Snapchat — to track children online,” she said. “They’re very concerned, as they should be, about the implications of that of where it could lead.”
Blackburn says she herself is not just a “security mom,” but also a “security grandmother.”
Blumenthal said the pressure extends beyond mothers to fathers and grandfathers as well, saying he was perhaps a “security dad.”
“I would say outrage is an understatement in what I’ve heard from constituents,” he said of the recent Facebook revelations.
Facebook insists a kids version of Instagram will be safe. But state attorneys general aren’t buying it.
The political pressure is not limited to the halls of Congress, and state attorneys general from opposing parties have also teamed up on this issue. Earlier this year, a group of 44 state attorneys general sent a letter to Instagram calling the company to call off its plans to launch a service specifically for kids. Two of the state attorneys general on that letter, Nebraska Attorney General Doug Peterson (R) and Massachusetts Attorney General Maura Healey (D) say the company’s recent announcement that it would pause Instagram Kids is insufficient, and that they’re continuing to monitor the situation.
“One of the things that creates bipartisan engagement for the attorneys general is when we see a business such as Instagram operating in such a way that they place profit over social good for young people,” Peterson said in an interview.

source

About Author