Advertisement
Supported by
New rules will require online services to overhaul how they treat the personal details of children in the country.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.
Britain unveiled sweeping new online protections for children on Tuesday, issuing expansive rules despite widespread objections from a number of tech companies and trade groups.
The rules will require social networks, gaming apps, connected toys and other online services that are likely to be used by people under 18 to overhaul how they handle those users’ personal information. In particular, they will require platforms like YouTube and Instagram to turn on the highest possible privacy settings by default for minors, and turn off by default data-mining practices like targeted advertising and location tracking for children in the country.
The new rules are the most comprehensive protections to arise from heightened global concern that popular online services exploit children’s information, suggest inappropriate content to them and fail to protect them from sexual predators. The British children’s protections far outstrip narrower rules in the United States, which apply only to online services aimed at children under 13.
The new rules will soon be submitted to Parliament, which called for online standards for children as part of a 2018 data protection law and is unlikely to change them. The code should go into effect eight to 10 weeks after it is sent to the lawmakers.
“This is a significant shift in the landscape,” said Elizabeth Denham, Britain’s information commissioner, an independent regulator who drafted the new rules. “The code is a set of principles and standards that require companies to think about, to focus on and to be accountable for the way they are serving children.”
The tech industry lobbied Ms. Denham to weaken the rules, arguing that they were too onerous, vague and broad. In particular, some industry experts said that the code would cause online services to collect even more personal data in order to distinguish their child users and treat them differently.
“While we support the code’s ambition, we do have real reservations,” Antony Walker, the deputy chief executive of techUK, an industry group that represents Amazon, Facebook, Google and other companies, said in a statement. He added that his group was particularly concerned that the code “could lead to some unnecessary age-gating of online services.”
Trade groups also say that smaller companies may have to curtail free services for children because it could be more difficult to make money from advertising to them.
“Some people may see this as a victory for children, but we’ll actually see a restriction in the services that start-ups can build for kids,” said Dom Hallas, executive director of Coalition for a Digital Economy, an advocacy group for start-ups that has received funding from Google, Intuit and Stripe.
Ms. Denham did not weaken the rules in response to the industry pressure, but she did clarify and amend some provisions. For instance, the final code suggests that instead of trying to determine a user’s age, online services could just apply the standards for children to all users.
The new rules, called the Age-Appropriate Design Code, are intended to give minors in Britain special rights and protections online — much like in the real world where children generally have the right to attend school and are prohibited from going to bars.
“We already treat children differently in the offline or analog world than we do adults,” said Ms. Denham. “So why shouldn’t we also treat them differently in the virtual world?”
The code lays out 15 different principles that sites, apps and other online services likely to have users under 18 in Britain must follow. Among other things, it prohibits such services from influencing minors to share unnecessary personal information or select weaker privacy options.
It also requires sites and apps to collect as little personal information from minors as possible. And it prohibits online services from using children’s personal data in ways that could be detrimental to them, such as by automatically recommending sexual or violent content based on their searches. That is of particular concern in Britain, where a teenager, Molly Russell, committed suicide, which her family said was influenced by her seeing images of self-harm on Instagram. Instagram subsequently banned images of graphic self-harm.
The new British children’s rules may require online services to make cultural changes as well as practical ones. The code requires online services to put the best interests of children first, above their bottom lines.
Companies with major violations of the code could face fines of 4 percent of their annual worldwide revenue.
The rules underscore a growing movement by regulators on both sides of the Atlantic to rein in tech industry data-mining practices. In September, Google agreed to pay a $170 million fine and make changes to settle charges that its YouTube service violated the children’s online privacy law in the United States.
“We’re going to see the spread of significant data protection laws for children in Europe, Australia, New Zealand and Africa,” said Baroness Beeban Kidron, a children’s digital rights advocate who is a member of the House of Lords. “I would like to suggest that the vast majority of American parents would want this for their children as well, if only they knew this was something they could have.”
Advertisement