November 22, 2024

Advertisement
The tech giant becomes the latest to adopt tougher standards as the industry comes under fire for not doing enough to protect children.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.
This article is part of our Daily Business Briefing

Google says it plans additional privacy measures to protect teenage users on YouTube and its search engine, becoming the latest technology giant to adopt tougher standards in the face of criticism that companies are not doing enough to protect children.
In a blog post on Tuesday, Google announced that videos uploaded to YouTube by users 13 to 17 years old would be private by default, allowing the content to be seen only by the users and people they designate.
Google also will start to allow anyone under 18 years old, or a parent or guardian, to request the removal of that minor’s images from Google Image search results, the company said. It is unclear whether this process will be easy and responsive, considering Google’s historical reluctance to remove items from search results.
In addition, Google said it would turn off location history for all users younger than 18 and eliminate the option for them to turn it back on.
The company plans to roll out the changes in the “coming weeks,” it said.
There is growing bipartisan support in Washington to press technology companies to do more to protect children. In the last few months, two pieces of legislation, one in the House and one in the Senate, seek to update the Children’s Online Privacy Protection Act. The 1998 law, known as COPPA, restricts the tracking and targeting of children under 13 years old, and the bills would extend those protections to teenagers.
Google has repeatedly faced scrutiny over its handling of data related to children. In 2019, it agreed to pay a $170 million fine for violating COPPA by collecting children’s data without parental consent.
Google’s announcement comes on the heels of changes unveiled last month by Facebook to protect teenage users on Instagram. Among the advertising and privacy policy changes, one will make accounts created by children under 16 private by default, Instagram said.
Both Facebook and Google said they were limiting the ability of marketers to target teenagers with advertising, but in slightly different ways. Facebook said advertisers would be able to target people under 18 based only on their age, gender and location — and not on their interests or their activity on other apps and websites.
Google said it would block personalized ads that were based on age, gender or interests to people under 18. It will still allow ads based on context, such as a person’s search requests.
Advertisement

source

About Author