The recent announcements from social platforms like TikTok, YouTube, and Instagram about their increased child safety efforts have gotten a lot of attention among parents looking for more ways to help protect their children online. And while these are all positive steps, these efforts will not actually solve the broader problem of keeping kids safe online. Ultimately, the announced measures provide only the illusion of safety.
We absolutely need social platforms to play a vital role in children’s online safety (reporting CSAM, removing bad actors, providing transparency around their data practices, etc.), but the core problem in the current narrative is the expectation that every social platform can — and will — independently build sufficient tools to actually keep children safe. Parents, rather than Big Tech, ultimately own that responsibility — they just need the access to empower them to do so.
It’s no secret that a large social platform’s primary duty is to drive shareholder value by serving its users ads, accomplished by optimizing for user numbers and engagement metrics. Platforms are inherently disincentivized to expend any significant energy on child safety features that reduce usage from one of the largest groups on their platform. Updates made in the name of child safety are generally of the “check-the-box” variety, because social platforms aren’t nonprofits, and they’re not dedicated to providing safe online spaces for kids.
Some platforms do offer child-focused versions of their apps designed for younger children (i.e., users to whom they can’t yet serve ads because of the Children’s Online Privacy Protection Act, a law that protects the privacy of kids under 13). For example, Facebook Messenger Kids and the rumored upcoming Instagram for Kids. Putting aside the question of whether we want to encourage more social media usage at younger ages, these types of apps usually work reasonably well for kids six to 10.
But the kid versions simply aren’t cool for a tween or teen to use, are limited in functionality, and are tied in awkward ways to their parent’s account on that platform. To switch to the adult version of the app, all they have to do is fudge their birth date. With no age verification, they can start exploring the entire app or chatting with strangers.
Alternatively, some platforms provide “parental controls” for kids using the adult versions of their apps, but in most cases, kids can simply turn them off at any point without their parent’s say-so. And even when enabled, parental controls are inherently blunt instruments, often dramatically limiting the usefulness of the app. While disabling specific features can certainly be useful when a (typically younger) child is solely consuming content, they have little impact on the biggest challenges that kids face once they’ve graduated to using the platforms as tools to communicate with others.
At online safety company Bark, we’ve learned firsthand that when kids are online, the chances that they’ll experience some sort of issue are incredibly likely. In Bark’s 2020 annual report, we analyzed more than 2.1 billion messages across SMS, email, and 30+ social media and messaging platforms, and we found:
And a platform’s parental controls — even if implemented well – would have helped with nearly zero of these. Parental controls simply don’t help when a child is being bullied, pressured to send nude photographs of themselves, expressing suicidal thoughts, or being sent unwanted pornography via a DM/text message.
Granted, on a small handful of platforms, a parent could use the platform’s built-in parental controls to turn off DMs altogether — but this has the side effect of rendering the app largely useless for older kids, and simply shifting the problematic communication to another platform with less restrictive controls. And, most importantly, this naïve “block-or-allow” approach erects a wall between the child and parent — there’s no feedback loop for the parent to offer support.
Rates of teen/tween suicide, depression, bullying, and online predation have dramatically risen over the last decade, but parents are facing a massive awareness gap when these issues occur. It is too big a burden on our children to expect them to always know when and how to ask for help (statistically, they overwhelmingly don’t), and history has shown that expecting platforms to independently and proactively implement sufficient tools for parents and kids is not a viable option.
So what is actually needed? While I certainly encourage platforms to continue to work to offer better tools, the number one thing needed is neither controversial nor difficult for platforms to provide: Let users truly own and access their data. Again, the biggest challenges our kids face — rising teen/tween suicide, mental health issues, online predators, and bullying — are all buried within the content of their messages.
Without the ability to move data out of a platform’s walled garden, caregivers have neither awareness of major issues nor teachable moments to guide their kids with responsible usage of technology.
Just as we have the ability to choose to send our data to other services (e.g., your biometric data to FitBit, or your financial data to Mint), families need data access to utilize specialized online safety tools to gain the awareness needed to help with situations that kids face online.
While “data portability” may not sound as powerful as “parental controls,” it is the key to helping parents actually be parents in the digital age. Without it, a parent’s only two options are A) don’t let their kids use tech (i.e., the “don’t ride that bike, you might fall off” approach), or B) give their kids access to an unlimited amount of dangers with zero guidance.
Ultimately the solution is not to demand more “parental controls” from tech companies who aren’t incentivized to provide them in the first place. It is time for them to empower us to fully protect our children.
Brian Bason is a dad of two and the founder and CEO of Bark Technologies, Inc., an online safety company that helps protect more than 5 million kids across the US.
Keep reading