TikTok has recently faced scrutiny over child safety issues in the US and elsewhere due to its youth-skewing userbase and reams of inappropriate content on the platform. Now, the company (owned by China’s ByteDance) has announced that it’s is giving parents more control over what their teens can see. It’s adding new content filtering controls to its “Family Pairing” feature, letting parents filter out videos containing specific words or hashtags — while still keeping kids in the loop.
TikTok introduced Family Pairing back in 2020 as a way to let parents link directly to their kids’ accounts then remotely disable direct messages, set screen time limits and enable a “restricted content” mode. And last year, it added a tool that automatically filters out videos with words or hashtags users may not want to see in their For You or Following feeds.
The new controls essentially combine those two features, giving parents the option to remotely filter out videos from their kids accounts in For You or Following with specific words or hashtags. “We’re bringing this [content filtering] tool to Family Pairing to empower caregivers to help reduce the likelihood of their teen viewing content they may uniquely find jarring,” TikTok wrote.
At the same time, kids will be alerted to their parents’ selected filters and can choose not to opt-in, the company told Sky News. “By default, teens can view the keywords their caregiver has added and we believe this transparency can also help to prompt conversations about online boundaries and safety,” the company wrote. “We also wanted to make sure we respect young people’s right to participate.”
TikTok gives parents even more control over what their teens see
At the same time, TikTok announced that it will form a global Youth Council later this year. The aim, it said, will be to “listen to the experiences of those who directly use our platform and be better positioned to make changes to create the safest possible experience for our community.”
TikTok has been criticized for exposing children to videos showing self-harm, eating disorders and other inappropriate content, often disguised by slightly altered hashtags designed by bypass moderation. The company is facing new content regulations in UK via the Online Safety Bill, and US lawmakers are working on a Kids Online Safety Act that would force social media companies like TikTok to add online safeguards for children. TikTok was recently banned in Montana, but the company is suing the state on the grounds that the ban violates the First Amendment and other laws.