Amid renewed focus on the potential risks social media may pose for teenagers, TikTok on Wednesday unveiled a number of improvements designed to assist users in customising their viewing choices and filtering out content that may be troublesome or too adult for young users.
According to Cormac Keenan, head of trust and safety at the firm, the short-form video app is changing its viewing interface so users will now see fewer movies on subjects that “may be fine as a single video but potentially problematic if viewed repeatedly.” Keenan gave the topics of diets, extreme exercise, and grief as illustrations of this type of content. (TikTok rival Instagram has previously tried to block specific weight loss products from being seen by teenagers.)
Additionally, TikTok announced that it is introducing a brand-new system that classifies content according to thematic maturity, much like the rating systems used in movies and television. A “maturity score” will be assigned by the new protections to videos that are thought to potentially contain mature or difficult subjects.
To “to help prevent content with overtly mature themes from reaching audiences between ages 13-17,” says Keenan, is the aim.
Late last year, after a Facebook whistleblower renewed concerns about the influence social media platforms had on its youngest users, senators questioned executives from TikTok, YouTube, and Snap about the measures their platforms were taking to protect kids online.
Additionally, a group of state attorneys general initiated a probe into TikTok’s effects on American youth early this year. At the time, TikTok claimed in a statement that it restricts its features by age, offers parents tools and resources, and creates its policies with the welfare of children in mind.
In a blog post published on Wednesday, Keenen stated that the business is “focused on further safeguarding the teen experience” and that it will soon implement a new feature to offer more precise content filtering options.