TikTok is taking a significant step toward reducing the impact of appearance-altering filters on younger users. In response to growing concerns about the mental health effects of social media on teens, the company announced a new initiative on Tuesday that limits access to certain filters for users under 18. This move follows ongoing legal scrutiny, with TikTok currently facing lawsuits in 14 U.S. states over its potential harm to young people’s mental well-being.
Aiming for Mental Health Protection
The new restrictions will apply globally, but TikTok’s plan remains somewhat vague, as it will only limit “some” appearance-altering effects for underage users. While the exact filters affected remain unclear, the platform has also promised to enhance transparency by requiring more detailed information about the effects filters have on users’ appearances. The company plans to refresh its guidance for creators in the TikTok Effects House to encourage awareness of the psychological impact certain filters may have on younger audiences.
The Push for Greater Accountability
In addition to limiting filters, TikTok is ramping up its efforts to detect and ban users under the age of 13, a group that is not allowed to use the platform according to its terms of service. The company claims it removes about six million accounts per month for suspected underage users. Now, TikTok is exploring machine learning technology to detect and flag accounts of individuals under 13. This system will be tested in the UK first, with plans to expand globally.
Balancing Safety and Creativity
While these steps may help protect the mental health of younger TikTok users, the platform faces the challenge of balancing safety with creativity. The upcoming changes to its filter system and the ongoing push to identify underage users will be closely watched to see if they lead to tangible improvements in user experience or if they are seen as superficial efforts aimed at improving the company’s public image.
To learn more, read the full article from Engadget here.