TikTok has announced a new set of resources to support users who may have mental health problems, especially related to eating disorders, self-harm and suicide.
First, the platform expands its source-guides to support people who decide to share their personal experiences on the platform.
As TikTok explained:
“While we do not allow content that promotes, glorifies, or normalizes suicide, self-harm, or eating disorders, we support people who choose to share their experiences to raise awareness and help others who may be struggling and find support in our community. To make it easier, we are developed new welfare guides to support people who choose to share their personal experiences, with leadership International Association for Suicide Prevention,, Crisis Text Line,, Live for tomorrow, Samaritans from Singapore i Samaritans (UK). ”
New guides, now available in TikTok guides Security Center, offer tips to help users share their experiences, as well as how to engage responsibly with others who may be struggling or in trouble.
In addition, TikTok highlights a new set of prepared content from partner organizations within the app, providing more information in advance on important welfare issues.
The new program is currently underway and will run until September 16th.
TikTok also expands its search queries when users type in queries related to eating disorders, which will help them direct them to professional tools and support resources.
“We added a new one Security Center Guide on eating disorders in teenagers, caregivers and educators. Developed in consultation with independent experts, including National Association for Eating Disorders (NEDA), National Eating Disorders Information Center,, Butterfly Foundation, i Bodywhys, this guide will provide information, support and advice on eating disorders.”
TikTok also adds similar searches related to suicide and self-harm, with links to them local resources and support opportunities.
And finally, TikTok is too update warning labels for sensitive content.
“Starting in September, when a user searches for terms that may provoke content that might be annoying to some, such as‘ scary makeup ’, the search results page will be covered by a login screen. Individuals will be able to click ‘Show Results’ to continue viewing content. ”
These sign-in sign-up screens already appear at the top of videos that may be graphic or annoying to some, while this type of content also doesn’t qualify for recommendation to anyone. Feeding for you.
TikTok has gained a lot of appeal among younger audiences in recent years, and continues to add more users – and within that, the platform has an obligation to protect these more impressive users, where it can, and both protect them from harm while also providing support.
TikTok has faced various challenges in this area. Last year, the app was temporarily banned in Italy after the death of a girl who took part in a challenge within the app, while TikTok was also criticized for exploiting young girls, with highly aligned algorithms seemingly aligned with features and content that predators might like.
Similar to Instagram, the visual nature of the platform can easily lead to an impact on mental health, so as such, TikTok must make it easier to support users who need it, with more resources, more tools in applications and connecting communities.
There is no way to solve such problems completely, but it is good to see how TikTok continues to add new tools in this area.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.