Popular video-sharing app TikTok on Tuesday announced a number of updates to its community guidelines to better promote user safety, transparency and integrity to the platform. 


What You Need To Know

  • Popular video-sharing app TikTok on Tuesday announced a number of updates to its community guidelines to better promote user safety, transparency and integrity to the platform 

  • The community guidelines now explicitly prohibit content that deadnames or misgenders individuals, or promotes conversion therapy; misogynistic content is also banned 

  • Another change in Tuesday’s update, which will be rolled out over the coming weeks, was the decision to remove content that promotes disordered eating

  • There has been growing concern in the United States about the risks young adults are exposed to on apps like TikTok, Instagram and Snapchat

One of the changes was to offer additional “​​clarity on the types of hateful ideologies prohibited on our platform,” Cormac Keenan, TikTok’s head of trust and safety, wrote in a statement. The community guidelines now explicitly prohibit content that deadnames or misgenders individuals, or promotes conversion therapy; misogynistic content is also banned. 

“Though these ideologies have long been prohibited on TikTok, we've heard from creators and civil society organizations that it's important to be explicit in our Community Guidelines,” Keenan wrote in part. “On top of this, we hope our recent feature enabling people to add their pronouns will encourage respectful and inclusive dialogue on our platform.”

“Deadnaming,” according to the Cleveland Clinic, is when an individual refers to another person by an incorrect or previous moniker – which can occur when a transgender individual opts to change their name from their legal or birth name to better reflect their identity. 

“A transgender person may decide to no longer use their birth or legal name. Instead, they’ll choose a name that better aligns with their identity,” psychiatrist Jason Lambrese told the health center in an interview published last year. “When someone uses their old name after being asked not to, that is what we call ‘deadnaming.’ The person who they once were is dead, but the new person is alive, so their current name should be used.” 

Another change in Tuesday’s update, which will be rolled out over the coming weeks, was the decision to remove content that promotes disordered eating. The app already removed content that promoted specific eating disorders – like anorexia and bulimia – but will now take the extra step of deleting videos that advertise content like over-exercise or short-term fasting, which TikTok says are “frequently under-recognized signs of a potential problem.”

TikTok had, in late 2020, added a policy specifically targeting eating disorder content, putting forward “additional considerations to prohibit normalizing or glorifying dangerous weight loss behaviors.” The update mandated that any content that depicts self-harm or promotes disordered eating will be removed “regardless of the user's intention of posting it.” 

"We want our community to feel comfortable and confident expressing themselves exactly as they are," a TikTok spokesperson told Spectrum News of the disordered eating guidance. "Our updated guidelines incorporate feedback and language used by mental health experts to improve our policies on self-harm, suicide, and eating disorder content and avoid normalizing self-injury behaviors. Our policy on eating disorder content has additional considerations to prohibit normalizing or glorifying dangerous weight loss behaviors."

TikTok on Tuesday also clarified its guidelines on potentially harmful conduct, with a particular focus on preventing suicide hoaxes from going viral on the platform. While the company announced a stricter approach in November – which included advice for caregivers, improved language on warning labels and prompts to visit the app’s safety center – the company will now highlight that portion of the guidelines for easier access in a separate policy category. 

There has been growing concern in the United States about the risks young adults are exposed to on apps like TikTok, Instagram and Snapchat, particularly the impact certain content might have on users’ mental health. 

Sen. Richard Blumenthal, D-Conn., last year called on TikTok representatives to testify in front of a Senate subcommittee over the so-called “devious licks” challenge, a destructive trend that targeted schools and which rose to popularity on the video-sharing app in early September. The trend traveled across the country, with some schools reporting vandalized bathrooms, various stolen items and graffitied walls over the next several weeks. 

At the time, Blumenthal cited a number of other challenges on the app that he deemed harmful to teenagers, singling out the “milk crate challenge,” which saw individuals attempt to climb milk crates stacked in a pyramid formation, as well as the “blackout challenge,” where users were encouraged to hold their breath until they passed out, as particularly dangerous.

"We expect our community to stay safe and create responsibly, and we do not allow content that promotes or enables criminal activities,” a company spokesperson confirmed to Spectrum News at the time. “We are removing this content and redirecting hashtags and search results to our Community Guidelines to discourage such behavior." 

TikTok’s vice president and head of public policy Michael Beckerman, along with the heads of other major social media companies, did testify in front of the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security to discuss potential rules changes to better protect children online. 

TikTok had banned the #deviouslicks hashtag in mid-September, and in November released their update on potentially harmful content, including hoaxes and challenges.