Social Media UpdatesSnapchat

Snap Provides New Insight Into its Content Moderation Rules, Additional Controls for Parents

Snapchat has added a new element to its Family Center, which will enable parents to restrict the content that their kids see in the app, while it’s also looking to provide more transparency into its content guidelines to help parents and regular users better understand how it ranks and distributes uploads.

First off, on the Family Center update – Snapchat’s new sensitive content toggle within the Family Center control panel will give parents additional peace of mind that their kids are not being exposed to offensive material in the app.

“Our new Content Controls in Family Center will allow parents to filter out Stories from publishers or creators that may have been identified as sensitive or suggestive. To enable Content Controls, parents will need to have an existing Family Center set up with their teen.”

Snapchat’s Family Center, which it first launched in August last year, gives parents a way to monitor who their kids are interacting with in the app, without giving them access to the actual messages, and infringing on the child’s privacy. That provides an additional layer of assurance and insight, while the new sensitive content controls will give parents even more peace of mind in regards to their child’s usage of the app.

But the bigger update, at least from a more broad relevance perspective, is this - Snapchat is also publishing its Content Guidelines for the first time, which provide full insight into how it vets and moderates content in both Stories and Spotlight.

“We have always shared these guidelines with our media partners and Snap Stars. By publishing these full content guidelines for anyone to read, we want to offer greater transparency into the stronger standards we set for public-facing content and into our eligibility requirements for distribution.”

The guidelines cover all the rules around what Snap allows in the app, including content that’s eligible for recommendation, what’s considered sensitive, and what it prohibits from distribution in the app.

Most of the rules are pretty much as you would expect, in regards to sexualized content, violence, hate speech, etc.

But there are some interesting elements, including these notes on ‘Creative Quality’:

The guidelines provide some valuable insight for creators and marketers looking to maximize their exposure in the app, while they’ll also offer more reassurance for parents as to how Snap monitors and moderates the content that their kids could see.

Finally, Snap says that it will also soon add a new element to its Family Center which will enable parents to have a level of oversight around their kids’ use of its ‘My AI’ element.

Snapchat added My AI, which incorporates generative AI elements from ChatGPT, last month, giving users a way to interact with an AI chatbot within the app.

But reports have suggested that My AI could be dangerous in some respects, with some users finding that the bot has provided information about drugs and alcohol, and how to hide things from your parents.

Generative AI is still an unknown element in many respects, with the system literally generating new responses on the spot. As such, Snap will be looking to provide additional assurance around such, while it will also add controls so parents can stop their kids using My AI if they feel unsure about this element.

My AI is currently only available to Snapchat+ subscribers.

These are good additions, both for parents, and for general users, with some valuable insights into how Snap works, and what it restricts in the app.

And while parents are the main focus, there’s some key insight for marketers too, which could help to guide your Snap content approach.