Meta, TikTok and Snap have agreed to be independently rated on how safe their platforms are for teens. It’s a simple idea where each platform gets a public grade that parents, schools, brands and regulators can actually point to.
The headline is that this could change what gets pushed and what gets flagged when teens are likely to see your content. If your videos reach under-18s or regularly get recommended to them, expect more focus on tighter enforcement as platforms try to protect their scores.

What’s changed?
This is part of a new initiative led by the Mental Health Coalition called Safe Online Standards (S.O.S.). Platforms that join agree to share evidence of how they protect teens – things like safety tools, design choices, and how they handle exposure to harmful content – and then an independent panel reviews that info to publish a simple public rating.
If a platform scores well, it can point to that rating as proof it’s safe for young users. If it doesn’t, the pressure is on to tighten things up fast, because nobody wants to be the app with the worst grade.
It’s also worth seeing this in the wider context, with some countries moving towards hard limits for teens on social apps. In December 2025, Australia brought in rules requiring age-restricted platforms to take reasonable steps to prevent under-16s having accounts. France has voted through a bill to ban under-15s from social media, while Italy currently sets its age limit for social platforms at 14.
Against that backdrop, the Safe Online Standards ratings are a proactive step. They aren’t a ban, but they let platforms show they’re taking teen safety seriously, so regulators have less reason to push for tougher restrictions.

Why this matters for creators
Even though this is a voluntary move from the platforms, it’s still likely to change what creators feel day-to-day. Once a public rating exists, platforms have a clear reason to look safer, and the quickest way to do that is usually by adjusting how they handle potentially risky videos.
If your content reaches teens or could be recommended to them, you might see more scrutiny around framing, language, and anything that could be interpreted as harmful or overly intense.
This increased responsibility towards users also fits into a wider pattern creators have been watching for a while. Regulation is creeping into more corners of social media, and even when the rules aren’t aimed at creators directly, creators still end up adapting to them.
Filmbro co-founder Arnaud predicted exactly this in a recent Uppbeat interview: “I think we’re going to see more and more regulation. In some countries you used to be able to do whatever you wanted, but now you have to be clear on what you can and can’t do.”
These teen safety ratings won’t force you to rewrite your content overnight, but they can nudge platforms toward stricter standards. In time, that will influence what gets pushed, what gets limited, and what brands feel comfortable sponsoring.

Uppbeat’s take: Safer platforms are a positive. Make your content easy to trust.
Platforms have a duty to protect the people using them, especially younger viewers, so moves like this are always welcome. If a public teen-safety grade pushes Meta, TikTok and Snap to take harmful content exposure more seriously, that’s a net positive for creators too. It usually means clearer expectations, fewer grey areas, and less surprise enforcement.
You don’t need to sanitise your channel to fit in
The practical goal if your content might be seen by teens is to remove the avoidable stuff that might get your content flagged. Start with the videos that already reach the widest audience and give them a quick check for anything inappropriate. Titles, thumbnails, and the spikiest moments should feel like they’re serving the point of the video, not trying to win easy engagement with shock tactics.
When you touch heavier themes, make the context clear early on. A simple line that frames intent goes a long way, whether it’s meant to be educational, a personal account, or something fictional. It helps viewers understand what they’re watching, and it helps platforms and brands see that you’re handling the topic responsibly.
Watch for platform tweaks and keep your monetisation clean
If you find your reach shifts around certain themes, don’t panic and rewrite your whole format overnight. These kinds of initiatives tend to trigger platform-wide tweaks in recommendations and moderation as apps try to improve their scores. Watch what happens across a few uploads, then adjust with intent instead of reacting to one dip.
And while scrutiny is rising, it’s worth keeping your content monetisation-safe too. The last thing you want is avoidable claims on your audio or visuals on top of everything else. Using royalty-free music, sound effects, and motion graphics from Uppbeat keeps your edits polished and brand-friendly without putting ad revenue at risk.







