The Pupil Safeguarding Review: Top 5 platforms pupils feel unsafe using


Type of Resource

Research and Evidence

Publication Date

June 27, 2023


Digital Wellbeing / Gaming  / Online Bullying  / Screen Time / Social Media and Apps / Video and Livestreaming 
New research has uncovered the top five platforms young people feel the most unsafe whilst using. The Pupil Safeguarding Review investigated the effectiveness of school safeguarding policy, with an aim of understanding whether pupils feel safe in a variety of settings.
Pupils most commonly feel unsafe whilst using:
Our online safety experts have created a guide to help you understand why children and young people might feel unsafe on these platforms and what you can do to help create safer online experiences.


What is Roblox?

Roblox is an online gaming and game creation platform. Users can play and create games for others to play. The game includes social features such as friend requests and chats in which players can design an avatar and speak to others while they play.

Roblox has a PEGI rating of 7 in the UK.

*The platform’s Terms of Use say that users under the age of 18 require parental consent to use the services.

What are the risks of Roblox for children and young people?

While Roblox has parental controls, there are still some instances where conduct within the game may be inappropriate to their developmental stage.

Read Roblox ‘The Game’ – Our latest Online Safety Update

The concern is that this form of activity on the platform may normalise inappropriate interactions between other children or with adults on the platform; normalising sexual activity with children is often part of the grooming process.

If parental controls are not configured, and young people play unsupervised, they may be at risk of exposure to harmful content and contact from strangers.
While there is moderation within Roblox, it is not guaranteed. Word filters will fail if users write explicit words like sex as ‘S£Gs’.
Some children have reported unsolicited friend requests from strangers. Roblox may see children encounter people who use anonymity to conceal their intentions or their true age. Those with ill intentions could use this as an opportunity to build rapport and move to a more private server.
In some cases, platforms like Discord are used to live stream games and chat 1-2-1 away from the game.
The temptation or requirement to buy in–game currency to improve gameplay can be used to exploit children.


What is Snapchat?

Snapchat is a social media platform used to share photos, messages, and short videos that can be customised with filters, text, and stickers.

It’s one of the most popular social media platforms and is widely used by young people to send ‘disappearing’ pictures, videos, and messages.

Users must be 13+ years old to use Snapchat

One report found that children in the UK spent 82 minutes a day on Snapchat in 2021.

What are the risks of Snapchat for children and young people?

As parents and carers are unable to see the content of the messages or ‘Snaps’ their young people send or receive, there is no way to know if they contain anything that might be inappropriate or harmful.
Young people might feel pressured to share content on Snapchat due to ‘Snapstreaks’ – a tally of messages sent between users on consecutive days that many aim to get as high as possible. The disappearing messages, images and videos feature on Snapchat may also make young people feel ‘safer’ about sharing content that is inappropriate or harmful. Although this content disappears after a short period of time, it is still possible for it to be recorded or screenshotted and for the sender to lose control of that content.
Quote from Lucy, aged 15 “I think snapchat could make their app more safe by monitoring what is sent from person to person as snapchat deletes a message as soon as it’s been seen by the receiver meaning that if someone wanted to report another person they may not have any evidence. I think in general a good idea would be to add links on all of the apps home pages to different anti bullying and safeguarding websites so people can seek advice straight away especially if they are not comfortable asking for help in real life.”

Snap Map and Snapchat Live Location sharing allow users to share their location with friends via the app. These features have the potential to be misused.

For example, a young person could feel pressured to share their location with their Snapchat ‘friends’ but not all contacts on the app are people they know in real life, meaning that could be exposing their location to strangers.

What’s the difference between Snap Map and Live Location?

Snap Map allows users to share their location but Live Location tool gives a more specific location that’s updated live and will run even if the app is closed.


What is Instagram?

Instagram is a photo and video sharing platform from Meta (previously Facebook). It’s one of the most popular online platforms, with over 30 million users in the U.K.

Users can interact with others by commenting and liking on posts, following profiles, private messaging and more.

Instagram’s minimum age requirement is 13 years old.

What are the risks of Instagram for children and young people?

Despite sensitive content control and safety settings, children and young people may be exposed to inappropriate, upsetting, or harmful content. Beyond sexual and violent content, some platform users also create communities that encourage harmful behaviour such as content that encourages disordered eating behaviours and self-harm.

If a profile is set to public, anyone can send a direct message or comment on a user’s posts or reels.

The bio section of every profile is always public, no matter the age of the user. Often young people add their Snapchat username to their bio, creating risk as anyone can view this information and add them.

Users can tag their location on their profile posts and in stories which can then be viewed by followers. This could leave young people open to risks, such as strangers gaining access to their location.

Learn about Instagram’s latest safety updates.

The pressure of ‘curating’ a feed may lead to obsessive behaviour and influence the choices a young person makes (e.g., not doing or enjoying something because it’s not ‘Instagramable’).


What is TikTok?

TikTok is a social media platform that allows users to share short-form videos and photos. Users share content ranging from lip-syncing videos to dance challenges and daily routines.
TikTok has an age rating of 13+ years old.

What are the risks of TikTok for children and young people?

Any social media brings the risk of cyber bullying, but the risk can be heightened on platforms where users specifically share personal elements of their lives, hobbies and personality. For example, a young person may share their interest in an unusual or unpopular hobby and then be targeted for this.
Similar to many other social media platforms, young people should be cautious about what details they are sharing via stories. They may not realise that they are giving away information about where they live, what school they go to, their current location etc.
Quote from Chloe* (not real name), aged 13 “I feel the least safe on TikTok because people can find you through mutual connections or your account will be suggested to people in your contacts. TikTok is also not a secure app and strangers can tag you in videos or comments”
Users on TikTok can interact with each other by liking, commenting, following and messaging. This could leave young people at risk of inappropriate and harmful contact and being groomed, particularly as they may share interests that could be misused to build connection.

Young people may be tempted into creating and sharing content in a bid to attract likes, followers and status. Social media apps like TikTok are often centred around an aim to become popular and/or have content go viral. A young person may engage in more risky behaviour as a result, including taking part in dangerous TikTok challenges that may result in injury or even death.

Read more about online challenges


What is Fortnite?

Fortnite is an online multiplayer video game. It is a “shooter style” game, meaning players battle it out against opponents with multiple weapon styles, but it is also a construction style game that allows players to create and personalise their own game components.
Fortnite has a PEGI rating PEGI rating of 12

What are the risks of Fortnite for children and young people?

Fortnite makes full use of persuasive design features to captivate users and encourage repeated gameplay. The game has continued updates and changes that can contribute to the feeling that the game is always ‘active’ and users may experience fear of missing out if they don’t keep playing.
Young people may feel pressured into spending money to buy the in-game currency ‘V-bucks’. This allows users to purchase add-ons like character skins and ‘dances’ (special moves their characters can make).
The voice chat feature means users can interact and hear from strangers, leaving young people open to hearing harmful or inappropriate conversation.

Top Tips

Many of the above risks apply to other social media platforms and games and although these are some of the main risks our Online Safety Experts have identified, the list is not comprehensive. That’s why we’ve created our top tips that apply to all of the above platforms plus more.
Don’t be afraid to make your own account for the games and social media platforms your child or young person is using. Seeing how it works yourself will help to give you a better understanding of the experiences your child is having.
Just as you teach the child or young person in your care how to be responsible with money offline, talk to them about the value of online money too. Make sure children in your care know to ask for your permission before purchasing anything in the app. Suggest using pocket money as a way for them to learn the value of money.
Advise children and young people to never share any personal information online, even with friends, like their phone number, real name, address, school, or names of clubs they attend, or mention usernames in public.
Talk about what’s appropriate to share online and what to do if someone asks them to send a photo, video or message that makes them feel uncomfortable. And just in case, make sure the young person in your care knows what to do if they lose control of an image.
Have conversations with the children and young people in your life about what cyber bullying looks like, what to do if they experience it and who the trusted adults they can turn to are. Remember, this works both ways so ensure you talk about being kind online too!
Remind them that if they are uncomfortable or don’t want to do something, they do not have to do it. They might be feeling pressure to ‘look popular’ or ‘not be boring’. Discuss the feeling of missing out) and how to achieve a healthy balance between online and offline. You may also want to discuss influencers and that not everything we see on social media is a true reflection of real life. Visit our Home Learning Hub for more information.
From how to mute mics, make profiles private and report others, you can find a wealth of information on the most popular platforms on our Safety Centre.
For guidance on having discussions with your young person about their online world, watch our video below and use our handy ‘Conversations Starters’ guide.

Conversation Starters

Discussing Online Life With Your Child

Welcome to the Online Safety Hub

How old are you?

If you are under 18, click the blue button below to visit the Online Safety Hub micro-site for children and young people.