Type of Resource


Publication Date

December 18, 2023


Digital Wellbeing / Social Media and Apps 

Our online safety experts have received reports from our Safer School partners about a peer support app that could be reinforcing harmful behaviour. Vent markets itself towards children and young people as a platform where they can express themselves, “chill out” and have their mood “lifted”. We have reviewed and tested this app and found that it features unhealthy and potentially dangerous behaviours, some of which are age-inappropriate or illegal.

We’ve produced this brief for parents, carers and safeguarding professionals to learn about what Vent is, the risks it poses to those in their care, and how they can respond.

Here’s what you need to know…

What is Vent?

Age Rating

Vent states that users can only post to the app if they are ‘aged 13 years or older’. However, other ratings suggest 16+ or 17+ age limits, as the app may include “suggestive themes” such as profanity/crude humour, mild sexual content, nudity, and drug use reference – content that is not suitable for Vent’s suggested age rating.

Our testers also noted that there is ineffective age verification on this platform. Users just need an email and a password to create an account and are only asked to tick a box that says, “I am over 16 years of age.”

What are the key functions?

‘I Need Help’

Within the user profile section, users can find an ‘I Need Help’ option. This is meant to provide anyone who is “having a really difficult time” with anonymous support from a trained provider. Vent has then included instructions for a 24/7 text line they call ‘Vent Crisis Messenger’. After testing the app, we discovered this is actually connected to Shout Helpline.
Our online safety experts reached out to Shout with concerns about the use of their helpline in the Vent app. Shout’s spokesperson then confirmed that Shout does not have any relationship with Vent, and that the signposting had been done without Shout’s knowledge or consent.


There is an abundance of inappropriate and explicit content, including harmful or triggering topics such as sexual fetishes, eating disorders, and self-harm methods. These appeared on the app even after our testers chose to block these topics.
There is a large presence of negative and unhealthy language within user posts. Groups include ‘Eating Disorders’, ‘Depression’, and ‘Abuse’, and user posts include similar tags.
Our researchers found no controls in place for either app to safeguard against a user suggesting continuing the conversation on a different platform. This leaves a child or young person at risk of moving to another app or site that has call and/or video chat options, or end-to-end encryption, escalating the potential for victimisation.
Peer support and self-help apps have become popular due to limitations on in-person help. However, the absence of professional advice could mean a young person develops unhealthy coping mechanisms.
A young person may begin to rely on or give inaccurate/harmful advice when talking to others that fits in with their own views and beliefs. This may inspire or encourage self-destructive behaviours.
Some of the posts contain graphic or explicit details of harmful behaviours and habits, which could lead to irreversible damage or medical emergencies if a young person attempts to harm themselves.
Young people might use these apps to try and ‘fix’ problems that need professional help because they are embarrassed, confused, scared, or feel misunderstood.
If a child is being vulnerable, they could be more swayed by those who ‘understand them’, introducing harms such as bullying or grooming by other users who wish to exploit their emotional state.

Our Advice/Top Tips

Welcome to the Online Safety Hub

How old are you?

If you are under 18, click the blue button below to visit the Online Safety Hub micro-site for children and young people.