Most UK adults think social media platforms should provide better measures to protect the mental health of users.
In a survey conducted by wellbeing company Soul Analyse of 4000 people that focused on social media use during the pandemic, 53% of respondents felt like social media platforms were doing a poor job protecting user wellbeing, while only 4% thought the opposite.
When asked how to tackle the issue, most respondents were in favour of measures to crack down on abusive and potentially triggering content, alongside tougher consequences for deliberately antagonistic accounts.
Over half (56%) of those in favour of better mental health measures think that platforms should offer details about helplines or supportive communities alongside potentially triggering content.
Two thirds (64%) of respondents also think that sensitive content should carry a trigger warning, while around half of respondents think there should be an option to easily hide sensitive content.
The results may be a reflection of people spending more time than ever on social media during the pandemic: 48% of respondents said they were using social media more, with 67% spending up to an additional hour doing so.
Regarding specific information that negatively affected respondents, content about COVID-19 (65%) and potentially false information (54%) were the leading causes for many feeling anxious, sad or fearful.
A vast majority of users (73%) also said that social media 'trolls' should be banned permanently from the platforms on which they offend.
“The majority of social media users clearly feel platforms must play their part in creating a healthy online environment," said Stephanie Dunleavy, founder of Soul Analyse.
"Social media companies could introduce a ‘conduct agreement’ box for people to tick each time they log on, which may help to deter trolls."
In December of last year, the government gave its final decision on the previously proposed Online Harms white paper, which sets out new rules and restrictions for websites that allow users to post their own content and interact.
The rules will directly affect social media firms who can be fined, their website blocked, and senior executives held liable if they fail to remove or limit the spread of illegal content on their platforms.
In addition, the most popular social media platforms are expected to go further by providing and enforcing terms and conditions around legal content that could cause physical or psychological harm to adults, such as content around suicide or self harm.
"I’m unashamedly pro tech but that can’t mean a tech free-for-all," said Digital Secretary, Oliver Dowden, in a press release regarding the proposed new regulations. "We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.
"This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives."
Results from the survey follow the proposed changes from the government, with 53% saying that social media platforms should provide clear guidelines on how users should not interact with one another.
A third (36%) of respondents also said that companies should look to the government for guidance on how to limit potentially harmful content.
Although the survey results provide a mostly negative context around social media use and mental health, there were many respondents who also described a positive effect on their wellbeing from using social media.
Overall, 57% reported experiencing positive emotion when browsing social media, 64% said that optimistic posts were the main reason for feeling happy or calm, and 40% said social media helped them feel 'connected'.
Written by Marco Ricci
Editor and contributor for Talking Mental Health