• CoBrowse
  • Release: Half of teens exposed to harmful social media




    • Two in three don’t report posts that they believe shouldn’t be allowed
    • Three in four people believe social media companies should do more to protect children from harmful content

    Half of teenagers (46%) aged 13-17 who use social media have seen posts that they believe should not be allowed, new research published today by the Chartered Institute of Marketing (CIM) has revealed.

    The survey of over 2,500 adults and teenagers published ahead of the close of the Government’s consultation on online harms, shows that 95% of young people aged 13-17 have a social media account, with the most popular being YouTube (79%), followed by Instagram (73%), Snapchat (66%) and Facebook (45%).

    Despite many children coming across potentially harmful posts on social media platforms, very few are doing anything about them. Almost two thirds (62%) of teenagers who have seen content they think they shouldn’t have, say they either rarely or never report these posts. Only 7% say they always do.

    Seeing these posts does seem to be discouraging some children from engaging on social media; close to half (44%) agree that they would be put off from engaging in discussion and conversations online. But very few are prepared to give up their accounts; two in three (66%) said that seeing posts on social media that should not be allowed would not make them want to delete their account, while more than half (52%) said it would not put them off signing up for an account in the first place.

    The survey put similar questions to adults and found that almost half (44%) of those who had seen harmful content on social media say they rarely or never report it. While only one in five (20%) say that they always report it.

    Who is responsible?

    When it comes to who should be protecting children under the age of 18 from harmful or inappropriate content on social media, the public place responsibility on parents and social media companies.

    Three quarters of people over 18 say it is the responsibility of parents/guardians (76%) and social media companies (74%) to protect children on social media.

    However, most people believe strongly that social media companies should be removing harmful content from social media.

    -       Who’s responsible? Eight in ten (83%) said that social media companies have a responsibility to monitor for harmful content on social media. Many people also felt there was a role for government (49%), and individuals themselves (57%).

    -       Who pays? When it came to paying for dealing with harmful content on social media the vast majority of the public felt this was the responsibility of social media companies. 67% of adults said the cost of monitoring and regulating harmful content on social media should be borne by the social media companies themselves, compared with only 14% who said government was responsible.

    Revenue from marketing and advertising is the main source of income for most social media companies and the Chartered Institute of Marketing believes more must be done to protect users on social media if UK businesses are to continue to spend their marketing revenues reaching customers through social media platforms.

    Chair of the Chartered Institute of Marketing, Leigh Hopwood, said:

    “Social media is a crucial marketing channel, but professional marketers need to be confident that enough is being done to protect the users of these platforms, especially children. With regulation of social media under review we felt it was important to seek out the views and experiences of the users themselves.

    It is alarming that so many children have seen inappropriate posts on social media and failed to report them. Moreover, while more adults do report harmful content, it is concerning that only one in five always do so.

    Our research shows that we could make a huge difference quickly if we all take the simple action of hitting the report button when we see something that shouldn’t be on social media. When the new regulations take effect then social media companies will have a legal responsibility to do something about it once we have reported it.

    We are calling for a public education campaign to show people, especially children, how to report harmful content and to highlight the importance of reporting it whenever you see it. We don’t believe we should wait for the regulations, this is something that can happen now.”

    The research also demonstrates the prevalence and impact of harmful content being seen by adults on social media:

    -       Harmful content: Three in ten (29%) adults said that they had seen content that could be damaging if seen by children, encourage illegal activity or be considered abusive or offensive, in the last 6 months. Only one in five (21%) said that they had not seen harmful content, while a third (32%) were not sure or couldn’t recall.

    -       Who’s seeing it? Younger adults are much more likely to recall seeing harmful content than older generations; 46% of 18-24 year olds say they had seen it in the last 6 months, compared with only 16% of those aged 55 and over. Those people most active on social media are the most likely to have seen harmful content. Among those who are active on all three of the most popular platforms, Facebook, Instagram and Twitter, 44% say that they had seen harmful content in the past six months.

    -       Stifling debate: Three quarters of people who use social media (74%) say that the presence of abusive or offensive content can put them off engaging in discussions on social media, while more than half (52%) agree that it would make them consider deleting their account.



    Notes to editors

    All figures, unless otherwise stated, are from YouGov Plc.  Total sample size was 2032 adults and 550 children aged 13 to 17. Fieldwork was undertaken between 31st May - 4th June 2019.  The survey was carried out online. The figures have been weighted and are representative of all GB adults (aged 18+) and all GB children (aged 13 to 17).

    A Government Consultation on Online Harms was announced 8 April 2019 and is closing 1 July 2019.

    For media enquiries

    For further information please contact the Chartered Institute of Marketing press office at Good Relations on CIMTeam@goodrelations.co.uk

    For CIM enquiries

    James Delves
    CIM Head of PR and External Engagement

    Ally Lee-Boone 
    Content and Engagement Executive


    About CIM

    The Chartered Institute of Marketing (CIM) is the world’s leading marketing body. CIM’s mission is create marketing advantage for the benefit of professionals, business and society with a focus on export, data and skills. It believes marketing is the critical factor in driving long term organisational performance.

    CIM provides members and organisations with five key benefits: 

    • Partnership – CIM is a professional and organisational partner to support performance and career development
    • Education – CIM allows individuals and businesses to continuously upskill
    • Information – CIM keeps members up to date with the latest marketing thinking, and keeps organisations at the forefront of practices
    • Connection – CIM provides access to services, expertise and peers
    • Recognition – CIM is the global benchmark of professional competence

    For more than 100 years, CIM has supported, represented and developed marketers, teams, and leaders across the profession. There are 130 CIM study centres in 36 countries and exam centres in 132 countries worldwide. In the last year, over 7,500 people registered at over 230 UK CIM events.

    Find out more about CIM by visiting www.cim.co.uk.







    More Stories

    • Release: The Chartered Institute of Marketing and The Chartered Institute of Marketing Ghana announce new partnership



    For all press enquiries, please contact the media team:

    Media Team


    Fact Sheet

    An overview of CIM our history and services.

    Sign up for the latest news

    To receive sector specific news: