Insights Online Safety: Ofcom publishes latest research and guidance on media literacy

Contact

Ofcom has published its latest research into the experiences of adults and children online, alongside guidance on how online services can promote media literary on their platforms through so-called ‘on -platform interventions’.

Ofcom is charged with supporting people to navigate the online world as safely as possible, in part through a statutory duty to promote media literacy. As Ofcom explains, “media literacy is about people’s knowledge, understanding and skills. Together, these play an important role in helping internet users to engage with online services critically, safely, and effectively, so that they can maximise the benefits and minimise the risks associated with being online”. Its latest research suggests that more needs to be done to improve media literacy among adults. For example, whilst the majority of adult users said that the benefits of being on social media outweighed the positive, many admitted to inadvertently amplifying untrue stories, some still find avoiding scammers difficult, and close to three quarters of respondents doubted their ability to distinguish between human-generated content and that generated by AI.

Ofcom has already consulted on 12 Best Practice Principles for Media Literary by Design (on which we commented here). Alongside the latest research on adult’s experiences online, it has published a summary of responses to that consultation, as well as guidance focusing in particular on so-called ‘on-platform interventions’. These include measures such as overlays, which provide users with additional information before they view content, as well as labels, prompts, and notifications.

Ofcom suggests that services go beyond using on-platform interventions purely as preventative measures to address potential harm, and instead adopt a more ‘proactive’ approach, considering how media literacy skills “can be encouraged and developed by their users/communities when interacting with the service”. Platforms are also encouraged to consider interventions that are suitably tailor-made to specific user groups which may be particularly vulnerable: Ofcom gives the example of designing interventions that tackle content and activity that disproportionately affects women and girls online. It also makes clear that any media literacy initiative needs to be monitored effectively and evaluated to ensure that it is working as intended.

Ofcom has also published its research into the experiences of younger age groups online. It found that that approximately one in four children aged five to seven years old now owns a smartphone, three quarters use a tablet, and a third use these devices without supervision. Concerningly, it also reported that 40% of 8-17 year olds admitted to providing a fake age to gain access to a new site or app, and that there was a “disconnect between older children’s exposure to potentially harmful content online, and what they share with their parents about their online experiences”. Furthermore, only three in ten children over eight years old can recall ever having had regular lessons on online safety in school, and only a third of parents know the correct minimum age requirements for most social media platforms.

In order to address concerns about children’s safety online, Ofcom has confirmed that it is launching a consultation in May on its draft Children’s Safety Code of Practice, as well as a consultation later in the year on how automated detection tools can be used to mitigate the risk of illegal harms and content most harmful to children.

The research into the experiences of adults and children online can be read – respectively – here and here. The guidance on on-platform interventions can be read here.