HomeInsightsOnline Safety Act: Ofcom publishes guidance for online video games industry

Ofcom has issued guidance for the online video games industry on how to comply with the Online Safety Act 2023 (OSA). 

The guidance sets out the various risks associated with online gaming from the perspective of the OSA. For example, it points to Ofcom’s research indicating that nearly half of 13-17 year olds report being concerned about trolling or abusive behaviour and threats when playing games online. At the same time, other organisations have highlighted the use of voice or text chat services in online multiplayer games to groom children, and even the recruitment of children into trafficking through online video games.  

Central to the OSA is placing more responsibility on services to root out such risks, and the online gaming industry is no exception. As the guidance explains, those services which enable users to interact with each other, or to create, share or upload content may well be caught by the OSA. Ofcom sets out examples of how this might apply in the context of online video games services such as users interacting “by creating or manipulating player profiles, avatars, objects, and the environments themselves, or by using voice and text chat (including, for example, team-based channels or open-world chats)”. Similarly, it states that “online safety rules would apply to content where games use matchmaking systems to connect users with each other, including strangers, through mechanisms such as populating lobbies and/or by assigning players to teams, and where services enable livestreaming”. 

Ofcom stresses that those who are unsure as to whether they are caught by the OSA’s regulatory regime should take advice and consult Ofcom’s resources. If the OSA does apply to them, they must carry out (1) an illegal content risk assessment, (2) a children’s access assessment, and, if applicable, a children’s risk assessment, and (3) put in place relevant protections.  

To read the guidance in full, click here.