3 minute read

In the Spotlight Online Safety Bill

Earlier this month, Jess attended the SACPA Digital Safeguarding conference, which was opened by the Rt Hon Nadine Dorries MP (left), Former Secretary of State for Digital, Culture, Media and Sport. Whilst in office she was responsible for bringing the Online Safety Bill to parliament in 2019, a Bill which she believes is crucial for the safety of children and young people (those aged 18 years and younger) who make up one in three of global internet users. As educators and parents, it is impossible for us to police young people’s internet usage, so it is incumbent on government to hold internet providers and websites to account when their platforms cause harm to children and young people. Dorries believes that, despite some of the Bill being ‘watered down’, the current scope will improve the safety of children and young people online.

The online safety bill is wide ranging, technical and complex – so what do we need to know? The Bill is currently in the House of Lords and, if passed, will become law later this year and will impact everyone in the UK. The aim of the Bill is to make the UK the safest place to be online by:

1. Making it more difficult for young people to access harmful content online.

2. Imposing new legal responsibilities on companies that provide online services, such as social media sites and search engines.

3. Providing adults with greater choice over the content they see when using social media.

The Bill doesn’t apply to all websites – it focuses on online services which host content posted by other people (e.g. Facebook, Twitter and Instagram) and search services (e.g. Google, Yahoo and Bing).

What protections will it provide for children and young people?

There are currently very few laws that regulate what content can be shared online. Online safety campaigners have long been concerned about the impact the unregulated internet is having on users, especially children. The Bill responds to this by creating a legal responsibility (a ‘duty of care’) for the operators of userto-user services to protect users under the age of 18 from harmful content. The Bill includes new requirements such as:

• Enforcing minimum age requirements – where platforms specify a minimum age for users, the Bill requires them to clearly explain how these are enforced.

• Publishing risk assessments – the Bill requires userto-user services to carry out risk assessments on the dangers posed to users. These must be published and allow parents, guardians and the wider public to understand the potential risks posed by using the service.

• Protecting children from harmful content published on the service – this includes, for example, ensuring that potentially harmful content is not displayed to users under 18. This content will, however, continue to be available to adults.

• Properly applying the Terms and Conditions – many social media sites already have policies about what content should or shouldn’t be allowed. In practice, these are inconsistently enforced. The Bill requires that affected services properly apply these rules in order to protect users.

How will the Bill be enforced?

Ofcom, the UK’s existing communications regulator, will be responsible for making sure that platforms within scope comply with the new legal duties introduced by the Bill. Ofcom will be responsible, therefore, not only for content broadcast on TV and radio but also that online. They will have powers to obtain information from website operators on how they deal with online harms and to act if they fail to comply with their new duties. Ofcom will have power to investigate website operators and issue financial penalties to companies that do not comply with the new regulations (a maximum fine of either £18 million or 10% of the company’s global turnover – whichever is higher). Providers will face criminal consequences if they break the remit of the Bill’s regulations.

It is recognised that the Bill is a landmark step towards improving online safety for everyone. It has gone through a lot of changes since it was first introduced, but does the Bill go far enough? Some campaigners are concerned that the latest version of the Bill does not in addressing the volume of harmful content online. This is because many types of websites and potentially harmful content falls outside the scope of the Bill. There are others who think that the Bill goes too far in limiting what can be said online. As the Bill is not yet law, it may go through further revisions before approval. Whatever your thoughts, there is no doubt that the Bill will enforce a higher level of scrutiny into the online world and require website and social media developers to think carefully about protections for children and young people.

This article is from: