Source: Department for Science, Innovation and Technology published on this website Monday 19 May 2024
The Online Safety Act 2023 (the Act) is a new set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. The Act will give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear.
The strongest protections in the Act have been designed for children. Platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.
The Act will also protect adult users, ensuring that major platforms will need to be more transparent about which kinds of potentially harmful content they allow, and give people more control over the types of content they want to see.
Ofcom is the independent regulator of Online Safety. It will set out steps providers can take to fulfil their safety duties in codes of practice. It has a broad range of powers to assess and enforce providers’ compliance with the framework.
Providers’ safety duties are proportionate to factors including the risk of harm to individuals, and the size and capacity of each provider. This makes sure that while safety measures will need to be put in place across the board, we aren’t requiring small services with limited functionality to take the same actions as the largest corporations. Ofcom is required to take users’ rights into account when setting out steps to take. And providers have simultaneous duties to pay particular regard to users’ rights when fulfilling their safety duties.
To read the full document