Skip to content

Enhancing Online Security for Kids: Breakdown of Amendments to the Online Safety Bill

Internet Safety Regulations: A Closer Look at Policies Protecting Minors Online

Enhancing Online Child Protection: Explaining the Revisions to the Online Safety Law
Enhancing Online Child Protection: Explaining the Revisions to the Online Safety Law

Enhancing Online Security for Kids: Breakdown of Amendments to the Online Safety Bill

In a landmark move, the UK Online Safety Act, effective from July 25, 2025, marks a significant step forward in child protection online. The new legislation requires digital platforms to implement robust age verification processes to restrict users under 18 from accessing age-restricted content, particularly adult content.

Under the Act, age checks must be technically accurate, reliable, and fair, ensuring non-bias across different user groups. Ofcom, the UK regulator, has provided a non-exhaustive list of acceptable methods, including cross-referencing email data, authentication via mobile network operators, and facial age estimation techniques that do not store biometric data. Simple self-declaration of age or payment methods without age verification are not accepted.

Data privacy protections are also a priority. Platforms can use methods like facial scans or photo ID to verify age, but they must handle this data securely. The law emphasizes minimizing data storage and ensuring third-party data handlers meet strict data security standards to reduce breaches.

To mitigate circumvention, platforms should implement controls that prevent users from bypassing age verification, including addressing VPN use by children or content encouraging circumvention.

The Online Safety Act places clear and unequivocal duties on platforms to protect freedom of expression, in addition to their legal duties to keep children safe. Services must comply with the UK's data protection laws, and the Information Commissioner's Office has set out the main data protection principles that services must take into account in the context of age assurance.

Internet Matters, a UK-based organisation focused on online safety for children, sees today as an important milestone towards ensuring that online services are designed with children's safety in mind. Chris Sherwood, the Chief Executive, supports the Online Safety Act, stating that it can be a vehicle for significant and lasting change.

The Act does not require platforms to age gate any content other than those which present the most serious risks to children, such as pornography or suicide and self-harm content. However, it is expected that platforms will ensure that strangers have no way of messaging children, including preventing children from receiving DMs from strangers and not recommending any accounts to connect with.

Failure to meet either obligation under the Act can lead to severe penalties, including fines of up to 10% of global revenue or £18 million, whichever is greater. Barnardo's Lynne Perry supports the new protections, but emphasizes the need for robust enforcement.

The Online Safety Act is considered the most significant step forward in child safety since the internet was created. People now have to prove their age to access pornography or harmful content on social media and other sites. While proving age, platforms must confirm it without collecting or storing personal data, unless absolutely necessary.

DSIT media enquiries can be directed to [email protected] website or 020 7215 3000 (Monday to Friday, 8:30am to 6pm). The Act comes into force to protect under-18s from harmful online content such as pornography, self-harm, suicide, and eating disorder content.

While the Act is a significant step forward, it is crucial to remember that it does not ban any legal adult content, but protects children from viewing material that causes real harm in the offline world. According to Internet Matters, 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers. Children as young as 8 have accessed pornography online, and 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.

The Online Safety Act is a testament to the UK's commitment to ensuring a safer online environment for its young population. It is expected that platforms will adapt to these new requirements, ensuring that children can enjoy the benefits of the internet without the risk of harm.

  1. The UK Online Safety Act, applicable from July 25, 2025, expands beyond child protection online, extending to areas like health-and-wellness and mental-health, where platforms must ensure that children are not exposed to content promoting disordered eating or stigmatizing body types.
  2. As part of the extensive education-and-self-development opportunities offered by the digital realm, the Online Safety Act in the UK fosters a safer environment for under-18s, encouraging providers to implement robust age-verification systems while safeguarding privacy and minimizing data storage.

Read also:

    Latest