Live Markets, Charts & Financial News

The Impact of the Online Safety Act on UK Businesses

2

Introduced to address growing concerns about the safety of internet users – particularly children and vulnerable groups, the Online Safety Act (OSA) represents a major shift in the regulatory landscape for companies operating online platforms in the UK.

Passed in October 2023 and being implemented gradually, it introduced a wide range of new commitments, imposing stricter requirements for transparency, age verification and content moderation to create a safer online environment.

Under the law, companies operating online must now ensure transparency by regularly publishing their safety measures and reporting on their efforts to regulators. This means not only creating new policies when needed, but also providing evidence that these policies effectively mitigate the risks associated with harmful content. The law focuses specifically on platforms accessed by children, requiring additional safeguards and age-appropriate design features.

To comply with these new regulations, digital platforms will be required to implement stricter risk mitigation policies and are mandated to cooperate with… ofcom, Telecommunications Regulatory Authority in the United Kingdom. Ofcom will oversee implementation of the law and impose penalties on those who do not comply. To comply, companies must maintain detailed compliance records by constantly updating and improving their safety procedures to keep pace with evolving risks.

Effective age verification and safeguards for children

One of the most important elements in OSA The focus is on protecting children and young people when they access the Internet. By 2025, online platforms accessible by minors will be required to implement age checks to determine whether users are children.

Ofcom will publish final guidance in early 2025, however, in the meantime, it is clear that basic or outdated age verification systems – such as a simple “yes/no” checkbox or self-declared age – will not be sufficient, and age is very effective and should be used. Guarantee measures. Innovative technologies that verify users’ ages while protecting their privacy are not just a pipe dream; They are available and ready to deploy.

Platforms are also expected to incorporate more age-appropriate design features that reduce the risk of children being exposed to harmful content. This means filtering out explicit material, protecting personal data, and placing limits on interactions with adults, all while maintaining a user-friendly experience. For example, social media platforms will need to evaluate how they moderate conversations, structure social interactions, and regulate the visibility of certain types of content.

The need for content moderation and transparency

Encouraging effective content moderation is another key element of the Online Safety Act. Companies are committed to implementing systems to mitigate harmful content, including hate speech, violence, and inappropriate material that can harm users, especially minors. To achieve this, platforms must adopt proactive rather than reactive measures to prevent harmful content from being uploaded or disseminated before it reaches their users. Content moderation efforts should also be transparent, with companies documenting and publishing their policies and any actions taken as well as their results.

The law is designed to hold platforms accountable not only for the safety measures they implement, but also for how well the measures work in practice. Companies that fail to demonstrate strong content moderation could face legal repercussions or fines from UK regulator Ofcom.

Technologies to make the Internet safer

Safety technology solution providers are constantly innovating and evolving solutions to keep up with the ever-changing and challenging online environment. In the area of ​​age assurance, technological advances and the introduction of AI-based technologies mean that safety technology providers can now offer a range of highly accurate, privacy-preserving age assurance methods that protect user privacy, reduce friction, and ensure compliance with applicable rules. – Advanced regulations.

While some methods require user interaction, such as uploading a photo of an identity document or taking a short video, other methods use existing user data. This data, such as email address, can often be collected as part of the account creation process or during the checkout process on online marketplaces, and can be published in the background without requiring further user interaction. Email address age estimation can accurately determine a user’s age without the need for sensitive personal information, allowing companies to maintain compliance while protecting user privacy.

In the context of content moderation, artificial intelligence (AI) will play a crucial role in helping platforms maintain a safer environment. This technology can be used in conjunction with human moderators to add an extra layer of support and scalability, and quickly remove malicious material at scale.

Opportunity for UK businesses

For UK businesses, the OSA is not just another regulation to follow but a hugely important opportunity to make the internet safer. By adopting cutting-edge safety measures and prioritizing transparency, companies can build trust with their users and demonstrate a commitment to protecting children when they venture online.

Companies that proactively harness and implement effective age verification and content moderation will also benefit from the ability to avoid regulatory fines and quickly adapt to future regulatory changes. Given the fast-paced nature of the Internet, companies that are able to stay ahead of regulatory requirements now will be better positioned to thrive and grow in the years to come.

As new legislation, the OSA naturally requires businesses to change the way they operate, which can be difficult at first. However, by staying up to date on regulatory changes, leveraging evolving technologies, and implementing them effectively, companies can strategically position themselves to become a trusted voice in their field and ultimately better protect children and youth online.


Lina Ghazal

Head of Regulatory and Public Affairs at VerifyMy specializing in ethics, regulation and online safety, previous roles were at Meta (formerly Facebook) and Ofcom.

Comments are closed, but trackbacks and pingbacks are open.