About the Act
In March, the Online Safety Act comes into force for UK online services.
The Act attempts to curb the spread of harmful content online and puts the onus on the service providers to ensure that online spaces are kept safe from such content. The Act applies to services that involve user-to-user generated content and online search services.
Services of all shapes and sizes that have “links” to the UK are in scope for the Act, which is being regulated by Ofcom. Ofcom have also published a tool you can use to check if your service is in scope. If a service is in scope, penalties for non compliance are very high and potentially personal, even for people that are officers at businesses limited by liability.
Opinion
The motivations for the Act seem completely proportionate and sensible to me; people shouldn’t be using online services (or any mechanism) to spread abusive or harmful material. Equally, people (and particularly children) should feel free and safe to use online services without being harassed, abused, or exposed to harmful content. We’ve likely all experienced hateful content at one time or another (whether directly towards us, or to someone else) and even that alone can be enough to cause worry, stress, and trauma. The Online Safety Act is designed to limit this content along with significantly worse and damaging material.
Some of the opaqueness surrounding the specifics of the new legislation, and how it will be enforced, means there are a number of open ends that cause the risk to be very high to individuals managing online services, even if substantial steps are made to meet the needs of the Act. I very much hope that Ofcom will be positioning themselves as a helpful ally in collaboration with online services, rather than as a bully, in ensuring that the small services ecosystem (who often do not have the resources for lawyers and yet more bureaucracy) can remain safe and compliant.
Having been through most of the guidance published by Ofcom, the accessibility to understanding and implementation is certainly sub-optimal, without particularly concise, clear or complete steps or checklists on how services can become compliant. Even the guidance document on creating a risk assessment alone is a staggering 84 pages long and not particularly easy to consume. It would be great to see some really nice concise, quick and easy-to-access reference guides in order to allow smaller services to really make a difference in a shorter period of time.
Some recent blog posts on the topic (example 1 from Terence Eden and example 2 from Russ Garrett) probably do a better job of comprehensively and easily conveying the requirements of the Act than Ofcom’s bizarre PDF maze of “materials”, “chapters” and “volumes”.
Don’t get me wrong; I feel (and hope) that the implementation of the Act will result in a net positive outcome for all online services and the safety of their users. Ofcom just need to be wary that the potential threat of their severe penalties may force small and not-for-profit businesses to close purely out of fear, and thus remove competition from the larger companies that can afford the legal expertise, time, and bureaucracy – as well as the fines.
Indeed, some services are already planning to close down ahead of the deadline. For example, the long-established London Fixed Gear and Single-Speed cycling community has decided to end its services ahead of March 17th. Not because they intend to be non-compliant – but the combination of the vagueness of the “rules”, the substantial additional documentation, training, and technical changes required, and the huge personal risk to the administrators mean that continuing would be untenable for them. Especially if there is an added risk of disgruntled users inappropriately weaponising the Act, as they fear.
Thinking practically
Anyway, all that aside, the fact of the matter is that the Act is seemingly here to stay and we need to assess whether our services are in-scope and to determine what actions need to be taken.
Treadl is an open source project that I maintain. Anyone can use the source code and run their own version of the platform, for which there are instructions available. As the lead project maintainer I also run an “official” version of the software that anyone can sign up for and use free of charge at treadl.com, and I do this as a way of giving back to the open source community. There is no business or financial reward for any of this; the servers alone cost me more money to run than the small number of kind donations I receive, but I don’t do it for the money.
The Treadl project is aimed at textile weavers and provides a place for these artists to store, manage, and share their work. Despite the fact that the target audience is clear (and pretty niche), the fact that it conveys user-generated content (including text and images), and that I am from the UK, the service that I run at this domain falls firmly within the scope of the Act.
Treadl only has a few thousand active users and so would be deemed a “smaller service” by Ofcom. I consider it also to be “low risk” due to its niche and focused nature (though the full risk assessment is still in progress). However there are certainly processes and technical changes that can be introduced and made to reduce this “risk” even further.
I am keen to keep Treadl up and running as I believe the aim of the legislation is not to shut services down, but to simply ensure (through law) that they are run in a safe manner to minimise harm to users. As such, I intend to (try to) follow Ofcom’s guidance to take the steps necessary to be compliant with the Act, relevant to the risk level.
Specifically, I plan to:
- Conduct a full risk assessment, using the guidance from Ofcom and considering Treadl’s primary risk factors and the 17 priority kinds of illegal content;
- Assign a responsible person (i.e. me) for handling processes, comms, reports and complaints related to online safety;
- Based on the risk assessment, make relevant technical changes to the platform (such as adding pre-publishing moderation capabilities, a new reports/complaints form referenced throughout the platform, and removal of features that would not enable me to safely moderate);
- Based on the risk assessment, introduce appropriate process changes to ensure online safety is maintained on an ongoing basis (e.g. periodic risk assessment reviews, periodic reviews of updated legislation, procedures for handling complaints, etc.).
Much (if not all) of the above I will try to do in the open within the Treadl repository and Wiki for public transparency and version control. OpenBenches have openly published an excellent summary of their actions as a result of their risk assessment, and I hope to do something very similar for Treadl.
I hope that this can be an opportunity for the ecosystem of small services to come together to share learnings and best practices for maintaining compliance with this new legislation. If you run a small service and are interested in collaborating through learnings and process implementation, please do feel free to reach out.