The Joint Committee on the Draft Online Safety Bill has published its Report of the Session 2021 - 22, outlining a long list of recommendations for how the draft Bill could be improved.
The 194-page report seeks to address some of the most challenging questions posed by oral and written evidence. It follows several months of pre-legislative scrutiny on the draft Online Safety Bill which techUK summarises here.
techUK and our members support the objectives of the draft Online Safety Bill to improve online safety for UK citizens by providing a higher level of protection for children, following a proportionate approach and safeguarding freedom of expression and privacy.
We welcome the Committee proposals to simplify the next iteration of the draft Bill to help companies and society have a clearer understanding of what this legislation is trying to achieve and how it will deliver on the objectives.
The next stages of the legislative process will be critical and techUK would like to see Government move quickly to develop a workable Bill to be presented before Parliament in 2022.
What does the report say?
Objectives
The report begins by giving an overview of evidence which the Committee has heard around the content and activity that creates risks of harm experienced by different groups of people online', including children, adults, LGBTQ+ people, women and girls, religious groups, disabled people, and society more broadly.
The Committee outlines some of the factors which may contribute to increased risk of harm before recommending that the objectives of the Bill are clarified and placed at the beginning of the Bill text to ensure greater understanding about what the Bill is trying to achieve.
Services in scope
The Committee recommends that parts of the Bill, such as the child safety duties, apply to all internet society services', taking the same wide-net approach as the Age-Appropriate Design Code which applies to a range of services beyond user-to-user and search.
Under this new scope, the Committee would like to see the categorisation process overhauled' meaning there would no longer be a differentiation in obligations based on whether a service is Category 1, Category 2A or Category 2B.
Instead, they would like to see more weight given to individual companies risk profiles', asking Ofcom to consider factors such as reach, user base, safety performance and business models' when determining individual services' obligations.
As part of the risk profiles, the Committee recommends that end to end encryption should be identified as a specific factor', yet they would also like to see Government provide some more clarity on this issue.
Legal but harmful content'
The Committee recommends a revised approach towards harmful content which considers a single test to determine regulated content and activity'.
The aim is to better reflect the range of online risks people face and cover new forms of interaction that may emerge as technology advances', while considering the need to address societal harms such as misinformation and disinformation.
The Committee endorses the Law Commission's test to determine harm-based offences which assesses whether activity is likely to cause harm to a likely audience' and suggests that if adopted this could reduce regulatory burden and improve consistency'.
Illegal content
The Committee highlights consensus among stakeholders that illegal content should be prioritised before recommending the adoption of the Law Commission's proposals for reform of the Communications and Hate Crime Offences.
If implemented, this would add harms offences - such as cyber flashing, encouraging self-harm and false communications which intentionally cause psychological and physical harm - to the scope of the illegal content duties.
Ofcom would be required to issue a legally binding Code of Practice to assist providers in identifying, reporting on, and acting on illegal content', in addition to the Codes proposed in the draft Bill on terrorism and child sexual exploitation.
Fraud and paid for advertising
The Committee recommends for the scope of the Bill to be extended to include fraud, but no other types of economic harms.
Under the Committee proposals, Ofcom would be required to take responsibility for acting against service providers who consistently allow paid for advertisements that create a risk of harm to be place on their platforms', with a specific focus on fraud.
The Committee outlines how Ofcom would need to respect the boundaries of other bodies who have responsibility for advertising regulation, such as the Advertising Standards Authority. However, there is limited reference to the Financial Conduct Authority's ongoing regulatory remit over financial fraud.
Greater protection for children
The Committee agrees with Government that the Bill should ensure a higher level of protection for children and that there should be a broad definition of content that is harmful to children.
However, the existing definition in Clause 10 of the draft Bill which refers to a child of ordinary sensibilities' is questioned and the Committee recommends that the definition should be tightened and known risks of harm to children, such as pornography, gambling, and violence, should be set out on the face of the Bill.
Regarding the child safety duties, the Committee outlines support for the likely to be accessed by a child' test and recommends that this test should be the same as the Age-Appropriate Design Code, while asking Ofcom and ICO to issue a joint statement on how this would work in practice.
The Committee would also like to see Government require Ofcom to develop an age assurance technology and governance code' before the Bill's implementation, which would provide minimum standards for age-appropriate protection.
Safety by design
The Committee recommends that the Bill tackles design risks' by placing a responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm and take proportionate steps to mitigate risks of harm'.
This would involve the Bill setting out a non-exhaustive list of design features and risks associated with them' which could be amended by Parliament to respond to the development of new technologies'.
If implemented, Ofcom would be required to produce a mandatory Safety by Design Code of Practice' setting out steps for services to address risks relating to, but not limited to, algorithms, auto-playing content, data collection, frictionless cross-platform activity and default settings on geolocation'.
For platforms that allow anonymity and pseudonymity, the Committee would like the Safety by Design Code to include risk mitigation measures such as suspending users that violate companies' terms or providing users with options on how they interact' with verified and non-verified accounts.
Freedom of speech requirements, journalism, and content of democratic importance
The Committee notes how balancing people's right to freedom of expression with online safety was one of the most controversial subjects in their inquiry.
The report outlines the Committee views on how some of the existing definitions in the Bill, such as for democratic importance and journalistic content, were both too broad - creating a loophole to be exploited by bad actors and too narrow - excluding large parts of civil society.'
The Committee would like to see the journalistic content' and content of democratic importance' duties replaced with a single statutory requirement' which seeks to protect content where there are reasonable grounds to believe it will be in the public interest', such as contributions to societal debate and whistleblowing'.
Role of the regulator
Many of the Committee recommendations around the role of the regulator relate to the need for Ofcom to start developing key components of the regime, such as Codes, risk profiles and risk assessments, immediately.
In addition, the Committee outlines that for a service provider to not underestimate the level of risk on their service', the Bill should be amended to consider reasonably foreseeable risks and Ofcom should be required to set binding minimum standards for the accuracy of risk assessments'.
Other recommendations in this section include for the Bill to provide a framework for how regulators will work together, to ensure that Ofcom has enough power to co-designate' and to remove the powers of the Secretary of State to a) modify codes to reflect Government policy' and b) give guidance to Ofcom'.
Transparency reporting and redress
The Committee recommends that transparency reporting should be a regular requirement for in scope services as part of efforts to create a standardised reporting process' and share data with external researchers.
There are several areas which the Committee would like to be included in transparency reports, such as safety by design features', most viewed/engaged with content', proportion of users who are children' and time taken to deal with reports'. The reccomendations also suggest that Ofcom should be able to request any other relevant information.
In regard to user redress, the Committee recommends the introduction of an Online Safety Ombudsman to initially consider complaints about actions by higher risk service providers', but this could also extend to lower risk providers'.
Joint Committee on Digital Regulation
Finally, the Committee would like to ensure that digital regulation is subject to dedicated Parliamentary oversight and recommends a Joint Committee to oversee digital regulation. The role of this Committee would be to scrutinse the Secretary of State, review Codes of Practice, consider the impact of new technologies and help generate solutions to ongoing issues in digital regulation.
What does techUK think?
Commenting on the report, Antony Walker, Dep. CEO techUK said:
The joint committee report is well thought through and does a good job of disentangling and restructuring a complex Draft Bill. We welcome the focus on clarity, proportionality and a risk-based approach for the 24,000 companies in scope.
There is much in this report that could significantly improve the Government's Draft Bill. Importantly it focuses on the need for the Bill to be crystal clear about what it is asking businesses to do. Lack of clarity has been one of the sector's biggest concerns.
The ultimate test of this legislation remains whether it enables in-scope companies and the regulator to make swift and effective decisions without unintended consequences for fundamental freedoms online.
The committee has recognised the widespread concerns about using the term legal but harmful' and has proposed an alternative approach for a single test for regulated activity' which could be an improvement and deserves serious consideration.
The committee recommendations to streamline the child safety duties with the Age-Appropriate Design Code (AADC) would ensure greater consistency with existing legislation on child safety. We also welcome the recommendation to ensure Ofcom's independence and limit the powers of the Secretary of State.
Overall, the report takes a nuanced and proportionate risk-based approach, however, there are areas of concern:
- There is a risk that taken together, these proposals could amount to a requirement for general monitoring of user behaviour with a negative impact on individual fundamental rights.
- There is also a risk that the extension of scope to include fraud could over burden the communications regulator. We believe the issue of financial fraud should remain in the remit of the Financial Conduct Authority.
- There is also concern that a potentially overly prescriptive approach to age assurance could lead to the age gating of services to the detriment of young people.
The next phase of this process must include thorough engagement by Government with in-scope services to get into the detail of how the Bill will work in practice for different companies.
We encourage Government to come forward with a revised and improved Bill quickly so that we can get this new regulation into law.