The digital wild west met its sheriff this week. The Joint Online Safety Law Committee announced on Tuesday that it had approved its Report on forward-looking legislation. In unveiling the document, the Committee’s Conservative Chairman Damian Collins MP said, “The committee was unanimous in its conclusion that we need to get the time in the Wild West online.”
the 192 page report recommends a comprehensive revision of the draft law, which is aimed at companies that provide user-generated content – i.e. social media networks such as Facebook and Twitter and video sharing platforms such as YouTube and TikTok – as well as search engines such as Google. The bill puts due diligence on technology companies to protect users from harmful content, with the risk of a substantial fine from Ofcom, the law-enforcing communications industry regulator.
The bill is important because it is an attempt to properly regulate social media companies, video sharing sites and search engines under one legislative umbrella for the first time. And the report is also important. It is thorough work by a bipartisan group of knowledgeable and unanimous support for the 170 conclusions. The government has already promised to give it serious thought, and the committee is likely to remain in place as the guardian of the bill – a supervisory structure modeled on the joint human rights committee. Here are some of the changes to the bill that the report recommends.
Make the goals of the bill clear
The committee says the bill should set its core objectives “clear from the start”. This is also very useful for all TechScape readers who want to keep track of complex legislative work. According to the report, Ofcom should protect UK citizens online by ensuring that technology companies do the following: Comply with UK law and not endanger public health or national security; offer a higher level of protection for children than for adults; to identify and deal with the risk of “reasonably foreseeable damage” resulting from the operation and design of their platforms (algorithms and the like); recognize and respond to the disproportionate extent of harm that people experience due to protected characteristics (disability, age, sexual orientation, religion, etc.); Make sure your systems are secure – that is, they don’t lead users into rabbit holes with dangerous content; Protection of freedom of expression and privacy; and operate with transparency and accountability regarding online safety.
Legal but harmful content
One of the most controversial parts of the bill was Clause 11, which regulated adult due diligence: protecting them from legal but harmful content. This was a cause for concern, given that under the bill, the Minister of Culture would not only play a key role in defining such content – meaning Nadine Dorries would at least technically play a censorship role in terms of acceptable online language – but also an amorphous threat contained against freedom of expression.
The report proposes abolishing Clause 11 altogether and replacing it with categories of violations that reflect illegality in the offline world. This would mean banning online content that depicts abuse, harassment or inciting violence or hatred based on the protected features of the Equal Opportunities Act 2010. By doing this, the Committee hopes that technology platforms will address hate speech.
The new approach would also mean banning other forms of content that is illegal in the offline world, such as intimidating candidates in elections and facilitating human trafficking. The rationale of the report is that, since this harm is already illegal in the offline world, “society has realized that it is legitimate grounds for interfering with freedom of expression”. [online]. “
Journalistic exceptions – and citizen journalists
While we are on the subject of freedom of expression, the report also recommends tightening the exemption of content from news organizations from being deactivated by technology platforms. According to the committee’s new recommendation, if it is not illegal, it remains in place: “We recommend reinforcing the exception for news publishers to include a requirement that news publisher content should not be moderated, restricted, or removed unless it is is the content of the publication that clearly represents a criminal offense. “
There is also an attempt to cover “citizen journalists” such as bloggers by addressing the draft law protecting content of “democratic importance”. Instead, the report recommends that the bill protect content that is in the “public interest”. Citizen journalists whose content has been erroneously or unjustifiably removed can quickly resume their work using a special and accelerated complaint procedure.
In the bill, one of the three duties of care is to protect children from harmful content (the other two protect users from illegal harm and adults from legal but harmful content). There are a number of new protective measures for children recommended by the committee. These include: Asking all pornography sites to prevent children from accessing their content, which could lead to induction Old-age security measures; the definition of internet services that children are likely to access should be taken from the information officer’s age-appropriate design code; Introducing minimum standards for old-age security measures (from entering your date of birth through a pop-up form to more stringent age verification); and Ofcom should develop a code of conduct for the protection of children online, which should make reference to the UN Convention on the Rights of the Child, the AADC and the right of children to information under the European Convention on Human Rights.
Algorithms, misinformation, anonymity and security through design
The report recommends combating malicious algorithms, anonymous abuse and the spread of misinformation with an Ofcom-overseen “Safety by Design Code of Practice”. This code of conduct requires platforms to examine how they work, how they are designed, and how this could harm users. For example, tech companies need to look at the algorithms that deliver content to users and prevent them from leading people into dangerous “rabbit holes”. The mass dissemination of misinformation is countered by developing measures that prevent the smooth distribution of content on a large scale, and by taking it into account in the code of conduct. Anonymous trolling – and the dissemination of misinformation through anonymous accounts – is also added to the Code of Conduct as a special category for new accounts.
The big players should also be required to conduct annual independent reviews of the impact of their algorithms, their risk assessments (which tell Ofcom the damage their services could cause) and their transparency reports (which include things like the occurrence of illegal and harmful content and how many users are on such content has been encountered). Ofcom should also be empowered to inspect these audits and carry out its own controls.
New criminal categories
The report recommends the creation of new crimes, including: cyberflashing; encourage someone to harm themselves; deliberately sending flashing images to someone with epilepsy (with the intention of causing a seizure); and knowingly send false, persistent, or threatening communications. Technical managers are also affected by an expansion of criminal liability. The report calls for technology companies to appoint a board-level executive who becomes the company’s “security officer” and is held liable for a new crime: failure to address “repeated and systemic failures that create a significant risk of serious risk.” Damage to the user ”. The committee sees the latter offense as backing, but technology companies obviously don’t like it a bit.
Martin Lewis, the consumer champ, made one passionate appearance at a committee hearing in October where he said MPs and colleagues had “destroyed” lives by scammers who used his picture in fraudulent online advertisements. The committee listened and recommended that fraudulent advertisements be included in the scope of the bill. According to his proposal, Ofcom will be responsible for taking action against technology companies who consistently allow fraudulent or harmful advertising on their platforms.
No currency for crypto
The Bank of England came out with more cryptocurrency warnings on Tuesday. The bank’s deputy governor Sir Jon Cunliffe said the price of digital money like Bitcoin could “theoretically or practically” go to zero. The bank is now Employee blog, referring to Bitcoin, said, “It’s just a bunch of code that only exists in cyberspace. It’s not backed by the state. “While Threadneedle Street is right, it only serves to underline the anti-establishment status of cryptocurrencies.
If you want to read the full version of the newsletter please subscribe to get TechScape in your inbox every Wednesday.