Clicky

Playing catch-up with technology is not enough when it comes to safeguarding issues

In 2020 the Age-appropriate design code, which contains a number of design features in connection with the duty of care, became law, and after a twelve-month transition period online services must now comply with this. More legislation is in the works with a Online security invoice expected to enter into force at the end of 2023 / beginning of 2024.

“The online safety bill introduces the concept of a regulator for the internet. Ofcom will have the power to ensure that the major internet companies demonstrate a duty of care to their users. The bill aims to ensure that children are not exposed to online harm and that companies can and will be fined for failing to fulfill their duty of care, ”explains Carolyn Bunting of Internet matters.

While the Online Safety Act is seen as a largely workable model, the NSPCC believes it needs to be strengthened to:

  • Stop the spread of maintenance and abuse between apps
  • Stop abuse at the earliest opportunity
  • Fix large gaps in child safety requirements as high risk sites like Telegram and OnlyFans could be excluded as only companies with a “significant” number of children in their apps would be required to protect them, which could result in high risk services potentially being distributed to smaller locations relocated
  • The management must be held accountable, companies can be penalized under criminal law if the duty of care is not observed
  • Commit to a legal user advocate for children.

Sonia Livingstone of the London School of Economics (LSE) points out another problem with both the Online Safety Bill and the Age Appropriate Design Code, as neither the technology used in schools for learning nor for protection does not cover “because the contract not provider-to-user, but provider-to-school, the legal responsibility seems to lie with the school rather than the digital provider ”.

“The bill aims to ensure that children are not exposed to online harm and that companies can and will be fined for failing to exercise due diligence.” – Carolyn Bunting, Internet Affairs

She adds, “Given the pace of technological change, it is vital for schools and businesses alike to adopt forward-looking strategies such as privacy impact assessments, safety by design, and impacts on children’s rights.”

CRIA (Child Rights Impact Assessment) was introduced to assess the impact of policies and programs on children’s rights. The exam is now carried out by the Commission for the digital future on the feasibility of using CRIA as a means of embedding children’s well-being in a digital world.

Developers of new systems are more interested in product development than in questions of validation. As technology continues to evolve, there is a risk that lawmakers and educators will “catch up” instead of taking the initiative.

-