What has happened to the Online Safety Bill?
Judith Thompson 29-11-2022
The recently created Online Safety Bill has been amended to get rid of plans to make tech companies remove harmful content. The arguments raised which led to the amendment, centred around freedom of speech, but dismay has been voiced by those who work with victims in this area, including the Samaritans.
One counter argument, put forward by Culture Secretary Michelle Donelan, is that adults will have more control over the content they can see online.
However, since the death of teenager Molly Russell, who viewed content relating to suicide and self-harm before taking her own life, concerns have grown about the continued availability of such material.
The Bill was drafted to include a provision that the biggest and most influential social media platforms would have to take action to tackle such harmful material, even if it was strictly legal. This has been replaced with a requirement for these companies to create more filters so that users can remove content they don't want to see.
If they don't comply with the Bill, tech companies can be fined up to 10% of their worldwide profits, but it seems that this entirely defeats the object of the original Bill, which was to prevent this harmful content being available in the first place.
The law still says that children should be prevented from being able to access harmful and inappropriate material. However, the tragic deaths of Molly Russell and others demonstrates just how weak these provisions are, and how easy to get around, for anyone with even the most rudimentary IT skills.
The Samaritans pointed out that even when someone becomes an adult, they can still be seriously harmed by some of the material which is available online. People who have pre-existing issues with their mental health are likely to be at a higher risk of harm.
Victims of harmful online content and publications will have to continue to rely on the existing law, including provisions surrounding defamation, breach of privacy, misuse of personal data and other causes of action.
There will be stricter provisions requiring tech companies to confirm how they check the age of users, as this is key to ensuring that children are protected. How this can be successfully achieved remains an unknown.
If you are concerned about the effect of online content upon your children, you should contact the police and the social media platform. If your attempts to protect your children online fall on deaf ears, we may be able help. Contact us to speak to a specialist today.