Facebook and Google will have to deal with a new law in the UK soon. Companies will be fined if they do not remove certain harmful content quickly on their platforms.
The law refers to content that encourages child terrorism, abuse, and exploitation. This is yet another example of a government falling on technology companies, something we have seen happen in recent months.
The UK government says Facebook is not simply a content distributor but a 'media empire'. This law requires that directors of companies be liable if the content is not withdrawn at a specific time.
UK is putting pressure on Google and Facebook
These measures also come to combat misinformation and the so-called fake news or possible interference in elections. Certainly, the passage of Articles 13 and 11 will reinforce these measures.
Reports indicate that this pressure from the UK on online content was caused by Molly Russell, a 14-year-old boy. The young woman committed suicide in 2017 after allegedly viewing content related to suicide on the internet.
The terrorist attack in New Zealand and its transmission also motivated the UK legislation to create a new law. Apparently, the UK government does not believe that companies do enough to protect their users.
The new proposal aims to protect UK citizens from potentially harmful content. In addition, it aims to ensure that companies do not escape their responsibilities.
Facebook will not be the only one affected by this new law. Search engines like Google and online messaging services will also be regulated by the new law.
In conclusion, we can interpret this law as a filtering tool for harmful content. However, in relation to children, this responsibility lies with the parents, not with the government.