Social media executives from Facebook, Google and Instagram could be held personally liable for harmful content distributed on their platforms, according to a person familiar with U.K. government plans.
U.K. ministers are frustrated at the slow speed of self-regulation by internet giants amid the dissemination of images relating to terrorism, self-harm, suicide and child abuse.
In legislation due to be published Monday, executives could be found liable for the images under a new legally-binding duty of care on the companies.
A regulator, likely to be funded by the companies themselves, will be able to impose fines and hold bosses accountable.
Search engines alongside online messaging services and file hosting sites will also come under the remit of the regulator, according to the person familiar with the plans.
Annual reports on what companies have done to remove and block harmful content will also be required.
The need for legislation has been highlighted by the terrorist attack in New Zealand last month in which 50 Muslims were killed while footage was live-streamed online.
In the U.K., the case of 14-year-old Molly Russell has focused minds. According to her father, the teenager killed herself in 2017 after viewing self-harm and suicide content online.