A national charity has published a strategy to regulate social media and whole tech companies to account when they fail to protect children on their sites.
The NSPCC's 'blueprint' came with the approval of nine out of ten British parents who took part in a survey and backed regulation of social networks to make tech firms legally responsible for protecting children.
In the same survey six out of ten adults said they don’t think social networks protect children from sexual grooming, and inappropriate content like self-harm, violence or suicide.
The proposals come ahead of a White Paper on online harms which will street the Government's future policies on how it could protect children online and regulate social media sites.
The charity is calling on the Government to create an independant, statutory regulator that will have legal powers to investigate tech firms and demand information about child safety measures.
They also want social networks to meet a set of minimum child safeguarding standards and deploy tough sanctions for failures to protect their young users – including steep fines for tech firms of up to €20m and a new criminal offence for platforms that commit gross breaches of duty of care.
The charity says their plans would make Britain a 'leader in online child protection'.
Some of the ways social media sites could promote child safety is to make children's accounts have the highest privacy settings by default, turn off friend suggestions and make them invisible to public searches.