Civil society campaigners have welcomed a parliamentary report recommending the draft Online Safety Bill be strengthened, saying the proposed changes would better protect vulnerable people and wider society online.

The Joint Committee on the draft Online Safety Bill has published its report after an inquiry into the issue of online harms and the proposed legislation to combat it.

Among its recommendations, the report says that paid-for advertising should be added to the scope of the Bill, as should cyberflashing, content promoting self-harm and the deliberate sending of flashing images to people with photosensitive epilepsy.

It also said the proposed regulator, Ofcom, should be given more powers to audit and fine companies who breach mandatory codes of practice that should be introduced, with named senior managers as “safety controllers”, who could be liable should their company fail to protect users.

In response, civil society groups including Demos, Reset, Hope Note Hate and the Antisemitism Policy Trust said they believed the changes recommended would help reduce the spread of harmful content online.

Poppy Wood, UK director of Reset, said: “Unregulated social media has led to the proliferation of Covid-19 disinformation, the promotion of damaging content to children, and unfettered racist and misogynistic abuse.

“Everyone has a right to free speech – but it’s time to move past a binary debate between liberty and censorship.

“Instead, we need to challenge big tech platforms’ business models that prioritise and amplify content that causes division and violence and undermines trust.”

The group of organisations backing the report also includes FairVote, Glitch, Carnegie Trust UK, Compassion in Politics, SumOfUs and Tell Mama.

William Perrin, trustee of Carnegie UK, said the recommendations were “sensible and proportionate” and that it was “vital” the Government accepted the proposed changes because it would help tackle content which is causing “real-world harm, distress, discrimination and suffering”.

Tech giants have expressed their support too, with Meta, the parent company of Facebook, Messenger, Instagram and WhatsApp welcoming the latest work on the Bill.

“We have long called for new rules to set high standards across the internet and are pleased the Online Safety Bill is moving forward,” a Meta spokesperson said.

“While we already have strict policies against harmful content on our platforms, regulations are needed so that private companies aren’t making so many important decisions alone.

“We look forward to continuing the discussion with the government, Parliament and the rest of the industry.”

However, Dr Bill Mitchell, director of policy at BCS – the Chartered Institute for IT, said that while the BCS supports the recommendations, he had concerns over the clarity of the Bill.

“If I were the mum of a young daughter whose social life is mainly online, it wouldn’t be clear to me if this bill really does do enough to keep her safe,” he said.

“What would bother me is it seems to rely entirely on the platforms’ own risk assessments and their own reporting on how well their systems work at reducing harm.

“I’d want to have more reassurance around how the Bill will guarantee auditing and that accountability of social media platforms is open, transparent and rigorous.”