Parents call for stricter social media regulation

Tuesday 12th February 2019 12:06 EST
 

National Society for the Prevention of Cruelty to Children (NSPCC) launches plan to tame Wild West Web and more than eight out of ten adults back social network regulation. This plan will include new criminal offence for gross breaches by tech companies and hefty fines of millions, shaming tactics and disqualification of directors.

Debanjali Bhattacharjee is a mother of all trades and jack of none. She is a part-time worker at Sainsbury's and a full-time mother of two 11 and 16 year old sons. While away from home for most of the day, Debanjali ensures that dinner is family time. But that doesn't ward off her concerns emerging from the recent media reports about the improper use of social media.

“It is easy to sign up for various social media apps these days even if you're under 18,” she says.

In an effort to have that 'friendly relation' with their children, a lot of these parents are themselves joining social media and 'following' their children on Instagram and becoming 'friends' on Twitter. But while that maybe the case, it is difficult for them to monitor their child's activities with respect to their viewings.

“Being away from home and not having enough time with the children makes it difficult to know what exactly they are 'consuming on the internet'. We try to have an open atmosphere at home where we can share everything with each other but children are not always comfortable in talking about everything and that is where these social media websites come into play,” she said.

According to a new NSPCC Survey, 86% of the adults back regulation of social networks to make tech firms legally responsible for protecting children.

More than half of adults in London approximately 56% do not think social networks protect children from sexual grooming, and the same proportion don’t think networks protect children from inappropriate content like self-harm, violence or suicide. Whereas across Britain, six out of ten parents did not feel social networks are doing enough to keep children safe from sexual grooming.

Ruth Moss, whose daughter Sophie took her own life at the age of 13 after looking at self-harm and suicide content on social media, is backing the NSPCC’s campaign for statutory regulation.

Ruth said- “Sophie’s death devastated me. No mother, or family, should have to go through that. It was so unnecessary; she had so much to live for. She was only 13.

“I found out that she had been looking at completely inappropriate things online. Some of the images were so graphic that even as an adult, I was shocked.

She was also communicating with people in their 30s and pretending to be older than she was, under a made up persona. Whilst the internet was heavily controlled at home and at school, Sophie had free Wi-Fi when she was out, making it very hard to 'police' her internet use 24 hours a day.

Peter Wanless, NSPCC Chief Executive, said- “The support for statutory regulation of social networks is now overwhelming. The Government’s Online Harms White Paper must impose a legal duty of care on social networks. Our proposal to tame the Wild West Web would make the UK a world leader in protecting children online. We urge the Government to be bold and introduce these measures without delay.”

A huge majority of adults in the NSPCC’s survey also backed a call for social networks to be legally required to make children’s accounts safe, including the highest privacy settings by default, friend suggestions turned off, not being publicly searchable, and geolocation settings turned off.

Tech firms would have a duty to risk assess its platforms and promptly notify the regulator if children had come to harm or been put at risk on their sites.

In the case of gross breaches, tech firms would be charged with a criminal offence and directors overseeing the duty of care could face disqualification.


comments powered by Disqus



to the free, weekly Asian Voice email newsletter