A former Meta employee has said that the company is wilfully failing to protect children from violent and harmful content.
Meta, founded by tech billionaire Mark Zuckerberg, owns and operates Facebook, Instagram, Threads and WhatsApp, among other products and services.
On Newstalk Breakfast, former director of engineering for protect and care at Meta Arturo Bejar said that dangerous and inappropriate content could be removed from Meta’s platforms in just six months.
“If Mark Zuckerberg woke up tomorrow and said, ‘I want to create the safest environment for teenagers that I can with my company,’ it would probably take around six months to create a feed that’s free of [things] like pornography, violence, self-harm content,” he said.
“[Also] to change messages for kids, [so if] they get unwanted advances, all they need to do is flag it - no reporting - just say, ‘Hey, this is not for me’ - and these changes wouldn’t really affect their bottom line.
“I think it’s a matter of values. They keep demonstrating that we cannot trust them with our kids.”
Mr Arturo said that he was directly involved in online moderation when he was with the company, but that many policies have since changed.
“For six years I was the person responsible for all of the different aspects of engineering and product and research to help people deal with harm online,” he said.
“So, there would be things like stopping spam, helping with bullying and harassment, helping people who might be thinking about...
Read Full Story:
https://news.google.com/rss/articles/CBMipAFBVV95cUxQZzJHaWt5U0VEWjhLUTRzal9F...