Meta announced Friday a list of new efforts it’s making to keep kids safe on its platforms, notably after a whistleblower recently testified to Congress about the company’s failure to protect children.
Meta laid out several steps it is taking to making children on its social media apps like Facebook and Instagram safer, including making changes to the content it suggests to younger users, targeting “suspicious adults,” and utilizing online child safety experts. This all comes after a former Meta employee told Congress less than a month ago that the company has not done enough to keep minors safe. And earlier this week, Meta sued the FTC the latest move in a legal battle in which the FTC claims Meta has failed to make good on its privacy promises made in a 2020 settlement.
“We take recent allegations about the effectiveness of our work very seriously, and we created a task force to review existing policies; examine technology and enforcement systems we have in place; and make changes that strengthen our protections for young people, ban predators, and remove the networks they use to connect with one another,” Meta’s statement reads, alluding to the recent testimony made against it.
Reels and Instagram explore, both of which feed and suggest content to users, will expand their protections. The social media giant says these tools already designed to avoid suggesting upsetting or rule-breaking content, but a central list across Facebook and Instagram will grow to include...
Read Full Story:
https://news.google.com/rss/articles/CBMibWh0dHBzOi8vcGV0YXBpeGVsLmNvbS8yMDIz...