A new report reveals significant public concern over Meta's rollback of safety measures to promote free expression on social media.
Meta, the parent company of
Facebook and Instagram, has faced substantial backlash following its announcement in January regarding changes to its content moderation policies aimed at enhancing 'free expression.' A report from the Molly Rose Foundation, a suicide prevention charity, indicates that a significant majority of the public opposes these policy shifts.
The foundation conducted a poll involving over 2,000 adults, revealing that 86% believe social media platforms should be obligated to actively seek out and address harmful content.
As part of its revamped policies, Meta CEO
Mark Zuckerberg stated that the company would reduce its proactive scanning for harmful content in certain cases, asserting that this approach was intended to promote free speech and minimize what he described as 'censorship.' The decision includes a reliance on users to report harmful content rather than removing it automatically.
Specific to this policy alteration, 71% of respondents expressed opposition to the reduction in automatic content removal.
The Molly Rose Foundation was established in memory of Molly Russell, who tragically died by suicide in 2017 after encountering distressing online content.
The charity has cautioned that the relaxation of moderation policies could increase the risk for vulnerable young individuals accessing harmful material.
It has called upon the government to fortify the Online Safety Act to prevent similar policy adjustments by social media firms.
Andy Burrows, the foundation's chief executive, emphasized the dangers posed by these changes, stating that they could significantly elevate risks of suicide and self-harm among youths.
He urged for immediate governmental action, reasserting that decisions regarding the protection of children online should be made by elected officials rather than influenced by large tech companies.
When introducing the policy changes, Zuckerberg claimed that these measures were a response to an increased global crackdown on American technology companies.
He also mentioned plans to collaborate with governmental officials to counteract these pressures, asserting that the opportunity to restore free expression was both timely and crucial.
In conjunction with these changes, Meta has begun testing the loosening of content moderation rules on topics that are frequently discussed in political contexts, such as immigration and gender issues.
Furthermore, Zuckerberg stated the company would eliminate its use of fact-checkers, opting instead for user-generated community feedback, which he criticized as politically biased.
In response to the growing concerns over online safety, a representative from the UK Department for Science, Innovation and Technology affirmed that all social media companies operating in the UK are legally required to remove illegal content, including materials that promote self-harm or suicide.
With the upcoming implementation of the Online Safety Act, social media platforms will also be required to enhance protections for children against exposure to harmful content.
The department stated that it would closely monitor the impact of these laws and take action if necessary to ensure child safety online.