Meta has made the decision to pull back on their content moderation policies, as they have recently claimed that this approach has been over-enforced, error prone, and its third-party moderators have been biased. Meta has reported it’s getting rid of many restrictions on topics around immigration, gender identity, and gender.
The technology conglomerate has admitted that these moderations have been shaped by social and political pressures, resulting in things “going too far” and ultimately hindering free expression.
Of note, the changes announced will only impact organic content on the platform. As it pertains to ads, Meta will continue to focus on ensuring and upholding brand safety through their robust suite of tools for advertisers — as they value the importance in giving advertisers more transparency and control over their brand suitability.
Some key takeaways from the announcement include:
Free expression: Meta will allow more conversation by lifting restrictions on topics that are part of mainstream discourse and will focus enforcement on illegal and high-severity violations.
Political content: Meta will take a more personalized approach to political content so that users who want to see more political content in their feeds are provided more opportunities.
Meta’s New Approach to Content Moderation
Meta is shifting toward community moderation through a new “Community Notes” feature. This is similar to what X shifted toward after Elon Musk purchased it in 2022 and removed content moderation.
Community Notes on X are now being studied as a possible tool to curb misinformation, although some say the removal of content moderation on X may have led to an increase in misinformation.
Research thus far has indicated that crowdsourced fact-checking through these notes can effectively reduce misinformation, as users are more likely to retract false information when presented with credible corrections. Meta appears to be leaning into these findings.
For the details of this announcement, read this Newsroom post from Joel Kaplan, Meta’s Chief Global Affairs Officer.
What Is a Community Note?
Similar to X, Meta Community Notes are meant to empower the community to decide when posts are potentially misleading and need more context. Users across a diverse range of perspectives decide what sort of context is helpful for other users to see.
Meta won’t write Community Notes or decide which ones show up — these are written and rated by contributing users and will require agreement between users with a range of perspectives to help prevent biased ratings. Users can begin to sign up to be contributors to Community Notes now.
Meta’s Content Moderation Background and the Significant Reversal
These changes in content moderation are a significant reversal in how Meta had handled false and misleading claims on its platforms in the past. Meta launched an independent fact-checking program back in 2016 as a response to claims that it had failed to stop foreign actors from using its platform to spread disinformation which led to unrest among Americans.
In the years since, Meta has taken measures with this program to limit the spread of controversial content on its platform, such as misinformation around elections, anti-vaccination stories, violence, and hate speech.
Meta created safety teams and introduced automated programs to limit the visibility of these false claims. While Meta’s fact-checking partners claimed they looked at both sides of every claim, many have expressed that their voices have been restricted.
Starting with the U.S., Meta says it is ending its partnership with third-party fact-checkers as it will focus more on providing an environment for users to express their freedom of speech.
Meta also plans to adjust its automated systems that scan for policy violations to focus only on checking for illegal and “high severity” violations such as terrorism, child sexual exploitation, drugs, fraud, and scams. All other concerns will need to be reported by users before Meta evaluates them.
What Is the Timing of These Changes?
Meta plans to roll out Community Notes in the U.S. over the next couple of months, and will continue to adapt it over the course of the year.
As they make the transition, Meta will begin to remove their own fact-checking control, phase out fact-checked content, and instead of overlaying full-screen interstitial warnings, they will use a much less obtrusive label indicating that there is additional information for those who want to see it.
What Is Our Recommendation?
MERGE will continue to monitor Meta’s new content moderation process to see how the new policies impact the platform. Loosening Meta’s content moderation policies could lead to a spike in misinformation on the platform—adding more political content to its platforms may result in some users being alienated, although this remains to be seen.
The addition of Community Notes is meant to aid in curbing the misinformation spread, and prevent valid information from being wrongfully removed from content moderation tools.
For organizations with an organic social strategy on Meta, we recommend closely monitoring user commentary on posts so your team can implement faster moderation or more limited comment options.
For those with paid ad programs on Meta, Meta’s reliability and safety for brand messaging in ads should continue to be monitored. However, since none of the brand safety tools are being removed, users should be able to continue forward as usual.
Lastly, for organizations which have not yet put a focus on inventory filters that provide more control over what type of content our ads are being shown near, it’s important to review and update these options. There are different levels of inventory filters that can be utilized based on the type of sensitive content that one wants to exclude one’s ads from appearing next to.
Does your brand need help implementing an organic or paid media strategy on Meta? The media experts at MERGE can help. Get in touch.