
Essential Insights on Meta’s New Content Guidelines
January 7, 2025

When Meta announced significant changes to its content moderation standards last week, it ignited the first major social media dispute of 2025. These adjustments included loosening the rules governing user posts on its platforms and doing away with fact-checking.
Many believe that this move is an effort to appease Donald Trump, the incoming US president, who has previously voiced his distaste of Meta. But why did Zuckerberg and his colleagues make the choice they did? And how will these changes affect Facebook, Instagram, and Threads users—for better or worse?
We’ll look at the main elements of Meta’s contentious policy shift and look into why the firm chose to take this particular course of action at this particular time.
What is changing?
Meta is also changing its standards to allow for more in-depth discussion on delicate topics and ending its third-party fact-checking program. Instead, it will use a crowdsourced Community Notes system, similar to what X (previously known as Twitter) uses. Additionally, Meta plans to reintroduce politically controversial content in order to buck the recent trend of lowering its prominence in users’ feeds during the previous four years.
In order to promote a wider range of viewpoints and conversations, the policy update particularly targets issues like immigration and gender identity that are regularly the focus of political controversy.
The Intercept has seen Meta’s amended moderation standards, which are currently being sent to staff members. Additionally, it has revealed instances of remarks and descriptions that were previously forbidden but are now allowed under the updated regulations.
Depending on the political issues being discussed at the time, the wording of Meta’s update raises the prospect of further revisions in the road.
In the meantime, Meta’s own data (described below) suggests that the removal of fact-checkers will likely make it more difficult for Meta to fight false content. Furthermore, the resurgence of political material may make contentious conversations more visible on Meta’s platforms.
What is the rationale for eliminating fact-checkers?
Zuckerberg asserts that Meta’s fact-checking collaborators are predisposed to a political bias.
“Some of the people whose job is to do fact-checking, a lot of their industry is focused on political fact-checking, so they’re kind of veered in that direction. We kept on trying to basically get it to be what we had originally intended, which is not to judge people’s opinions, but to provide a layer to help fact-check some of the stuff that seems the most extreme. But it was never accepted by people broadly. I think people just felt like the fact-checkers were too biased, and not necessarily even so much in what they ruled, but a lot of the time it was just what types of things they chose to even go and fact-check in the first place.”
Share this post