Getty Images.
Getty Images.
  • Facebook will reduce how much political content people see in their News Feeds, Axios reports.
  • The platform will specifically stop pushing as much content in front of people based on past engagement habits.
  • Facebook has tried to deprioritise political posts on its site over the past year.
  • For more stories go to

Facebook is rolling out another effort to stymie the proliferation of potentially contentious political posts online.

Axios reported on Tuesday that Facebook will use negative user feedback to deprioritise political and current events content in their feeds.

The platform specifically will stop relying so heavily on its algorithm that decides how likely someone is to share or comment on a certain post based on their past engagement, according to the outlet. Instead, it will rely on what users express interest in through surveys and other feedback.

The change, which the company plans to start testing in countries outside the US, could affect news publishers whose content focuses on politics. Facebook did not immediately respond to Insider's request for comment.

Facebook has been heavily criticised in recent years, especially in 2020, over its role in political misinformation online. It's historically taken a hands-off approach to moderating all kinds of content in an attempt to not be the "arbiters of truth," as CEO Mark Zuckerberg has touted.

Critics have zeroed in on its algorithm specifically for pushing more extreme and partisan content in front of people that it deems would more likely engage with it, which would prompt them to spend more time on the platform.

The move detailed in the report isn't the first effort Facebook has made to limit the amount of political and potentially divisive content on its platform.

In June 2020, Zuckerberg wrote in a USA Today op-ed that the company would allow users to turn off political ads.

"Everyone wants to see politicians held accountable for what they say - and I know many people want us to moderate and remove more of their content," Zuckerberg wrote.

In January, after the US Capitol insurrection - whose participants were found to have organised in advance on Facebook and other websites - the company said it would stop recommending political groups to users for the "long term." US Senator Ed Markey a few days before had written a scathing letter to Zuckerberg, condemning Facebook groups as "breeding grounds for hate."

And the company in February started testing the temporary reduction of political posts on some News Feeds for users in the US and Canada, among other nations. The move, according to Zuckerberg, was because many users on the platform didn't want their feeds to heavily consist of political content.

But the company didn't start grappling with this problem in 2020. In late 2019, Facebook was fielding heavy blowback over its policy to not fact-check political advertising.

"We don't believe, however, that it's an appropriate role for us to referee political debates and prevent a politician's speech from reaching its audience and being subject to public debate and scrutiny," Facebook's VP of global affairs and communications Nick Clegg said at the time.

Get the best of our site emailed to you every weekday.

Go to the Business Insider front page for more stories.