Facebook is now allowing comments to be turned off on public posts for the first time since the social media platform launched in 2004.
Users will also be offered more insight into and control over what content appears in their News Feed as a part of what the company calls a "significant shift" in how it operates its algorithms, which have been widely criticised in recent years.
he ability to disable comments on posts will be given to all people and pages as of today, Facebook says, adding the feature is "intended to be used reactively in the case of harassment and other unwanted interactions".
But it could be used however users want.
"It is alleged that social media fuels polarisation, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality, eroding the public sphere and the understanding of common facts,"
For news outlets, this could free up staff time currently used for moderating and removing comments that reveal legally suppressed information and other content that breaches their guidelines.
For businesses or politicians, it could mean turning off comments whenever they don't like what the commenters are saying, for whatever reason.
To coincide with Facebook announcing the changes, the company's vice president of global affairs Nick Clegg has written an essay defending its algorithms.
"It is alleged that social media fuels polarisation, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality, eroding the public sphere and the understanding of common facts," writes Clegg.
"Perhaps it is time to acknowledge it is not simply the fault of faceless machines? Consider, for example, the presence of bad and polarising content on private messaging apps - iMessage, Signal, Telegram, WhatsApp - used by billions of people around the world.
"None of those apps deploy content or ranking algorithms. It's just humans talking to humans without any machine getting in the way. In many respects, it would be easier to blame everything on algorithms, but there are deeper and more complex societal forces at play."
Facebook's algorithms have often been cited in recent years as a lead cause for increased polarisation around the world, ultimately leading to atrocities such as the attack on the US Capitol earlier this year and the 2019 terrorist attack on two mosques in Christchurch.
- Facebook wants to evolve how we use computers with wrist-based augmented reality tech
- Facebook 'tentatively' friends Australia again after widely criticised news block
- ‘We are sorry’: Senior Facebook exec apologises for blocking Australian health, emergency sites
That particular atrocity was very closely associated with Facebook as the terrorist used the platform to livestream the mass murder.
Prime Minister Jacinda Ardern's responses to the Christchurch terror attack have frequently included citing the role of social media algorithms in it, with signatories of the Christchurch Call committing to "review the operation of algorithms" among other things.
A release by New Zealand's Classification Office on violent extremism and disinformation online published in late 2020 specifically blamed algorithms for spreading the Christchurch terrorist's footage.
"The video was amplified by algorithms on platforms like Facebook, and even reached victims' family members and friends. It was a horrific wake-up call to how digital technology can be weaponised in new and devastating ways," said the Classification Office.
"Globally, we have since seen attacks with clear links to the Christchurch terrorist attacks and a growing movement of white supremacists, 'incels', and violent extremist action often linked to disinformation and conspiracy theories spread online."
Amid a public spat between Facebook and Apple over privacy, Apple CEO Tim Cook earlier this year called out social media algorithms as being responsible for real world violence among other ills.
"It is long past time to stop pretending that this approach doesn't come with a cost - of polarisation, of lost trust and, yes, of violence. A social dilemma cannot be allowed to become a social catastrophe."
"At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement - the longer the better - and all with the goal of collecting as much data as possible," said Cook.
"What are the consequences of prioritising conspiracy theories and violent incitement simply because of their high rates of engagement? What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations? What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?
"It is long past time to stop pretending that this approach doesn't come with a cost - of polarisation, of lost trust and, yes, of violence. A social dilemma cannot be allowed to become a social catastrophe."
Among the defences in Clegg's essay is an insistence that polarisation and extremist content is bad for Facebook's bottom line.
"Before we credit 'the algorithm' with too much independent judgement, it is of course the case that these systems operate according to rules put in place by people. It is Facebook's decision makers who set the parameters of the algorithms themselves, and seek to do so in a way that is mindful of potential bias or unfairness," writes Clegg.
- ‘We will not be intimidated’: Australian PM takes Facebook fight to India and the world
- Prime Minister Jacinda Ardern, Broadcasting Minister Kris Faafoi weigh in on Facebook's Australian news ban
- Facebook says it's cracking down harder on anti-vaccine misinformation as COVID-19 pandemic rages on
- Advance NZ Party's Facebook page has been unpublished, it says
"The reality is, it's not in Facebook's interest - financially or reputationally - to continually turn up the temperature and push users towards ever more extreme content. Bear in mind, the vast majority of Facebook's revenue is from advertising.
"Advertisers don't want their brands and products displayed next to extreme or hateful content - a point that many made explicitly last summer during a high-profile boycott by a number of household-name brands."
However, despite Facebook arguing against criticisms of its algorithms, its latest actions show it clearly agrees there is room for improvement.
How impactful the changes are will likely be monitored by concerned governments and other regulatory bodies around the world.