Facebook Inc's independent oversight board demanded more transparency from the social media giant on Thursday, saying the company was not "fully forthcoming" on how it deals with certain high-profile user accounts.
The comments follow a Wall Street Journal report last month that said millions of Facebook accounts belonging to celebrities, politicians and other high-profile users were exempted from some internal checks.
The board said that Facebook has not been transparent with the company's 'cross-check' system, an internal program the social media network says is used to double-check enforcement actions against certain users.
"Facebook needs to commit to greater transparency and to treat users fairly," the board said in a tweet.
In relation to its May decision to uphold the indefinite suspension of former US President Donald Trump's account after the January 6 riot, the board said when Facebook referred the case, it did not mention the cross-check system until it was asked.
"Given that the referral included a specific policy question about account-level enforcement for political leaders, many of whom the Board believes were covered by cross-check, this omission is not acceptable," it said.
READ MORE
- Facebook to change rules on attacking public figures on its platforms
- Facebook responds to abuse scrutiny by revealing aggressive takedown moves
- Facebook plans to change its name
- Facebook users said no to tracking - now advertisers are panicking
Facebook, in the form of a policy advisory opinion, has asked the board to review its cross-check system and make recommendations on how it can be changed.
A company spokesperson said the board's work had been "impactful," which is why it asked for input into the cross-check system.
Facebook created the board mainly to address criticism over how it handles problematic content and is responsible for independent verdicts on a number of thorny decisions related content moderation.
Going forward, the board will publish quarterly and annual transparency reports to provide assessment on whether its recommendations were implemented.
In its first quarterly report, the board said over half a million Facebook and Instagram users submitted appeals between October 2020 and the end of June 2021, of which more than a third were related to content concerning Facebook's rules on hate speech.