There are guidelines for posting on Instagram and Facebook.
They are subject to change as well as the way they will be enforced by robot and human moderators. The rules for all five billion users of these sites are identical in theory.
Except if you are a celebrity, politician, or business partner of Facebook, and Instagram’s parent company .
They are able to post along with the posts of 5.8 million other users who are influential.
These exemptions can have significant consequences. The automated moderation system will flag a normal user’s posting and it will be removed immediately.
Flagging a VIP’s post will allow it to remain up until human moderators have a second, third, fourth, or fifth look.
For example, Neymar, a Brazilian footballer, posted intimate images of another person on his Facebook and accounts in September 2011. It was not reported that the person concerned had given permission.
This video clearly violated Meta’s content policies which prohibit many mild forms of nudity. According to The Guardian, the video was still online for more than a day and had received 56 million views by the time it was removed.
What was the reason for the delay? Neymar, who announced a business partnership with Meta to promote Facebook Gaming later, was on the cross-check list, which was trying to manage a backlog at that time.
This delay, which averages five days and rises to 12 in the United States, and 17 in Syria on an average, is one aspect of cross-check that Meta’s Oversight Board sharply criticizes. The semi-independent “court” created by Mark Zuckerberg to help with difficult questions around moderation, Meta’s Oversight Board has criticized this type of delay.
Since last year’s whistleblower Frances Haugen exposed the extent of the system through leaked documents from her company to the Wall Street Journal, the board has been reviewing and revising the program.
Meta was urged by the board to reform the program in a Tuesday report. The board argued that the programme “prioritises commercial value users” over its “human rights and company values”.
Sky News’ Thomas Hughes, director of Oversight Board, stated that the system had caused “real harm”. He did not call for the dissolution of the system, but he said that “you need some sort of secondary review process”.
The board asked Meta to improve cross-check and make it more efficient and transparent. They also suggested that Meta refocus the process on human rights issues such as accidental removals of journalistic material.
Continue reading:
Mark Zuckerberg has a long, difficult road to metaverse success
Facebook defends itself after transferring chat messages to US authorities investigating abortion
Meta should establish clear criteria for participation in cross-check. It should also publicly mark accounts that are included in the system. This includes state actors and business partners. Even those who are subject to crosscheck don’t know that they are included.
According to the report, Meta prefers to enforce its rules in order not create a “perception or censorship” and stir up “public controversy” for partners.
To avoid any delays in moderation, however, the board recommends that content marked as “highly severe” at first review be removed while it is reassessed.
Meta is not required to follow the suggestions of the board and has declined to do this on numerous occasions. However, Mr Hughes stated that the company tends to implement most of the recommendations. There are 32 in this instance.
Hughes stated that although they won’t implement all of them, given the rate at which they have been implemented, he believes they will implement most. These recommendations are feasible, according to the board.
However, despite the request for Meta to “radically improve transparency around cross-check”, it failed to produce full transparency and its report is missing crucial details.
Despite repeatedly asking, the board could not determine who was on the cross-check lists. The board was unable to verify the exact number of persons on the cross-check list or obtain examples of posts that were cross-checked.
The board complained in its report that “this limited disclosure impairs board’s ability [to] carry out its mandated supervision responsibilities,”
The board had previously stated that Meta “had not been fully forthcoming” regarding cross-check. They failed to mention President Trump’s programme and then claimed it was small, when it actually involved millions of users.
Ms Haugen, a whistleblower, accused Meta of “repeatedly lying about the scheme”. Mr Hughes, however, disagreed and stated that he believed the information given to the board was accurate and complete and that the board had “flexed his muscles” in order to investigate the program.
Critics claimed that Meta’s fundamental problems were too large for the Oversight Board. Because implementing their most significant suggestions would require the company employing tens to thousands more human moderators, particularly in countries other than the US and Canada,
The board discovered that 42% of content cross-checked was from these two countries, despite having only 9% of monthly active users.
“The Haugen documents reveal a picture of systemic inequalities in which the US, despite all its moderation issues, gets the lion’s share of moderation resources and almost everywhere else gets practically nothing,” Cori Crider, director at Foxglove, says. Meta is suing Meta for former Facebook content moderator Daniel Motaung.
“Until this imbalance is rectified, I don’t see how Oversight Board opinions can make much of a difference.”