In its report, the board calls for a “significant revision” of the double-checking program called “cross-checking,” to make it more transparent, responsive, and fair.
Currently, when posts or images that potentially violate Facebook or Instagram policies are flagged, they are immediately removed if they are considered too risky and if they come from unknown users.
But if the author is “whitelisted”, this content remains online while it is further examined, a process that often takes several days and sometimes months.
This “inequitable” system that operates in two phases “offers additional protections to what certain users express, chosen in part based on Meta’s economic interests,” the report details.
This causes “content that is identified as being contrary to the Meta rules to remain visible on Facebook and Instagram, while spreading virally and causing potential harm,” the board warned.
The group recommends speeding up content reviews of personalities who post important human rights messages and also removing high-risk ones pending an internal verdict.