With regard to the third issue identified for comment by the Oversight Board, R Street submits for consideration that perfect neutrality and a full removal of bias are, in practice, patently impossible. Because of the wide ranging scope of creative possibilities, which will likely be the topic of other submissions to the Board, R Street will focus the remainder of this submission on structural considerations around the system, such as how it is evaluated and used. ![]() Creativity is possible in these considerations above and beyond the current state for example, a geographic content-based scope could be used to trigger certain content in “hot” geopolitical regions regardless of the identity of the author. Many such improvements will focus on the mechanics of the system, such as the factors used in its review. R Street appreciates the Oversight Board’s intent to offer constructive guidance regarding improvements to the system. However, that has not been the case thus far with Meta’s cross-check system. With these inconsistencies in mind, a properly designed system such as cross-check, can layer in a degree of pre-programmed review that is tailored to known challenges for automated systems. Substantial external input into the design and operation of such a system, and adequate disclosure of its impact in practice, could greatly improve the issue of online harm and increase trust and confidence in the entire social network. Furthermore, automation can have disparate impacts, with racial bias in hate speech detection as a known example cited in our final report. However, as our project participants were quick to note, automation struggles with context, which often requires a very human understanding of evolving offline social cultures and structures. At the scale of Facebook, automation is essential for effective content moderation and can serve to greatly mitigate harm through techniques such as “virality circuit breakers” driven by metadata associated with online activity. One of the richest propositions discussed in the development of R Street’s multi-stakeholder content policy project was the use of automation to detect potential policy violations in real-time. The sine qua non of the cross-check system is Facebook’s extensive use of automation for content moderation at scale. However, as detailed below, such a system must provide far more transparency, and on the basis of such transparency should be reviewed and improved over time with the benefit of input from external perspectives to ensure that its output is balanced and on net positive for the entire ecosystem. R Street recognizes the value, and arguably the necessity, of a user-centric system such as Meta’s cross-check to supplement the normal operation of automated content review mechanisms. The R Street Institute respectfully submits these comments in response to the request for public comment issued by the Oversight Board in connection with its consideration of a request by Meta for a policy advisory regarding the “cross-check” system used to help detect some “false positive” content takedowns in certain circumstances. The comments below, submitted to the Oversight Board on behalf of the R Street Institute, offer thoughts and recommendations for changes in policy and practice by Meta, grounded in transparency and receptivity to external input, that can begin to address the harm that has been done and make some progress on the long journey to improving public trust. The net result has been a further blow to public trust in Facebook, an increase in doubt as to the efficacy of the Oversight Board, and more fuel pouring on the burning fires of regulator interest. But in practice, the list (which grew to include millions of people) often had an opposite effect, shielding from reprisal the abusive behavior of some of Facebook’s digitally sanctified elite. As revealed in the very first of the “Facebook Files” reports by the Wall Street Journal, the once-secret program was built with the intention of protecting high-profile accounts from abuse, in the form of false positive automated content moderation actions. ![]() The Oversight Board– better known to many as the Facebook Oversight Board– has invited public input as it formulates a “policy advisory opinion” regarding the Meta corporation’s so-called “cross-check” system.
0 Comments
Leave a Reply. |