Implications of Facebook/Instagram Fact-Checking Mechanism: The Nigerian EndSARS Hashtag Instance
- Collins Okoh |
- November 16, 2020 |
- Digital Rights
Photo by Glen Carrie on Unsplash
Introduction
Social media is now becoming a swift and vast approach to spreading news, both true and false, and, perhaps, the current most dominant means of communication.[1] Facebook, which also owns social networking app Instagram as well as private messaging app WhatsApp, therefore, wields significant power in communication. In analysing the 2020 Reuters Institute Digital News Report, BBC takes the view that Instagram, which is a section of Facebook, will soon overtake Twitter as a news source, especially for the youths.
An example of Facebook’s influence in communication is the recent Nigerian #EndSARS movement. #ENDSARs is a movement by the Nigerian youths and well-wishers against police brutality and for the abolition of the Special Anti-Robbery Squad (SARS). The way content was taken down during the campaign, illustrates the deficiencies in Facebook’s fact-checking mechanisms.
The Facebook and Instagram Fact-Checking Mechanism
Facebook has a tripartite approach in mitigating misinformation. The first is through the removal of content or accounts that tend to misinform. The second is the reduction of misinformation using algorithms to identify and filter fake stories and headlines that garner traffic to sites which then profiteer from ads. Third, Facebook informs the users of the nature of certain posts by adding tags/watermarks to posts and accounts that are suspected to disseminate false information or mislead the users.
To accomplish the tripartite steps above, Facebook explains that it uses two methods, namely, algorithms and independent third-party fact-checkers. Facebook uses the result and information from the third-party fact-checkers to update its algorithms. Across the globe, Instagram has 45 independent third-party fact-checkers that are certified by the International Fact-Checking Network. The Network is set up to primarily target the pages of public figures and institutions that spread information that is of public interest and to assess the validity of such communications. Facebook limits the visibility of any information that is flagged false by the Fact-Checkers. Additionally, it penalizes further dissemination of the information by temporarily or permanently disabling the source page and/or any page promoting the information by further dissemination.
The reality is that Facebook is somehow ineffective in protecting the public against misinformation. This is due to the counter-productivity of the fact-checking mechanism that mislabels genuine social media awareness on real occurrences and either limits the visibility of such contents or permanently removes them. The error is heightened when the company even goes as far as penalizing the source page or account over the dissemination of genuine information. This stiffens the voice of the public who mostly rely on social media to raise awareness and communicate to the larger world in times of crisis. Although Facebook allows for an appeal against any such regulation, the approach is still a challenge since timely dissemination of information in times of crisis is of great asset. The case of Nigeria during the #EndSARS movement helps to prove this discouraging reality.
The counter effect of the Facebook and Instagram Fact-Checking mechanism
Nigeria experienced a critical moment when the youths went into the street to protest police brutality. This protest birthed a social media trending hashtag #EndSARS on 4 October 2020. Of focus is the incidence that took place at the Lagos Lekki Tollgate on Tuesday 20th October 2020 when the Nigerian soldiers opened fire at peaceful protesters who were waving their national flag sitting on the ground. This claimed the lives of many protesters and injured some others. Several eyewitnesses confirmed the incidence, and humanitarian organisations such as Amnesty International verified the event.
The incidence which is referred to as the ‘Lekki Massacre’ inspired Black Tuesday, a day when Nigerians commemorate the death of the peaceful protesters. Additionally, Nigerians and well-
wishers took to social media to post, among other symbolic pictures, a blood-stained flag used to wrap one of the victims to raise awareness over the massacre and to express bitterness. These symbolic posts were flagged false news on both Facebook and Instagram. Similar treatment was accorded to some comments that contained certain word combinations related to the incidence, likewise the majority of posts that are under the hashtag.
Human rights advocates argue that ‘straight forward censorship would undermine the fundamentals of a democracy, doing harm to the public good.’ Therefore, no one should be penalised for true statements. The public good in this scenario is the freedom from police brutality that the protesters sought. The dissemination of the information regarding the Lekki-Massacre is also not entirely false like Facebook made it seem. The limitation was not in line with the rights and reputation of the victims and supporters of the movement. Hence, the limitation by Facebook is unlawful and the affected public can bring a class action against Facebook.
Conclusion
The taking down of posts during the #EndSARS protest shows that the current fact-checking mechanisms at Facebook can be counter-effective. The events suggest that Facebook was unaware or out of touch with a topic that was of interest not only in Nigeria but in the larger continent and world as well. If the posts were taken down due to algorithmic recommendations, this demonstrates the fallibilities of automated decision making and the need for human intervention. Nigeria is Africa’s most populous country, with a significant number of monthly active users amounting to 19 million and 5.70 million for Facebook and Instagram respectively, as of 2018. Facebook as a widely used means of communication ought to have more nuanced representation and decision making on trending issues in Nigeria—regardless of whether a post has been reported by other user(s).
[1] Newman N et al, Reuters Institute Digital news Report 2020, 2020, 11. Available at https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020-06/DNR_2020_FINAL.pdf
[2] Article 19 (3), International Covenant on Civil and Political Rights.