Facebook is trapped in an eternal braid.
While it seeks to moderate content at full scale for the safety of its users, it still refuses to accept that it is a media platform. But neither is it entirely clear whether it is just a technology platform. This media/technology duality could be Facebook’s original sin, but why doesn’t the social networking giant pronounce its position once and for all?
I had a brief interview with Daniel Mumbere of Africanews about Facebook’s plans to secure its users in Africa from harmful content . This article is an extension of my thoughts on Facebook’s content moderation plans for sub-Saharan Africa.
Since 2017, Facebook has captured the headlines for all the wrong reasons. It has come under fire for enabling fascism, promoting impenetrable echo-chambers, facilitating weaponization of fake news, and totally disrupting democratic dispensations. Most pronounced is its role in enabling Russia’s supposed meddling in U.S elections which saw the election of alt-right Donald Trump to the highest office in 2016. It has also been faulted for enabling genocide against the Rohingya through failure to moderate inflammatory content and hate speech on its platform.
Closer to home, Facebook was accused of enabling large scale data manipulation and profiling of voters in Kenya in order to swing their public opinion in the recent elections in 2017. Through its global partnerships with Cambridge Analytica and similar companies, users’ data has been shared without consent, and in many cases, even used against their interests. But sceptics say that Facebook has shared even much more data with traditional partners such as Oracle and Cisco. Some reports show that despite the warnings and summonings by the U.S and the E.U, Facebook still continues to enable nefarious data activities such as racial and targeted political advertising.
It’s important to understand why Facebook has moved real hard into the content moderation/review business.
As highlighted above, the majority of Facebook’s 2.2 billion users have been victims of fake news and inflammatory content in one way or another. For example, during the U.S presidential elections in 2016, fake news was arguably shared more than real/genuine news because (fake) news that appeals to people’s biases naturally follows the least path of resistance — especially given the rise of impervious filter bubbles and echo-chambers on social media.
In a bid to clamp down the viral spread of content, especially fake content, Facebook founder Mark Zuckerberg, announced in May 2017, that Facebook would be hiring a global army of 3,000 to “review the millions of reports we get every week, and improve the process for doing it quickly.”. These content moderators would be mainly stationed at the Facebook campus in Dublin, Ireland, but would still have distributed stations in different places all over the world.
Facebook calls content moderators by another name; market specialists. Recently, they sounded a call for market specialists for 7 African languages (Afrikaans, Amharic, Hausa, Luganda, Oromo, Somali, and Swahili) to be based at their campus in Dublin.
The Cleaners (directed by Moritz Riesewieck and Hans Block) is a riveting 2017 documentary about the role of content moderators in the inflamed world of social media. In particular, content moderators are armed with tools to determine if a piece of content — be it child pornography, political propaganda, hate speech — should be “filtered” or “deleted” altogether. The documentary delves into the precarity of this digital trade citing subsistence wage rates and occupational hazards not limited to trauma and exhaustion.
Meanwhile, The Kenya Wallstreet reported last week that Facebook will employ approximately 100 reviewers by the end of the year, supporting a number of languages, including Somali, Afaan Oromo, Swahili and Hausa. However, these would be based at a content review centre in Nairobi manned by SamaSource; a non-profit digital labour platform founded in San Fransisco whose operations in Africa span Gulu, Uganda and Nairobi, Kenya.
Facebook has around 170 million users in Africa, with 94% accessing the social network via mobile. Egypt, Algeria, Morroco, Nigeria, South Africa, and Kenya constitute nearly 65% of the total number of Facebook users on the continent.
About 3,000 living languages and dialects are spoken throughout Africa. Sudan, Nigeria, and DRC collectively speak about 1,500 languages. However, it seems like Facebook is targeting more popularly spoken languages and dialects below the Sahara, less so countries that have the highest number of users in Africa (Egypt, Algeria, and Morocco).
Facebook’s platform model is creating large text repositories of many local languages where previously relatively few had much of a printed canon. Swahili, a language spoken by over 90 million people in East Africa is more commonly used on text-based social media than most local and some international languages (compared to about 80 million French speakers and some 35 million Portuguese speakers below the Sahara) .
Is Facebook a media platform?
Facebook is careful not to be labelled as a media platform for fear of deeper scrutiny and subjection to multi-jurisdictional media laws. However, as a technology platform, its governance and content mediation does not exist in a vacuum. In other words, the algorithms that superintend over newsfeeds are built and deployed by humans and corporations that have a set of biases. As political and technology scholar, Robert Gorwa, so very well articulates; how can we strive for accountable algorithms, if corporate entities [such as Facebook] that build and deploy them are not accountable, fair, transparent, or ethical, and if they seem to be entrenching, rather than combatting, existing social prejudices? .
While Facebook continues to invoke platitudes of innovation and disruption, it functions no different than traditional media enterprises and corporates of the old days even though it does not accept its position as a media platform, at least explicitly.
In fact, in what seems like a soul searching mission over the last year, Facebook has courted NGOs, think tanks, researchers, journalists, among others, to institute an independent global oversight board for content decisions. In a January 28, 2019 blog post, Facebook released a draft charter highlighting the scope of the oversight board and part of its mandate. This board would “render independent judgment, [that] is transparent and respects privacy” on issues around freedom of expression, technology and democracy, procedural fairness and human rights.
As Facebook hires even more content moderators and institutes an independent board for content decisions, it only serves to show that they are rewriting their rulebook on their terms. Indeed, they might appear to be under the crosshairs of the U.S. and the unflinching EU but they’re staying clear of the scrutiny typically characteristic in regulating traditional (media) entities.
Despite the small number of users, Facebook’s platform power and influence in Africa is incredible. The capability to close seams of ethnic hate speech and burst political propaganda during elections through human moderation only concentrates more power into Facebook’s hands. In fact, as stated in the fine print, the global group would be obligated to the people who use Facebook — and not Facebook the company. Yes, it would not be accountable to governments or other relevant authorities, but ambiguously to “Facebook users” who would be moderated by digital labourers — and faceless algorithms — who are accountable to Facebook (and their rulebooks).
Soon, we’ll all be accountable to the Facebook corporation-nation-state.Share this article via: