The leak of Frances Haugen, the never-ending gift, exposes Facebook’s secret ranking method to “problematic” countries that need stricter content control. And it is clear that Israel is starring
Mark Zuckerberg. Photo: Giktiim
The dam gates breached by Frances Haugen, the leak from Facebook, are still open – and today (Monday) several new stories are published that shed light on the problematic light of the American giant. One of them deals, among other things, with Israel and the content monitoring efforts that Facebook invests in certain countries – while the rest are forgotten along the way.
Israel is among the “top” in the world
A new report from The Verge, based on documents leaked from Facebook, reached the US Congress and was partially blacked out to the media – revealing that Facebook has created an internal rating system to monitor content – mainly around election periods.
The internal ranking system included four levels – level 0, level 1, level 2 and “everything else” – with the content monitoring efforts and resources that Facebook devotes to each country moving in descending order from level 0 onwards. The documents obtained by The Verge show that in the highest rank there are 3 countries – USA, India – these are the countries that received the most resources, which included “war rooms” for managing platforms in real time, with dashboards for managing network activity and to be in touch with local actors In case any problems were identified.
In the second tier – which includes 5 countries – are Israel, Iran, Indonesia, Germany and Italy. These countries are considered to be less explosive than those in Level 0 and receive almost all of the same resources – including “war rooms” (or “enhanced operations centers”, as they were called on Facebook) – except for some internal resources allocated only to the 3 “senior” countries.
At level 2, 22 countries were found, which received increased content monitoring efforts – but not the “improved operations centers” that reached the previous eight countries. In the next rank were the other countries in the world, which did not receive minimal content monitoring efforts from Facebook, except in cases of content related to elections in these countries – if the situation in them escalates. It is important to note that even then the same test was performed by content supervisors only, with no investment of significant effort.
Facebook has not revealed at any stage what the criteria for determining the ranking of countries, and they are of course not subject to external examination. What is certain is that Facebook has taken care to monitor the content with tweezers in the mother country of the United States and in two other countries where there is a significant chance of violence and instability following an heated debate around the election. In practice, all of these resources apparently did not help when thousands lined up on the Capitol building last January – in part following content posted on Facebook.
The leaked documents show that the upgraded resources allocated by Facebook to the most problematic countries, including Israel, included translating the user interface into all relevant languages in those countries, translating community terms in an attempt to reduce deviations from them; Building systems that include artificial intelligence-based content classification, which can identify problematic content and misleading information in the various source languages in each country, and create teams that work 24/7 to analyze viral content and take prompt action in cases of spreading calls for violence or misleading content.
All articles and updates from the world of technology and high-tech
Waiting for you now on the Giktiim channel on Telegram
Forget how many countries needed the supervision
The leak shows that although it has tagged certain countries as such in the higher ranks, Facebook has set aside some countries where the social situation was particularly problematic – and its platform did not help resolve the situation, quite the opposite. One of these countries is Ethiopia, a country that has been in a bloody conflict in recent years, where content monitoring efforts did not exist because it was not in the top ranks of Facebook.
The screens also show that alongside Ethiopia, Facebook was not prepared to monitor content in other countries that are in bloody conflicts – such as Myanmar, which has undergone a military coup and Pakistan in recent months. Facebook has also failed to designate teams of translators for the two-tier countries in an attempt to address language gaps – which significantly impact content oversight efforts.
Leaked documents also show that Facebook repeatedly mentions the high cost of these content monitoring efforts, especially in high-ranking countries. Along with the high cost, it is also a process that often takes a long time, with a reported report that Facebook takes about a year to produce AI-based systems for monitoring content based on “classified” based on the source languages in different countries.