Despite ongoing promises to shore up the platform from illegal activity and threatening behaviors, another release of documents has revealed that Facebook did little to stop the human rights abuses that it was helping to propagate. Company documents not only confirm that there were many instances of women alleging criminality on the platform, but in Facebook’s own words they were: “under-enforcing on confirmed abusive activity.”
Approximately two years ago, Apple threatened to pull Facebook and Instagram from its app store over concerns about the platform being used as a tool to trade and sell maids in the Mideast. This happens to roughly coincide with the timing of certain crack-downs by Facebook as the 2020 US Presidential Election approached, following the data breach of Cambridge Analytica, where more than 50 million Facebook profiles were harvested and used to target American voters.
In these latest leaked Facebook documents, the company acknowledged that as early as 2018 the company knew it had a problem with what it labeled “domestic servitude.” These documents also confirm that Facebook defined the problem as a “form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception.”
Facebook even went so far as to create an acronym to describe the issues of human trafficking, calling it “HEx, or human exploitation.” In order to quell Apple’s concerns over human trafficking, Facebook apparently disabled over 1,000 accounts on the platform. Labeling the problem and claiming to disable these accounts seemed to be enough for Apple, because shortly after Facebook shared the minimal procedures it had initiated, Apple apparently dropped the threat of pulling the Facebook and Instagram apps from their devices.
The recent release of papers by Facebook whistleblower Frances Haugen, which is comprised of Facebook’s own internal analysis of the problems, described a problem within the social media company that it had only scratched the surface and that “domestic servitude content remained on the platform.” Perhaps the most salient point contained within the Facebook documents is the following statement from Facebook’s own program analysts who concluded: “Removing our applications from Apple platforms would have had potentially severe consequences to the business.”
The problem of human trafficking continues today across both the Facebook and Instagram platforms. The recently leaked internal documents reveal instances where Facebook engineers would encounter problematic messages in maid-recruiting agencies’ inboxes, including one in which a Filipina specifically is mentioned as being “sold” by her Kuwaiti employers.
One Facebook leaked document confirmed: “In our investigation, domestic workers frequently complained to their recruitment agencies of being locked in their homes, starved, forced to extend their contracts indefinitely, [being] unpaid and repeatedly sold to other employers without their consent.” The internal report concluded that in response to these huge red flags, “agencies commonly told [the maids] to be more agreeable” in response.
The Mideast remains a crucial source of work for women in Asia and Africa, who are merely trying to provide for their families back home, and Facebook has acknowledged that some countries within the region have “especially egregious” human rights issues when it comes to laborers’ protection. The Facebook whistleblower reported the following shocking conclusion: “We also found recruitment agencies dismissing more serious crimes, such as physical or sexual assault, rather than helping domestic workers.”
The weak protests from Apple and the supposed reforms from Facebook have had little apparent effect on curbing the current epidemic of online human trafficking, and Facebook’s own supposed crackdown seems to have had extremely limited effect. Just last week, CNN conducted a search using the terms listed in Facebook's internal research on the human trafficking subject.
Active Instagram accounts purporting to offer domestic workers for sale, and similar accounts to those on Facebook that researchers had flagged and removed, were still posted and viable on the site. After CNN asked Facebook about those suspicious Instagram accounts, the company removed the accounts and posts, confirming that they had indeed violated the purported policies.
A quick search for maids in Arabic will also bring up accounts that feature posed photographs of Africans and South Asians with ages and prices listed next to their images. The Philippine government has a team of workers that focus solely on scouring Facebook for human trafficking posts — like these each day— in order to try and protect desperate job seekers from criminal gangs and unscrupulous recruiters using the site. The problem, however, is still largely under the radar.
A report from the BBC’s Arabic service on the human trafficking practice in the Mideast in October of 2019 allegedly sparked the idea of removing Facebook apps from the Apple platform. The recently leaked documents reveal that Facebook acknowledged being aware of both the exploitive conditions of foreign workers and the use of Instagram to buy and trade maids online even before this 2019 report.
According to leaked internal documents, Facebook engineers had found nearly three-fourths of all problematic posts, that included maids being shown on video along with screenshots of their conversations that occurred on Instagram. Facebook engineers determined at the time that the actual links to maid-selling sites predominantly affected Facebook, rather than Instagram.
Facebook’s own analysis in 2019 of the instances of human trafficking found that over 60% of the offending material came from Saudi Arabia, with about a quarter coming from Egypt. Facebook was able to break down the predominant risk and isolate the regions that appeared to be violating basic human rights. The information was always available, yet there is no confirmation that Apple insisted on learning more about the threats.
The computer conglomerate recognized the significant issues involved but somehow determined that naming the problem and clearing out 1,000 accounts was an adequate solution. This reflects the bigger problem: the cultural dependence on Facebook has created an environment where many would rather look away than face a decrease in profits and/or convenience.
The consumers who continue to use Facebook are no better, as the ongoing lack of massive outrage means that too many are still willing to sacrifice human rights protections for the ease of sharing family photos and checking in on old flames. It’s time to hold everyone accountable: Facebook, Apple, and even ourselves. We knew human trafficking was likely occurring and now we have proof of it. The question is: will we be bold enough to actually do something about it this time?
For a tutorial on how you can take action now by completely deleting your Facebook and Instagram accounts, click here. You can also help by sharing this article with every person you know and imploring them to take action.
Amee Vanderpool writes the SHERO Newsletter and is an attorney, published author, contributor to newspapers and magazines, and analyst for BBC radio. She can be reached at firstname.lastname@example.org or follow her on Twitter @girlsreallyrule.
Paid subscriptions and one-time tributes embedded in each article, allow me to keep publishing critical and informative work like this, that is often made available to the public — thank you. If you like this piece and you want to further support independent journalism, you can forward this article to others, get a paid subscription or gift subscription or donate once, as much as you like today.