For more than two years, a team of civil rights activists have been examining Facebook’s actions under a microscope. They have issued various interim reports: this week they produced their final report, which evaluates how well Facebook has done in implementing their extensive recommendations. The short answer: not very well.
The report covers a wide scope of activities, including eliminating hate speech, policing posts that are threatening democratic elections and the collection of US Census data, changes in advertising policies and algorithmic bias, inciting violence, and policies promoting diversity and inclusion. It would be a tall order for many tech companies to resolve all of these issues, but for business the size and scope of Facebook, I would expect to see more coherent and definitive progress.
At first glance, Facebook seems to be trying — maybe. “Facebook is in a different place than it was two years ago,” as the report mentions. The company has begun several initiatives towards making amends on some of their most reprehensible actions, including:
- Setting up better screening of posts that encourage hate speech or promote misinformation or harassment. The auditors mention that while there have been improvements during the study period, specific recommendations haven’t been implemented.
- Prohibiting ads that mention negative perceptions of immigrants, asylum seekers or refugees.
- Creating new policies prohibiting threats of violence relating to voting and elections outcomes.
- Expanding diversity and inclusion efforts, although in interviews with Facebook staff the auditors feel there is still plenty of room for improvement and could do a lot more.
- Eliminating explicit bias in targeting housing, employment and credit application ads by age, gender or Zip code.
- Making changes to its Ad Library to make it easier and more transparent for researchers to search for bias and to determine if Facebook is making progress in implementation of these policies.
But when you read the entire 90-page report, you get to see that while the company has moved (and is continuing to move) towards a more equitable and appropriate treatment, they have just begun to move the needle. “It is taking Facebook too long to get it right.” they state.
Megan Squire, a CS professor at Elon University, wrote to me with her reaction. “The report highlights the same kinds of inconsistencies and persistent failures to act that I have experienced as a researcher studying the hate groups. These groups still routinely use Facebook’s platform to recruit, train, organize, and plan violence. Onboarding civil rights expertise is something they have yet to do in the white supremacist and domestic terror space, but I hope they strongly consider something like this in the future.” Squire refers to hiring civil rights specialists to round out various teams. The final report mentions this hiring in several contexts, but doesn’t touch on it when it comes to the sections on fighting hate speech and improving Facebook’s content moderation.
One thing that occurred to me as I was reading the report is how many of the issues mentioned have to do with the actions of our President and his campaign staff. Many of his statements, on Twitter and Facebook and in his campaign advertising, violate the auditors’ recommended actions. They auditors mention a trio of Trump posts in May which contained false claims on mail-in voting and an attempt at voter suppression. The posts were removed by Twitter but left online by Facebook. “These political speech exemptions [justifying keeping them online] constitute significant steps backward that undermine the company’s progress and call into question the company’s priorities,” the auditors say. “For many users who view false statements from politicians or viral voting misinformation on Facebook, the damage is already done without knowing that the information they’ve seen is false.” The auditors mention civil rights advocates’ claims that Trump’s content is “troubling because it reflects a seeming impassivity towards racial violence.”
The auditors specifically address this, saying “powerful politicians do not have to abide by the same rules that everyone else does, so a hierarchy of speech is created that privileges certain voices over less powerful voices.” They mention how Facebook has reined in anti-vax proponents but ironically has been “far too reluctant to adopt strong rules to limit misinformation about voting.” They go on to state, “If politicians are free to mislead people about official voting methods (by labeling ballots illegal) and are allowed to use not-so-subtle dog whistles with impunity to incite violence against groups advocating for racial justice, this does not bode well for the hostile voting environment that can be facilitated by Facebook in the United States.”
Facebook has tried to blunt the auditors’ criticism, saying that from January to March 2020, they removed 4.7M pieces of hate speech-related content, which is more than twice what was removed in the prior three months. That’s progress, but just the tip of the hate-speech iceberg. Earlier this week, Zuck once again promised to address the auditors’ issues. And last week, the company announced they are trying to still lock down API access to private data, after yet another revealing breach of private user data was discovered. Clearly, they could do a better job.”Facebook has a long road ahead on its civil rights journey.” I agree. It is time we see progress over promises.