This piece was originally published at .
Facebook has admitted a serious problem with the platform’s advertising function that is allowing racial discrimination on its site. But there is a way to fix it — if the company is willing.
In the spring of 2016, Facebook rolled out its “ethnic affinity” feature, which allowed advertisers to target Facebook users labeled as African American, Latino, or Asian American based upon their behavior on Facebook. Advertisers could opt to include or to exclude users in these categories. Facebook said that these labels were not equivalent to race because they were based not on users’ actual racial identities, but on whether they engaged with Facebook pages associated with those racial communities. Nonetheless, it identified the categories as “demographics” in its options for advertisers.
The system made it easy to exclude users marked as African American from seeing ads for anything, including job postings and credit or housing opportunities. Yet civil rights laws like the make this kind of discriminatory advertising illegal.
, ProPublica was able to place a housing-related ad that targeted house hunters and those likely to move, excluding users marked as African American, Asian American, or Hispanic. The story prompted an immediate outcry. The , and the Department of Housing and Urban Development, which enforces fair housing laws,
Make no mistake, this is not simply an advertising problem — this is a civil rights problem made all the more dangerous by social media’s technological advances. Online personalization opens up significant possibilities for discrimination against marginalized communities, including people of color and other members of protected classes. In the offline world, we have thankfully moved past the era of housing advertisements that explicitly stated that people of certain races, religions, or ethnicities could not apply. But with behavioral targeting online, discrimination no longer requires that kind of explicit statement. Instead, a property manager can simply display ads for housing only to white people, or Christians, or those without disabilities.
In response to ProPublica’s 2016 investigation, Facebook to solve this discrimination problem built into its ad targeting business. We at the ACLU and other advocates spent many hours helping the company move toward some fixes. We helped Facebook settle on a system that we were told would use machine learning to detect ads for housing, credit, or employment and treat them differently. In those categories, Facebook promised, ethnic targeting and advertisers would have to certify that they were not violating the law or Facebook’s anti-discrimination policy before their ads would run.
Discrimination in the rental market is one of the most toxic forms of contemporary discrimination.
At the time, we praised Facebook’s changes — although they left some significant questions unanswered — hoping acknowledgment of the civil rights laws would become standard throughout the online advertising ecosystem.
Fast forward to last week. ProPublica tested the system again and found that Facebook to prevent users from seeing an ad for rental housing based on race, now rechristened “multicultural affinity.” ProPublica was also able to exclude people in wheelchairs and Spanish speakers, among others.
We have been extremely disappointed to see these significant failures in Facebook’s system for identifying and preventing illegal advertising discrimination. Facebook’s representations to us over the course of the last year indicated that this problem had been substantially solved, but it now seems clear that was not the case. Discrimination in the rental housing market is one of the most toxic and tenacious forms of contemporary discrimination. And, as the recent Pulitzer Prize-winning book “Evicted” demonstrates, discrimination in rental housing
In a statement, for ProPublica’s recent findings and continues to express a desire to get this right. The company now says that all advertisers who want to exclude groups of users from seeing their ads — and not just those advertising housing, credit and employment — will have to certify that they are complying with anti-discrimination laws. And, just yesterday, that it would temporarily turn off all advertisers’ ability to exclude users by race while it continues to work on these problems.
Still, this story makes the need for greater transparency and accountability from these online platforms that much more urgent. Discrimination in the virtual world is no less damaging than offline discrimination, but it can be even more difficult to root out. People who are excluded from viewing Facebook advertisements, for example, would never know that the housing opportunity existed, and so they would not apply.
It is the most vulnerable Americans who tend to lose the most from predatory business practices.
What’s worse, unlike in the pre-digital world, where organized communities and advocates could spot discriminatory ads and report them, it is impossible for someone who didn’t see a relevant housing ad to prove that discriminatory targeting is responsible. And of course, it is the most vulnerable Americans who tend to lose the most from predatory business practices.
The good news is that there is a way to find these issues: audit testing by academic researchers. Had Facebook allowed outside researchers to see the system it had created to catch discriminatory ads, those researchers could have spotted the problems and ended the mechanism for discrimination sooner.
There are whole communities of researchers ready, willing and able to conduct the audits that could help protect the public from some of the digital platforms’ most pernicious effects on civil rights. Having learned that it is not well-equipped to police its own systems, Facebook should commit to allowing independent audits. Justice, fairness and civil rights laws demand no less.