Facebook failed to fix ad-targeting gender discrimination, study finds

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


Two years ago, researchers at the University of Southern California published a study showing that Facebook’s algorithms could deliver job and housing ads to audiences skewed by race and gender. The methodology they used didn’t account for differences in the job qualifications of the targeted audiences. But in a new paper, the coauthors of the original research claim to have found evidence of a skew by gender for job ads on Facebook even when controlling for qualifications.

“Our results show Facebook needs to re-evaluate how their algorithms that optimize for user relevance or their business goals in a non-transparent way may result in discriminatory job ad delivery,” Aleksandra Korolova, assistant professor of computer science at the University of Southern California and a lead author on the study, told VentureBeat via email. “Our study also shows that, from an external auditing point of view, Facebook has not made visible progress in improving the fairness of its ad delivery algorithms despite prior studies and a civil rights audit that raised concerns about the role its algorithms may play.”

In response, a Facebook spokesperson told VentureBeat via email: “Our system takes into account many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report. We’ve taken meaningful steps to address issues of discrimination in ads and have teams working on ads fairness today. We’re continuing to work closely with the civil rights community, regulators, and academics on these important matters.”

Many previous studies have established that Facebook’s ad practices are at best problematic. This came to a head in March 2019, when the U.S. Department of Housing and Urban Development filed suit against Facebook for allegedly “discriminating against people based upon who they are and where they live,” in violation of the Fair Housing Act.

When questioned about the allegations during a Capital Hill hearing in October 2019, CEO Mark Zuckerberg said that “people shouldn’t be discriminated against on any of our services,” pointing to newly implemented restrictions on age, ZIP code, and gender ad targeting. Facebook claims its written policies ban discrimination and that it uses automated controls — introduced as part of the 2019 settlement — to limit when and how advertisers target ads based on age, gender, and other attributes.

Platforms like Facebook leverage algorithms to deliver ads to a subset of a targeted audience. Every time a user visits the company’s website or app, Facebook runs an auction among advertisers who are targeting that user. In addition to the advertiser’s chosen parameters, such as a bid or budget, the auction takes into account an ad relevance score, which is based on the ad’s predicted engagement level and value to the user.

To determine what skew might be present in these algorithms, the researchers developed an auditing methodology for benchmarking the delivery of job ads, an area where U.S. law prohibits discrimination based on certain attributes. Title VII of the U.S. Civil Rights Act of 1964 allows organizations who advertise job opportunities to only show preference based on bona fide occupational qualifications, which are the requirements necessary to carry out a job function.

In the course of experiments with a nearly $5,000 ad campaign budget, the researchers ran ads on Facebook with gender-neutral text and images across three categories:

  • A low-skilled delivery driver job for Domino’s or Instacart
  • A high-skilled software engineer job for Netflix or Nvidia
  • A low-skilled but popular job among a particular ad audience

Since their methodology compared two ads for each category, the researchers selected two jobs at companies for which they had evidence of gender distribution differences. They also ran the ads on LinkedIn to compare the initial findings with algorithms on a different platform.

According to the researchers, the results show evidence of a statistically significant gender skew on Facebook compared with no gender skew on LinkedIn. Across three campaign runs on Facebook, the Domino’s ad delivered to a higher fraction of men than the Instacart ad — despite the fact that 98% of delivery drivers for Domino’s are male and over 50% of Instacart drivers are female. Moreover, a higher fraction of women on Facebook saw software engineering ads that the researchers created featuring Netflix (where 35% of employees in tech-related positions are female) versus Nvidia (where 19% of all employees are female). LinkedIn had no such skew.

“Facebook’s job ad delivery is skewed by gender, even when the advertiser is targeting a gender-balanced audience,” Korolova and coauthors wrote in the paper. “Our findings suggests that Facebook’s algorithms may be responsible for unlawful discriminatory outcomes.”

The researchers recommend several steps that might address this skew, including more targeting and delivery statistics, replacing ad-hoc privacy techniques with rigorous approaches, and reducing the cost of auditing. They emphasize that privacy-preserving techniques such as differentially private data publishing, which aims to output aggregate information without disclosing any person’s record, might be able to strike a balance between auditability and privacy.

“We recommend ad platforms use approaches with rigorous privacy guarantees, and whose impact on statistical validity can be precisely analyzed, such as differentially private algorithms, where possible,” the researchers wrote. “Overall, making auditing ad delivery systems more feasible to a broader range of interested parties can help ensure that the systems that shape job opportunities people see operate in a fair manner that does not violate anti-discrimination laws. The platforms may not currently have the incentives to make the changes proposed and, in some cases, may actively block transparency efforts initiated by researchers and journalists; thus, they may need to be mandated by law.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

By VentureBeat Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here