Facebook algorithm shows gender bias in job advertisements, study results Study
Researchers have found that certain types of job advertisements are displayed disproportionately to men and women, challenging the company’s progress in removing bias in its algorithms.
The study, conducted by researchers at the University of Southern California, found that Facebook systems were more likely to present users with job advertisements if their gender identity reflected that gender’s concentration in a particular position or industry. In tests conducted late last year, advertisements were found to recruit delivery drivers for Domino’s pizza Inc.
were disproportionately shown to men, while women were more likely to receive notifications when recruiting buyers for the grocery delivery service Instacart Inc.
According to the study, the imbalance also applies to job postings for highly qualified positions. Facebook’s algorithms showed a woman an ad for a technical job at Netflix Inc., which has a relatively high proportion of women in the tech industry, than an ad for a job at Nvidia Corp.
, a manufacturer of graphics chips with a higher percentage of male employees, based on data from federal labor reports.
The results suggest that “a platform whose algorithm learns and maintains the existing differences in employee demographics,” the newspaper said, noting that Facebook’s algorithms appeared to produce biased results even when an employer intended to reach a demographically balanced audience.
Federal law prohibits discrimination based on gender, race, age, and other characteristics in the advertising of housing, employment, and credit products. While the law’s application to behavioral advertising remains controversial, the federal government has argued that ads must be distributed in a way that does not affect the ability of protected groups of people to see them.
Subscribe to Newsletter
Important news from the technology sector.
The paper reflects Facebook’s difficulty in understanding and managing the societal impact of its content recommendation systems. Several large tech companies have teams working to investigate and find ways to remove bias in their algorithms.
“Our system takes many signals into account to try to deliver the people advertisements that interest them most, but we understand the concerns raised in the report,” said Beth Gautier, a Facebook spokeswoman. “We have taken sensible steps to address the issue of discrimination in advertisements and have teams today working on the fairness of advertisements.”
The USC researchers also ran nearly identical tests on LinkedIn’s serving of job advertisements, but found no evidence of Microsoft’s gender bias Corp.
-own platform. The researchers didn’t look at other websites showing job vacancies, saying that running tests in more categories would improve confidence in the results.
Aleksandra Korolova, a former Google scientist and one of the authors of the study, said studying Facebook systems was difficult without accessing company data, but she was surprised the company hadn’t looked at the skewed distribution of job advertisements. “You’ve known this for years, and it’s an important issue for society,” she said.
Ashvin Kannan, LinkedIn vice president of engineering, said the USC paper’s conclusions reflect LinkedIn’s own results, but the company remains concerned about the bias of its systems.
Questions of discriminatory and illegal ad targeting have been cropping up on Facebook for years. A 2016 report by ProPublica found that the company allowed employers to exclude older workers from viewing job postings and landlords to exclude ethnic minorities from ads.
The U.S. Department of Housing and Urban Development sued Facebook in 2019 for so-called biased advertising, claiming the company allowed landlords to exclude users with interests in topics like Hispanic culture, mobility scooters, and hijabs from housing listings. Facebook has settled the lawsuit, stating that it has removed categories like targeting options and committed to working with HUD to address the agency’s other concerns.
It has also settled a lawsuit filed by the American Civil Liberties Union and has agreed to take steps, including investigating any possible discriminatory outcomes produced by its algorithm.
While the research sheds light on problems at Facebook, the industry has so far been unable to find a permanent solution to the problem, said Piotr Sapiezynski, a computer scientist at Northeastern University who worked with the USC team on previous research on racial differences in job provisioning Show.
“Until we figure out how to do this correctly, the short-term solution is to turn off relevance comparison for apartment, loan and job ads,” he said.
Write to Jeff Horwitz at Jeff.Horwitz@wsj.com
Corrections & reinforcements
Piotr Sapiezynski is a computer scientist at Northeastern University. An earlier version of this article incorrectly claimed he was a computer science professor. (Corrected April 9th)
Copyright © 2021 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8