Facebook has already decided how you will vote – News @ Northeastern
Share

According to a new study by a team of computer scientists, Facebook wields significant power over political discourse in the United States thanks to an ad delivery system that increases political polarization among users.
The study– Released this week by researchers from Northeastern University, the University of Southern California, and nonprofit technology organization Upturn, reveals for the first time that Facebook is delivering political ads to its users based on the content of those ads and the media company’s information on its users – and not necessarily based on the advertiser’s intended audience.
“We noticed that Facebook disproportionately delivers an ad to the users who [Facebook] thinks okay with the ad based only on the content, ”says Alan Mislove, Professor of Computer Science at Northeastern and one of the authors of the paper.
Mislove says the results are having serious consequences for democracy in the US Facebook is one of the world’s leaders biggest Advertising platforms and its ad delivery system create information filter bubbles for their users, as the study shows. It shows that citizens are being presented with ads that reinforce their existing political beliefs and that they are excluded from seeing ads that challenge those beliefs.
Alan Mislove is Professor of Computer Science at the Khoury College of Computer Sciences. Photo by Matthew Modoono / Northeastern University
In one Statement to the Washington Post, a spokesman for Facebook denied the severity of the results.
“The results, which show that ads about a presidential candidate are being delivered to people in their political party, should come as no surprise,” Facebook spokesman Joe Osborne told the post. “Ads should be relevant to the people who see them. It is always the case that campaigns can reach the desired target group with the right focus, the right goal and the right expenses. “
But Mislove says this is an oversimplification.
“I don’t think most people understand the level of optimization online advertising is,” he says. “When Facebook optimizes ads for relevance, they also optimize Facebook’s profit margin.”
Like many of the largest digital companies, Facebook keeps its algorithms under lock and key. To understand how advertisements are delivered to users, Mislove and his colleagues – a team that included PhD students from the Northeast – worked on Muhammad Ali and Piotr Sapiezynski– Issued as political advertisers.
The researchers spent more than $ 13,000 on a series of ad campaigns testing how Facebook promotes political messages.
They focused on creating advertising campaigns for US Senator Bernie Sanders, a Democrat, and President Donald J. Trump, a Republican. At the time of the experiment (early July 2019), the real Sanders and Trump camps had spent most of the money on Facebook advertising among key candidates from both parties, and so the researchers felt confident that their relatively small advertising budget would not affect either The election performance of Sanders or Trump.
The researchers largely repurposed real ads in both campaigns to test Facebook’s ad delivery system, but with a special focus on target audience. They used public records in North Carolina and Facebook’s own demographic information to create specific audiences to sort people by political party affiliation.
Facebook and other online advertising platforms offer advertisers a variety of tools for targeting audiences – a practice called “microtargeting,” which is being reconsidered by some of the largest media companies, including Twitter and Google. (The researchers note that Facebook is also considering changes to its policies.)
Microtargeting allows advertisers to target specific demographics and try to get their ad to exactly who they want to see. You can also choose between different destinations, such as: B. the display of the ad to the largest number of users, which is what the researchers chose for their ads.
One of the problems the researchers uncovered in their study is the relatively limited impact of such targeting options on audiences compared to Facebook’s internal system for determining the “relevance” of an ad, Mislove says.
This system, which is a proprietary algorithm that Facebook keeps secret, determines who sees a particular ad and who doesn’t, Mislove says. And to optimize the ad’s success, Facebook’s algorithm delivers it to people it believes will tend to like.
Such optimization isn’t just limited to political ads, Mislove says. It’s probably the same system that Facebook uses to get the relevance of. to determine every Advertising on its website.
Mislove says it’s not a problem for marketing ads or political ads that are meant to raise money by appealing to the grassroots of a campaign, but it is a problem for an ad that is meant to change the mind of a voter who hasn’t on board is the news.
In one case, the researchers found that when targeting an audience of users who Facebook labeled “likely to have political content in the US (Liberal)” and an equal audience of people who were “likely to have political content.” in the US are busy (conservative), ”60 percent of Liberal users saw their Democratic ads and only 25 percent saw Republican ads.
In another ad run, the researchers simultaneously pushed Sanders and Trump ads to a conservative audience. All other things being equal, the Trump advertisement was delivered to 21,792 conservative Facebook users and the Sanders advertisement to 17,964 conservative users – almost 20 percent fewer people.
The researchers also found that a political advertiser who wanted to bridge this ideological divide would have to pay more for the ad. In the extreme, it meant paying up to two or three times more for an ad, Mislove says.
When the researchers sent out a neutral ad asking people to register to vote, despite all the other restrictions, it reached a much more balanced proportion of liberal and conservative Facebook users.
For Mislove, the results illustrate a broader problem facing society today – the sheer impact that invisible and unregulated algorithms have on everything we do.
“Whether you surf on Facebook or use Google Maps, there is an algorithm that optimizes everything you see online,” he says. “And there is very little accountability and very little transparency about how these algorithms determine what this optimization looks like. I’m thinking about how we can measure and test these things. ”
For media inquiries, please contact Mike Woeste at m.woeste@northeastern.edu or 617-373-5718.