ADVERTISEMENT
ADVERTISEMENT

Why Everyone Is Worried About The Facebook Data Sold To Cambridge Analytica

Cambridge Analytica, a voter profiling company with a link to Donald Trump’s campaign, collected private information from over 50 million Facebook users without their permission, The New York Times reported on Saturday. While the information wasn’t stolen, it was gathered via an app that asked for consent before accessing user data, the information was being used for a purpose that wasn’t transparently disclosed to the users. They had been told it was for “academic purposes,” but the Facebook data was sold to Cambridge Analytica, a foreign company funded by right-wing billionaire and Trump donor Robert Mercer, and then used to create psychographics or demographics that help in targeting voters.
AdvertisementADVERTISEMENT
Paul Grewal, Facebook’s vice president and deputy general counsel, told the Times that “this was a scam — and a fraud” since the social media site was also unaware that information had been sold to a third party, which is against Facebook’s Terms of Service. “Protecting people’s information is at the heart of everything we do,” Grewal stated. “No systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.” Nonetheless, Grewal said that “it’s a serious abuse of our rules” and Facebook suspended Cambridge Analytica’s account. According to the Times, the Facebook data that was stolen included “details on users’ identities, friend networks and ‘likes.’” In a statement, Facebook said they requested that Cambridge Analytica and Dr. Aleksandr Kogan, who created the app, delete the data. The NY Times reported that the data still exists, unprotected, on Cambridge servers. In a series of tweets, Cambridge Analytica claims they complied with Facebook's request.
That might not seem like very much information, but those small details help political strategists in big ways. Vox points out that in a 2016 speech Alexander Nix, the CEO of Cambridge Analytica, whose board included members like Steve Bannon, explained how this information could help politicians. “If you know the personality of the people you’re targeting,” Nix said, “you can nuance your messaging to resonate more effectively with those key groups.”
Specifically, psychographic data could help specific politicians by, for example, informing them that a certain part of the country is very interested in infrastructure, so they create more Facebook ads on the topic to target that audience. The candidate might even hop on a plane and give a speech about infrastructure in that town. It’s something the Trump campaign did throughout the 2016 election, which is why Special Counsel Robert Mueller has asked Cambridge Analytica to turn over internal documents. Though Kogan, who mined the Facebook data and fraudulently sold it to Cambridge Analytica, is Russian-American, Vox reports that as of now, “no definitive evidence has emerged that connects Cambridge Analytica and the Trump campaign to Russia’s efforts to influence our election.”
AdvertisementADVERTISEMENT
These revelations beg the question: Are expertly targeted social media ads the new normal when it comes to elections? Casey Lynn Fiesler, an assistant professor of information science at University of Colorado in Boulder, confirmed in a phone interview with Refinery29 that this is something we can expect to see more of in the future. Advertising has long been using psychographics, which Fiesler says is “attitudes or psychological criteria” that can be inferred from what you’re doing to sell products. Facebook has also been using your “likes” to target ads. “[Facebook] is very transparent about doing this,” she says. “You can go into your ad preferences and see how they’re categorising you.” The categories are based on things like what pages you like and what ads you click on. Fiesler says that Facebook knows she’s a knitter and likes Star Trek, and sends her ads based on those preferences.
The problem here says Fiesler, whose focus is on research ethics, is context. People expect that Facebook will use their data to create ads, but not to help a political candidate win. Fiesler says what Cambridge Analytica did was an “expectation violation.” People are willing to allow access to their Facebook data “so they can take a quiz and find out what Game of Thrones character they are,” knowing the company behind the quiz is mining their data and that of their Facebook friends for advertising purposes. But in the case of Cambridge Analytica, which got its information second-hand from an app created by Kogan, who in turn got access with Facebook’s permission, there was a sense of safety. “I think it’s safe to say that if people were told truthfully what it was for,” Fiesler said of the data being used for political gain, “people would have cared.”
AdvertisementADVERTISEMENT
If it’s true, as Facebook claims, that the social media conglomerate was lied to by Kogan about the purpose of his research, Fiesler said it’s hard to place too much blame on Facebook. There’s the question of how much ethical responsibility Facebook has to its users, but when someone else breaks the rules, it complicates the conversation. “Can we expect people to follow the rules?” Fiesler asks. “Or should a platform being doing something more serious to make sure rules are followed?”
After asking this, though, Fiesler says that there is actually a more important conversation that needs to be had after the Cambridge Analytica story. It’s an even more complicated one about media literacy. “Saying this is a technological problem is too simplistic,” Fiesler says. “Facebook can come up with rules and algorithms that will detect fake news and flag it. You can put a Band-Aid on it, but it’s not going to fix the underlying problem, why are people creating it? Why are people sharing it? Why are people believing it?”
These are really hard problems to solve, which is why it’s simpler to blame Facebook. The problem is that targeting voters on social media is going to continue to happen in politics and it’s only going to get more advanced. It’s hard to stop psychographic advertising with rules and regulations, which is why the public has to learn how to identify it. Facebook can and should help in the process, but, ultimately it’s up to its users to learn how to detect it on their own.
AdvertisementADVERTISEMENT
It won’t be easy, but Fiesler says the public needs to start thinking of “more social solutions than tech solutions” to solve this problem and it starts with being more protective of our personal data. “Think before you give access to your entire Facebook profile to an app so that you can find out what Game Of Thrones character you are,” she says. “Not just for yourself, but for your friends, too.”
Read These Stories Next:
AdvertisementADVERTISEMENT

More from US News

ADVERTISEMENT