Navigating big data dilemmas

By Cheryl Cooky, Jasmine R. Linabary and Danielle Corple

From Big Data & Society

Facebook is yet again dominating news headlines, this time for its #10YearChallenge encouraging users to post pictures of themselves from 10 years ago and today. While ostensibly a fun and trivial exercise, the viral challenge raised questions regarding whether the game was initiated by Facebook to gather data on its users. This led Zak Doffman to claim in Forbes, “The world of Big Data has clearly changed in the last year”.

Indeed, it has. In March 2018, Facebook was embroiled in a much more significant scandal. Years before, the consulting firm Cambridge Analytica hired a researcher to harvest data from over 50 million Facebook users under the guise of academic research. Cambridge Analytica used the data to develop software for predicting voter behavior, selling it to political parties to create targeted campaign ads. In a Congressional hearing about the Facebook data breach, Mark Zuckerberg, the founder and CEO of Facebook, apologized for the invasion of privacy, claiming “we didn’t do enough to prevent these tools from being used for harm … That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy.”

Although academics and civil liberty groups have long raised concerns over the ethics of Big Data technologies and companies, these events have thrust the conversation into the mainstream, demonstrating the urgency of addressing issues of privacy, access, and corporate control of data. The recent controversies regarding Facebook highlight a central challenge for academic researchers. While users may produce “content”, they often relinquish control over how that content is used. For individuals vulnerable to surveillance, harassment, exploitation, or other forms of abuse, the ways that researchers access and utilize their social media data may place these individuals at further risk of harm. 

Using our own research on #WhyIStayed as an example, we have examined some of these challenges for Big Data researchers. Corporate control of online content, and in particular social media content, raises critical questions regarding knowledge production itself and who stands to benefit. For example, what gets researched and who does it represent? Who gets to conduct research and in whose interests? What knowledges are even possible to be produced if access and content are controlled by corporate entities? What symbolic violence might we enact if, as researchers, we simply adhere to “regulatory norms”, such as those of institutional review boards or user agreements? Who is most vulnerable to that violence?

In order to address these considerations, we have drawn attention to the importance of feminist ethics for Big Data social media research. We have argued that power, context, and subjugated knowledges – key tenets of feminist holistic reflexivity – must each be central considerations in conducting Big Data social media research. While our study, which involves victims/survivors of domestic violence, highlights particular risks and vulnerabilities, we believe that the practices of feminist holistic reflexivity that we have discussed can help other researchers navigating ethical issues related to Big Data social media research.


Article details

Navigating Big Data dilemmas: Feminist holistic reflexivity in social media research

Cheryl Cooky, Jasmine R Linabary, Danielle J Corple

DOI: 10.1177/2053951718807731

Big Data & Society


About

CCooky-002.jpg

Cheryl Cooky, PhD, is an associate professor in the American Studies program and Women’s, Gender, and Sexuality Studies program at Purdue University. She is the author of No Slam Dunk: Gender, Sport and the Unevenness of Social Change (Rutgers University Press, 2018).

JLinabary-1-683x1024.jpg
DCorple-250x228.jpg

Jasmine R. Linabary, PhD, is an assistant professor in the Department of Communication and Theatre at Emporia State University. Her research centers on issues of safety and inclusive participation in digital and physical spaces.

Danielle Corple is a PhD candidate in the Brian Lamb School of Communication at Purdue University. She studies issues of vulnerability in online and offline organizations.