How Datafied Society Reinforces Harmful Biases

FIBER
6 min readJun 4, 2018

--

By Hanke van der Lee

As a citizen of a free society, we have the duty to look critically at our world.
But if we think we know what is wrong, we must act upon that knowledge. [Tony Judt]

Moderator Tijmen Schep presents a few central questions at the start of Coded Matter(s) #14 Big Bias

Today, big data and curated technology have had an increasing impact on our lives and thought processes. We are stuck in a way of thinking that feeds into racial and gendered biases that continue to shape the world around us. The dependency on software and computers that were created to make our lives easier seem to captivate us in a vicious cycle instead. The urge to critically reflect upon the effect of technology lingers in the corner. How could the arts serve us further in comprehending what is at stake here in this datafied society? Coded Matter(s)#14 Big Bias revealed.

‘Embedded in every technology there is a powerful underlying idea’ Mimi Onuoha spoke during Coded Matters. The purpose for which that power will be used, however, differs. It made me reflect on how recently the case of Cambridge Analytica exposed a vivid example concerning the power of datafication. The case of this tech company, amongst many companies that make money through the sell-out of data, depicted the risk that the gross scale exchange and appropriation of mass data through advertisements carries in reality. Institutions have gained the ability to watch our moves and influence our behaviour. Furthermore, Cambridge Analytica- during its hay days — set up stereotypical profiles in order to pinpoint different target audiences. Hence, consumers were classified into focus groups, such as a ‘high neuroticism and conscentious’, in order to predict their consumer behaviour. More particularly, a shallow description of a human being in order to meet as many commercial targets through narrowed-down advertising. These so-called psychographs were used to segment people by personality and combined with informational aspects such as demography. This lack of fluidity can be potentially harmful to those that do not want to be reduced to biases and datafication.

Mimi Onuoha at Coded Matter(s) #14- Big Bias

As I said, the control over our bodies and lives seems to be placed outside ourselves. Web 2.0 has made it possible to incorporate data surveillance and see what citizens are doing both offline as well as online. People are aware of how tech companies use their data to great lengths. It made me think of the Panopticon prison model, which was designed by Jeremy Bentham and could serve as an understanding to how social institutionalized control functions. The Panopticon is in fact, an ‘all-seeing’ architecture which consists of an annular building with individual cells that are all controlled and watched by a central watchtower. Moreover, the prisoners in the building cannot see the guard but are well-aware that they are being watched 24/7. This awareness of the gaze by the guard and the thought of being surveyed, despite not being sure that the guard is actually always there watching you, the Panopticon creates the illusion of not being able to make a move without the guards knowing. Thus, the citizens will behave according to what is expected of them. Michel Foucault translated this work to how modern society works. The information that is gathered is used as a form of social control over citizens within neoliberal society (1). Those who are a ‘good citizen’ do not risk being punished. However, this Panopticon system does not treat citizens equally, but instead has incorporated racial bias that belong to the colonial era. As the supposedly colour-blind software already uncovered: discipline, power and agency differs across the racial spectre.

Mimi Onuoha on Algorithmic Violence at Coded Matter(s) #14 Big Bias

So, what particularly makes these biases so harmful ? And where are they coming from to begin with? Institutions have been harmful for decades, whether that is consciously or subconsciously. As Karen S. Glover established in her research on ‘Citizenship, hyper-surveillance, and double-consciousness: Racial profiling as panoptic governance’, how the United States has incorporated racial construction within the governance of citizenship for over years now (2). Furthermore, a non-profit organization, ‘the Sentencing Project’, has come forth with a research that revealed the significant failing of jurisdictional and criminal justice system on behalf of the black community. Results from the research ‘Black Lives Matter: eliminating racial inequity in the criminal justice system’ have shown how this jurisdictional system only benefits Caucasians, while people of colour are grossly disadvantaged and do not get the required resources or legal support.

Apart from these considerations, the black community is also far more often faced with inordinate arrests, forced settlements such as low quality paroles or unfavourable plea deals. This research shows how significant the failing of the criminal justice system is with regard to the equal treatment of people of colour (3). Thus, when incorporating software that does not debunk these wrongful assumptions and fails to take in its role as objective machine, these racialized notions are reinforced. W.E.B. DuBois’ work on the double consciousness of people of colour revealed how people who constantly live under the white gaze struggle with an internal conflict of not belonging and being abject within this mostly white constructed society. In effect, this leads to a feeling of inferiority within the black community. Glover emphasizes that these racial biases incorporated in our governance of citizenship have a causal effect and strengthen these feelings of inferiority within the double consciousness.

Zach Blas lays out in his lecture performance ‘Metric Mysticsm’ how Silicon Valley mythologizes data to hide the bias embedded in data practices

In effect, it should come as no surprise that gender or racial biases are a part of algorithms. Easily said: the input of the computers and its software, is very likely to be a prediction of its output. Thus, when data is biased in itself and incorporated in software, it is inevitable that its outcome contains these same prejudiced aspects. Perpetuating these same wrongdoings in society through software could have detrimental effects for what emancipation movements have tried to establish over the years. We want to be regarded as actual human beings instead of commercial datasets that fit into one particular box. It takes away the ‘agency’ of the human being and his or her body, to control and live it as they see fit. Moreover, if we want to commence a society in which we move forward towards more equality and fluidity, we need to seriously re-evaluate the roles that software and datafication play in the reinforcement of binaries. We need to consider the moral dimension these programs — rendered by tech companies — carry in actual society and its social structures. It is up to humans who program algorithms to reassess and review how data is presented to users when interacted with. With that, our task as humans to change what is wrong with our biased software in a world that has become increasingly dependent on technology, is laying before us.

Hanke van der Lee is a RMA student Gender & Ethnicity at Utrecht University and currently writing her thesis which concerns the impact of Facebook news framing on the Dutch political moral. In this research she traces the intersections of online identity shaping together with the effect of algorithms and news framing on Facebook users perceptions from a Genderstudies perspective. Additionally, this research incorporates how these online interactions shape our own attitudes towards current societal debates and in particular focuses on the case of the refugee crisis.

Bibliography
1.Green, Stephen. ‘A plague on the Panopticon: Surveillance and power in the global information economy’ Information, Communication & Society 2/1 (1999): 29.

2. Glover, Karen S. ‘Citizenship, hyper-surveillance, and double-consciousness: Racial profiling as panoptic governance’ In: Surveillance and Governance: Crime Control and Beyond (Volume 10) Deflem, Mathieu, Ulmer, Jeffrey T. (ed.) 2008: 242

3. The Sentencing project ‘Black Lives Matter: eliminating racial inequity in the criminal justice system’ December 2014 <http://sentencingproject.org/wp-content/uploads/2015/11/Black-Lives-Matter.pdf >

For more information about FIBER’s event series Coded Matter(s) visit: http://codedmatters.nl/

Designer & researcher Femke Snelting explores in the Possible Bodies project how biases are embedded in medical imaging data

--

--

FIBER

Amsterdam based platform and festival for audiovisual art, digital culture and electronic music. Upcoming events: FIBER Festival 2024