Misbehaving Systems: In Conversation with Tega Brain

FIBER
12 min readAug 5, 2020

--

Eccentric engineering, the fantasy of neutrality and giving a language to uncertainty

Tega Brain, Julian Oliver & Bengt Sjölén — Asunder

FIBER Festival 2020 will take place from 24–27 September. With the festival theme of Instability we explore new ways of adapting to an age of planetary and societal changes. Leading up to the event, we are investigating the unstable layers of four subdomains: Becoming Unstable, Sensing Place, Radical Streams and Posthuman Territories. Within Posthuman Territories, we aim to investigate how artistic engineering can play a role in developing an inclusive and sustainable technological infrastructure. What invisible conflicts may emerge due to the changing landscape? Over the next few weeks, we begin our unstable journey by speaking with different artists whose work connects to these themes. Next in this series is Tega Brain (AU).

Tega Brain is an Australian-born artist and environmental engineer whose work examines issues of ecology, data systems and infrastructure. She has created wireless networks that respond to natural phenomena, systems for obfuscating fitness data, and an online smell-based dating service. Her work has been shown in the Vienna Biennale for Change, the Guangzhou Triennial, and in institutions like the Haus der Kulturen der Welt and the New Museum, among others. Her practice often involves building experimental infrastructures that enact different relations, thinking of infrastructures as negotiations or designing in ways that produce mutualistic relationships. Her work takes the form of installations that prototype these types of ideas, using computational models and machine learning within environmental research, as well as questioning the perspectives and limitations which computation produces. Another part of her practice looks at computation in the realm of our social lives, examining data collection, surveillance, targeted content, and how algorithmic systems are shaping our experiences. These pieces exist as online interventions or participatory projects. In an interview we spoke with Tega about engineering’s role in creating a less extractive relationship between technology and the biological world.

Rhian Morris: You propose a form of ‘eccentric engineering’ to create misbehaving systems/technologies. What can we, as humans, learn from these surprising interactions with nonhumans?

Tega Brain: I think we experience technologies as misbehaving all the time! There’s this narrative that technology solves problems, that it augments some kind of human capacity, but so much of our interactions are about fixing and maintaining systems when they don’t behave as expected.

In my practice a big source of inspiration is where systems leak or fail or misbehave. First of all these moments reveal the logic and agenda of a particular technology and they can also often make a gap for thinking about how another world might be possible. If a system behaves differently it opens up one’s imagination for a world that might accommodate this.

The example I talk about a lot is water infrastructure, where leaking is conventionally thought of as a failure; a lot of energy and effort goes into plugging up leaks and trying to monopolize water as a resource. And yet if you look at a water leak, they often support other ecologies. So a leak is a redistribution of resources and is often an opportunity for other species to thrive. A leak gives us a glimpse of a world where systems are deliberately designed so that a certain percentage of that resource is distributed to other species and lifeforms.

In my practice a big source of inspiration is where systems leak or fail or misbehave.

Leaks in water systems offer oppourtinites for redistribution of resources amongst other ecosystems

When a system fails, it’s often that it redistributes resources or power in some way. Even information leaks like Snowden or the Panama Papers offer a potential redistribution of power in how information is available. In this space of surveillance systems, the argument goes that if these technologies fail for a particular group of people, we have to fix them and make them more accurate, or generically applicable. For example in the case of face recognition technologies that still do not work well on non white people.

But rather than trying to perfect these technologies, we actually need to be having a conversation about whether they should be built at all. Where is it appropriate for data driven systems to be used? How do these systems redistribute power? By that I mean, do they help or hinder protest? Do they make decision-making clearer or more opaque? Do they lead to a more equitable distribution of resources or not? The goal should not be to create ideal or objective systems because that’s not possible, but to be questioning the rationale behind any emerging technology.

Despite the fact that Silicon Valley claims to be progressive, disruptive and innovative, typically emerging technologies are actually very conservative and reinforce the status quo. They augment inequalities and give leverage to those already in positions of power. I think art has an important role to play in questioning these logics and narratives. To remind us that computation is not the only way to organise and see the world.

The goal should not be to create ideal or objective systems because that’s not possible, but to be questioning the rationale behind any emerging technology.

RM: Your work focuses on a shift from infra to inter structures, to examine how we can design our systems to support species other than our own. How can such a shift in design affect our way of relating with the world?

TB: We don’t have a choice anymore. I use the term inter structure because infra means hidden from view, and in the anthropocene, when we are all confronted with climate issues daily there is no hiding our systems out of sight anymore. We can’t continue to send waste out to an ocean or atmosphere and pretend that it has gone away. That was a fantasy of the 20th century.

Tega Brain — Coin Operated Wetland

The word inter structure is to try to think about these systems as connections, negotiations with other lifeforms, to think about them as active agents and as opportunities for designing relations. If COVID has shown us anything it’s that we can radically change our personal behaviour — the latest data is that global carbon emissions have dropped by 17% compared to this time last year, and this is good but not enough. We need this to be more like a 117% reduction and this means addressing these issues on an infrastructure level. We have to stop flying but unless we redesign our systems, we’re not going to have a livable future.

A lot of these questions around utility, failure, glitch and materiality are very alive in art. Artists have been looking at these ideas for a long time and so this history gives a rich context for reflecting on how our systems work and where they don’t.

RM: Your project Asunder responds to a growing interest in the application of AI to critical environmental challenges by creating a fictional ‘environmental manager’ that proposes and simulates future alterations to the planet to keep it safely within planetary boundaries. In what way did this project highlight the fact that data driven systems cannot depoliticise or neutralise decision making?

TB: Asunder questions the techno-solutionism of artificial intelligence and of environmental or geo engineering. There is a growing interest using artificial intelligence and machine learning to automate decisions about environmental management and geoengineering and I think we are going to see more discussion of this as the climate emergency gets worse.

Geoengineering is a broad and conflicted term, one which Holly Jean Buck writes about really well. What is geoengineering? A lot of people are comfortable with planting a trillion trees to draw down carbon but they are not so comfortable with the carbon sequestering machines Bill Gates is trying to build. However, most geoengineering proposals rely on algorithmic decision making. If we were going to make a drastic intervention into earth’s climate system, it’s not a one-off event but a long ongoing issues of maintenance, like city wide garbage collection. There is this need to collect data and model earth’s climate in order to make decisions about how much one intervenes.

Tega Brain, Julian Oliver & Bengt Sjölén — Asunder

The project Asunder tackles these propositions by presenting an autonomous environmental manager. It selects different regions of the earth and manipulates satellite imagery to generate geoengineering scenarios. Each scenario is then modelled using a real open source climate model. The work questions what handing over power to computational processes would look like — and the answer is it looks very bizarre and apocalyptic. Most of the scenarios generated are economically or physically impossible or politically unpalatable for human societies. It’s one thing to design an AI for a self-driving car and a whole other to design AI for a planetary situation that involves 6 or 8 billion people.

RM: Data-driven systems are trained on data inputted by humans, and therefore are inevitably tainted by their own political bias. Will it ever be possible to create an AI system which is neutral?

TB: Neutral is a fantasy. Every means of data collection, modelling and simulation process is always done from a particular view which you can’t escape — there’s no all seeing eye that will show you what the world is, it will always escape us.

Neutrality is connected to the idea of objectivity, there is a lot of feminist scholarship from thinkers like Donna Haraway which pushes back against this idea of complete objectivity and says that if we want to get towards a thoughtful discussion of reality it’s about situatedness; knowing where you come from and which expectations you arrive with. This helps us see the context in which a certain data service is produced. For example, computing is a technology that has emerged from mostly west-coast US culture and it’s written in English. There’s been some projects with people trying to write programming languages in Chinese or Arabic such as in the work of the artist and developer Ramsey Nasser, but at some point you have to implement in English because all the operating systems are all written in English. So all of a sudden you have a bias there, and you can’t get around that unless you build a completely different stack, where stack is a term that refers to the layers of hardware and software that make up contemporary computer platforms and which is a multi-decade undertaking.

RM: What do you think we can work towards in regards to AI if not neutrality?

TB: Just because it’s not neural doesn’t mean that it’s not powerful or useful. The climate denialists out there have shown us that critiques of technology can be used for any agenda, they have utilised doubt in the space of climate modelling, they’ve leveraged this to create a situation where we are not making the changes to production that we should be making. The models that we have are all that we have and we need to use the perspectives they provide to make decisions so that we can have a future.

Just because data driven perspectives of the world are not neutral does not mean we should abandon all computing. Rather we need to give a language to uncertainty and publicly educate that just because there’s uncertainty in a prediction it doesn’t mean it’s not worth taking into account. AI is powerful when it’s used as one tool amongst many. We can look to see what predictions or perspectives are offered by it, but we have to weigh it against other ways of knowing like lived experience, narrative and storytelling.

The climate denialists out there have shown us that critiques of technology can be used for any agenda …they’ve leveraged this to create a situation where we are not making the changes to production that we should be making.

Tega Brain — Deep Swamp

RM: Your project Deep Swamp explores how a computer looks at and manages an environment, examining what a machine can see and can’t see and how its training data shapes its view of reality. How did the three different wet-lands present in this installation demonstrate the problem of optimization in relation to environmental engineering?

TB: Deep Swamp is a piece where there’s a number of different wetlands in the gallery and each wetland is managed by an agent with software that is able to change the conditions in the wetland: create mist, change light levels and so on. Each system works by capturing and analysing photographs of its environment in comparison to a computer vision model. Each computer vision model is trained by a different data set — one is trained on thousands of images of wetlands taken off Flickr, so its understanding of what a wetland should be is based on human photography of wetlands.

When we talk about bias, Flickr is an American company, there’s a whole load of cultural factors that shape what perspective one sees on Flickr. The system takes a photo and it makes an assessment of what percentage confidence it has that this image is a wetland based on its data set, and it makes changes to try to get closer to that goal.

The second wetland has only been trained on images of landscape painting from the history of western art. I took images from galleries and museums around the world. Its view of an environment is solely based on that canon of artists’ work. And the third one is trained to optimise for attention, so it is looking for people within its photos and if it sees people it reinforces those settings.

Tega Brain — Deep Swamp

When the piece runs over a couple of months they diverge into different settings based on these correlations, it’s a live experiment in the way it plays out. Some of them live, some don’t. With living things it’s hard to say why. Machine learning systems are not good at cause, they’re good at correlation, but they’re not a tool by which we can establish a clear cause and effect relationship. I’m not doing this as a scientific experiment, each time it’s in a different environment with different wetland species, there’re so many variables. For me this work is a public invitation to have conversations around the use of these sorts of methodologies. What they can and can’t do and what their blind spots are.

RM: A systems view of the world as reinforced by computational models invokes possibilities of manipulation and control over our ‘eco-systems’. What are other options for ecological thinking, and how can we think-with the systems which seem to control us?

TB: It’s not a binary we need to turn away from, we absolutely need the systems perspective but there’s also a lot of other ways that people relate to environments. I think the process and space of art values things like transformation through an encounter in space and aesthetic experience.

Anna Tsing’s research on Matsutake mushrooms pushes against an engineering paradigm because these mushroom species cannot be cultivated. You can’t design conditions to produce them so instead you have to forage for them which means developing a literacy for their environment and what other species they like to live with. That’s a really different way of engaging with an environment than from an engineering perspective. One thing that I’ve recently started looking into is logics or ways to engage with environments that are based on protocol rather than simulation.

For example, E.O Wilson has this project called Half Earth — his theory is if we want to find a sustainable future we should make sure that humans occupy half of the earth. I think that’s a beautiful idea as it doesn’t rely on a perfect view of the world or optimising earth’s ecology, it just says here’s the protocol and let’s stick to that. If we don’t think about systems, what other framework will we use — Tsing uses assemblage which gives more space to indeterminacy, allowing for the fact that transformation always occurs in relationships. While if you approach ecology through the perspective of science, the transformation of the observer is not accounted for. It is still a human centred view — we are the ones looking at the world and making adjustments, rather than acknowledging that we are all looking at each other and in constant transformation.

Half Earth — E.O. Wilson

It’s not a surprise that all the big foundations and oil companies are looking at carbon sequestration and geo-engineering. They want to maintain the status quo, they’re not going to question themselves or their culture’s definition of what a good life is. How do we do that? Science and engineering is not equipped to think through that, it needs to be a conversation that includes everyone, and that includes other forms of knowing and engaging with the world.

Tega Brain will be participating within the FIBER Festival 2020 hybrid conference on Friday September 25. She will give a talk and participate in a Q&A. The conference will be live streamed. Registration is open now: fiberfestival.nl

Tega Brain’s participation is supported by Het Nieuwe Instituut and part of the online visitors programme 2020.

Interview conducted & edited by: Rhian Morris

--

--

FIBER
FIBER

Written by FIBER

Amsterdam based platform and festival for audiovisual art, digital culture and electronic music. Upcoming events: FIBER Festival 2024

No responses yet