A new exhibition explores invisible data, from facial algorithms to satellite tracking as a return to Country
- Written by Kim Munro, Lecturer, University of South Australia
Review: Invisibility, MOD museum, Adelaide
Disinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.
In her New York Times bestseller, Weapons of Math Destruction (2016), subtitled “how big data increases inequality and threatens democracy”, mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday lives and how accurate, fair or biased they might be.
Algorithms hide behind the assumed objectivity of maths, but they very much contain the biases, subjective decisions and cultural frameworks of those who design them. With scant detail on how these algorithms are created, O’Neil describes them as “inscrutable black boxes”.
Opaqueness is intentional.
In one of the upstairs galleries at the spacious MOD, we are greeted in large text as we enter: “what do algorithms think about you?”
Can an algorithm think?, we ask. And, if so, what informs the decisions it makes about us?
Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.
We don’t see who the photos are of or who is doing the evaluating – and therefore we don’t know what biases might be reproduced.
You are invited to gaze into a mirror which scans your face. From this scan, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.
When I look into the mirror, I am told I am neither trustworthy nor emotionally stable. The algorithm underneath guesses my age by a few years, and I score highly for intelligence and uncertainty – an unhelpful combination.
Despite my doubts about the algorithm, I notice myself focusing on the more favourable data.
In this context, the data is benign. But facial recognition technology has been used to survey and monitor activists and has been responsible for thousands of inaccurate identifications by police in the UK.
Using data to illuminate cultural knowledge
In one of the more impressive works in the exhibition, contemporary data visualisation is used to illustrate Aboriginal forms of knowing and the intrinsic relationship between spatial awareness, Country and kinship.
Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla from design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.
In every AFL game Goodes played, his on-field movements was recorded via satellites, which connected with a tracking device in the back of his jumper. 20 million data points were then fused with data scans of a Red River Gum, or Wirra, to form an impressive data visualisation projected onto two large screens in a darkened gallery.
TopbunkHere, Goodes’s data is returned to Country to form part of the roots of the tree as well as the swirling North and South Winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.
In a small room between the screens – or within the tree – drone footage of the Adnyamathanha Country (Flinders Ranges) plays against the retelling of the creation story in Adnyamathanha language.
What results is the synthesis of traditional Aboriginal knowledge with cutting edge technology, revealing different ways of sensing space and time.
Read more: The land we play on: equality doesn’t mean justice
The power of the invisible
While it’s easy to focus on how technology is used and exposed in the works in Invisibility, down the corridors and hanging from the ceiling in MOD are a few other exhibits that flesh out the concept of invisibility.
TopbunkWomen’s Work celebrates the leadership of South Australian Aboriginal Women with striking black and white photographs. Tucked away down the hall on the second level is Fostering Ties, a series of images drawing attention to children in foster care.
This exhibition foregrounds invisibility as a way to contend with our own blind-spots, knowledge systems, biases and cultural frameworks.
What is invisible to us may not be to those from demographics, cultural or language groups that differ from ours.
Drawing attention to the invisible encourages us to shift our perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or life form – does.
Invisiblity is at MOD until November 2022.
Authors: Kim Munro, Lecturer, University of South Australia