Hashtag

AI affects everyone – including Indigenous people. It’s time we have a say in how it’s built

  • Written by Tamika Worrell, Senior Lecturer in the Department of Critical Indigenous Studies, Macquarie University
The Conversation

Since artificial intelligence (AI) became mainstream over the past two years, many of the risks it poses have been widely documented. As well as fuelling deep fake porn, threatening personal privacy and accelerating the climate crisis, some people believe the emerging technology could even lead to human extinction.

But some risks of AI are still poorly understood. These include the very particular risks to Indigenous knowledges and communities.

There’s a simple reason for this: the AI industry and governments have largely ignored Indigenous people in the development and regulation of AI technologies. Put differently, the world of AI is too white.

AI developers and governments need to urgently fix this if they are serious about ensuring everybody shares the benefits of AI. As Aboriginal and Torres Strait Islander people like to say, “nothing about us, without us”.

Indigenous concerns

Indigenous peoples around the world are not ignoring AI. They are having conversations, conducting research and sharing their concerns about the current trajectory of it and related technologies.

A well-documented problem is the theft of cultural intellectual property. For example, users of AI image generation programs such as DeepAI can artificially generate artworks in mere seconds which mimic Indigenous styles and stories of art.

This demonstrates how easy it is for someone using AI to misappropriate cultural knowledges. These generations are taken from large data sets of publicly available imagery to create something new. But they miss the storying and cultural knowledge present in our art practices.

AI technologies also fuel the spread of misinformation about Indigenous people.

The internet is already riddled with misinformation about Indigenous people. The long-running Creative Spirits website, which is maintained by a non-Indigenous person, is a prominent example.

Generative AI systems are likely to make this problem worse. They often conflate us with other global Indigenous peoples around the world. They also draw on inappropriate sources, including Creative Spirits.

During last year’s Voice to Parliament referendum in Australia, “no” campaigners also used AI-generated images depicting Indigenous people. This demonstrates the role of AI in political contexts and the harm it can cause to us.

Another problem is the lack of understanding of AI among Indigenous people. Some 40% of the Aboriginal and Torres Strait Islander population in Australia don’t know what generative AI is. This reflects an urgent need to provide relevant information and training to Indigenous communities on the use of the technology.

There is also concern about the use of AI in classroom contexts and its specific impact on Indigenous students.

Looking to the future

Hawaiian and Samoan Scholar Jason Lewis says:

We must think more expansively about AI and all the other computational systems in which we find ourselves increasingly enmeshed. We need to expand the operational definition of intelligence used when building these systems to include the full spectrum of behaviour we humans use to make sense of the world.

Key to achieving this is the idea of “Indigenous data sovereignty”. This would mean Indigenous people retain sovereignty over their own data, in the sense that they own and control access to it.

In Australia, a collective known as Maiam nayri Wingara offers important considerations and principles for data sovereignty and governance. They affirm Indigenous rights to govern and control our data ecosystems, from creation to infrastructure.

The National Agreement on Closing the Gap also affirms the importance of Indigenous data control and access.

This is reaffirmed at a global level as well. In 2020, a group of Indigenous scholars from around the world published a position paper laying out how Indigenous protocols can inform ethically created AI. This kind of AI would centralise the knowledges of Indigenous peoples.

In a positive step, the Australian government’s recently proposed set of AI guardrails highlight the importance of Indigenous data sovereignty.

For example, the guardrails include the need to ensure additional transparency and make extra considerations when it comes to using data about or owned by Aboriginal and Torres Strait Islander people, to “mitigate the perpetuation of existing social inequalities”.

Vote No and Vote Yes signs on footpath.
Opponents of the Indigenous Voice to Parliament used artificial intelligence to create online ads depicting Indigenous people during last year’s referendum debate. Mick Tsikas/AAP

Indigenous Futurisms

Grace Dillon, a scholar from a group of North American Indigenous people known as the Anishinaabe, first coined the term “Indigenous Futurisms”.

Ambelin Kwaymullina, an academic and futurist practitioner from the Palyku nation in Western Australia, defines it as:

visions of what-could-be that are informed by ancient Aboriginal cultures and by our deep understandings of oppressive systems.

These visions, Kwaymullina writes, are “as diverse as Indigenous peoples ourselves”. They are also unified by “an understanding of reality as living, interconnected whole in which human beings are but one strand of life amongst many, and a non-linear view of time”.

So how can AI technologies be informed by Indigenous ways of knowing?

A first step is for industry to involve Indigenous people in creating, maintaining and evaluating the technologies – rather than asking them retrospectively to approve work already done.

Governments need to also do more than highlight the importance of Indigenous data sovereignty in policy documents. They need to meaningfully consult with Indigenous peoples to regulate the use of these technologies. This consultation must aim to ensure ethical AI behaviour among organisations and everyday users that honours Indigenous worldviews and realities.

AI developers and governments like to claim they are serious about ensuring AI technology benefits all of humanity. But unless they start involving Indigenous people more in developing and regulating the technology, their claims ring hollow.

Authors: Tamika Worrell, Senior Lecturer in the Department of Critical Indigenous Studies, Macquarie University

Read more https://theconversation.com/ai-affects-everyone-including-indigenous-people-its-time-we-have-a-say-in-how-its-built-239605

Health & Wellness

How to Maintain Your Oral Health During Stressful Times

Hashtag.net.au - avatar Hashtag.net.au

Life can be a bit like an Australian outback road—full of twists, turns, and the occasional bump. During these hectic, stressful times, it’s all too easy to let your oral health fall by the wayside...

The impact of visual art on mental health and productivity

Hashtag.net.au - avatar Hashtag.net.au

In a world where stress and anxiety often take center stage, the presence of visual art in our daily environments offers a quiet yet profound counterbalance. Posters and paintings are not just decor...

Understanding the Environmental and Health Impacts of Waste Disposal: Essential Insights

Hashtag.net.au - avatar Hashtag.net.au

🌎♻️Explore the health & environmental impacts of waste disposal. Dive into essential insights💡for a sustainable future! #WasteManagement #Health🌿🌍 Waste disposal is an inevitable aspect of mode...