Haunting The Binary Code
In the age of artificial intelligence, where the digital simulacra converge with the hyperreal, native epistemologies emerge as elusive specters haunting the binary code. Within this landscape of algorithmic determinism and techno-capitalist hegemony, the wisdom of indigenous ways of knowing offers a disruptive force — a challenge to the very foundations of our hyper-mediated existence.

Consider the phenomenon of artificial intelligence (AI), where machines mimic human cognition and decision-making processes. In this realm of digital replication, indigenous epistemologies present a stark contrast — a reminder of the sacred interconnection between all beings, rooted in the wisdom of the land, the rhythms of ceremony, and the stories of ancestors. Yet, as AI systems proliferate, they perpetuate and amplify existing inequalities encoded within the data sets, reflecting and reinforcing the biases of their human creators.
Take, for example, the use of AI in predictive policing, where algorithms analyze historical crime data to anticipate future criminal activity. In this scenario, indigenous communities, already marginalized by colonial legacies of over-policing and surveillance, become further disenfranchised as they are disproportionately targeted and criminalized. The AI systems, trained on biased data sets, perpetuate systemic racism and perpetuate cycles of injustice, eroding the trust between communities and law enforcement agencies.
Moreover, the relentless march of AI-driven automation threatens to exacerbate the displacement of indigenous peoples from their ancestral lands. As traditional livelihoods are disrupted by technological advancements, communities face economic precarity and cultural erosion, further eroding their connection to the land and their sense of identity. In this context, indigenous epistemologies offer a framework for resilience — a call to reclaim sovereignty over land, language, and culture in the face of technological colonization.
Yet, within the realm of AI, the boundaries between reality and simulation blur, as the digital simulacra colonize our consciousness and shape our perception of the world. In this hyperreal landscape, native epistemologies offer a pathway to reconnection — a reminder of the inherent value of lived experience, relationality, and reciprocity.
But how do we navigate this terrain of hyperreality, where the lines between truth and fiction dissolve like pixels on a screen? How do we confront the specter of AI-driven technocracy, while remaining attuned to the subtle nuances of indigenous ways of knowing?
One approach is to engage in what Baudrillard termed “hyperreality hacking” — a mode of subversion that destabilizes dominant narratives and disrupts the logic of simulation. As researchers and creatives, this entails interrogating the biases embedded within AI systems, challenging the techno-capitalist logics that underpin their development, and reimagining alternative futures rooted in indigenous ontologies.
Moreover, we must recognize that our actions within the realm of AI are inherently political, embedded within structures of power and privilege that shape the production and deployment of technology. Thus, we must commit to a praxis of decolonization — a process of unlearning, relearning, and co-creating knowledge in collaboration with indigenous communities.
References
- Baudrillard, Jean. Simulacra and Simulation. University of Michigan Press, 1994.
- Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press, 2019.
- Kukutai, Tahu, and John Taylor. Indigenous Data Sovereignty: Toward an Agenda. ANU Press, 2016.
- TallBear, Kim. “Artificial Knowing: Gender and the Thinking Machine.” Signs: Journal of Women in Culture and Society 40, no. 2 (2015): 395–417.