Higher education is undergoing a noticeable transformation in the age of artificial intelligence (AI). Universities are grappling to find answers to some of the fundamental questions: What does it mean to educate students in an AI-driven world? How can universities ensure that their teaching keeps up with the evolving AI technologies? And how to ensure that education remains relevant in this changing landscape?
At the same time, universities face the challenge of making education more inclusive, accessible, diverse, and, importantly, decolonised. The rise of AI adds another layer of complexity, as it reshapes how knowledge is produced, shared, and valued within education.
While AI offers exciting possibilities for rethinking teaching and learning, it also raises critical questions. Generative AI tools have the potential to transform the way universities deliver education and support new forms of learning. Yet universities must also pause and reflect on how these technologies may reinforce the very systems that decolonial education seeks to challenge.
There is a growing concern that the design, development, and evolution of AI are grounded in historical and contemporary power structures shaped by colonialism. In many cases, AI systems reproduce unequal social, cultural, economic, political, and environmental dynamics that disadvantage certain regions and communities.
This blog series contributes to the urgent and ongoing conversation about resisting the coloniality of AI. Across five posts, contributors explore different dimensions of this issue and reflect on how higher education can respond critically and responsibly.
Professor Leon Tikly opens the series by examining how AI is far from a neutral technology. He argues that AI systems are shaped by global colonial power structures and often reproduce existing inequalities and biases. His reflections highlight why the project of decolonising AI is both necessary and meaningful for higher education.
In the second post, Professor Foluke Adebisi offers a thought-provoking exploration of how AI can reinforce the same racist and misogynistic structures that have historically shaped society. She calls for reclaiming our shared humanity in an increasingly automated world.
Professor Richard Watermeyer then turns attention to the economic logic that underpins both the development of AI and the operation of modern universities. He calls for greater transparency, stronger independent governance of institutions, and the development of AI critical literacy as essential steps toward meaningful decolonisation.
In the fourth blog, Dr Dave Lawson reflects on the growing presence of generative AI tools in teaching and learning. While recognising their potential, he cautions against uncritical use as AI generated outputs are rarely neutral and often amplify certain voices and knowledge systems over others. Dave suggests viewing AI as a “warped mirror” one that reveals existing knowledge hierarchies as an opportunity to encourage deeper learning and reflection.
Finally, Dr Bunmi Isaiah Omodan explores how generative AI might support efforts to decolonise learning. He argues that by expanding access to diverse learning materials and encouraging collaborative learning practices, AI can be used as a critical learning partner that may serve to broaden perspectives.
Together, these contributions invite readers to reflect on how higher education can engage with AI thoughtfully, critically, and responsibly. We hope that this blog series invites readers to pause, reflect, and engage critically with the ways AI is shaping education (and society) and to envision how we might develop, design, or use AI in more critical and decolonially-informed ways.
View the other posts in this series here: https://bilt.online/category/decolonising/decolonising-ai/


Leave a Reply