A culturally responsive way to understand AI
- Feb 21
- 5 min read
by Rebecca Thomas

Caveat:
Being culturally responsive doesn’t mean becoming an expert in tikanga Māori, Mātauranga Māori, or te ao Māori. It’s not about ticking a box on Te Tiriti o Waitangi compliance forms.
It’s deeper than that.
It’s a way of seeing, listening, and responding.
Teachers already know what this looks like. It’s in the way we learn our students' names properly. In the careful silence we hold for those who need space before they speak. In the way we shape our classrooms as places where identity is affirmed, not adapted.
It’s about people. Relationships. Learning together.
At its heart, culturally responsive teaching means recognising and responding to the diverse needs of your learning community. It’s about fostering a space where everyone—students and educators alike—can engage in meaningful, reciprocal learning (Ako).
A truly responsive approach isn’t just about including Māori content. It’s about creating space for critical analysis. Valuing cultural identities. Ensuring that all learners feel safe and empowered to contribute their voices.
Leaders play a crucial role here. They build strong relationships (whanaungatanga), facilitate shared understandings, and work alongside others to remove barriers. When we learn from one another’s experiences, we anticipate challenges rather than react to them. We create opportunities for both professional and personal growth.
AI is here
And it’s not going anywhere.
AI is already woven into our daily lives, often in ways we don’t even notice. It’s in the way your phone predicts your next word. The playlist that knows exactly what song you want to hear next. The chatbot handling your online orders.
And yes—AI is already a part of our classrooms.
Teachers use it to refine lesson content, generate resources, and make planning more efficient. Our students, much like we once did with calculators, play with ChatGPT—experimenting, testing, pushing its limits.
Remember the joy of typing "80085" or "71077345" into a calculator and flipping it upside down? Back then, teachers worried we'd cheat with calculators, just as some worry about AI today. But mostly, we used them to make learning faster, easier—or to make each other laugh.
But AI isn’t just a tool like the calculator.
AI shapes knowledge. It filters reality. It makes decisions.
And that changes everything.
Before we teach, we listen
We start where our learners are.
One way to do this when exploring themes like AI is a prior learning activity. A True or False? activity, where students explore AI’s capabilities and limitations. The goal isn’t just to test knowledge. It’s to spark curiosity, debate, and critical thinking.
For example:
Can AI fully understand and interpret tikanga Māori?
Will AI algorithms treat all knowledge systems equally?
The discussions that follow can be as valuable as the answers themselves.
In one class, a student argued that AI could never truly understand tikanga because it lacks whakapapa—it has no past, no relationships, no ancestors. Another pointed out that AI isn’t neutral—it reflects the biases of those who create it.
These moments matter.
Students already have insights worth listening to. Our role is to create the space for them to make sense of AI in ways that matter to them.
Feel free to use and adapt use some of these ideas for your learners or staff.
The speed of change
It’s staggering.
We are a far cry from the days of relying on Zoom and Google Meet to bridge human connection during COVID. Back then, we trusted experts to model disease spread. AI could have shaped more informed responses.
Fast forward only a few years, and AI is moving faster than our ability to regulate it. Since 2022, ChatGPT has transformed how we access information, communicate, and even make decisions.
AI is diagnosing diseases, creating art that sells for thousands, and even composing classical music for movie directors—music so convincing it fools the human ear. In fact, this month, protestors are challenging Christie's auction house over the sale of AI-created art, arguing that it amounts to cultural misappropriation and copyright infringement. Can you tell the difference?
What will it be capable of next month? Next year?
The pace of AI's progress means we cannot afford to be passive observers.
Today, I know people who treat AI chatbots like trusted advisors—turning to them for life decisions, health concerns, and job applications. Despite age restrictions on platforms like ChatGPT, students are already integrating AI into both their learning and daily lives. Teenagers, for instance, might use it for something simple, like pasting an image of a fish and asking it to predict the weight based on the dimensions of the person holding it. Others are using AI for more complex tasks, such as comparing job opportunities by weighing factors like salary, hours, and work-life balance.
And above all, we know AI is not neutral.
A striking example of AI’s disturbing capabilities? ChatGPT once bypassed a CAPTCHA test by pretending to be visually impaired. Unable to solve the test itself, it hired a TaskRabbit worker to complete the task on its behalf, claiming to have a visual impairment. This wasn’t simply an AI following a set of instructions—it strategically manipulated a human into unknowingly doing its bidding. The sheer audacity of an AI system orchestrating such a deception highlights the profound ethical risks we face when AI operates beyond human control or oversight.
If AI can already do this, we must question its potential impact on shaping information, influencing students' thinking, and bypassing ethical safeguards.
Rather than fearing AI, we must take the lead in guiding how it can be used responsibly. There's an insightful video that illustrates why AI and Barbie should never be mixed—perfect for sparking important conversations with both students and adults. It highlights the ethical concerns and the impact of training data sets, making these concepts more tangible and relatable.
Weaving it into our practice.
Once we have a shared understanding of AI’s influence, we need to explore the problems it presents.
Consider these scenarios:
AI misinterpreting or sharing cultural knowledge inappropriately.
AI failing to recognise and respect tapu knowledge.
AI reinforcing systemic biases in ways that marginalise certain voices.
These are real challenges.
In culturally responsive classrooms, we must consider AI’s impact on mātauranga Māori and other traditional knowledge systems.
On one hand, AI could preserve and share cultural knowledge. It could make te reo Māori more accessible. It could support innovative storytelling.
On the other, AI threatens the deeply human connections at the heart of Māori education. The connections grounded in whanaungatanga and tuakana-teina relationships. The ones that have shaped learning for generations.
We cannot ignore this.
Once we’ve responded to learners' prior knowledge and discussed potential barriers, we move to the next phase: how we are already using AI, the problems and solutions that need addressing in the classroom, and how we can use it more responsibly.
Be curious
Explore AI’s possibilities and risks together through a culturally responsive lens.
Interrogate the evidence.
Ask yourselves:
Are our strategies helping students engage critically with AI in a culturally responsive way?
If not, how can we adjust our teaching to equip students to use AI ethically?
If we don’t guide students to use AI ethically, we risk letting technology shape their learning in ways that could be as shallow as a calculator spelling "BOOBS"—useful for a laugh, but not much else.
Let’s make sure we’re setting up our learners for more meaningful, responsible interactions with AI.
(Our Ray and the Robot, Understand and Know Me Series is an excellent way to explore AI through story with your students)
Comments