Advertisement

AI not cheating partner, but ally in modern learning 

AI not cheating partner, but ally in modern learning 
A graphical representation of artificial intelligence. PHOTO/Pexels

In his thought-provoking piece, ‘AI aiding cheating in higher education’ (PD Wikendi, Saturday, July 12, 2025), Dr Wycliffe Osabwa raises a critical alarm about the misuse of artificial intelligence by students who pass off AI-generated work as their own.

As a lecturer at Alupe University, he’s seen firsthand the suspiciously flawless assignments that betray a reliance on tools like ChatGPT, often lacking the depth or context he expects from his students.

His concerns about academic dishonesty and the risk of students outsourcing their thinking are valid and echo a broader unease among educators.

But while I share his desire for intellectual honesty, I believe it’s time to reframe AI not as a threat to higher education but as a transformative ally – one that, when guided by human ingenuity, can elevate learning to new heights. 

Osabwa’s observation that AI can produce technically correct but generic responses is spot-on.

I’ve seen it myself: a student’s essay that reads like it was written by a machine, polished but missing the spark of personal insight.

He’s also right to warn about AI’s limitations, like its potential to regurgitate misinformation from flawed datasets – a problem that can mislead novice learners.

These are real challenges, and the temptation for students to take shortcuts isn’t new; AI just makes it easier. But banning or fearing AI isn’t the answer.

Instead, we should embrace it as a tool that, when used responsibly, amplifies critical thinking and prepares students for a world where AI is ubiquitous. 

Consider how AI is already reshaping education for the better. Tools like Khan Academy use AI to tailor lessons to individual students, adjusting pace and content to match their needs.

For students in under-resourced areas, this can bridge gaps that traditional classrooms can’t.

Imagine a student in a rural college using AI to access high-quality explanations of complex concepts or to refine a rough draft by comparing it with AI-suggested edits.

This isn’t cheating – it’s learning through collaboration with a tool.

Dr. Osabwa himself suggests teaching students to use AI for feedback or editing, and I agree. By inputting their own ideas and critically assessing AI’s output, students hone their analytical skills, much like a writer revises a draft with an editor’s notes. 

The key is teaching students to see AI as a purveyor, not a replacement. Prompt engineering – crafting clear, precise instructions for AI – is a skill Osabwa highlights, and it’s one that’s increasingly vital.

A student who learns to ask AI for specific, context-driven insights is practising critical thinking, not bypassing it.

For example, instead of asking AI to “write my essay,” a student could prompt it to “suggest three arguments for a policy based on Kenyan cultural values”.

The student then evaluates and builds on those suggestions, weaving in their own perspective.

This process mirrors how professionals in fields like journalism or marketing use AI to brainstorm or refine ideas while keeping human judgment at the core. 

Osabwa’s call for context-specific assignments is a brilliant strategy to curb misuse. Questions rooted in local experiences or classroom discussions are harder for AI to answer accurately, pushing students to engage deeply.

I’d add that tools like Turnitin are evolving to detect AI-generated content, giving educators a way to uphold integrity without stifling innovation.

But the real solution lies in redesigning education to embrace AI’s potential. Group discussions, in-person exams, or reflective essays that demand personal insight can coexist with AI tools that help students polish their work or explore new ideas. 

The writer comments on topical issues

Author

For these and more credible stories, join our revamped Telegram and WhatsApp channels.
Advertisement