Advertisement
Advertisement

Beyond the blame: untangling UTME ‘mass failure’ mystery and charting way forward

Candidates share 'harrowing' experiences during 2023 UTME Candidates share 'harrowing' experiences during 2023 UTME
Candidates share 'harrowing' experiences during 2023 UTME

It’s that time of the year again. UTME results season. As you probably already know, 78% of students who sat for the 2025 UTME scored below 200. And, as always, the nation is outraged. Everyone’s out to find answers—or, at the very least, someone or something to blame. Teachers. Technology. Exam timing. Social media. Parents. You name it.

This blame game is nothing new. I wrote about it last year when 77% of the 2024 candidates also scored below 200. In that article, I predicted that unless we addressed the deeper, systemic issues behind these results, we’d be right back here again in 2025. There are times you’re glad to be proven right. This is not one of them.

So, rather than repeating (again) that we need to go deeper in addressing this like I already did last year, this time I want to focus on what we can actually do about it. I’m offering a starting point. A first step in tackling a very complex issue.

One thing you’ll quickly notice: every year, when you ask why students fail, you get wildly different answers depending on who you’re asking. Every time I’ve written about this publicly, responses pour in—each one delivered with a tone of absolute certainty about what the “real problem” is. Yet, those confident answers often contradict one another. Parents blame teachers. Teachers blame the system. Society blames parents. And on it goes. That alone should tell us something: the root cause of this recurring mass failure remains a mystery.

Advertisement

And that’s exactly where we need to start—by trying to untangle the mystery. And nothing helps bring clarity to mystery quite like data. Comprehensive, well-analysed data. We need a deeper, more granular look into what really happens each year. A once-a-year touchpoint with the UTME data is simply not cutting it. Just releasing a final score summary once a year is not helping. That’s not data. That’s a scoreboard. And it’s clearly getting us nowhere.

Yes, it’s true that the failure problem is likely a complicated, multi-layered issue involving different stakeholders all at once. That’s exactly why detailed data is essential. Stakeholders need to know which part of the problem is theirs to own—not based on public opinion or blame-swapping, but based on actual insights that only proper data can provide.  With proper data, stakeholders can see clearly where they fit in and what specific role they need to play to turn things around.

For example, imagine if teachers had access to UTME results data broken down by subject, and even further into specific topic categories. They could see exactly where students are stumbling. How helpful would that be for lesson planning and exam prep?

Advertisement

Or imagine UTME results broken down by states and local governments. Failure hotspots could be mapped and compared with success areas and the differences in those areas could be looked into. What do the success areas have that the failure hotspots don’t? That kind of information would be incredibly useful for government officials and education administrators when it comes to planning, budgeting, and infrastructure development.

What if we could have analytics that compare exam centre results with the technological and infrastructural conditions at those centres? We might uncover how much of a role poor facilities or system glitches play in student performance. With proper data analysis and research, the potential to pinpoint and address the real issues is limitless.

So the big question becomes: where do we get this kind of data?

The answer is simple: JAMB already has it. They have all the raw data. What’s needed now is the analytical capacity to turn that data into real insights. If JAMB doesn’t currently have the technical talent to do this well, then that should point the body to a hiring need. Alternatively, they can seek external support.

Advertisement

Here’s one idea: data analytics is a fast-growing field among Nigerian youths. What if we created fellowship or internship programs specifically for using data to understand UTME trends? An example of such a fellowship is seen in what the Nigerian Economic Summit Group is doing with the NESG fellowship, where young people with policy-making expertise are drafted to support the NESG team. We can take inspiration from that and a similar initiative focused on education and data could make a real difference. If we’re serious, this can be done.

We can’t keep recycling this annual failure-outrage-forgetfulness cycle. It’s time to break the cycle. It’s time to act. And this could be the first real step in that direction.

Oluwatoyin is a STEM education doctoral researcher, social impact founder and education policy advocate.  She writes from Nigeria and the United States. She can be reached at [email protected] or on LinkedIn here.

Advertisement


Views expressed by contributors are strictly personal and not of TheCable.

error: Content is protected from copying.