Measuring What Matters
Shifting from Product to Process in the Age of AI
There is a quiet but profound shift underway in education. For decades, assessment has centred on products: the essay, the exam script, the final artefact. These outputs have served as proxies for learning, standing in for what we assume students know and can do.
Yet with the rise of generative AI, that assumption is increasingly fragile. When high-quality outputs can be produced with minimal human input, the traditional alignment between product and learning begins to break down.
The question is no longer simply “What did the student produce?”
It is becoming “What did the student do to produce it?”
Why effort and process now matter more than ever
At the heart of this shift is a construct that education has historically struggled to assess: effort.
Effort is not directly observable. It sits behind the scenes, entangled with:
time spent,
cognitive engagement,
persistence through difficulty,
and the strategies used to solve problems.
Research has consistently shown that performance reflects a combination of ability and effort, yet our assessment systems rarely distinguish between the two. A polished submission may signal deep understanding, or it may reflect tool-assisted production with minimal engagement.
In an AI-enabled environment, this ambiguity is amplified.
If assessment remains product-focused, we risk evaluating:
tool proficiency over learning,
output quality over intellectual engagement,
and in some cases, automation over authorship.
The limits of traditional assessment
Traditional assessment methods assume a relatively stable relationship between:
effort → learning → product
AI disrupts this chain.
Students can now:
generate essays, code, or reports rapidly,
iterate at a scale previously impossible,
bypass much of the cognitive struggle that supports learning.
This does not make AI inherently problematic. It simply means that products alone no longer provide sufficient evidence of learning.
What does it mean to assess process?
A process-oriented approach shifts attention from what was produced to how it was produced.
This includes examining:
1. Time and engagement
How long did the student engage with the task?
Was the work distributed over time or completed in a single burst?
Digital environments now allow us to capture:
time-on-task,
interaction logs,
revision histories.
These are imperfect, but they offer insight into learning behaviours rather than just outcomes.
2. Iteration and development
Learning is rarely linear. It involves:
false starts,
revisions,
refinements.
Assessing process means valuing:
drafts,
version histories,
evidence of improvement.
A final submission becomes one point in a trajectory, not the sole object of evaluation.
3. Cognitive effort
Effort is not just time, it is mental work.
Indicators include:
tackling challenging problems,
engaging with feedback,
demonstrating conceptual change.
Cognitive load theory reminds us that meaningful learning requires investment of mental effort, not avoidance of it.
4. Decision-making and justification
Students should be able to explain:
why they made particular choices,
how they evaluated alternatives,
where they used tools, including AI, and why.
This moves assessment towards thinking, not just producing.
Designing for visible effort
One of the most important insights from the research is this:
Effort is best assessed when it is made visible through design.
Rather than trying to detect effort after the fact, we can structure tasks so that effort leaves a trace.
Practical strategies include:
Staged submissions
Break tasks into components:
proposal,
draft,
feedback response,
final submission.
Each stage captures part of the process.
Process artefacts
Require students to submit:
planning notes,
annotated drafts,
reflection statements,
prompt logs when using AI.
This aligns with the principle that evidence of authorship is more robust than attempts at detection.
In-class or synchronous checkpoints
Short, controlled activities:
discussions,
oral explanations,
quick problem-solving tasks.
These provide snapshots of unassisted thinking.
Reflective components
Ask students to articulate:
what they found difficult,
how they overcame challenges,
what they would do differently.
Reflection reveals metacognition, a key component of effortful learning.
Rethinking grading: should effort count?
A common concern is whether effort should be graded directly.
The literature suggests caution.
Effort alone is not learning. However, process indicators can provide evidence of learning behaviours that lead to understanding.
A more balanced approach is to:
assess quality of thinking and development,
reward engagement with the process,
and ensure that outcomes still matter.
In other words, effort is not a replacement for achievement, but a lens through which achievement is interpreted.
AI as a partner in process, not a shortcut to product
An interesting paradox emerges.
The same technologies that disrupt product-based assessment can also enhance process-based assessment.
AI tools can:
support brainstorming and iteration,
provide immediate feedback,
help students explore alternatives.
The key is not to exclude AI, but to integrate it transparently into the learning process.
When students document:
how they used AI,
what they accepted or rejected,
how they verified outputs,
they demonstrate critical engagement, not passive reliance.
A shift in mindset
Moving from product to process is not simply a change in assessment design. It is a shift in what we value.
From:
correctness → thinking
completion → development
output → engagement
This aligns closely with long-standing educational principles, but AI has made the shift urgent rather than optional.
Concluding thought
If education is fundamentally about learning, then assessment must capture learning as it happens, not just what remains at the end.
In an AI-enabled world, the most meaningful question we can ask is not:
“Is this work original?”
but rather:
“What does this work reveal about the learner’s thinking, effort, and development?”
When we begin to answer that question with confidence, we move closer to assessing what truly matters.



