17 March 2026
Inside the OCR NEA: Practical Strategies for Marking - CAS A-Level event
If you were unable to join us for the OCR A-Level NEA Marking online community meeting, don’t worry! You can catch up on all the content and a recording of the session below.
Demystifying the A-Level NEA: Practical Insights into Marking, Moderation and Student Success
Key Takeaways
-
Start early and plan backwards from the May submission deadline to avoid unnecessary pressure.
-
Concise, context-driven evidence is far more effective than lengthy, generic explanations.
-
Regular progress checks help identify issues early, including misuse of AI.
-
The mark scheme should guide both teaching and student structure—but not become a rigid template.
-
Clear, measurable success criteria and stakeholder-focused design are key to high marks.
The session, led by Hazel Hatch (Senior Computer Science Consultant at the Harris Federation), offered a grounded and practical walkthrough of marking the OCR A-Level NEA. Drawing on extensive recent moderation and marking experience, Hazel focused on what actually makes a difference when assessing student work—and what teachers should prioritise in the classroom.
Working Backwards from the Deadline
A key theme throughout the session was time management. While the official deadline may be mid-May, Hazel stressed the importance of planning well in advance. Teachers need to allow time for:
-
Internal marking and moderation
-
Plagiarism and AI checks
-
Student review requests (based on raw marks)
-
Administrative processes with exams officers
Leaving everything until the final week creates unnecessary stress—not just for teachers, but for the wider school system.
Understanding the Rules: What You Can (and Can’t) Do
Hazel reinforced the importance of adhering to JCQ regulations. Notably:
-
Students must be given raw marks before submission to allow time for review requests.
-
Once marks are shared, students cannot improve their work further.
-
Teachers should avoid giving overly structured support, such as templates or detailed corrective feedback.
Instead, the focus should be on guidance, not direction—helping students interpret the mark scheme rather than telling them what to write.
What High-Quality NEAs Actually Look Like
One of the most valuable parts of the session was a live walkthrough of a real student project. This illustrated a consistent message:
“Sharper is stronger.”
High-marking projects are not the longest—they are the most focused and relevant.
Key characteristics of strong NEAs:
-
A clear, realistic problem context
-
Strong identification and justification of stakeholders
-
Evidence consistently applied to the project context
-
Minimal reliance on generic textbook definitions
-
Logical structure aligned (loosely) with the mark scheme
Hazel highlighted that weaker responses often include unnecessary theory (e.g. defining decomposition) rather than demonstrating how it is used in the specific project.
Using the Mark Scheme Effectively
Rather than treating the mark scheme as a checklist, Hazel recommended a holistic approach:
-
Start at the top band and adjust downwards based on missing evidence
-
Accept that not all criteria will apply equally to every project
-
Look for quality of application, not just coverage
She also shared her use of RAG rating (Red, Amber, Green) during marking. This helps:
-
Track evidence across sections
-
Speed up final marking
-
Provide clear justification for awarded marks
Although not required by exam boards, this approach has been positively received by moderators.
Common Misconceptions and Pitfalls
Several recurring issues emerged:
1. Too Much Writing
Students often assume more content equals higher marks. In reality, excessive writing without relevance weakens the submission.
2. Misunderstanding Features vs Success Criteria
-
Features = what the system will do
-
Success criteria = how success will be measured
Strong projects include quantifiable metrics (e.g. login attempts, load times).
3. Overemphasis on Hardware Specifications
Students sometimes fixate on RAM or CPU requirements. Hazel suggested this should only be included if relevant to stakeholders and context.
4. Weak Computational Thinking Sections
Students frequently default to textbook definitions rather than applying concepts like decomposition or abstraction to their own system.
Design, Testing and Evaluation: Thinking Ahead
Although time limited deeper coverage, several important insights emerged:
-
Design should reflect analysis clearly, using diagrams such as DFDs or ERDs where appropriate
-
Algorithms must be justified, not just presented
-
Testing should include:
-
Development testing
-
Post-development (end-user) testing
-
Functional, robustness, and usability checks
-
Crucially, limitations identified earlier should feed into evaluation, allowing students to reflect meaningfully on their solution.
AI and Academic Integrity
Hazel briefly addressed the increasing challenge of AI-generated content. The key strategy here is not just detection, but process tracking:
-
Regular check-ins
-
Observing students coding in real time
-
Asking students to explain their work
This makes it easier to identify inconsistencies and ensure authenticity.
Next Steps
Questions to Reflect On
-
Are my students writing about their project, or about computer science in general?
-
Do I build in regular progress checks, or leave marking until the end?
-
How clearly do my students understand the difference between features and success criteria?
-
Am I guiding students with the mark scheme, or unintentionally constraining them?
-
How confident am I in identifying AI-generated work?
Classroom Ideas and Activities
-
Context Challenge: Give students a generic paragraph (e.g. defining decomposition) and ask them to rewrite it for their own project.
-
Stakeholder Mapping Exercise: Students identify and justify at least three stakeholder groups for a given scenario.
-
Success Criteria Workshop: Convert vague features into measurable success criteria.
-
Mark Scheme Annotation Task: Provide an exemplar and ask students to identify where marks are achieved.
-
Peer Explanation Activity: Students explain sections of their project verbally to test authenticity and understanding.
Further Resources
OCR A-Level Computer Science specification and NEA mark scheme