Skip to main content

16 September 2025

Teaching Code Critique Skills for the Gen AI Era - A-Level online event

Computing at  School profile image
Written by

Computing at School

If you were unable to join us for Problem Solving for Programming online community meeting, don't worry! You can catch up on all the content and a recording of the session below.

Helping Students Think Critically About Code in the AI Era

Key Takeaways

  • Generative AI often produces plausible but subtly flawed code, highlighting the need for students to critique rather than trust.

  • Problem-solving should emphasise code comprehension and clarifying vague requirements, not just code writing.

  • “Refute problems” ask students to disprove buggy code with counterexamples, strengthening critical thinking.

  • “Probeable problems” challenge students to probe ambiguous specifications, encouraging clarification skills.

  • These approaches can be adapted for school-level teaching and provide opportunities to align classroom practice with real-world software development.

Viraj Kumar, a lecturer at the Indian Institute of Science, joined our session to share his ideas on developing students’ problem-solving skills in programming—ideas he has been trialling with undergraduates but believes can be successfully adapted to school-level teaching.

The discussion began with a reflection on the capabilities and limitations of generative AI tools. While such tools can rapidly generate code that looks convincing, errors—often subtle ones—still occur. For beginners, these errors may be difficult to spot, so students risk accepting code as correct without question. Viraj argued that this makes code comprehension more important than ever.

A second challenge is that AI systems, much like real-world clients, often make assumptions or work with vague specifications. This prompted a focus on helping students not only read code critically, but also learn to ask clarifying questions when requirements are underspecified.

To support these goals, Viraj introduced two structured exercise types:

  • Refute problems: Instead of writing code, students are given a problem statement and a buggy solution. Their task is to find a counterexample that proves the code is incorrect. This encourages careful code reading and problem understanding.

  • Probable problems: Here, students are given deliberately vague tasks and must explore different interpretations by asking clarifying “doc test” style questions. The exercise mimics the reality of software development, where requirements are rarely perfectly specified.

Both exercise types can be adapted to different contexts, whether using interactive platforms such as CodeCheck or simple paper-based worksheets. They are also relatively straightforward to design: refute problems can be based on common student errors, while probable problems start with traditional tasks but deliberately omit detail.

Discussion in the community highlighted both the opportunities and challenges of applying these ideas in the A-level classroom. While some felt assessment pressures might limit uptake, others saw strong potential for building students’ critical thinking skills, whether at A-level, T-level, or other vocational contexts. The approaches could also sit naturally alongside existing topics such as testing, debugging, and development cycles.

Next Steps

Here are some reflective questions you might ask yourself:

  • Do my students spend more time writing code than understanding or questioning it?

  • How could I adapt buggy student solutions into refute problems for use in class?

  • What opportunities exist to present deliberately vague problem statements and support students in clarifying them?

  • Am I preparing students only for exams, or also for the ambiguity and critique required in real-world programming?

Example classroom exercises:

  • Provide a function with a subtle bug and ask students to find an input that proves it wrong.

  • Share a vaguely worded problem (e.g. “count digits in a number”) and ask students to generate test cases to clarify different possible interpretations.

  • Run a class discussion comparing AI-generated solutions with student-written ones, focusing on spotting hidden assumptions.

Further Resources

CodeCheck platform - for creating and sharing refute and probeable problems

CAS Online Community – continue the discussion with fellow teachers.

AQA Python resources on VS Code (demo coming soon – see CAS events page).

Event Recording