- Reviewing the post-exam performance data in a post-exam review meeting, known as a rapid improvement panel (RIP), can help to avoid underperformance next time.
- The RIP should be made up of academy sponsors (if applicable), school governors and headteachers.
- Middle and senior leaders should prepare their data analysis reports in advance of the meeting and submit it to the panel for their consideration.
- The RIP panellists should interrogate the report, highlighting key strengths and weaknesses, and annotating pertinent questions and concerns.
- The meeting itself should focus on panellists’ questions rather than leaders’ presentations, in order to try to ascertain more fully the reasons for certain outcomes and trends.
The purpose of a post-mortem
Exam analysis meetings go by many names, most of them aptly funereal in tone, such as ‘post-mortems’ or ‘rapid improvement panels’ (RIPs). One by one, middle and senior leaders step forward, heads bowed reverently, to get a grilling from a grim reaper in the guise of academy sponsors, school governors and headteachers who form part of the post-exam review panel.
The primary purpose of these meetings is to interrogate a school’s summative performance data, celebrating success where it occurs (recognising departmental improvements as well as individual accomplishments) and questioning underperformance or significant deviations from predicated outcomes in the hope that the same mistakes can be avoided next year.
The panel meeting
Leaders should prepare their data analysis reports in advance of the meeting and submit it to the panel for their consideration. Panellists should interrogate the report, highlighting key strengths and weaknesses, and annotating pertinent questions and concerns.
The meeting itself should focus on panellists’ questions rather than leaders’ presentations in order to try to ascertain more fully the reasons for certain outcomes and trends. There is no doubt that leaders can present their findings in a positive light, but what is needed is an honest account of the facts and an appropriate level of challenge, leading to an agreed set of SMART actions rather than vague promises.
The exam analysis report
Exam analysis reports should not be too long or descriptive. Rather, they should be succinct and evaluative in nature. Panellists want to know the following:
- What are the headline results per subject per year group and per cohort/class?
- What was attainment like versus what was predicted?
- What was attainment like versus what was targeted/expected?
- What was progress like versus what was predicted?
- What was progress like versus what was targeted/expected?
- What value did each teacher add (often presented in terms of residual scores where any positive figure shows value was added)?
- How did different groups of students attain and progress in relation to all students, including boys and girls, students in receipt of pupil premium funding, students for whom English is an additional language, students with SEND, and so on?
- What interventions (wave 1, 2 and 3) were put in place, when and for which students?
- What effect did each intervention have, what has been learnt about the value of each intervention?
- What was the accuracy and quality of teacher assessment like?
Once all these meetings have been concluded, the headteacher must collate and summarise the school’s performance in order to present it to the governing body, academy sponsors or executive head. At this stage it is worth a headteacher remembering that exam results are exactly that: results. They exist in the past tense and cannot be improved (with the exception of exam papers which are entered for remarks, of course; though under recent reforms this practice will become less common).
The only point of an exam post-mortem is to ascertain the ‘cause of death’, so to speak, so that appropriate action can be taken in the future in order to benefit the living.
As such, what a school’s stakeholders really want to know while they’re reviewing exam results is what led to those results: what worked and what didn’t; what lessons have been or can now be learnt.
Accordingly, here is some advice for headteachers and senior leaders who face inquisitions this autumn term from their executive heads, sponsors and governors.
Top five tips
- Present your data clearly, succinctly and honestly. Don’t try to mask your data by combining various qualifications. Although it might feel like it, it is not a witch-hunt and you will gain nothing by being in denial or being defensive. Moreover, you will fool no one by massaging your data.
- Keep track of which interventions are given to which students and analyse their relative effectiveness in light of the outcomes. You need to demonstrate value for money, so must evaluate the relative success of all your intervention strategies. This is not always easy but, in the case of the pupil premium in particular, it is important that you try because Ofsted and the DfE expect to see evidence of successful use of the pupil premium. Where you know a student has only been in receipt of one form of intervention, use him or her as a test base to compare the effectiveness of that strategy versus another.
- Identify which teachers achieved the highest value added scores (using L3VA, ALPs, ALIS, etc.). Decide how to employ your best teachers this year. This isn’t necessarily always with the top set or the C/D borderline class, especially now we have a 1–9 grading system and a focus on the progress of the majority not the attainment of the minority. Think creatively about each teacher’s particular skill-set, try to ‘think outside the box’ a little.
- Analyse how accurate your internal moderation proved to be. What more could be done to ensure that your teachers mark coursework and/or controlled assessments accurately? Also, analyse how accurately your teachers predicted their students’ outcomes and carry out a question-by-question analysis of the exam. Which questions proved the most difficult for students? What more can you do this year to better prepare students for that question? What support do your teachers need to help them teach those aspects of the syllabus better? Ensure that all this self-flagellation leads to clear and SMART actions against which you and your staff can be held to account.
- What professional development do your teachers need to help them improve? What other actions need to be taken to improve the performance of your team? Do any formal procedures now need to be invoked in order to tackle endemic underperformance or malpractice? Did you, as the headteacher/senior leader, challenge your senior and middle leaders and their teachers? Did you do everything you could to keep track of the progress of every student and take appropriate actions to intervene when it was needed?
Strike the right balance
When presenting to governors, aim for a balance between honest self-reflection and dogged determination to drive up standards. Be proud of your school and your staff and don’t be afraid to sing the praises of those who deserve it. But also be frank about failure where it exists and have a robust plan to tackle it.
Your key focus at all times must be on:
- Impact: What was the impact of the actions you took last year on student outcomes and what did that teach you?
- Action: What SMART actions will you take this year in order to improve student outcomes and what will the impact of those actions be next summer when you face the grim reaper again?
There are three tables in the toolkits section (provided by Tony Powell) for analysing results against targets (see the Toolkit box below). These can be used as stand-alone tables but are designed to be used sequentially, either to drill down or to scale up.
There are two features of the tables which will give schools a different perspective from analysing results against national averages or levels of progress.
- Achievement of target grades. Targets are normally set using tools such as those provided by Fischer Family Trust. Therefore, while some teachers may claim that the targets are over-ambitious, they will have been set using the same methodology and data for each pupil and will provide a common base level
- Pupil characteristics. The tables also contain information about pupil characteristics. These can be factored in when considering the performance of individual pupils, groups or subjects. This does not mean the underlying reasons for achievement are always linked to pupil characteristics, as will be obvious when all the tables are completed.
Some parts of the tables have been populated to provoke thought and discussion.
Use the following items in the Toolkit to put the ideas in the article into practice:
- Form – Key Stage 4: Pupil/teacher-level analysis84.50 KB
- Form – Key Stage 4: Subject-level analysis83.5 KB
- Form – Key Stage 4: All-subjects-level analysis84.5 KB
About the author
Matt Bromley is an experienced education writer, consultant, speaker and trainer. In a leadership career of more than 15 years, he was Group Director of a large FE college and multi-academy trust, acting Headteacher of one of the top five most improved schools in England, Deputy Headteacher of a small rural school, and Assistant Headteacher of a large inner-city school. He is the author of several best-selling books and regularly speaks at national and international conferences. You can find out more at www.bromleyeducation.co.uk and follow him on Twitter @mj_bromley.