Interpreting data for 2017 performance

Published: Tuesday, 02 January 2018

Tony Powell looks at the three different ways that a school’s academic performance is evaluated.

Summary

  • Analysing school performance (ASP) can be used to present a summary to governors and parents.
  • When using the ASP, schools should look at confidence intervals and compare results with national areas of strengths and weaknesses.
  • The inspection data summary report (IDSR) replaces the inspection dashboard as data becomes available.
  • The IDSR has many important differences, such as ‘areas for investigation’ instead of ‘strengths and weaknesses’.

There are now major differences in the ways in which a school’s academic performance is evaluated. These largely stem from the changes to the curriculum and consequently assessment, which make it very difficult to make robust comparisons with earlier years. They also reflect the understanding that the analysis and interpretation of some data must be very tentative. 

This article looks at the three most important changes that schools need to be aware of:

  1. Analysing school performance (ASP)
  2. Inspection data summary report (IDSR)
  3. Guidance to inspectors.

Analysing school performance

RAISEonline provided a summary which most schools printed off and used discretely to analyse performance. An increasing number, but still not enough, used RAISE to dig down into results. It is still straightforward to print off a summary, but the ASP data service demands that schools use it interactively, and this will prove far more informative.

Stage 1: Understanding the broad picture

First, skim the headline results to get a feel for overall performance. Go back and click on the help buttons, even if you think you are familiar with the information. Copy and paste these to create a guide for governors and parents. Think carefully about the information and follow up with the technical information if necessary. For example, consider this on the confidence interval (CI).

Confidence interval

It is difficult to be certain how much of the progress score is down to the school and how much is down to the pupils. For example, the school may have scored higher with a different group of pupils or the pupils may have performed well at any school. The confidence interval reflects this uncertainty. If the confidence intervals for two schools overlap, then we can’t say for certain that the two progress scores for these schools are significantly different.’

The CI is commonly defined in statistical terms, but this gives a much clearer insight into its importance. Think about this again after reading the later guidance to inspectors from Ofsted.

Stage 2: Digging deeper

Go through each of the ‘explore data in detail’ charts and note any groups with results below and above national and/or markedly different from others in the school. Do not invest too much significance where the groups are small, and look for overlapping characteristics such as SEN and prior attainment.

Now study the ‘pupil breakdown’ scatter plots; you can filter groups for comparison and identify individual pupils. Look for patterns across groups and subjects, but the best way to study data at this level is with the class teacher, SENCO and assessment coordinator.

Stage 3: Question level analysis

Are there any national areas of strengths and weaknesses? Are these mirrored in the school? In which areas did the school do better or worse than the national averages? Subject leaders and class teachers should use the summative data formatively by asking: 

  • Was this aspect/skill taught sufficiently? – Did we spend enough time on it and do we need to amend the scheme of work?
  • Was this aspect/skill taught sufficiently well? – This relates to teaching and learning. Is this an area for CPD?

Inspection data summary report (IDSR)

This replaces the inspection dashboard as data becomes available. There are several important changes in the data, and since this is specifically designed to support inspections, it reflects the new Ofsted procedures from September 2017. Inspectors have been trained to interpret and use the new-style report.

  • The front page now shows an overview of results and whether the school is below floor or coasting standards. Instead of strengths and weaknesses, there are ‘areas to investigate’. These may include very positive or negative aspects and cover issues such as absence and exclusions. 
  • The context pages follow immediately after the front page and show pupil characteristics, year group data and prior attainment. This indicates that a higher importance should be given to school context and the characteristics of pupils.
  • There are fewer detailed breakdowns of groups than in the inspection dashboard. The IDSR looks at trends and whether the group has been performing well or poorly over time; the past three years compared with national figures. 
  • Percentile ranks are shown for each year, with the school’s position in a quintile representing 20% of schools. Red borders indicate statistically significantly below, while green borders indicate statistically significantly above. 
  • The IDSR includes scatterplots for key measures. These are intended to help identify clusters of pupils and outliers that may have affected average data. 

Ofsted guidance to inspectors

Schools should study the reference documents listed in ‘Further information’ to understand the major shift in Ofsted’s guidance on using data in inspection. 

The changes in the curriculum and assessment and reformed GCSEs make it much more problematic for schools and inspectors to evaluate attainment and progress and to make robust predictions. The March 2017 newsletter reinforces Ofsted training which warned inspectors about the dangers of thinking that data can be interpreted in the same way as in the past. 

For example, the guidance highlights that inspectors should not refer to ‘expected’ progress or ask schools to make predictions of progress, since they are unable to do so. The new guidance is that inspectors should not invest too much significance in data, but regard it as part of the evidence base. In his blog in March 2017, Sean Harford sums this up in his advice that data is ‘a signpost not a destination for inspection’.

The inspection handbooks were updated in October 2017 and the advice on interpreting data is much more forceful and detailed. See, for example, paragraph 185 in the section 5 handbook.

‘Gaming the system’

Since her appointment, HMCI Amanda Spielman has been very clear about the importance she places on a broad and balanced curriculum. This is a more important focus and inspectors will review the design of the curriculum and whether it is based on the needs of the pupils or trying to maximise results.

Further information 

Toolkit

Use the following item in the Toolkit to put the ideas in the article into practice:

About the author

Tony Powell is an experienced Additional Inspector and LA adviser. He writes extensively on education management, but his main work is in supporting schools to develop systems for self-evaluation, school improvement and continuing professional development. Tony can be contacted at: This email address is being protected from spambots. You need JavaScript enabled to view it.

Most frequently read