School data: what’s the deal?

    Published: 14 March 2019
    School data


    Ofsted have declared that, from September 2019, inspectors will no longer be asking to see a school’s internal attainment and progress data.  So what is the state of play?  Clare Hodgson, Assessment Adviser, shines a spotlight on data and its uses and misuses in schools

    As an Assessment Adviser, people sometimes mistakenly think that I must be some kind of number crunching data guru, or that I love data. Not data in the broader sense of the word, but data in its number form, in all its statistical glory. Whereas the truth is that my relationship with school data has always been circumspect at best, and at times, downright murky. What I have learned over the 27 years that I have worked in schools and as an adviser, is that school data can be both an illuminating lens, and a hindrance. Ofsted, it seems, have learned this too. They have criticised the data machine that has built up in schools as schools try to map the progress of pupils towards an often-shifting end goal. There is an irony here that cannot escape the long-serving educationalist. But that aside, just what does this missive mean for schools?  What data should we pay attention to – and how and for what reason?

    Schools are awash with data.  Schools have access to the IDSR, (Inspection Data Summary Report), ASP (Analyse School Performance), the subscription service FFT Aspire and, in the public domain (so worth a look), the freely available ‘Compare School Performance’.

    Ofsted will still be using school performance data.  Outcomes are, of course, still important.  Examination success is presently the gateway to the next stage of life. Hence it pays to understand the data, and to use it as a starting point to launch investigations, analyse successes and setbacks, observe trends, and take actions. Useful questions when looking at the data include:-

    • What are the strengths?
    • How do results compare nationally/ within LA/ to similar schools?
    • Are there differences between the performance of different age groups within the school?
    • Are there differences between the attainment or progress of pupil groups?  e.g. gender, disadvantaged pupils (for whom the school receives Pupil Premium), different prior attainment groups, ethnic minority students, children with EAL (English as an Additional Language), children with SEND (Special Educational Needs and/or Disabilities) etc?
    • Are there differences in attainment between subjects?
    • Are there differences between teacher assessment and test results?
    • Are there trends over time? (How does this relate to national trends?)
    • Are the differences statistically significant? (How many children does this represent?)
    • Why did the results happen this year?
    • Was it expected?
    • What strategies contributed to this outcome?
    • How well have we done against our key priorities in the school development plan?
    • Are there any key lessons to be learnt from these results?
    • How should these results inform future priorities in our school development plan?

    It is worth considering too, that all the data in the IDSR, ASP and Compare School Performance is retrospective. Hence, another key question is to ask is:-

    What does attainment and progress currently look like?

    Up until now, you may have turned to your internal progress and attainment data to answer this question. However, under the newly proposed Ofsted inspection framework, internal data will not be asked for, in recognition of the unreliability of much internal data, and because excessive ‘data drops’ can divert teachers’ time away from education. (Research shows that internal tracking is often distorted, for example, if it is used to fit with a system, rather than linked to the subject needs, or when linked to teachers’ pay and performance.) Instead, the focus will be on the intent, implementation and impact of the school’s curriculum.   

    Curriculum leaders therefore need to have a clearly defined curriculum, linked to clearly defined (modelled) established standards/proficiency scale and be able to demonstrate how they know that pupils are taking on this new learning/achieving.

    If we can’t point a finger at our internal data to show that most children are making progress, how can we prove progress?  How do we know it is happening?

    When talking about pupil progress, we are really talking about learning and how we know that it is taking place.  This is not wholly quantifiable in terms of a number or grade, or even in terms of a lesson observation.  Take one definition of learning as ‘a change in long term memory’ (Kirschner, Sweller and Clark) and you can see that learning happens over time.  A pupil may well understand the new learning in a lesson, yet forget it all that evening.  Teachers need to be able to revisit key ‘learning’ to make sure that it ‘sticks’ and can be utilised. (‘Learning’ may well here cover content [knowledge], and also skills and processes.) Therefore, an Ofsted inspector will look at the curriculum, observe lessons, talk to pupils, and look in books to evidence progress (or lack of it).  For example, a pupil may have applied a concept incorrectly in science, but later show that they have corrected their original misconception. In history, a repeated spaced quiz may show that a student has acquired an increasing and accurate historical knowledge of a topic, while their written answers could show a developing ability to manipulate grammar, vocabulary, and historical evidence to explain and analyse the causes of an event. Teaching staff (and pupils) should be able to articulate and illustrate what progress and attainment looks like in each subject for each particular year group.

    The long and the short of it:  Be wary of data, use it to raise questions, not answer them.

    Be careful – as data alone can lead us astray.  Data can be misleading and hinder learning. For example:-

    • Internal data may not be comparable between subjects or over different assessments and assessment types.  It may not be particularly valid or reliable – dependent on the teacher, moderation procedures, and the teacher’s own level of assessment training and expertise.
    • Tests are only a snap shot in time.  A child may have missed schooling at one stage, could have been learning English, or simply performed badly or well on the day – against expectations. Hence, data does not always tell the true picture.
    • Relying on data can build on underachievement, or lead to unmanageable pressure for others. For example, being over-reliant on data can lead us to have low expectations for some children.  E.g. If a child achieved poorly at KS1 or KS2, their computer generated targets or predictions for the next stage will be lower. Research has shown that if a teacher views a pupil as a ‘low achiever’, they will give less challenging work, and the pupil will fall further and further behind.  Moreover, the child themselves will believe that they ‘can’t do {subject}’, most often with devastating consequences for engagement and achievement.  In the primary setting pupils soon ascertain teacher expectations. E.g. ‘I’m not one of the pupils who do the ‘challenge’ sheets.’ (See ‘The Pygmalion effect’.) 
    • Allied to the Pygmalion effect (above) data targets given to pupils can have a limiting effect. “I’m only expected to get a grade 4 so I’m doing fine as I’m on track for my target.” In my opinion, it would be more honest, and less damaging, to give pupils a range of probable estimates rather than a fixed target i.e. a pupil can honestly be told, based on historic attainment, that  44% of pupils with the same starting point as themselves gained grade 4, but 39% gained a grade 5 or higher. It depends on the input of the student themselves. As a teacher we should, at the very least, be purveyors of hope, not forecasters of doom.
    • Furthermore, a focus on the data/examination results can lead to a narrowing or a skewing of the curriculum. What do we really want our pupils to learn and be able to do by the end of KS1, KS2, KS3 and by the end of KS4? What is our curriculum vision?

    When I was a young Head of History, my department development plans were full of actions that related to planning and teaching and learning across the department. These actions were not simply plucked from the air, but were put in place as a response to book scrutinies, lesson observations, an analysis of exams, and an understanding of the pupils and their difficulties in the classes in front of us.  Later I was ashamed that my outcomes for each action merely stated what would be completed e.g. a scheme of work re-written, a focus on revision methods for homework implemented, feedback strategies amended to cause and prompt thought and action to take place in the mind of the learner,  or a literacy initiative rolled out.  I hadn’t yet been told to ensure that each outcome must have a statistic beside it that said results will rise by a certain amount.  Yet within a year or two, our pass rates rose from 37%, to 79%, while at the same time numbers opting to take history tripled. The results were a by-product of enthusiasm, passion, and a relentless focus on teaching and learning. Focus on the learning, and the results look after themselves (and eases the mental health issues associated with a performance culture). This is a much less stressful and more productive way to work. 

    Consequently, my view is that teachers require CPD that supports their understanding of assessment, assessment for learning and responsive feedback.  They also need time to moderate with other teachers in order to establish year group standards. We all need to be data-informed (rather than data-driven) and challenge assumptions that may be made about pupils and their potential based on the data alone. What does progress look like – not in terms of data – but in terms of subject knowledge, acquired skills and next steps?  What gaps might be in place for some of our pupils and how can we fill these gaps in skills and knowledge? What is our curriculum provision, given our pupils? Is there a shared curriculum vision and implementation in each department and key stage?

    It is important too that staff have an understanding of both validity and reliability when considering assessment data.  Further to this, data needs to be used with caution. It is only one source of evidence when considering pupil rates of progress and attainment. It is also only a starting point.  Other, first-hand, sources of evidence come from talking to staff and pupils, observing lessons and looking in books – but an understanding of the importance of curriculum design, assessment for learning, in-class responsive teaching and cognitive science (how we learn) underpins an understanding of how to demonstrate progress. Above all, in my view, teachers and governors need to be able to ask questions focusing on how staff and pupils know what progress looks like, and how this learning can be evidenced in their subject area for each year group/key stage. The evidence is in front of us.  Let’s bring out the books at parent evenings, not codes or long lists of highlighted descriptors. Teachers know which pupils are stuck, coasting, or improving. Let us have conversations with the pupils themselves – and wherever possible, remember that the word assessment comes from the Latin, ‘assidere’, meaning to sit beside. Let’s make learning real – in all its complexity.

    "Outcomes really matter, and the impact of a good curriculum, well-taught, is that pupils will achieve great outcomes.  They’ll gain qualifications that they can take into the next stage of life.  But in all of this, data should not be the only thing we look at.  And data should not be ‘king’."

    Matthew Purves, deputy director of Ofsted 18th December, 2018

    Matthew Purves will be the keynote speaker at our HfL Assessment Conference on 23rd September 2019 at HDC, Stevenage.


    For more details of our assessment team services and how they may help your school please visit:

    Primary assessment

    Secondary assessment

    Contact details

    Latest blogs

    Receive our latest posts direct to your inbox...