The unvalidated Key Stage 2* data was released in ASP (Analyse School Performance) last week. So what’s new?
*[Edit (04/11/18) - KS1 and Phonics 2018 data now also released, with the exception of data for disadvantaged pupils, which will follow in due course.]
Well apart from the fact that the DFE has moved the signing-in page to here (although you need to have successfully ‘migrated’ from the previous log-in system before you can access ASP from this page)** we can see a couple of new reports and a change to the methodology behind the progress score calculation. Plus, there is a new data management facility and exporting data to pdf just got easier.
** See also this page for logging in to other DfE sites: www.services.signin.education.gov.uk
* see Edit above
Firstly, let’s look at that methodological change - this is the decision to cap the progress scores of individual pupils (‘outliers’) where their scores are extremely low (i.e. negative numbers with a large magnitude).
The DfE’s KS2 school accountability document (Annex C) explains the methodology for determining how these caps have been set for each prior attainment group and provides (on page 38) a table displaying them all.
For example, for a pupil whose prior attainment score is 15 points (i.e. level 2b at KS1) the maths progress score cap is set at -13.03. That is to say, a child in this particular prior attainment group (PAG) cannot be awarded a progress score lower than -13.03 in maths. Elsewhere in the same document (pages 21-22) we see the national average scaled scores for each PAG. The national average maths scaled score for children in this example PAG is 102.3.
So, if a child, with prior attainment of 15, ended up with a scaled score of, say, 85, this would (unadjusted) give them a progress score of -17.3 (85 subtract 102.3). But instead, this will be capped at -13.03 (still a very low progress score, but better than it would have been).
If your 2018 Year 6 cohort includes any children whose progress scores have been capped ('adjusted'), you will see that indicated in the headline KS2 progress figures, as shown below.
When clicking on the ‘explore data in detail’ link, you will see the unadjusted and adjusted progress scores for the cohort and for pupil groups. In the example below, the pupil with a capped score was a boy, so you can see that the progress score for males has been adjusted, as well as the overall score.
This example also demonstrates that the introduction of capping extremely low progress scores does not tend to have a very large effect on the cohort score. If a school had a small cohort and quite a few pupils with capped scores, there could be a greater effect, but in the case of the example I have shown (a one-form entry school) the effect was only to increase the progress score by 0.14. (Of course the effect becomes larger when you look at groups. If this child, with the adjusted score, was one of a very small group of disadvantaged pupils, the effect of the adjustment on that group's score would be much greater than on the cohort score.)
It is also worth noting that only a very small number of children will have a capped score. For example, across the whole of Hertfordshire, in last year's Year 6 cohort (just over 13,000 pupils) only 104 pupils had an adjusted score in reading (less than 1% of the cohort), 223 in writing (1.7%) and 150 in maths (1.2% of the cohort).
It is also worth noting that, for pupils with very low prior attainment scores (below 6 points), this capping does not come into effect, so for many special schools there is unlikely to be any difference to the progress scores as a result of this methodology.
The easiest way to discover exactly which pupils have had their progress scores capped is to look within the Key Stage 2 additional reports located within the list of reports in the report menu. Provided your ASP account has been set u to include ‘named access’, you will see a report called Key Stage 2 pupil list. Click into this and then, from the drop-down menu, select either Table 3, 4 or 5 (for reading, maths and writing respectively). This shows the pupils’ names and data, including their progress scores (adjusted and unadjusted)
Three years of ‘new style’ data
Unbelievable as it may seem, we have now had three years of end-of-Key Stage assessments in this (not-so-new) world without levels. (Where does the time go?) So that has now made it possible to produce these two new reports - a school’s average results across the last three years and a three-year trend (or ‘time series’).
These figures are shown for headline cohort-level indicators only, not for groups of pupils.
A three-year average figure can be particularly useful for smaller schools, where cohorts can vary enormously - combining cohorts can create a more statistically meaningful sample size - although it’s a shame that three-year combined data is not available for groups, e.g. disadvantaged pupils. (NB FFT Aspire does do this.)
In the time series data, they have (rightly) indicated with a dotted line between 2017 and 2018 that the progress scores cannot be directly compared due to the change in methodology - this refers to the introduction of capping negative scores, discussed above. NB you can see the effect of this on the national average progress scores - normally (by definition) 0.00, but now (for 2018) showing as 0.03, because the capping of extremely low progress scores has slightly raised the average (see image below).
Although the scores themselves are not directly comparable, bearing in mind the amount of statistical noise in the progress calculation (represented by the size of the confidence interval) there was never really much sense in comparing small changes in progress score anyway. That is to say if a school’s progress score went up from, say, 0.9 to 1.1, but with a confidence interval of plus/minus 2 (typical for a cohort of around 30 pupils), it is meaningless to say that progress has improved, as there is so much overlap between the confidence intervals around these figures. (Progress could easily, in fact, have got worse.) For more discussion around progress scores and confidence intervals, see this earlier post.
What is worth looking at on the 3-year trend in progress scores is any change in the progress banding. It is clear in the example below that progress changed significantly (downwards) between 2016 and 2017, moving from ‘average’ to ‘below average’, but has now changed significantly upwards again, back to the ‘average’ range. The 2017 result would appear to have been a ‘blip’ in the school’s performance (for reasons that I’m sure a school could list) rather than the start of a downward trend.
Data Management and Exporting
I can't actually access the data management myself, as an adviser rather than a school user, so I can't comment on how easy it is to use, but according to this update from STA:
New data management functionality allows users to create a custom view by removing one or more pupils from their phonics, KS1, KS2, or key stage 4 (KS4) data to consider ‘what if’ scenarios.
Data management is available for the most recent year’s data held in ASP and will be updated when new data becomes available. There is also an improved functionality to export most files to PDF or Excel.
The export to pdf, for saving or printing, is a big improvement. From this same link, you are sometimes presented with the option to export to Excel. From what I can see, the Excel export seems to be available for any tables of data, e.g. data for groups of pupils. And the data actually exports quite nicely - no horrible formatting making it very tricky to manipulate (which makes a nice change, thinking back to the export functionality in the early version of RAISEonline and the old FFT Live).
That just about wraps up this quick summary of what has changed in ASP. We await the data for Key Stage 1, phonics and EYFS - hopefully in the not-too-distant future - and for the IDSRs to be released at some point in November.