RMS for Girls Headteacher Kevin Carson on why newspaper league tables are not the wisest measure of school performance
Each year, newspapers publish annual ‘League Tables’ of how UK schools performed in the summer’s GCSE and A-level examinations. Most headteachers are not fans – and this includes headteachers of schools at the top of these lists.
It’s not that we are concerned about where our school is positioned (and I write having worked at a couple of schools that were always placed highly in such lists). It is that, as Heads, we know that league tables based on a school’s raw grades (counting how many A*s were achieved at A level or 9s and 8s at GCSE) only really tell us how academically selective a school is. Schools that are more academically selective at age 11 and 16 will invariably be the schools that sit at the top of league tables of raw results each year. No s..t, Sherlock!
Many Heads see their publication in newspapers as unhealthy and unhelpful, an insidious part of a culture that encourages parents to equate the ‘best’ schools with those that have the most A*s or 9s. It is a culture that has even, on occasion, led to less able but hardworking pupils not being entered for examinations because this is the crudest lever to pull in order to climb such league tables.
A more accurate, and healthier, way of assessing the academic progress of students in a school is the ‘value added’. The Centre for Evaluation and Monitoring (CEM) was established at Durham University in 1983 and is the largest educational research unit in a UK university. CEM works with UK schools, colleges, education authorities and government agencies to provide scientifically grounded research that monitors every school’s academic performance.
“Value-added data enables schools to receive just as much credit for a less able pupil who achieves higher than expected grades”
Students sit baseline tests for CEM at ages 11, 14 and 16. Having amassed 40 years of this assessment data, CEM is able to give each child a scarily accurate predicted grade for every subject and can then standardise each school’s actual results in order to tell us how far above or below the predicted grade a student is. From this, CEM can provide evidence of how much value each school adds academically.
Every independent school that I have worked at over two decades uses CEM data internally to assess its own academic progress at both school and department level. It is a standard aspect of the department reviews written every September in schools. From our CEM data this year, I can see that RMS sits on the 92nd percentile of all schools and the 87th percentile of all independent schools for value-added performance. In other words, we add more value academically than 9 out of 10 schools.
CEM’s value-added data enables schools and departments to receive just as much credit for a less able pupil who might have been expected to attain a Grade 4 but achieves a grade 5 or 6 at GCSE as a more able pupil expected to attain a Grade 7 who achieves a grade 8 or 9. This is the just how it should be, of course.
Value added is the most important metric because it reflects the extent to which all pupils are achieving their potential. Unfortunately, it is not what is published in your Sunday newspaper each year, so perhaps we should all be questioning what value is actually added by looking at those league tables of raw results.
RMS for Girls rmsforgirls.com
Further reading: St Swithun’s School on the value of telling stories
You may also like...