I’ve spent the majority of this academic year helping primary and secondary schools design their new assessment systems. Needless to say, the removal of levels has left most school unsure of how best to replace the previous flawed but very well understood system. After working with 15 and more schools since the end of July, I want to summarize the key issues for most schools:

SIMS is very good at summative assessment. This shouldn’t really be a surprise because SIMS has always been used to record the overall end or term and end of key stage results rather than the day-to-day formative assessments. Where SIMS is less strong is with the day-to-day formative assessment. Luckily, schools see formative assessment as a less urgent priority.

15018326941_9b97317178_b

Photo Credit: JimmyMac210 – just returned home from hospital via Compfight cc

Age related expectations are the new levels. The majority of the new assessment paradigms are based around pupil’s understanding of each year of the national curriculum. Several years ago the ealy years foundation stage (EYFS) adoped the use of  emerging, expected and exceeding to assess pupils’ learning at the end of reception. Thus the majority of systems are variations on the same theme. Hence: emerging, secure and exceeded; working towards, achieving, advanced; ready, expected, mastered and so on.

Three becomes four, six or even nine. For many schools, being limited to only three possible outcomes for a pupil makes it difficult to show progress across the school year. Hence many schools introduce finer subdivisions in addition to the basic three, either by adding an extra grade (emerging, developing, secure and exceeding) or by adding plus grades (ready, ready+, expected, expected+, mastered, mastered+).

Can a Year 4 Pupil be Y5 Secure? This is one of the first questions I ask my schools. For many schools the answer will be no, classifying a pupil outside of his or her year group makes no sense in the new curriculum. But for other schools the answer will be an emphatic yes, because how else will the school measure the progress pupils make over the course of the year? Which bring us on to…

Measuring Progress. Some school argue that with age related expectations there is no need to measure progress for individual pupils. Instead, they measure the progress of the cohort from the end of the previous year (80% of the pupils made age related expectations or higher) to this year (89% of the pupils made age related expectations or higher – hence the cohort’s progress is a 9% increase). But the other schools prefer to count jumps through age related expectations (eg if a pupil starts the year on emerging, reaches expected by spring and ends the summer on exceeding, they have made 2 jumps).

Setting Targets, Or Not. Initially I was surprised by how unimportant targets now seem to be for most schools. But I realised that targets are implicit in an age related system. Thinking about it, the target for a Y5 pupil has to be to reach or exceed the Y5 target. If a Y4 pupil exceeded their Y3 expectations, then their target must be to exceed their Y4 expectations too.

Conclusion. It’s early days for most schools with the majority staying with levels this year, though many have decided to look again after Christmas. Lots of commercial consultants and software companies are advertising their own non-SIMS solutions but many school actively prefer a SIMS-based solution, recognising the advantages of ‘all their data, in one place’. As SIMS experts, we need to make sure that schools are aware that SIMS is flexible enough to cope with ‘assessment without levels’ and, importantly, that SIMS can provide assessment systems that can be tailored to each school’s changing requirements.