Working with data – the narrative in the numbers

I covet @Ashley_Loynton’s mug

It is a constant source of amusement and bafflement to my colleagues how much I love working with data. As an English specialist I’m not supposed to enjoy the spreadsheets and pie charts quite as much as I do. However, it’s my firm belief that data helps me to be a better teacher, and that my work with data as a member of SLT helps my school become a better school.

The principles of working with data

When I first started blogging one of the first posts I read was Kev Bartle‘s excellent ‘Doing’ Data as a member of SLT. In this post, Kev lays out some excellent principles for the use of data. If you haven’t read it – read it now, then come back. Finished? Good.

I don’t disagree with any of what Kev says in his post, as I’m sure you will see. But here are mine:

  1. Less is more – but don’t hide the rest
  2. Use hard and soft data equally
  3. Look for the narrative in the numbers
  4. Make the data work for you – don’t work for the data

Working with data as a classroom teacher

When I’m in the classroom I collect and use data that is relevant to me. Primarily, this is assessment data from my own subject. For each “proper” piece of assessed work completed it’s essential that I keep track of their attainment related to the assessment objectives, which I do in a spreadsheet. This helps me plan future lessons and target my teaching effectively. If I can see that particular students need more help with a particular part of the assessment, I will try to meet that need. Essentially I’m looking for answers in the assessments: what have I missed? What didn’t I explain well enough? What didn’t stick? What should I focus on next? Nobody really looks as this data except me, but it makes my lessons better.

I could find out loads more about the students in that class. I could ask SIMS for their attendance going back to Year 7 and beyond, their attainment and achievements and attitudes and intervention history and every letter sent home since they joined the school. Sometimes, where I have a concern, I will have a look and see if there is a similar pattern in other subjects, but if I’m honest this is unusual. The data I collect and record from my classroom helps me to teach better; I use it to inform the whole-school data collection which happens three times a year.

Working with data as a member of SLT

I try to keep my teacher-data in mind when I put my SLT head on. Teachers all around the school have their own systems for tracking ongoing progress on top of what I am asking them to input into SIMS, so it really needs to be worthwhile. Therefore each whole-school data collection we do has two purposes. Firstly, as a staging post to track progress and to report home to keep parents up-to-date. Secondly, to track academic attainment and progress through “working-at grades”. Thirdly, to track attitudes and behaviours for learning.

“Hard” data – academic attainment

Tessa Matthews wrote a depressing blog a couple of days ago called “What should we do about kids who aren’t making progress?”. Depressing, at least in part, because it was completely recognisable in its honesty and, in part because I am the SLT who sends out the emails and convenes the meetings to ask middle leaders the exact question posed in her title. Not, it has to be said, every third Wednesday, but I do ask the question – and I don’t apologise for it. I should be asking that question. For all that National Curriculum Levels are discredited and flawed, if a child at my school arrives in Year 7 with a Level 5 and reaches Year 9 still on a Level 5, we haven’t done enough for that child. It is one of the prime responsibilities of my job to spot those children who are at risk of not making progress and work with middle leaders and classroom teachers to find out what we can do differently, better, or additionally to ensure they do.

I try to take a “no excuses” approach with this process. By this I mean “she has a terrible home life – there’s no family support” might be an explanation as to why a Year 8 girl is not making expected progress, but it is not the end of the conversation. What can we, as a school, do to help her keep her studies on track as best she can? I echo John Tomsett’s sentiments in this regard:

I know that learning is not linear. I’m not worried if a child is a 5b in October and a 4a in January, but I will make sure that we keep an eye on that child’s progress.

“Soft” Data – attitudes, self-image and disposition

We probably do more work with the “soft” data than anything else. Three times a year all teachers report on students’ attitudes, grading them V, G, S or U for homework, classwork, effort and organisation. The judgements and the categories are all clearly defined (see our Monitoring Reports Guidelines) to try and provide consistency. Thanks to a system piloted by Ashley Loynton we track how students’ attitudes to learning are improving or declining between monitoring points and in relation to their peers, and pastoral leaders and tutors work with students individually or in small groups to address issues or trends. This has been a revelation and I always look at the attitude analysis first from any reporting session.

We also use PASS (Pupil Attitude to Self and School) surveys where students complete an online questionnaire to give feedback on their:

  • Feelings about school: “connectedness” whether pupils feel they belong to or are alienated from their school community
  • Perceived learning capability: how successful a young person feels about their learning
  • Self regard: long-term learner self worth
  • Preparedness for learning: whether learners feel they have the ‘tools to do the learning job’
  • Attitudes to teachers: pupil perceptions of their relationships with school staff
  • General work ethic: aspiration & motivation to succeed in life
  • Confidence in learning: pupils’ perseverance in the face of challenge
  • Attitude to attendance: how much a student wants to be in school
  • Response to curriculum demands: motivation to undertake and complete curriculum based tasks
Attitude data can help pick up "under the radar" issues

Attitude data can help pick up “under the radar” issues

Whilst certainly open to flaws on its own, this data is an excellent way to quantify teacher perceptions and identify students who may have barriers to learning which would otherwise remain under the radar.

External data sources – working the filter


There is a wealth of data available. Aside from the internal assessment data mentioned above, Raise Online, DfE performance tables, Fischer Family Trust, CATs and ALPS are all in regular use at my school. We also have student and parent survey data, and health data from SHEU amongst many others. Any member of staff can access them – some of them are in the public domain and available online, and the others are in the staff drive of the school network. Many wouldn’t go looking for them, however, and my job as Deputy Head is to filter out the key messages and make sure that the relevant staff get those messages without wading through irrelevance and drowning in the tide. This is a major responsibility of my role – I have to know it all. I use a combination of FFT, CATs and teacher assessments to set targets (we call them Challenge Grades – I describe the process here). I prepare a Raise Online summary for Heads of Subject and for SLT and Governors. We use ALPS in discussions with sixth form teaching teams. I make sure to congratulate and praise publicly and privately where we have success, but primarily I’m looking for the gaps and holes. I use the same test as I use in my classroom – what data is relevant? What will help improve teaching? What have we missed? What should we focus on next?


It is the job of any senior leader to make sure that the data works for the school. I can’t bear the tales I hear of children being able to tell inspectors their FFT D targets – FFT is not intended to be used that way by their own definition. This is a classic case of the data working the school, rather than the school working the data. True leadership means getting the filtering right and funnelling the right data to the right people so that, as a school, we can ask the right questions. Because – as Kev Bartle says – data only ever asks questions. I like looking for the narrative in the numbers which frames and poses those questions, but the answers are always in the classrooms and the relationships and the teaching and learning and care that makes schools a success.

9 thoughts on “Working with data – the narrative in the numbers

  1. Dear Chris,

    Another really good read. I like the layers in this and it reminds me of your post on progress which I have used with the staff and will probably quote again in the future. I’m not sure about the “5b in October and 4a in January” part. With individual assessments this is perfectly possible for a number of reasons but we had an issue of staff reporting summative grades based on single assessment pieces – when it happened it drove some parents mad as they couldn’t accept that their child had gone backwards one whole year between October and January. I blogged this and used it with staff – the two key elements in the discussion were about making summative assessment distributed and extensive. The blog came out of listening to Dylan Wiliam and was a light bulb moment for many staff. We’re looking at how we combine a series of assessments now to produce grades that are entered on SIMS to be used for tracking and reporting. Keep blogging.



  2. I take your point about the levels. The issue I’m trying to address here is that learning isn’t linear and – depending on the subjects – learners can slip back. I would like to move towards a more holistic assessment which looks at student attainment with a broader lens at each monitoring point, for example at a portfolio of work built up over the year. This model would show cumulative progress over time. I accept it’s easier to do that in my subject (English) than in some others, however, which is why the occasional one-off regression in progress still happens in our data without necessarily being something to worry about.
    Thanks again for your comment and vote of confidence!

  3. This is a really thought provoking post. I’m curious about your pupil questionnaire. It seems very ‘teacher talk’ and almost contradictory to the rest of your message. How can you be sure that the students can actually read,understand and be honest about what you are asking? One of the issues my students face is the overwhelming support they get at KS2 to get their levels being taken away at KS3 as we don’t have the capacity to support them in the same way. The data then can be a noose around everybody’s neck.

    • The PASS questionnaire is well-briefed and supervised through our tutorial system. We explain why we are doing it and what we will use it for. We find that nearly all students complete it carefully and accurately, but I also recognise that any questionnaire response needs to be treated with some caution. I don’t think it’s “teacher talk” – if anything it’s another way to provide the students with a voice. I’d class it as a self-assessment tool for the students.

  4. Pingback: Tracking progress over time: flight paths and matrices | Teaching: Leading Learning

  5. Pingback: Teaching and learning | Pearltrees

  6. Pingback: School data analysis | Pearltrees

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.