Attitude Determines Altitude

Attitude determines altitude banner

Attitude determines altitude banner

As part of our growth mindset ethos this year, we have been working hard with students on their attitudes to learning in school. As David Didau has explained, “good behaviour is necessary for good teaching to take place,” and we completely agree. I have been working closely with Head of Year 11 Phil Edwards (@_philedwards on Twitter) to help the cohort get into the right mindset for success. One of the innovations we’ve tried is the publication of attitude grades under our “Attitude Determines Altitude” banner.

“Attitude Determines Altitude” was adopted by NASA’s education programme in America (see here) as a variant on Zig Ziglar’s quotation.

zig

Of course, it’s not rocket science…except, in this case, it is! Aim too low – or get the attitude wrong – and you’ll crash and burn. Get the angle of ascent right, ignite the thrusters, and you’ll go into orbit.

At Chew Valley, we collect teacher assessments of student attitudes three times a year. We use a four point scale – VGSU for Very Good, Good, Satisfactory, and Unsatisfactory – in four categories:

  • Behaviour
  • Classwork
  • Homework
  • Organisation

All the categories are underpinned with clear definitions issued with report guidance (view a copy here: Attitude Grades). The grades awarded are processed into a percentage score – if students were to achieve all V grades, they would get 100%, whereas all U grades would result in a 0% score. These scores are reported to parents (along with individual grades), tracked at each reporting point so that trends can be identified.The most recent score is also included in student Key Performance Indicators in SIMS. The advantage of tracking attitudes in this way is that it is possible to identify improvement and decline in student attitudes over time. Tutors are issued with a tracking spreadsheet which shows students’ attitude scores over time and their improvement or decline, as well as their relative position in the year group. This allows intervention to be targeted at students whose attitudes are declining, and the success of those who have improved to be celebrated.

This tracking process is well established and has been running for four years, but it has always been teacher-based. With Year 11, we have gone public. In November we published student attitudes on the Year 11 noticeboard, along with their rank order position in the year group according to that score. We debated the format for a long time! I was all for publishing a straight rank-order list from 1-200 to make it totally clear who was at the top and who was at the bottom. However, I was persuaded away from this as we worried that students would easily see who was surrounding them in that part of the table and this may create a sense of group identity and possibly negative reinforcement – “we’re the bottom of the table crew!”

Example attitude grades from first posting in November (anonymised)

Example attitude grades from first posting in November (anonymised)

Instead, we published the list in alphabetical order by tutor group.  This made it easy for the students to find their own name and see where they stood in the rankings. The launch was carefully handled by Phil and his team of tutors, who made sure the message was mediated and that students were encouraged to improve their attitudes – and their position in the ranking!

Guide to attitude determines altitude published in November (original here)

Guide to attitude determines altitude published in November (original here)

Last week, we published the second attitude determines altitude scores on the noticeboard. These had been awarded following mock exams and results over Christmas. A few interesting trends emerged! In November, the highest score in the year was 98% (awarded to two students); in January there were three on 100% and thirteen altogether over 98%. If you scored exactly the same attitude in January as in November, your position in the rankings dropped. The rest of the year group was improving – staying the same wouldn’t cut it! Most impressively of all, some students had leaped up the rankings, with a dozen students improving by 10% or more. Of course, some had also declined – this wasn’t a magic wand and it didn’t work for all! – but the response has been really positive. Above all, the average attitude score from this Year 11 cohort sits considerably higher than any other Year 11 cohort we have ever had – and the evidence from staffroom conversations and staff evaluations is that this reflects a reality in the classroom. Phil made the most of the publication by stoking a bit of inter-tutor-group rivalry:

On Friday, I asked a selection of the students what they thought of it. Here is a selection of what they said:

  • “When I saw how low I was, I knew I had to do something about it.”
  • “I think it’s good so you know where you stand.”
  • “My Mum was against it, but I’m not really bothered.”
  • “I would have worked harder anyway because the exams are so close. I’m not sure the board had anything to do with it.”
  • “When I saw how far I’d gone up, I was really pleased with myself.”

A mixed picture! This is an inexact science and we’re not conducting an RCT here. I don’t know if it’s our whole-school growth mindset ethos and focus on effort, the excellent leadership from the Head of Year and his team of tutors, the luck of the draw or the publication of effort grades on the board that is making the difference. But something is working! And when the scores went up last week, students gathered round, keen to check their position and progress. Conversations about attitudes to learning were happening between students. That’s got to be a good thing! Certainly was for 11H…

Tracking progress over time: flight paths and matrices

Everyone should already be familiar with the KS2-4 Transition Matrices. A staple of RAISEonline, they were the first thing our HMI asked me for in our last Ofsted inspection and form the staple diet of inspectors judging the impact of a secondary school on progress in English and Maths.

Framework for KS2-4 Transition Matrices

Framework for KS2-4 Transition Matrices

And quite right too. It’s common for secondary teachers to bemoan the inaccuracy of KS2 levels, but like it or not, somehow those students got those levels in Year 6 and we need to add value during their time with us. Of course, the starting point (KS2 levels) and the end point (GCSE grades) are both in flux for the next few years, which renders the measurements somewhat uncertain (see my blog KS2, KS4, Level 6 and Progress 8 – who do we appreciate?), but the principle of measuring student performance on entry and exit to judge progress makes sense.

Over the past year we have been experimenting with progress flight paths which I found initially on Stephen Tierney‘s @LeadingLearner blog. We are now using transition matrices based on our own version of progress flight paths to track progress in each year group and identify students who are at risk of not progressing over time. In this post I will outline the methodology we use; I’m happy to answer any questions in the comments or via my “contact me” page.

But we don’t have National Curriculum levels any more…

No, that’s true – and we don’t use them. As outlined in my post Assessment in the new national curriculum: what we’re doing, we have adapted our assessment criteria at KS3 to reflect GCSE criteria. All our language in reporting to parents and policy statements now refers to “Chew Valley Levels” to clarify our position. This way, we preserve some continuity for students and parents who are used to the levels system, but we create a consistent ladder of knowledge and skills to assess from Year 7 to Year 11. As GCSE grades change to numbers, we may well consider adjusting to a numerical assessment system across the school too, but maintaining the principle of a five-year continuous assessment scheme in each subject.

The flight paths

The flight paths we are using, based on the @LeadingLearner model, are set up as follows:

  • Expected progress: one sub-level of progress in each year
  • Better than expected progress: one and a half sub-levels per year
  • Outstanding progress: two sub-levels per year
  • World class progress: more than two sub-levels per year

Progress flight paths tabulated

Progress flight paths tabulated

The flight paths do not presuppose that progress over time is linear; this was my initial misunderstanding of the model. Rather, they show the trajectory of progress over time within which students need to perform if they are reach or exceed the end of KS4 destinations outlined in RAISEonline. Creating marker points at the end of each year enables early identification of potential issues with progress. At Chew Valley we collect assessments three times in each academic year, all measured against the flight paths. At the first assessment point, only one short term into the year, a greater proportion of students might be lower on the flight paths, but over the course of the academic year teachers can focus their planning to ensure that those students who are at risk of falling behind have any issues addressed.

Creating transition matrices from the flight paths

Using SIMS tracking grids, we have created transition matrices for each year within the curriculum. These can be populated with student names at each assessment point, and generated for teaching groups, gender groups, pupil premium cohort, or any other field within the SIMS dataset. Simply put, students are plotted in the grid with the row representing their KS2 prior attainment level and the column representing their current performance assessment. We will be able to adapt the row and column headings as the assessment systems change.

Example tracking grid template in SIMS

Example tracking grid template in SIMS

Within the template, the fields are colour-coded to represent each of the flight paths:

  • White = below expected
  • Green = expected progress
  • Blue = better than expected
  • Pink = outstanding
  • Yellow = world class

Once populated, the matrices are distributed to curriculum and pastoral leaders and, critically, class teachers. They enable at-a-glance identification of progress issues on an individual, cohort, prior-attainment bracket or group scale.

Example of a populated tracking grid with student names anonymised. Note the tabs across the bottom for teaching groups and subgroups

Example of a populated Y8 tracking grid with student names anonymised. Note the tabs across the bottom for teaching groups and subgroups.

When I was a Head of English, this is the data I would have wanted my SLT to be providing me with. As with all data work in my leadership role, I am trying to adhere to the principles I outlined in my post The Narrative in the Numbers, and to make the data as useful as possible to enable teachers in the classroom to do their job even better. By clicking on my class tab along the bottom of the spreadsheet I will be able to see at-a-glance which students in my group are progressing well, and which less well; then I will be able to plan what I’m going to do about it over the next few terms.

Transferability 

Currently this method is only applied to English and Maths. We have experimented with using an average KS2 points score to create a generic baseline and applying it to other subjects, but it throws up too many anomalies to be reliable or useful (which poses some interesting questions about the proposed Progress 8 methodology). However, it would be possible to apply this model from a Year 7 baseline assessment in any subject – the tools are there.

 

Working with data – the narrative in the numbers

I covet @Ashley_Loynton’s mug

It is a constant source of amusement and bafflement to my colleagues how much I love working with data. As an English specialist I’m not supposed to enjoy the spreadsheets and pie charts quite as much as I do. However, it’s my firm belief that data helps me to be a better teacher, and that my work with data as a member of SLT helps my school become a better school.

The principles of working with data

When I first started blogging one of the first posts I read was Kev Bartle‘s excellent ‘Doing’ Data as a member of SLT. In this post, Kev lays out some excellent principles for the use of data. If you haven’t read it – read it now, then come back. Finished? Good.

I don’t disagree with any of what Kev says in his post, as I’m sure you will see. But here are mine:

  1. Less is more – but don’t hide the rest
  2. Use hard and soft data equally
  3. Look for the narrative in the numbers
  4. Make the data work for you – don’t work for the data

Working with data as a classroom teacher

When I’m in the classroom I collect and use data that is relevant to me. Primarily, this is assessment data from my own subject. For each “proper” piece of assessed work completed it’s essential that I keep track of their attainment related to the assessment objectives, which I do in a spreadsheet. This helps me plan future lessons and target my teaching effectively. If I can see that particular students need more help with a particular part of the assessment, I will try to meet that need. Essentially I’m looking for answers in the assessments: what have I missed? What didn’t I explain well enough? What didn’t stick? What should I focus on next? Nobody really looks as this data except me, but it makes my lessons better.

I could find out loads more about the students in that class. I could ask SIMS for their attendance going back to Year 7 and beyond, their attainment and achievements and attitudes and intervention history and every letter sent home since they joined the school. Sometimes, where I have a concern, I will have a look and see if there is a similar pattern in other subjects, but if I’m honest this is unusual. The data I collect and record from my classroom helps me to teach better; I use it to inform the whole-school data collection which happens three times a year.

Working with data as a member of SLT

I try to keep my teacher-data in mind when I put my SLT head on. Teachers all around the school have their own systems for tracking ongoing progress on top of what I am asking them to input into SIMS, so it really needs to be worthwhile. Therefore each whole-school data collection we do has two purposes. Firstly, as a staging post to track progress and to report home to keep parents up-to-date. Secondly, to track academic attainment and progress through “working-at grades”. Thirdly, to track attitudes and behaviours for learning.

“Hard” data – academic attainment

Tessa Matthews wrote a depressing blog a couple of days ago called “What should we do about kids who aren’t making progress?”. Depressing, at least in part, because it was completely recognisable in its honesty and, in part because I am the SLT who sends out the emails and convenes the meetings to ask middle leaders the exact question posed in her title. Not, it has to be said, every third Wednesday, but I do ask the question – and I don’t apologise for it. I should be asking that question. For all that National Curriculum Levels are discredited and flawed, if a child at my school arrives in Year 7 with a Level 5 and reaches Year 9 still on a Level 5, we haven’t done enough for that child. It is one of the prime responsibilities of my job to spot those children who are at risk of not making progress and work with middle leaders and classroom teachers to find out what we can do differently, better, or additionally to ensure they do.

I try to take a “no excuses” approach with this process. By this I mean “she has a terrible home life – there’s no family support” might be an explanation as to why a Year 8 girl is not making expected progress, but it is not the end of the conversation. What can we, as a school, do to help her keep her studies on track as best she can? I echo John Tomsett’s sentiments in this regard:

I know that learning is not linear. I’m not worried if a child is a 5b in October and a 4a in January, but I will make sure that we keep an eye on that child’s progress.

“Soft” Data – attitudes, self-image and disposition

We probably do more work with the “soft” data than anything else. Three times a year all teachers report on students’ attitudes, grading them V, G, S or U for homework, classwork, effort and organisation. The judgements and the categories are all clearly defined (see our Monitoring Reports Guidelines) to try and provide consistency. Thanks to a system piloted by Ashley Loynton we track how students’ attitudes to learning are improving or declining between monitoring points and in relation to their peers, and pastoral leaders and tutors work with students individually or in small groups to address issues or trends. This has been a revelation and I always look at the attitude analysis first from any reporting session.

We also use PASS (Pupil Attitude to Self and School) surveys where students complete an online questionnaire to give feedback on their:

  • Feelings about school: “connectedness” whether pupils feel they belong to or are alienated from their school community
  • Perceived learning capability: how successful a young person feels about their learning
  • Self regard: long-term learner self worth
  • Preparedness for learning: whether learners feel they have the ‘tools to do the learning job’
  • Attitudes to teachers: pupil perceptions of their relationships with school staff
  • General work ethic: aspiration & motivation to succeed in life
  • Confidence in learning: pupils’ perseverance in the face of challenge
  • Attitude to attendance: how much a student wants to be in school
  • Response to curriculum demands: motivation to undertake and complete curriculum based tasks

Attitude data can help pick up "under the radar" issues

Attitude data can help pick up “under the radar” issues

Whilst certainly open to flaws on its own, this data is an excellent way to quantify teacher perceptions and identify students who may have barriers to learning which would otherwise remain under the radar.

External data sources – working the filter

filterfailure1

There is a wealth of data available. Aside from the internal assessment data mentioned above, Raise Online, DfE performance tables, Fischer Family Trust, CATs and ALPS are all in regular use at my school. We also have student and parent survey data, and health data from SHEU amongst many others. Any member of staff can access them – some of them are in the public domain and available online, and the others are in the staff drive of the school network. Many wouldn’t go looking for them, however, and my job as Deputy Head is to filter out the key messages and make sure that the relevant staff get those messages without wading through irrelevance and drowning in the tide. This is a major responsibility of my role – I have to know it all. I use a combination of FFT, CATs and teacher assessments to set targets (we call them Challenge Grades – I describe the process here). I prepare a Raise Online summary for Heads of Subject and for SLT and Governors. We use ALPS in discussions with sixth form teaching teams. I make sure to congratulate and praise publicly and privately where we have success, but primarily I’m looking for the gaps and holes. I use the same test as I use in my classroom – what data is relevant? What will help improve teaching? What have we missed? What should we focus on next?

funnel

It is the job of any senior leader to make sure that the data works for the school. I can’t bear the tales I hear of children being able to tell inspectors their FFT D targets – FFT is not intended to be used that way by their own definition. This is a classic case of the data working the school, rather than the school working the data. True leadership means getting the filtering right and funnelling the right data to the right people so that, as a school, we can ask the right questions. Because – as Kev Bartle says – data only ever asks questions. I like looking for the narrative in the numbers which frames and poses those questions, but the answers are always in the classrooms and the relationships and the teaching and learning and care that makes schools a success.