Saturday 20 April 2013

Giving summative tests formative impact

Not quite so directly related to managing variability this one, but it is a useful approach and it certainly does help if the whole department is doing it!

The problem
Summative tests such as the completion of mock exam papers just result in a grade. The grade is really useful for departmental level tracking, but the grade alone doesn't give the students an indication of what they need to do to improve.

This is the classic formative vs summative conflict that has been discussed at length in various forums - I'm certainly not the first to encounter this conflict. There is a wealth and breadth of work on this such as "Inside the black box" and other work of Dylan William, among many others.

So the challenge becomes to balance the need to do the summative tests to help predict grades and assess the performance of students, while at the same time making the result more meaningful than just a grade or level so that the students get something useful out of the tests as well.

The approach
I aimed to give the department and students a framework to help them diagnose where their performance  was strong or weak in a summative test so that it can be used in a formative way.

Perhaps it's due to my background but Excel came to the rescue again!

I created a sheet that allows student performance on each question or sub question on a test to be entered. (sheet is a bit clunky and may not be the most elegantly presented, but it works) This is a relatively coarse measure - full marks gets a green, some marks gets an amber, no marks gets a red - and it's subjectively assigned by the teacher (or possibly by the student). This can take a little time but data for a full class of 32 can normally be entered in less than 20 minutes once you get into the swing of it. Once filled in it looks like this... (I've chosen one with plenty of colour on it!)


Firstly this is quite a visual guide for the teacher. So in this example it is clear that Q5 & 16 were quite well completed but Q6, 11 and 13 were completely unsuccessful.

This can then be used to create summary graphs to help further analysis at a teacher level, like this...

However this still doesn't really give anything back to the students (useful for the teacher though). To make the leap to formative feedback the same sheet automatically creates a personalised feedback form for each student that looks like this:
The columns identifying "strong", "could improve" and "need to improve" correspond to the red, amber & green ratings on the first sheet.

Now this becomes more useful as it clarifies the topic, sub topic or specific skill that the student needs to work on. Note that the descriptions assigned to each question are teacher generated & editable to give whatever level of detail needed.

In follow up lessons the students can then be guided to tasks that will help them to fill in knowledge in their weaker areas. Additionally the teacher has a clear record of where skills lie or are missing both at class and pupil level, and therefore has guidance for future planning.

A further development of this general idea to give a greater emphasis on self assessment involves giving students time to review tests alongside a structure to help them to reflect and identify strengths and weaknesses. We use the following sheets as a template for the students to complete as a review, and then encourage them to identify 2 strengths and 2 weaknesses.
Depending on the group this sheet can be used alongside or instead of the red/amber/green one.

We've been using all of the above across Key Stages 3 and 4, and also in selected applications in Key Stage 5.

Other uses
I've also used the red/amber/green sheet to give feedback on written assignments and on presentations - the objectives & assessment criteria for the assignments can replace the question topics.

Nothing new
I'm fully aware that this isn't a massive innovation, many teachers review tests at a question by question level. What I'm not so sure of though is how many then use this information to form the basis of specific and personalised feedback to students.

I should also observe that I am aware that there are a great many schools where this type of thing isn't done at all for their summative testing, and as such they are missing an opportunity for some really useful feedback to both students and teachers.

The usefulness of these sheets and structures is in the fact that it is relatively easy to create good feedback that can be reflected and acted upon in follow up lessons.

Benefits - is it worthwhile?
This is one of many strategies we have been using in my department over the last 18 months.

At a basic level it has provoked useful and informed discussions with students about areas to improve. As well as being used for guidance in class, we have had students specifically take copies of these sheets home to use during independent study. Fundamentally the students like them and tell me and the department that they find them useful to shape their studies. If I saw no other benefit then this positive student message would be enough to encourage me that it was worthwhile. However we have a more tangible indication that the approach is working...

As part of a programme of regular mock exams with year 11 this feedback structure has allowed us to prepare almost the whole year group for early completion of their GCSEs in the March exams. Yes I know there are mixed views on early entry but our students were ready for these exams and the results they delivered prove this...

The results were published this week, and have already set a new school record for maths in terms of A*-C. Compared to other schools we scored 17% points higher than the average of similar schools with that exam board and 25%points higher than the average of all schools with them. With the remaining students completing exams in June, along with some students now looking to improve, we are likely to deliver in the region of a 5-10% improvement in headline results compared to last year.

I'll not claim that this feedback approach made all of the difference, but it was a contributing factor in amongst everything else.

Any thoughts?
I'd be keen to hear if anyone has another way to crack this nut, or if you have any comments or questions. Leave a comment or come and find me on twitter: @ListerKev.

8 comments:

  1. Looks fantastic. Just what I need. Would you consider sharing the excel spreadsheet?

    ReplyDelete
  2. See here http://kevs-variability-thoughts.blogspot.co.uk/2013/08/a-toolbox-to-help-start-your-term.html there is a link it that post...

    ReplyDelete
  3. Really interesting blog post. My own attempt at doing something similar is here http://eviedblog.wordpress.com/2014/04/30/making-better-use-of-assessments/

    ReplyDelete
    Replies
    1. Thanks for the comments Austin - I'll check out your post :-)

      Delete
  4. Hi Kev, this is a brilliant resource. Would you consider sharing. I couldn't find a link on this blog. Thanks.

    ReplyDelete
    Replies
    1. Sorry - very slow response, the ling to the file is here:
      https://www.dropbox.com/s/7egqwu7dlpb07q9/Exam%20paper%20analysis%20sheet.xlsx?dl=0

      Delete
  5. Hi Kevin, I think this is wonderful way of helping the students know the gaps in the learning and the areas they have learnt well. Just wondering how did you create the summary composite graph?

    ReplyDelete
    Replies
    1. The spreadsheet takes care of the graph - (see link above) - many thanks for the comments - give me a shout if you have any other questions (via @ListeKev will get a faster response!)

      Delete