nss218I was really downhearted when I read this article published by the Guardian on its Higher Education pages.

The author argues that the National Student Survey is a waste of time and is detrimental to learning and teaching.

Personally, I think it is a cause for celebration that students report such high levels of satisfaction and positive experiences. In particular this year when the skies were meant to fall down upon us because the first generation of £9k fee payers were filling in the survey.

The NSS is a survey that draws in hundreds of thousands of respondents each year; just in order for the results to be published a response rate of over 50% must be achieved (and this is massively exceeded every year), this provides reassurance that the data is robust and is genuinely representative of student views about their experience.

To complain that the scores are clustering misses the point entirely and ignores the facts. Would we prefer it if more students were having an utterly miserable experience for the sake of having a wider range of scores? I doubt it.

Then there is the argument that a single survey is not appropriate because universities and the courses they provide are different. Of course they are! Universities are different and the courses that are delivered will vary in content, teaching and assessment methods; the survey does not ask students to compare their experience with students at other universities and it would be absurd to do so because students (by and large) only have one experience to reflect on. The NSS, rightly, asks students a series of pointed questions about their experience of teaching, assessment, support, resources etc. at their institution, the questions are identical for all students in their final year so the focus is on comparability of outcome not input. It is perfectly possible for two courses in the same subject to be taught and assessed in completely different ways but for students to come out at the other end highly satisfied with their experience. It would be disastrous to try to bring about uniformity of input.

Another point to consider here is the ‘masking effect’ of the mean scores for institutions. These scores that aggregate the level of satisfaction of all students covers up the fact that there will be significant variation between departments and even within departments between courses. This in itself is not a problem, but it does counter the suggestion that all this is pointless because there isn’t a wider gulf between the highest and lowest scoring universities. This also gives an opportunity for departments to benchmark themselves against colleagues in university and also peer departments in other universities.

The suggestions in the article of changes to the curriculum that have been made because of the NSS seem very odd to me. If methods of assessment are being changed to move away from over-reliance on essays and exams, then I can only cheer it along as a good thing in itself. These traditional forms of assessment are tried and tested and have very important part to play in universities. But why not add other methods of assessment into the mix? Ones that challenge students to develop their thinking and negotiate a final piece of work with a group of other students, ones that encourage  and support them to present their research and ideas in new and innovative ways. I’ve just spent the Summer at UCL reading through hundreds of external examiner reports. They are all experienced, senior academics from other universities, and their feedback is routinely positive about new and innovative methods of assessment as a way of maintaining high standards and challenging students in new and interesting ways; far from discouraging new methods of assessment, they actively encourage, support and promote it and these are the opinions of other academics, not students.

Another criticism that is often made (not though in this Guardian article) is that the questions only ask about satisfaction and do not ask student to reflect on their teaching, learning and assessment experiences. I disagree. To take four questions as examples:

  • Staff have made the subject interesting (Q2 in the NSS)
  • The course is intellectually stimulating (Q4)
  • The criteria used in marking have been clear in advance (Q5)
  • Feedback on my work has helped me clarify things I did not understand (Q9)

Who would not want to know the answer to these questions? If, for example, only 30% of students agree that the course is intellectually stimulating I think it’s very important to know that and to do something about it! Equally if, 70% of student agree with the statement “Feedback on my work has helped me clarify things I did not understand”, that’s a good score, but still one third of students are saying that feedback is not helping them and that’s a worry.

No mass survey, containing only 23 questions is going to be perfect. But I for one welcome this annual opportunity for students, en-masse, to take a moment to reflect on their entire experience and tell us what it’s been like, it is an important feature of UK higher education and a critical part of the relationship we have with our students.

UPDATE: Andrew McRae has ‘fisked’ the Guardian piece with magisterial style on his blog .


4 responses »

  1. […] The National Student Survey should be abolished before it does any more harm. And so, while my mate Derfel has already led the defence, and although I’ve made some of the points below before, here we go […]

  2. I think one of the big problems is design of the questions. Take:
    Feedback on my work has helped me clarify things I did not understand (Q9)
    Now if 30% of students give this a low score, perhaps that’s because they are being honest and just didn’t act on the feedback. Similarly, I know it’s possible to have 100% of feedback given back in time, but a good percent of students say that it hasn’t been, either because they are not aware of the deadlines for return, or because they have unrealistic expectations. The design of many of the questions in the NSS is weak, but on top of that a survey of this kind has to be interpreted, and attention has to be paid to how the respondents might have understood the question. Adding to that, without capturing data on respondents, such as motivation, achievement, personal circumstances, etc, the data the NSS is giving us probably isn’t good enough to tell us much helpful. That’s also why you see the clustering effect. It is probably useful to see big issues, but they’re almost certainly already caught at local level in course evaluations and via local quality processes. I just don’t think the NSS really adds much to improving students’ actual education, and the year on year data back that up.

    • derfelowen says:

      I think the key to the first point you make David is building an open and transparent relationship with your students. If you have departmental or university turnaround targets for return of feedback, this should be communicated and shared widely with students so that they know and understand the rules. There is a separate question about promptness of feedback in the NSS.

      I think the design of some of the questions is problematic. I have written before about the rather ambiguous wording of the question in academic support and personal development sections, also a couple of the learning resources questions are dated.

      I’m afraid I can’t accept that the NSS doesn’t help improve education. Sometimes it tells us things we already know, especially if there are good student representation systems and cultures in place. But even in then it can give pointers to other departments in or outside the university where good practice can be picked up and shared.

  3. […] proposed ones all focus on specific aspects of the student experience. I’ve written about this before, so won’t go on about it […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s