remarkabloggersurvey

Late last year the Higher Education Funding Council for England put out a call for feedback from the sector about the National Student Survey. It was a combination of request for feedback on the existing NSS and ideas for the future.

The two key proposals for the future hinted at in the call did not come as much of a surprise, namely that the NSS should extend its remit to all elements of the student experience, not just the academic experience (i.e. to include accommodation, sports, students’ union etc.) and that the NSS questions should shift to focus on student engagement rather than satisfaction.

I’d like to blow a big raspberry at both proposals.

The first is quite easy to dismiss. Good intentions are behind this I’m sure, but extending the remit of the NSS like this would introduce a bias that would be unacceptable. Some Universities have far greater flexibility and freedom to provide and improve these extra-academic experiences for students. For example, London Universities have virtually no freedom to affect or control the accommodation experience of students, smaller Universities can barely offer sporting facilities etc. Also, it would make the whole survey unfocussed and blur its impact.

The second is trickier, but should be more adamantly dismissed. I have built a substantial part of my career on student engagement and spent most of my working life seeking to advance and improve the role students can play in their academic community. So, at first sight I would have assumed that a national survey of student engagement was a good thing. But then a number of things have become clearer to me over time.

1) Student engagement is something you do, not something you measure.
Student engagement is first an foremost about getting into a mindset where you consider students to be peers who have something valuable to offer the academic community, whether in decision making structure, shaping the research environment or delivering and improving learning and teaching. What students have to offer is different to what academics have to offer and different to what professional service staff have to offer but it is valid and valuable. The only way to tap into is by involving them.

2) Student engagement with their academic study is already being measured constantly.
The NSS does not measure the extent to which students have engaged with their programme of study. This is true, the NSS only asks students to comment on how satisfied they are with how they have been taught and supported to learn. However, undergraduate students are constantly being tested on their academic engagement. Every single module they study is assessed in one way or another, this is how you ascertain student engagement with academic study. If they have thrown themselves into the course, taken up the opportunities afforded to them, paid attention in lectures, done the reading for seminars, looked up items on the reading list, worked with their peers to prepare presentations etc. then they will do well! If they haven’t done any of that, then they are less likely to do well. Why is asking survey questions a better proxy than this?

3) Stop trying to shift the blame!
I know that the current NSS asks ‘service oriented questions’, it only reflects the level of satisfaction with what a student has received, not what they have put in. I have been a lifelong advocate, and remain one, of students putting in as much as they get out; but isn’t it right that students who are paying £9,000 should be afforded the opportunity to provide feedback on their experience? It would be devastating if we were to create a survey that just shifted the blame for poor experiences onto students. It would only be a matter of time before somebody was saying “Well my programmes is outstanding, but only 40% of students said they used the reading list so it’s their own fault they don’t learn”.

4) The current NSS questions aren’t perfect, but they empower students.
When else are students given an unambiguous opportunity to tell their University and the rest of the world what their experience has really been like? They don’t. Students are assessed on their academic performance throughout their time at a University (quite right too) and told directly how they are doing, why shouldn’t they have an opportunity to do the same back?

5) The current questions are actionable!
Some of the current NSS questions are a bit out of date (q. 17) or a little obtuse (q’s 19-21) but on the whole they ask direct questions, about the right sorts of things, in the right sort of way. If you receive 40% satisfaction on q. 2 “Staff have made the subject interesting”, then you know there is a disconnect between the teaching practice and the students. There can be all sorts of reason for this and all sorts of solutions, but the problem is clear. If 97% of students are satisfied on q. 16 “The library resources and services are good enough for my needs” then something has been done very well and should be scooped up as good practice to share elsewhere in the University. But what if we ask “During the current academic year how often have you asked another student to help you understand course material?” and the response is “once”, what do you do? What does it mean? Is that good because the student hasn’t needed help, or bad because the student isn’t talking to his/her peers? If it’s considered a bad thing, what do you do about it? And what is a prospective student meant to do with that information?

I might be wrong, these surveys of student engagement have done good business in the States and in Australia, so there is probably more to it than I am seeing.

I’m looking forward to seeing the outcome of the consultation.

Advertisements

2 responses »

  1. Meehan, Katie says:

    Loving your blog!

  2. Andrew McRae says:

    My disagreement, Derfel, is because I’m perhaps coming at ‘student engagement’ from a slightly different angle. I see it as an index of a successful university and successful programmes. I also think that prospective students, at the point of application, would benefit from having information about levels of engagement. These can make a big difference to the quality of their own experiences: as much – perhaps more – than the quality of lecturers. Engagement questions, in other words, could provide valuable information about the culture into which prospective students would be stepping.
    ‘Shifting the blame’? I can see how some academics might try to do this, if the questions were poorly framed. But I see student engagement not as the students’ responsibility, but as a crucial sign of how well the programmes are designed and taught, and how well the university is managing the student experience. So poorly engaged students = poor teaching. That would certainly be my starting assumption, anyway, as someone who spends a lot of time looking at NSS results. I also think that it’s a bit fanciful to think that, with a DVC breathing down a department’s neck after some poor NSS results, the academics could get away with saying: ‘Oh, that was all the students’ fault’. That’s not how the NSS works.
    So: ‘actionable’? I think that engagement questions would lead to actions – in fact, perhaps more interesting, challenging processes of action than, say ‘how prompt was my feedback?’ Those questions have served a purpose, but I’d like something more from the NSS as we move forward.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s