This blog post is based on a short talk I did at the NUS-HEA Student Engagement Conference in Nottingham on 12 June 2012. I was one of four panelists talking about feedback from students. I was asked to give a sceptical perspective on the National Student Survey.
Last year I wrote a short article on how lecturers could make use of the NSS as a tool for improving the quality of teaching and learning. I came up with 5 main tips.
These were:
- Don’t chase the ratings
- Don’t blame other people
- Look at the free-text comments (people often forget this section of the NSS exists).
- Compare with your institution’s own data
- Talk to your students
I put a positive spin on the NSS in that article. However, I may as well have said “Number 6: Ignore the National Student survey”.
I have been asked to provide a sceptical viewpoint on the NSS. Here I will give three reasons to be sceptical.
1. Questions are ambiguous
Consider Question 19: This course has enabled me to present myself with confidence.
What does this question mean? These are some of the ideas staff and students came up with in a project I worked on last year.
What this question might mean | Possible assumptions | Other issues |
EmployabilityDoing oral presentations
Feeling confident in person Interviewing skills Self-belief Able to express opinions without fear. Able to challenge the opinions of others. Not anxious Students can stand up for themselves Students are confident they will get a good job. |
Students were unable to present themselves with confidence at the beginning the course.Confidence comes from going the course.
Presenting oneself with confidence is a good thing (some students might benefit from being less confident) A course which does not help students present themselves with confidence is not a good course. The student who answers this question in negative might have been better off doing a different course or studying at a different place. |
Confidence might come from sources other than the course e.g. student societies, increased age, work experience, time spent abroadDoes a negative answer to this question suggest that the course was in any way inadequate?
Some evidence of students thinking about L2 language confidence, but this question was for students of all disciplines. Students who answer this in the negative are saying something bad about themselves. Student anxiety or lack of confidence indicates poor teaching or course design. |
2. There is no significant difference between institutions at subject level
Measures of overall satisfaction will be included in the Key Information Set which is about to be launched. The KIS is meant to help students with choosing a university. However the NSS does not really help students in differentiating between institutions. In 2010 41 institutions returned scores for French. In response to question 22: “Overall, I am satisfied with the quality of the course”, the highest score was 100% agreement and the lowest was 70%. Quite a big difference? Yes, but when you look at the confidence limits you see that the 70% institution could have been as high as 89%. The real ‘score’ for the institution with 100% could have been as low as 83%. So statistically speaking the institution which came last might actually have a higher real score than the institution which came first!
I think that the using the NSS to compare institutions is like to trying to identify the best cyclist in one of those cycling races where the whole peloton crosses the finishing line together. One cyclist will be first across the line, but is he or she the best? In today’s stage of the Tour de France it will be one person. Next time this happens it will be someone else. Oh, and they all get given the same time anyway, so it really doesn’t make any difference.
3. No use for quality enhancement
Those with an eye for NSS history will know that the survey was inspired by the Australian Course Experience Questionnaire (CEQ). The CEQ was designed a performance indicator for assessing the quality of teaching in higher education. It is not very useful in improving the student learning experience and I am not entirely sure whether it was meant to be. It can identify departments or institutions which have performance problems, but it does not contain any clues as how teaching might be improved.
Conclusion : why the NSS's weakness help improve the student learning experience
So in conclusion the NSS is ambiguous, of little help to potential students, and little use in improving the student learning experience. Ironically though the NSS is actually quite useful for quality enhancement—not because it is useful in itself, but because lecturers spend a lot of time thinking and talking about how to respond to the NSS. Most of the time they are saying what a bad survey it is. That’s what makes the NSS successful. It gets people talking and about teaching and learning. And because they talk and think they make improvements. Perhaps the NSS isn’t so bad after all.