Category Archives: National Student Survey

12 actions language lecturers are taking to engage with the National Student Survey.

I have just been looking back at the NSS project I was involved with LLAS last year. The report concluded with 12 actions colleagues from nine institutions were planning to take. Not everyone will agree with all of them, I though I would post them here for interest.

1. Using the NSS questions on first and second year questionnaires.
2. Encouraging students to make more use of timetabled advice and guidance sessions.
3. Providing a more comprehensive introduction to the library resources. One colleague plans to recommend making library sessions obligatory.
4. Informing Level 2 students about previous actions taken in response to the NSS.
5. Discussing ways in which the NSS can feed into broader staff development, including courses for early career teaching staff.
6. Promoting more staff use of discussion boards in the institution‘s VLE as a means of providing feedback.
7. Encouraging tutors on skills modules to put more emphasis on transferable skills.
8. Developing a better understanding between staff and students of staff availability.
9. Communicating assessment criteria more clearly in order to relieve pressure on office hours.
10. Harmonising teaching and assessment for different languages. Where there are exceptions a case should be made to the students.
11. Fostering a 'personal tutoring' culture in the department.
12. Promoting awareness to students of the importance of the NSS.

John Canning, et al (2011) Understanding the National Student Survey: investigations in languages, linguistics and area studies. Southampton , GB, LLAS (Centre for Languages, Linguistics and Area Studies), 13pp. Available from: LLAS website

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Could Galvanic Skin Response (GSR) bracelets replace other methods of teacher evaluation?

I’ve been thinking a lot about student engagement since my trip to Nottingham a couple of weeks ago. The theme of that conference was ‘student engagement’ and my task was to speak about the National Student Survey. During the course of the discussion a student opined that student engagement is often seen as synonymous with doing surveys of students. And as I often hear students are getting all surveyed out.

The news that the Bill and Melinda Gates Foundation has given over $1 million to Clemson University to research “‘Galvanic’ bracelets that measure student engagement” is responded to with incredulity by Valerie Strauss in her blog for the Washington Post. Clemson University’s website describes the project thus:

Purpose: to conduct a pilot study to measure student engagement physiologically with Galvanic Skin Response (GSR) bracelets, which will determine the feasibility and utility of using such devices more broadly to help students and teachers.

According to Wikipedia (a source I only use for find out things I don’t know anything about), Galvanisitc Skin Response (GSR) is also known as skin conductance:

Skin conductance, also known as galvanic skin response (GSR), electrodermal response (EDR), psychogalvanic reflex (PGR), skin conductance response (SCR) or skin conductance level (SCL), is a method of measuring the electrical conductance of the skin, which varies with its moisture level. This is of interest because the sweat glands are controlled by the sympathetic nervous system,so skin conductance is used as an indication of psychological or physiological arousal. There has been a long history of electrodermal activity research, most of it dealing with spontaneous fluctuations or reactions to stimuli.

Whilst I share Strauss’s scepticism (despite my total ignorance of this field of study), this project brings in another dimension to debates about measuring student engagement. Is there a ‘Brave New World’ in which teacher evaluation instruments will be replaced with student sweat analysis?  Will the perception of scientific objectivity appeal to policy makers?

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Why the National Student Survey's shortcomings make it so useful.

This blog post is based on a short talk I did at the NUS-HEA Student Engagement Conference in Nottingham on 12 June 2012. I was one of four panelists talking about feedback from students.  I was asked to give a sceptical  perspective on the National Student Survey.

Last year I wrote a short article on how lecturers could make use of the NSS as a tool for improving the quality of teaching and learning. I came up with 5 main tips.

These were:

  1. Don’t chase the ratings
  2. Don’t blame other people
  3. Look at the free-text comments (people often forget this section of the NSS exists).
  4. Compare with your institution’s own data
  5. Talk to your students

I put a positive spin on the NSS in that article. However, I may as well have said “Number 6: Ignore the National Student survey”.

I have been asked to provide a sceptical viewpoint on the NSS. Here I will give three reasons to be sceptical.

1. Questions are ambiguous

Consider Question 19: This course has enabled me to present myself with confidence.

What does this question mean? These are some of the ideas staff and students came up with in a project I worked on last year.

What this question might mean Possible assumptions Other issues
EmployabilityDoing oral presentations

Feeling confident in person

Interviewing skills

Self-belief

Able to express opinions without fear.

Able to challenge the opinions of others.

Not anxious

Students can stand up for themselves

Students are confident they will get a good job.

Students were unable to present themselves with confidence at the beginning the course.Confidence comes from going the course.

Presenting oneself with confidence is a good thing (some students might benefit from being less confident)

A course which does not help students present themselves with confidence is not a good course.

The student who answers this question in negative might have been better off doing a different course or studying at a different place.

Confidence might come from sources other than the course e.g. student societies, increased age, work experience, time spent abroadDoes a negative answer to this question suggest that the course was in any way inadequate?

Some evidence of students thinking about L2 language confidence, but this question was for students of all disciplines.

Students who answer this in the negative are saying something bad about themselves.

Student anxiety or lack of confidence indicates poor teaching or course design.

2. There is no significant difference between institutions at subject level

Measures of overall satisfaction will be included in the Key Information Set which is about to be launched. The KIS is meant to help students with choosing a university. However the NSS does not really help students in differentiating between institutions. In 2010 41 institutions returned scores for French. In response to question 22: “Overall, I am satisfied with the quality of the course”, the highest score was 100% agreement and the lowest was 70%. Quite a big difference? Yes, but when you look at the confidence limits you see that the 70% institution could have been as high as 89%. The real ‘score’ for the institution with 100% could have been as low as 83%. So statistically speaking the institution which came last might actually have a higher real score than the institution which came first!

French overall satisfaction with confidence intervals
French overall satisfaction with confidence intervals (NSS, 2010)

 

 

I think that the using the NSS to compare institutions is like to trying to identify the best cyclist in one of those cycling races where the whole peloton crosses the finishing line together. One cyclist will be first across the line, but is he or she the best? In today’s stage of the Tour de France it will be one person. Next time this happens it will be someone else. Oh, and they all get given the same time anyway, so it really doesn’t make any difference.

3. No use for quality enhancement

Those with an eye for NSS history will know that the survey was inspired by the Australian Course Experience Questionnaire (CEQ). The CEQ was designed a performance indicator for assessing the quality of teaching in higher education. It is not very useful in improving the student learning experience and I am not entirely sure whether it was meant to be. It can identify departments or institutions which have performance problems, but it does not contain any clues as how teaching might be improved.

Conclusion : why the NSS's weakness help improve the student learning experience 

So in conclusion the NSS is ambiguous, of little help to potential students, and little use in improving the student learning experience. Ironically though the NSS is actually quite useful for quality enhancement—not because it is useful in itself, but because lecturers spend a lot of time thinking and talking about how to respond to the NSS.  Most of the time they are saying what a bad survey it is. That’s what makes the NSS successful. It gets people talking and about teaching and learning. And because they talk and think they make improvements. Perhaps the NSS isn’t so bad after all.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Some questions that didn't make it onto the National Student Survey

Richardson , J., Slater J. and Wilson, J. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education 32.5, pp. 557-580 :

This paper from 2007 reports on the piloting of the UK National Survey (NSS). The settled version of the questionnaire contains 22 questions, but the pilot contained 45 items. Interesting the pilot contained negative questions (where agreeing indicated a negative experience). Questions on workload were dropped due to a lack of internal consistency.

To do well on this course you mainly need a good memory

It was clear what standard was required in assessed work.

It was clear what I was required to attend, prepare, and to do throughout the course.

I found the overall workload too heavy

I feel confident in the subject knowledge I acquired

I had as much contact with staff as I needed

Overall questions

Overall I feel the course was a good investment

I would recommend the course to a friend

I wonder whether these would have been preferable to the "Overall, I am satisfied with my the quality of my course questions which is tends to take preference over all others.

The second pilot contained the question “It has been difficult to answer many of the questions because of the variability of my experience (Interesting only 12% strongly agreed and 18.2% agreed).

Just thought it would interesting to share the questions which did not make it.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

In (sort of) defence of ratemyprofessors.com

A few weeks back the Times Higher published an article on student survey fatigue. Students fill in some many surveys including the National Student Survey, module surveys and institution wide surveys (coincidently, some the questions on the latter surveys are similar or identical to those on the NSS). I suggested on Twitter that the ineffectiveness of current surveys means that we want to do more surveys to fix the shortcomings of the existings surveys in order to give a ‘truer’ picture. Much of the reason that we are considering a national survey of language students stems from the shortcomings of the NSS outlined in our recent report. Should this new national survey go ahead I’m sure that it too will have its own shortcomings.

Legg and Wilson’s recent paper in Assessment and Evaluation in Higher Education on the reliability of www.ratemyprofessors.com vis-à-vis in course evaluations is interesting in its own right (their title "RateMyProfessors.com offers biased evaluations" deserves an award for clarity). But at least one thing could be said in defence of www.ratemyprofessors.com. It is unambiguous about who is the target of the evaluation—the teacher. With most other surveys it is unclear whether the students are being asked about the course, the teacher, the content, the university or the programme of study. The students don't know when they are answering the questions and we don't know when we are analyzing their answers.

See also: 4 ways to Avoid Survey Fatigue in Higher Education

 

 

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Discard the irrelevant: Statistics don’t bleed, but our students do.

I have written an new article for the LLAS blog (in a personal capacity).

 

Some rise by sin and others by virtue fall. William Shakespeare, Measure for Measure

Statistics are everywhere in education. We have the National Student Survey (NSS), the First Destinations Survey, newspaper league tables, and the Times World University rankings among others. Universities are now required to publish ‘Key Information Sets’ (KIS) from 2012. The KIS has data from the NSS (the higher the agreeing percentages the better), the cost of university accommodation (presumably the lower the better), fees (the lower the better), graduate employment rates (the higher the better), percentage of assessment which is written exams (depends on the student) and number of ‘contact’ hours (again, depends on the student). In short if it can be measured the data is out there. And if it can’t be measured, we’ll find a way to measure it anyway, (research impact anyone?). Add to all this the information that students get from visit days, Facebook, twitter, the online student forums, friends and the phrase ‘information overload’ comes to mind. In his report Dimensions of Quality Graham Gibbs warns us about that immeasurable factor, reputation, which can override any real measure of quality. I suspect that all this information only serves to make reputation all the more important.

Read the full article on the  LLAS news blog

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

NSS: What does "The course has helped me to present myself with confidence" mean?

 

The report from the LLAS Subject Centre National Student Survey project last academic year is now online. In the project we focused in on eight of the 22 questions. Whilst many of the questions were found to be problematic, this one was especially difficult to unpack.

Question 19: The course has helped me to present myself with confidence.

From the report

When answering this question, many students initially thought about giving oral presentations.  It was also linked to employability and interviewing skills, but the question of whether this was about personal confidence or academic confidence was unclear. And where students reported an increase in confidence, was this down to the skills their course had given them, their year abroad, their work placements, or was it just part of being four years older?

One member of staff observed that the NSS is carried out at a time where students are at their most anxious, perhaps looking for work, perhaps worried about the future. In languages it was suggested that this question might be thought about in the context of L2 competence or confidence in dealing with people from other cultures. ―It’s a bit of a weird question said one student. ―It really wants you to say “yes”, because if you say “no”, you‘re saying something bad about yourself.

Further thoughts

Some further thoughts here. Some a little pedantic maybe, but that’s what happens when you start to unpack the question with students and lecturers.

What this question might mean Possible assumptions Other issues
Employability

Doing oral presentations

Feeling confident in person

Interviewing skills

Self-belief

Able to express opinions without fear.

Able to challenge the opinions of others.

Not anxious

Students can stand up for themselves

Students are confident they will get a good job.

Students were unable to present themselves with confidence at the beginning the course.

Confidence comes from going the course.

Presenting oneself with confidence is a good thing (some students might benefit from being less confident)

A course which does not help students present themselves with confidence is not a good course.

The student who answers this question in negative might have been better off doing a different course or studying at a different place.

Confidence might come from sources other than the course e.g. student societies, increased age, work experience, time spent abroad

Does a negative answer to this question suggest that the course was in any way inadequate?

Some evidence of students thinking about L2 language confidence, but this question was for students of all disciplines.

Students who answer this in the negative are saying something bad about themselves.

Student anxiety or lack of confidence indicates poor teaching or course design.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon