Monthly Archives: June 2012

Edinburgh 2012 and Quality workshop

Last minute preparations are underway for next week's conference in Edinburgh: Language Futures: Languages in Higher Education conference 2012.

I will be faciliating an interactive workshop as part of our EU-funded SPEAQ (Sharing Practice in Enhancing and Assuring Quality) project along with my colleagues Alison Dickens and Laurence Georgin where we will be exploring how good quality teaching is understood by lecturers, students and quality managers.

My colleague Angela Gallagher Brett will be chairing a sessions on the Routes into Languages programme. Kate Borthwick will be presenting about FAVOR (Finding a Voice through Open Resources- a poject aimed mainly at part-time teachers) and about Open lives which is digitising research resources documenting the migration experiences of Spanish emigrés for open access.

Twitter hash tag: #llasconf2012

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Could Galvanic Skin Response (GSR) bracelets replace other methods of teacher evaluation?

I’ve been thinking a lot about student engagement since my trip to Nottingham a couple of weeks ago. The theme of that conference was ‘student engagement’ and my task was to speak about the National Student Survey. During the course of the discussion a student opined that student engagement is often seen as synonymous with doing surveys of students. And as I often hear students are getting all surveyed out.

The news that the Bill and Melinda Gates Foundation has given over $1 million to Clemson University to research “‘Galvanic’ bracelets that measure student engagement” is responded to with incredulity by Valerie Strauss in her blog for the Washington Post. Clemson University’s website describes the project thus:

Purpose: to conduct a pilot study to measure student engagement physiologically with Galvanic Skin Response (GSR) bracelets, which will determine the feasibility and utility of using such devices more broadly to help students and teachers.

According to Wikipedia (a source I only use for find out things I don’t know anything about), Galvanisitc Skin Response (GSR) is also known as skin conductance:

Skin conductance, also known as galvanic skin response (GSR), electrodermal response (EDR), psychogalvanic reflex (PGR), skin conductance response (SCR) or skin conductance level (SCL), is a method of measuring the electrical conductance of the skin, which varies with its moisture level. This is of interest because the sweat glands are controlled by the sympathetic nervous system,so skin conductance is used as an indication of psychological or physiological arousal. There has been a long history of electrodermal activity research, most of it dealing with spontaneous fluctuations or reactions to stimuli.

Whilst I share Strauss’s scepticism (despite my total ignorance of this field of study), this project brings in another dimension to debates about measuring student engagement. Is there a ‘Brave New World’ in which teacher evaluation instruments will be replaced with student sweat analysis?  Will the perception of scientific objectivity appeal to policy makers?

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Why the National Student Survey's shortcomings make it so useful.

This blog post is based on a short talk I did at the NUS-HEA Student Engagement Conference in Nottingham on 12 June 2012. I was one of four panelists talking about feedback from students.  I was asked to give a sceptical  perspective on the National Student Survey.

Last year I wrote a short article on how lecturers could make use of the NSS as a tool for improving the quality of teaching and learning. I came up with 5 main tips.

These were:

  1. Don’t chase the ratings
  2. Don’t blame other people
  3. Look at the free-text comments (people often forget this section of the NSS exists).
  4. Compare with your institution’s own data
  5. Talk to your students

I put a positive spin on the NSS in that article. However, I may as well have said “Number 6: Ignore the National Student survey”.

I have been asked to provide a sceptical viewpoint on the NSS. Here I will give three reasons to be sceptical.

1. Questions are ambiguous

Consider Question 19: This course has enabled me to present myself with confidence.

What does this question mean? These are some of the ideas staff and students came up with in a project I worked on last year.

What this question might mean Possible assumptions Other issues
EmployabilityDoing oral presentations

Feeling confident in person

Interviewing skills

Self-belief

Able to express opinions without fear.

Able to challenge the opinions of others.

Not anxious

Students can stand up for themselves

Students are confident they will get a good job.

Students were unable to present themselves with confidence at the beginning the course.Confidence comes from going the course.

Presenting oneself with confidence is a good thing (some students might benefit from being less confident)

A course which does not help students present themselves with confidence is not a good course.

The student who answers this question in negative might have been better off doing a different course or studying at a different place.

Confidence might come from sources other than the course e.g. student societies, increased age, work experience, time spent abroadDoes a negative answer to this question suggest that the course was in any way inadequate?

Some evidence of students thinking about L2 language confidence, but this question was for students of all disciplines.

Students who answer this in the negative are saying something bad about themselves.

Student anxiety or lack of confidence indicates poor teaching or course design.

2. There is no significant difference between institutions at subject level

Measures of overall satisfaction will be included in the Key Information Set which is about to be launched. The KIS is meant to help students with choosing a university. However the NSS does not really help students in differentiating between institutions. In 2010 41 institutions returned scores for French. In response to question 22: “Overall, I am satisfied with the quality of the course”, the highest score was 100% agreement and the lowest was 70%. Quite a big difference? Yes, but when you look at the confidence limits you see that the 70% institution could have been as high as 89%. The real ‘score’ for the institution with 100% could have been as low as 83%. So statistically speaking the institution which came last might actually have a higher real score than the institution which came first!

French overall satisfaction with confidence intervals
French overall satisfaction with confidence intervals (NSS, 2010)

 

 

I think that the using the NSS to compare institutions is like to trying to identify the best cyclist in one of those cycling races where the whole peloton crosses the finishing line together. One cyclist will be first across the line, but is he or she the best? In today’s stage of the Tour de France it will be one person. Next time this happens it will be someone else. Oh, and they all get given the same time anyway, so it really doesn’t make any difference.

3. No use for quality enhancement

Those with an eye for NSS history will know that the survey was inspired by the Australian Course Experience Questionnaire (CEQ). The CEQ was designed a performance indicator for assessing the quality of teaching in higher education. It is not very useful in improving the student learning experience and I am not entirely sure whether it was meant to be. It can identify departments or institutions which have performance problems, but it does not contain any clues as how teaching might be improved.

Conclusion : why the NSS's weakness help improve the student learning experience 

So in conclusion the NSS is ambiguous, of little help to potential students, and little use in improving the student learning experience. Ironically though the NSS is actually quite useful for quality enhancement—not because it is useful in itself, but because lecturers spend a lot of time thinking and talking about how to respond to the NSS.  Most of the time they are saying what a bad survey it is. That’s what makes the NSS successful. It gets people talking and about teaching and learning. And because they talk and think they make improvements. Perhaps the NSS isn’t so bad after all.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Drupal: overcoming problems with unexpected page errors... Views, CTools, Panels

Been frustrated that I was having problems with Drupal's Choas Tools, Views, Update and Panels on YazikOpen. I kept getting "unepected page errors"  Today I eventually found a solution  online which worked for me.

  1. Went into file-manager on my webhost and moved (did not delete) the CTools module from the sites<all<modules folder.
  2. Disabled all contributed modules.
  3. Moved the CT tools module back to sites<all<modules folder.
  4. Views panels and update now works!

Very pleased about this as I can actually use the Views module now!

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Some questions that didn't make it onto the National Student Survey

Richardson , J., Slater J. and Wilson, J. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education 32.5, pp. 557-580 :

This paper from 2007 reports on the piloting of the UK National Survey (NSS). The settled version of the questionnaire contains 22 questions, but the pilot contained 45 items. Interesting the pilot contained negative questions (where agreeing indicated a negative experience). Questions on workload were dropped due to a lack of internal consistency.

To do well on this course you mainly need a good memory

It was clear what standard was required in assessed work.

It was clear what I was required to attend, prepare, and to do throughout the course.

I found the overall workload too heavy

I feel confident in the subject knowledge I acquired

I had as much contact with staff as I needed

Overall questions

Overall I feel the course was a good investment

I would recommend the course to a friend

I wonder whether these would have been preferable to the "Overall, I am satisfied with my the quality of my course questions which is tends to take preference over all others.

The second pilot contained the question “It has been difficult to answer many of the questions because of the variability of my experience (Interesting only 12% strongly agreed and 18.2% agreed).

Just thought it would interesting to share the questions which did not make it.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon