Category Archives: surveys

Coventry maintains Number 1 position on the NSS Canning list

I've just calculated the 2015 for the Canning list. The full data will be uploaded soon.

The top 10 institutions for average score are a mix of large and smaller providers, perhaps demonstrating that insitutations of all sizes can score well (or badly). Among the larger providers (minimum 20 programmes)  Coventry maintains its number 1 position.

 

 

 

 

Top 10 universities for National Student Survey (2015) Minimum 10 courses

Institution Number of courses Mean WSSS
City College Plymouth 11 1550.1
Coventry University 51 1502.4
The University of Surrey 36 1489.7
North Lindsey College 11 1456.9
Doncaster College 14 1446
The University of Leeds 72 1432.9
The University of Keele 34 1427
Bangor University 44 1423.8
Solihull College 13 1423.4
University of Ulster 57 1421

 

 

 

 

 

Top 10 universities for National Student Survey (2015) Minimum 20 courses

University Number of courses Mean WSSS
Coventry University 51 1502.4
The University of Surrey 36 1489.7
The University of Leeds 72 1432.9
The University of Keele 34 1427
Bangor University 44 1423.8
University of Ulster 57 1421
Loughborough University 34 1419.1
Edge Hill University 29 1415
Nottingham Trent University 47 1410.8
Bath Spa University 22 1410.4
  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

The National Student Survey and the development of the 'Canning list'

The National Student Survey (NSS) is now in its tenth year. Vice-Chancellors set targets by it, newspapers and magazines use it to create league tables and university strategies are framed around it. Expressing an off-message opinion on the NSS cost former HEA research director Lee Harvey his job.  

It surprises me how little has actually been written by an NSS in peer-reviewed journals. By ‘how little’ I don’t mean little as in nothing, but relatively little  considering the big part it plays in the life of the British university.

I’ve long been a critic of the NSS. What it comes to enhancing the quality of teaching and learning, the NSS is a bit like using a screwdriver to put a nail into the wall. It is possible to get a nail in into the wall using a screwdriver, but you’re better off using a hammer. If you don’t have a hammer then you might go ahead and use the screwdriver at the risk of a bent nail, broken screwdriver or injured hand.

However, perhaps I’ve been a bit mean to the NSS over these past few years. For all the emphasis on ‘overall satisfaction’, and assessment and feedback you might think that there were only three or four questions on the NSS. The NSS asks questions about course organisation, library resources and skills development as well, but we don’t hear much about these. What if we actually paid attention to some of these questions?

I took up this challenge in developing a new system of measurement for the NSS. I reasoned that if we considered all the questions and understood their relative importance we could bring about a system where each UK course at each university could have a ‘score’—in my paper I call this the Weighted Student Satisfaction Score (WSSS). However a raw score of say 1400 doesn’t tell you what they means in relation to other scores, so the scores are then normalised to a normal distribution so that an average course scores 100 (the Weighted Student Satisfaction Quotient—WSSQ). Over time it will be possible to trace changes in both absolute and relative scores. The system is fully outlined in the article and takes into account subject differences as well as providing a bonus for good response rates.  

The full ‘Canning list’ (a term used by David Law in his editorial is available from my website.

Here are the top 10 best courses in the whole UK according to the 'Canning list'.

Overall rank

Institution

Subject

Type

WSSS

WSSQ

1

Hugh Baird College

Fine Art

Other undergraduate

1884.7

141.0

2

Somerset College

Drama

Other undergraduate

1877.6

140.5

3

Grimsby Institute of Further and Higher Education

Imaginative writing

First degree

1869.7

140.0

4

Plymouth College of Art

Cinematics and photography

Other undergraduate

1861.6

139.4

5

Petroc

Design studies

Other undergraduate

1843.1

138.2

6

University of Cumbria

Fine art

First degree

1836.8

137.7

7

Northumbria University Newcastle

Architecture

Other undergraduate

1836.0

137.7

8

Imperial College London

Geology

First degree

1833.9

137.5

9

University of Glasgow

Social policy

Other undergraduate

1828.5

137.2

10

NCG

Marketing

Other undergraduate

1817.4

136.4

 

A couple of subject ranking examples

Physics and Astronomy top 10.

Rank

Institution

Type

WSSS

WSSQ

1

Lancaster University

First degree

1685.1

127.4

2

Nottingham Trent University

First degree

1663.6

126.0

3

University of Birmingham

First degree

1571.6

119.7

4

University of St Andrews

First degree

1517.9

116.0

5

Keele University

First degree

1511.7

115.6

6

University of Hull

First degree

1510.9

115.6

7

University of Bath

First degree

1494.0

114.4

8

University of Sheffield

First degree

1471.3

112.9

9

University of Sussex

First degree

1456.7

111.9

10

University of Hertfordshire

First degree

1452.7

111.6

 

 

French

Rank

Institution

Type

WSSS

WSSQ

1

Coventry University

First degree

1686.9

127.5

2

Queen's University Belfast

First degree

1667.9

126.2

3

Northumbria University Newcastle

First degree

1595.6

121.3

4

University of Southampton

First degree

1595.1

121.3

5

University of Exeter

First degree

1547.5

118.1

6

University of St Andrews

First degree

1507.7

115.4

7

Newcastle University

First degree

1501.1

114.9

8

University of Leicester

First degree

1496.1

114.6

9

Edinburgh Napier University

First degree

1486.6

113.9

10

King's College London

First degree

1482.6

113.6

 

Links

Article in Perspectives

Full dataset

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

CFP: LLAS 9th e-learning symposium, 23-24 January 2014

The Future is Now! So come and tell us about it…

Do you make innovative use of technology in language teaching and learning? Have you been experimenting with MOOCs and wish to share your experiences? Do you use social networking sites, virtual worlds or mobile technology with your language students? Are you engaging students in the creation or use of open educational resources? If so, then the LLAS community would like to hear from you!

LLAS, Centre for Languages, Linguistics, and Area Studies welcomes proposals for presentations, workshops and posters at the 9th annual e-learning symposium, on 23/24 January, 2014. Abstracts for proposed presentations or workshops should be no more than 400 words.

Topics may include but are not limited to, the use in language teaching or research of:

  • social networking sites
  • mobile technology
  • MOOCs and open learning
  • blogs or wikis
  • open educational resources
  • virtual worlds, such as Second Life
  • virtual learning environments
  • online tools or courses
  • innovative online learning designs or environments
  • autonomous learning
  • blended learning
  • social media, e.g. micro-blogging (e.g. Twitter)
  • student-generated digital content

 

Submissions deadline: Friday, 4th October, 2013

Submissions to: llas@soton.ac.uk , using the downloadable submission form

More information on the event on the LLAS website 

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Unistats and the information cult.

Having been going on to anybody who would listen (and those who wouldn’t) about the Key Information Sets for the past year or so, I actually managed to forget today was the launch of the new unistats website. Once there was talk of the ‘information age’, but now we have an ‘information cult’. In the information cult, if there is enough information about things we can made good and right choices. Back in the 1980s there was an advert for a bank, which parodied their rivals—each time a customer asked a question the bank employee would reply, “Here is a leaflet about it”. The point about their bank was that they actually answered your questions in person.Image of Key Information Set

With the internet we have a gigantic worldwide “leaflet about it”, whatever “it” is. With the right information we can apparently make choices about which school to send our children to, what hospital to have our operation at, what car insurance to buy and which company is the cheapest for electricity this week. The launch of the new unistats has been receiving a lot of coverage, mostly negative on the Times Higher Website. The KIS contains information on salary, % of assessment which is coursework and scores from the National Student Survey, among other things.

As Roger Brown pointed out some months back, this is actually a moral issue. The idea that this information empowers potential students to make reasoned choices is very troubling. And like anything which is measured, universities (and any other organisations), as Adam Child is quoted as saying in the article, will focus on the what is measured rather than making improvements  which really matter. And where there are numbers there are league tables.

Some have suggested that choosing a university is becoming like buying car insurance a la Compare the Market. This is nonsense. You can change your car insurance company, you can move house, you can change your spouse and you can even your bank (allegedly we are more statistically more likely to change our spouse than our bank).  The time and money expense of university means a wrong choice can be disastrous.  Like a pawn move in chess it is made forever.  In one of his books, Scott Adams, creator of the Dilbert comics talks about the confusopoly, an economic system sustained on the collective ability of service providers to confuse consumers with complex pricing structures, tariffs and performance measures. Perhaps that is what we are coming to here with universities.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Glossary, Websites and Further Reading: Student information and surveys

NSS: National Student Survey. UK survey of final year undergraduate survey undergraduates conducted annually since 2005. Results are published at institutional and disciplinary level within institutions is minimum threshold of 23 students and 50% response rate is met. http://www.thestudentsurvey.com/

PTES: (Postgraduate Taught Experience Survey) and PRES: (Postgraduate Research Experience Survey). Annual surveys of finishing taught and research postgraduate students run by the Higher Education Academy, though not every institution participates every year. Findings are confidential to the individual institutions though overall reports are published.  http://www.heacademy.ac.uk/student-experience-surveys

Key Information Set http://www.keyinformationsets.com/

“Key Information Sets (KIS) are comparable sets of information about full or part time undergraduate courses and are designed to meet the information needs of prospective students. From September 2012 all KIS information will be published on the Unistats web-site and will also be accessed via a small advert, or ‘widget’, on the course web pages of universities and colleges. Prospective students will be able to compare all the KIS data for each course with data for other courses on the Unistats web-site.” Source: HEFCE http://www.hefce.ac.uk/whatwedo/lt/publicinfo/kis/

Higher Education Academy http://www.heacademy.ac.uk/student-experience-surveys

Further reading

Canning, J. et al. (2011) Understanding the National Student Survey: Investigations in Languages, Linguistics and Area Studies. Southampton: Subject Centre for Languages, Linguistics and Area Studies. Available from: http://eprints.soton.ac.uk/197699/

Child, A. (2011) The perception of academic staff in traditional universities towards the National Student Survey: views on its role as a tool for enhancement. MA Dissertation, Department of Education, University of York. Available from: http://etheses.whiterose.ac.uk/2424/1/Final_Thesis_Version.pdf

*Maringe, F. (2006). ‘University and Course Choice: Implications for Positioning, Recruitment and Marketing’. International Journal of Educational Management 20, 466–479.

Ramsden, P. et al. (2010) Enhancing and Developing the National Student Survey. London: Institute of Education. Available from: http://www.hefce.ac.uk/media/hefce/content/pubs/2010/rd1210/rd12_10a.pdf

Renfrew, K, et al. (2010) Understanding the Information Needs of Users of Public Information About Higher Education. Manchester: Oakleigh. Available from: http://www.hefce.ac.uk/media/hefce/content/pubs/2010/rd1210/rd12_10b.pdf

*Richardson, J .T. E. et al. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education 32, 557-580.

*Richardson, J.T.E. (2005). Instruments for obtaining student feedback: a review of the literature. Assessment and Evaluation in Higher Education 30, 387-415

Surridge, P. (2009) The National Student Survey three years on: What have we learned? York: Higher Education Academy. Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_three_years_on_surridge_02.06.09.pdf

Williams, J. et al. (2008) Exploring the National Student Survey: Assessment and Feedback Issues. York: Higher Education Academy. Available from: http://www.heacademy.ac.uk/assets/documents/nss/NSS_assessment_and_feedback_issues.pdf

*Subscriptions may be required. Other items are open access

I have made a word version of this list available in humbox.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

Some questions that didn't make it onto the National Student Survey

Richardson , J., Slater J. and Wilson, J. (2007) The National Student Survey: development, findings and implications. Studies in Higher Education 32.5, pp. 557-580 :

This paper from 2007 reports on the piloting of the UK National Survey (NSS). The settled version of the questionnaire contains 22 questions, but the pilot contained 45 items. Interesting the pilot contained negative questions (where agreeing indicated a negative experience). Questions on workload were dropped due to a lack of internal consistency.

To do well on this course you mainly need a good memory

It was clear what standard was required in assessed work.

It was clear what I was required to attend, prepare, and to do throughout the course.

I found the overall workload too heavy

I feel confident in the subject knowledge I acquired

I had as much contact with staff as I needed

Overall questions

Overall I feel the course was a good investment

I would recommend the course to a friend

I wonder whether these would have been preferable to the "Overall, I am satisfied with my the quality of my course questions which is tends to take preference over all others.

The second pilot contained the question “It has been difficult to answer many of the questions because of the variability of my experience (Interesting only 12% strongly agreed and 18.2% agreed).

Just thought it would interesting to share the questions which did not make it.

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon

In (sort of) defence of ratemyprofessors.com

A few weeks back the Times Higher published an article on student survey fatigue. Students fill in some many surveys including the National Student Survey, module surveys and institution wide surveys (coincidently, some the questions on the latter surveys are similar or identical to those on the NSS). I suggested on Twitter that the ineffectiveness of current surveys means that we want to do more surveys to fix the shortcomings of the existings surveys in order to give a ‘truer’ picture. Much of the reason that we are considering a national survey of language students stems from the shortcomings of the NSS outlined in our recent report. Should this new national survey go ahead I’m sure that it too will have its own shortcomings.

Legg and Wilson’s recent paper in Assessment and Evaluation in Higher Education on the reliability of www.ratemyprofessors.com vis-à-vis in course evaluations is interesting in its own right (their title "RateMyProfessors.com offers biased evaluations" deserves an award for clarity). But at least one thing could be said in defence of www.ratemyprofessors.com. It is unambiguous about who is the target of the evaluation—the teacher. With most other surveys it is unclear whether the students are being asked about the course, the teacher, the content, the university or the programme of study. The students don't know when they are answering the questions and we don't know when we are analyzing their answers.

See also: 4 ways to Avoid Survey Fatigue in Higher Education

 

 

  • Twitter
  • del.icio.us
  • Digg
  • Facebook
  • Technorati
  • Reddit
  • Yahoo Buzz
  • StumbleUpon