Course evaluations yield low response rates, mixed emotions
Kenton Bird, director of the School of Journalism and Mass Media at the University of Idaho, said he’s seen and read through numerous course evaluations over the years and admits low student response rates pose a problem for the university.
He tends to look at evaluations with a low response rate with hesitation.
“When 10 out of 27 students respond online (it’s) below 50 percent,” Bird said. “You have to be a little more skeptical of the validity of the responses, particularly with the comments, you start to see patterns that have multiple students mention some of things that they liked or disliked.”
At the end of every semester, course evaluations open up on VandalWeb and students are encouraged to submit feedback on their classes and their instructors’ teaching abilities.
According to the Faculty Staff Handbook, student evaluations have two purposes — to assist individual instructors in improving their own teaching and to assist academic administrators in counseling instructors about their teaching. Additionally, evaluations are weighed as a factor in tenure, promotion and salary determinations.
Patricia Hartzell, Teaching and Advising committee chairwoman, said student response rates for course evaluations are important in reviewing evaluations because a higher number increases the validity of the feedback. She said response rates at UI range anywhere from 30-100 percent.
Bird said he sees a negative response rate for a particular course as an opportunity to counsel the faculty member and work with him or her on strategies to improve, not as a means for handing
“If I see students consistently are concerned about the clarity of the assignments or things being on the exam that weren’t necessarily cover in class, I may suggest to the instructor to take a look at their syllabus,” Bird said.
A different approach
After changing from a written paper evaluation process to the current web-based paperless evaluation in 2002, UI saw a dramatic decrease in student participation.
The UI College of Law’s average response rate for course evaluations hovers in the 70-100 percent range, according to Richard Seamon, associate dean for Faculty Affairs of the College of Law.
“The last time we did an average, we came up with something that was around 90 percent,” Seamon said. “The reason, I think, is that these are not online evaluations, we actually distribute hard copies.”
The College of Law uses a simple hardcopy form of an evaluation, with a few broad questions about instructor’s presentation and knowledge of the material, as well as a space for suggestions for future classes and a comment section at the end.
Like Bird and Hartzell, Seamon said if an instructor gets lower response rates, it is not representative of the class. Likewise, a 90 percent response rate, for example, gives administrators the impression that they’re getting the majority view of students.
Seamon said the College of Law has not switched to an online version because there is a concern it would lower the response rate. He also said many law faculty believe the narrative, open-ended questions are more informative than numerical feedback. Although, Seamon said there are still some faculty who think a mixture of online and offline evaluations would be ideal.
Another option for both students and administrators is accessing course feedback from websites like ratemyprofessor.com and ratemyteachers.com, websites designed for students to evaluate college professors anonymously to help other students in determining their course schedules.
Unknown to most, UI students have the ability to access numerical summaries from course evaluations about any professor on campus through the Institutional Research and Assessment office.
The numerical summaries are not displayed publicly, and an Access Request Form must be submitted to the office to obtain them. The process does not require a public records request.
Dean of the College of Engineering Larry Stauffer said he has no problem with displaying the numerical summaries publicly for students to see. He compared displaying the summaries to the contemporary way of buying items online.
“Anytime you go on Amazon, you can read the reviews,” Stauffer said. “When you get a recipe to cook something, you see what other people say. Why would we exclude a student feedback from a course?”
Bird said he thinks displaying the summaries without the comments would lack context because the summaries would only show the average, not qualitative evaluations of the instructors. He said the responses would likely be uncharacteristic of the larger population because of statistical outliers.
“If you’ve got somebody reading the responses and you’ve got threes and some fours and then all of a sudden you get zeroes and ones on a student or two,” Bird said. “A social scientist would disregard those responses, but here they get lumped in.”
Seamon said publishing summaries is a subject of controversy within faculties on campus. He said some professors support the idea because it promotes openness and transparency, while others fear public numerical summaries could encourage teachers to try to be popular instead of effective.
“Sometimes you want to have really high expectations for students, and some students can resist that a little bit — especially if they feel they’re pushed to work harder than they really want to do,” Seamon said.
Seamon said the argument in favor of publishing summaries is that students should have the ability to make an informed decision about what courses they take and what professors they select.
Jennifer Johnson-Leung, assistant professor in the Department of Mathematics, said she thinks publishing the numerical summaries is a terrible idea.
She said there has been studies showing the simplest judgments and biases of students play a big factor in the feedback they provide. She said those judgments and biases often don’t change from the first day of class.
“What you wear on the first day of class, what your accent is and your gender makes a huge difference in student evaluations scores,” Johnson-Leung said. “These things are measured very well in student evaluations — learning outcomes aren’t.”
Hartzell said the numerical scores students give their instructors don’t always provide a clear picture if someone is a good instructor or not. She said in some cases, it is more of a popularity contest and doesn’t effectively show an instructor’s strengths or weaknesses.
“This is why we changed it from a student evaluation of a course to a student feedback form,” Hartzell said. “I would argue if I were to look back on my education, the people that I gave high scores were not always the people that help me learned the most.”
UI professors weigh in
In regard to how evaluations are utilized on the UI campus, Hartzell said she believes students can be unkind when they are given anonymity. Despite this, she said she thinks evaluations are a powerful tool for giving a faculty member direction on how to improve and change courses.
Director of General Education and UI professor Rodney Frey said he sometimes questions the effectiveness of evaluations because they are used as a factor in promotion and tenure.
He said he finds it interesting that course evaluations are left to the students and he wonders why UI doesn’t have peer evaluations — where one instructor evaluates another.
Hartzell said she feels many instructors — both senior and junior — don’t want to get marked down on their evaluations because it is a factor in tenure, promotion and merit. Hartzell fears some instructors may hold back on doing something in the classroom that can be perceived as too difficult or pushes the boundary too much. Hartzell said if that’s the case, it is a disservice to students.
Hartzell said she worries the system — at UI and beyond — may encourage easier classes but in the long run not maintain the quality of teaching required for the current job market. She said she would like to see a shift in the balance of how the evaluations are used and how instructors are working in the classroom.
“They get out of the university with their degree in some discipline, but they can’t hold a job in that discipline,” Hartzell said. “They learned a little bit less, they weren’t pushed hard to achieve. We get really smart kids here, and we need to push and push, so when they leave the students are well prepared.”
Patrick Hanlon can be reached at email@example.com or on Twitter @pathanlonID