Student Evaluation Season

What Can We Do With Student Evaluations?

The 2019 Spring Course Evaluation window opened last week. Gulp.

Most of us have mixed feelings about course evaluations. On one hand, there’s value in surveying students about their perceptions of our teaching. For many of us, evaluations are our main source of information about students’ experiences in our courses. Students have a perspective on our teaching that no one else can have. As Maryellen Weimer points out, “the front of the room looks different when viewed from the desk.” Since our students’ learning is our goal, it’s essential for us to find out how effectively we’re reaching them.

On the other hand, there are plenty of shortcomings in the current process. Supiano and Berrett summarize three important objections faculty raise: Research suggests that course ratings are prone to bias, especially against faculty of color; students are unlikely to have the perspective to evaluate matters such as a faculty members’ knowledge of the field; and “these surveys are often used in tenure and promotion decisions and as proxies for really evaluating teaching and learning.”

Since we have a flawed instrument but students’ feedback is essential, here are several ways we can take ownership of the process:

  • Seeking the feedback you need. It’s important to talk to your students about the survey, explaining its purpose and how you intend to use their feedback to improve your course. You can also ask them to comment on specific aspects of the course or your teaching. We like to ask students what helped them learn, or what would have helped them learn better. You may even want to create and distribute your own survey, one more tailored to your course and designed to collect the feedback you really value and could use to improve your teaching.
  • Encouraging high response rates. The more student responses we get to these surveys, the better the quantitative and qualitative data we have to analyze, so it’s helpful to monitor response rates while the evaluation window is open, and remind students to complete the survey. You can also offer a bit of class time. Even with the electronic version, you can ask them to take out their phones and navigate to the survey. Then, you can leave the room so they can complete it.
  • Interpreting the results carefully. Since finding out how students responded to us, our teaching practices, course content, etc. can be an emotional, sometimes overwhelming experience, we recommend systematically organizing and analyzing the results, much as we would do with other types of data. First, remove irrelevant comments (like students’ opinions on your attire). Next, you can sort the comments into two categories (e.g., strengths and weaknesses), grouping together those that say nearly the same thing, and noting the most frequently made comment(s). You can also look to the written comments for insights on the responses to the Likert-scale questions. Syracuse University offers additional approaches to organizing and analyzing student comments in their guide to Interpreting and Using Student Ratings of Teaching Effectiveness.
  • Responding by making adjustments to our courses. When the dust settles, you can reflect on the semester yourself. How do your perceptions compare to students’? What concrete changes could you make to your course based on themes you found in the survey data? If we can give students specific examples of how we’ve adjusted our courses in the past, they’ll take our requests for genuine feedback more seriously.

If you’d like support interpreting and responding to student evaluations, we can help with everything from providing a safe space to vent about biased comments to working with you on a major revision to a course. Please get in touch, and we look forward to hearing from you.