They’re boring, they take ages, and the evidence suggests they’re sexist. Is it time to ditch them?
No matter the differing levels of exams and stress that plague students, one thing is for certain at the end of each semester: the untimely arrival of the Course Experience Survey [CES].
“I think it’s important, but I just didn’t have time,” said Connor Yee, a second year biology student who didn’t complete the CES for any of his five classes this semester.
This is not an uncommon sentiment among UVic students, as data collected by the Learning and Teaching Center [LTC] from 2014 to 2017 shows that the average response rate of students to the CES is approximately 44 per cent.
For something that less than half of students participate in, a CES can carry a substantial amount of weight for some professors. For teaching professors and regular faculty members, the quantitative — multiple choice — results are viewed by the professor and department chair, in addition to being a component in the consideration for tenure, promotion, and merit. Written comments are viewed only by the professor and can be shared in the same way as the quantitative data, although not selectively.
Sessional instructors have both written and quantitative comments provided to them and their department chair. Deans and department chairs are made privy to the composite quantitative results of their faculty or department, and a “CES University Roll-up Report” is posted publicly on the LTC website for each term.
“Student feedback is crucial in evaluating teaching—it reflects students’ first person accounts of teaching and of the course, it informs us about the learning environment that is created in an online or virtual classroom, and it gives some indication about the quality of teaching,” wrote Laurene Sheilds, acting executive director of the Division of Learning and Teaching Support and Innovation and current leader of the administrative unit responsible for the CES.
—————
“There are often systematic gender biases in course evaluation results,” Rose-Redwood said
—————
Students are typically given two weeks towards the end of the semester to complete the CES. This past semester, the survey was open from Nov. 17 to Dec. 1, the last day of classes. This has proven problematic for students who are eager to give feedback on their classes but swamped with end of term assignments.
“I was a bit busy last week, trying to finish my final papers and stuff, so I didn’t get to finish all of them, [but] I try to prioritize which courses I need to review,” said Salena Dhillon, a second year business student who completed two out of five surveys. “One of my profs was really good, so I definitely wanted to make them more noticeable, and the other one was really bad.”
To ensure that CES results don’t impact a professor’s grading process, survey results are made available after all grades have been processed and submitted.
“I wish we had better means of incentivizing students to fill them in, but it’s hard to imagine any such system that would be non-controversial,” wrote Simon Devereaux, an associate professor in History, in an email to the Martlet. “I sometimes worry that the response rate may be along the lines of Rate My Professor — that is, a sample that is heavily driven by the students who feel most positively (or negatively) compelled to reply, rather than the student body overall. All that said, however, one takes what one can get.”
Reuben Rose-Redwood, an associate professor in the Geography department, expressed concerns with more systematically troublesome repercussions of relying on student evaluations as a basis for awarding tenure and promotions.
“There are serious flaws with using CES as a measure of teaching quality, because there have been numerous studies suggesting that there are often systematic gender biases in course evaluation results,” Rose-Redwood said.
A 2014 study from the University of California at Berkeley, which looked at student course evaluations from universities in the U.S. and France, found that student evaluations of professors are “biased against female instructors by an amount that is large and statistically significant,” and that this bias impacts student assessment of “even putatively objective aspects of teaching, such as how promptly assignments are graded.”
—————
“I don’t think students put much faith in the CES, and instructors, rightly, feel that the scores are not a reliable indicator of instructional quality,” added Doyle.
—————
“I don’t think students put much faith in the CES, and instructors, rightly, feel that the scores are not a reliable indicator of instructional quality,” added Doyle.
However, Devereaux noted the CES allows for professors to engage with students in ways that may otherwise be inaccessible due to the constraints of the course. “I often don’t have enough one-on-one interactions with students for me to form an impression of what’s working and what isn’t. The CES reports can help a lot with this.
“I’ve always been (so far, at any rate!) pleasantly surprised at how generous most students are in their numerical scoring of my courses,” Devereaux wrote. “The most useful comments are the ones that are specific—noting a particular issue of an assignment, lecture, tutorial or grading practice that the student finds either troubling or helpful.”
Generally, written comments were stated as giving professors the most insight into student’s perception of the class. “In my experience, most students offer constructive feedback for improving courses. However, some students don’t complete the CES at all and others use it as an opportunity to anonymously rant against their professors,” said Rose-Redwood. “One year, a student said I should let my sideburns grow out!”
According to Susan Doyle, an assistant teaching professor in English, some of the comments submitted by students have caused distressing repercussions on professors. “Often, there is one comment that is so negative and hurtful that it overshadows all the positive comments,” wrote Doyle in an email. “I have heard stories of instructors who became depressed for months over a single nasty comment.
“I don’t think students put much faith in the CES, and instructors, rightly, feel that the scores are not a reliable indicator of instructional quality,” added Doyle. “I have always felt that students find it easier to give higher [CES] scores, perhaps because it spares them having to justify a lower score.”
With such a reportedly flawed and problematic system, is it worth it for students to take time out of their busy end of semester schedules to prioritize the CES?
“It’s easy for me to say ‘yes’ — it’s not my time that’s being used up in filling in the CES, and I have more inherent interest in the future character of any given course than might any student who has just finished taking it,” responded Devereaux. “Basically, though, I would hope that our students would be public-spirited enough to take the time to do something helpful for their instructors and for the instructors’ future students.”
Sheilds acknowledges that there is room for improvement within the current CES system, but stands by the overall value of student feedback. “With the CES occurring at the end of a course — it means that students don’t always see the changes that instructors may make in their courses,” said Sheilds. “Their feedback is important to us, as instructors, as departments, and as an institution—to continue to enhance or improve courses — and ultimately student learning.”