Why the Benefits of Course Evaluations Online Evaluations Outweigh the Disadvantage of Lower Response Rates


Peter Gold - University at Buffalo, College of Arts and Sciences

Notes on response rate

Course Evaluations provides timely diagnosis and feedback to faculty about their teaching and syllabus. Administrators find the evaluation data useful for identifying exceptional courses and teaching needing further attention. When evaluations are both voluntary and online, participation drops unless there are no other changes in the administration protocols. With preparation, and encouragements, the response rates rise. Careful analysis shows that response rates generally have little effect on the meaning and utility of student ratings. Changing response rates do not change evaluation scores or comments because the distribution of responses doesn’t change.


Faculty who encourage and expect students to evaluate their courses get higher response rates. Faculty who tell students why it is important to participate and how the previous evaluations of previous students have assisted their teaching get higher response rates. Anecdotally, faculty report success in regaining former response rates over several semesters. Deans and chairs improve student response rates when they make it clear to both faculty and students that they value student input.


Surveys that are long and complicated hurt participation and rarely yield more helpful evaluations than shorter formats because student data is impressionistic and imprecise. Surveys need both a multiple choice format and a long-answer format. Reports to faculty that are long and complicated discourage engagement in the process. However, surveys are best left unchanged during the transition to online evaluation.


Response rates may start too low but grow over time. Compare the response rates when in class evaluations started with the current participation to predict how response rates are likely to grow.


Response rates may vary because of campus traditions, practices, and expectations. Course Evaluations offers options to match campus practices.

Weighing the advantages of online evaluation

  • Compare in-class and online student ratings for the same courses and teachers regardless of response rate. In a direct comparison of evaluations there was no systematic change in evaluation ratings and changes attributable to response rates were small and may be confounded by other effects. Individual ratings did not change with the transition to Course Evaluations.
  • Compare reports to faculty and note how quickly faculty may view and download their evaluation reports. After Course Evaluations closes to students, evaluation reports may be released to faculty in minutes rather than months. Reports are attractive, customized, and easy-to-read. Reports are designed to improve faculty attention and outcomes. Written comments are presented as an integral part of the survey report.
  • Compare the quantity and quality of the survey long answers. Online written comments are typically more detailed and focused on topics of teaching and learning. Online comments are more useful to faculty. Faculty read the comments sooner because they are included as a part of the report and require no administrative attention.
  • Compare the rates of survey completion and the rates of student participation (defined as accessing the evaluation website.) When response rate is counted as the number of unique students who log into Course Evaluations, the rate of participation is higher than the percentage of surveys those students complete. Voluntary online surveys are selectively completed and returned while in-class evaluations may be blank or incomplete.
  • Compare the safety and security of data archives for both surveys and long answers. Course Evaluations secures the files to prevent tampering. Accessibility and anonymity are enhanced while op-scan and other formats usually require the forms and written comments to pass through many hands and file drawers.
  • Compare the flexibility of the survey processes which allow different course formats, schedules, faculty and disciplines to join a common process without high administrative costs. Course Evaluations will accommodate the most complex academic arrangements.
  • Compare the cost of administrative support, training, classroom time, supplies, and the labor required to produce paper reports. A Course Evaluations license - which includes unlimited use, training, support, and upgrades - also reduces the time required of faculty, administrators, IT and IR, and assistants. Course Evaluations is surprisingly economical and affordable - in many schools it costs less to use than previous arrangements and requires fewer personnel because it makes optimal use of existing campus records. The ‘data export’ feature allows quick bench-marking and analysis of large data sets.

A Selection of Features in Course Evaluations

Course Evaluations …

  • Provides sophisticated comparisons, benchmarks, standardization where required, and an accessible and welcoming format to students and faculty.
  • Provides better and more accessible reports for those who have access on your campus - faculty, chairs, deans, directors, students.
  • Provides security and anonymity at a higher level than op-scan/in-class protocols. Sophisticated record keeping and logs provide the campus and administrator with excellent insight into security, student participation and curriculum.
  • For example, although the evaluations are anonymous, it is possible to compare respondents with non-respondents. The students with a higher GPA and higher grades are more likely to be the core respondents.
  • For example, it is possible to examine the relationship between disciplines, class sizes, courses, and grades with evaluation ratings.
  • For example, reminders to students may be personalized through a campus portal or Course Management System. Reminders to faculty may be customized to include an up-to-date list of non-respondents while the surveys are, as always, anonymous.
  • Options will allow the collection of data for assessment’
  • Is efficient and evaluates courses regardless of format and at multiple campuses.
  • May also be easily used for assessment, elections, non-course surveys, non-anonymous surveys, and other special instructional settings.
  • Is customized for your campus requirements.

Comparing In-class Evaluations with Course Evaluations Online Evaluations:  A Case Study from the University at Buffalo, College of Arts and Sciences

A singular focus on response rates: often leaves many other considerations unaddressed including evidence for usefulness, validity and reliability despite lower response rates. The following table summarizes course evaluation protocols based on 15 consecutive semesters of course evaluations - 8 in class and then 7 online. In-class evaluations were administered on preprinted bubble sheets with spaces for (narrative) long answers. Online evaluations questions and forms were unchanged.


Course evaluations are used by: faculty for feedback at end of semester; dean and chair for promotion and tenure dossiers; program reviews by chair, dean, provost; and students for selecting courses.


Roll Out: The transition from in-class to online surveys (for faculty, administrators, students) required: continuing responsive communication with all participants throughout; limiting changes in questions and formats; bench-marking key indicators (avoiding ‘assertions’ and ‘impressions’); involving campus leadership at several levels; building interest through meetings and incentives; using as many different notifications and reminders as possible; reminding all of schedules and deadlines; seeking involvement and public support of faculty and administrators including provost, dean, chairs, IT and IR; using an incentive program; keeping all informed about survey progress.




Response Rates

In-class

Online

lower

% surveys completed & % students completing some or all online (F03 in class & F04 online)

70%

43% & 59%

(S00 in-class & S07 online)

27%

27% & 42%

# surveys submitted - (F03 in-class & F04 online - compares delivery method)

33,400

31,100

- (S00 in-class & S07 online - compares development over time)

19,400

21,028

Benchmarked ratings - (same instructors teaching same courses and key courses)

benchmark

unchanged

% evaluated of active sections (‘faculty response rate’)

77%

100%

Encouraging voluntary participation (using penalties & incentives, and reminders)

none

some

Survey period (open to students for completion)

15 minutes

30 days

Quality of long-answer (written) responses

very brief

very good



Time and costs

less time & $

Time of manager to prepare output for distribution

25 hours

8 hours

Time to prepare survey by administrators in each of 31 departments of CAS

4 hrs X 31

none

Time for each notification and reminders of students (18,000) and faculty (1,300)

N/A

1 hour

Time to manage database, server, course management system (IT and IR staff)

days

hours

Time after close of survey to make reports available to faculty (manager & TA)

2 to 3 months

1 day

Time to prepare and deliver archival and summary reports to chairs (manager & TA )

4 months

2 weeks

Cost of materials (bubble sheets, copies, CD files, supplies)

$8,000

$100

Costs for custom evaluation program vs. annual license fee, Verisign certificate

$20,000

$8,000

 

 

 

Preparing and modifying survey questions, formats

 

easy

Adding department-specific, assessment-specific, course-specific or non-course questions

moderate

easy

Modifying presentation, format, faculty reports & comparisons

difficult

easy

 

 

 

User access, archives, security of data

 

high

Protection and security (including long-answer ‘narrative’ responses)

on paper

archive

Anonymity of student responses (“Are responders and their responses linked?”)

handwriting

protected

Archive of prior evaluations

paper files

online

Correctable data errors and security features

problematic

few errors

Student complaints (“too many reminders & surveys, anonymity, can’t log in”)

few

few

Faculty complaints (“time, responsiveness, incentives, security”)

few

few

Bench-marking key comparisons and utilizing campus data sources

difficult

easy