From the CRLT Blog

Not just the same old drill: Student-authored test questions improve critical thinking

November 30, 2015
0 comment
crlt

4 people in a group talkingFaculty frequently name critical thinking as one of the most important goals for student learning. However, a key challenge to cultivating critical thinking can be the development of complex assessments. This can be especially difficult in large classes, when many tests and quizzes are in a multiple-choice format.

In a recent study published in the Journal of Dental Education, a team of U-M faculty from the School of Dentistry (Carlos Gonzalez-Cabezas and Margherita Fontana), School of Public Health (Olivia Anderson) and the Center for Research on Learning and Teaching (Mary Wright) investigated a new approach to mitigate these challenges. This student-centered approach to testing asks students to work in teams to design their own multiple-choice questions.

In the study, students in an introductory Dentistry class (Cardiology) were asked to work in groups to write higher-level thinking exam questions on various units of the class. Students were asked to avoid writing questions at lower levels of Bloom’s Taxonomy (i.e., the  ability to understand and re­member). Instead, they were encouraged to develop multiple-choice items that would test their higher-order thinking skills (e.g., the abilities to evaluate and synthesize information). Student teams were given extra credit if their questions actually met that goal. All questions were posted on a Google Doc for the whole class to comment on and study. Instructors used these questions on exams, with minor modifications, to encourage students to move beyond a surface-level understanding of the material.

To research the impact on student learning, the authors compared the cognitive level of the exam questions written by students to those written by instructors for a previous year’s class. They also compared student performance on the two different types of exams, student-written versus instructor-written. Student experience with the exercise was assessed through a survey with 70% response rate. Finally, the researchers compared the background of the two different groups of students taking the course. College GPA, generally found to be the best predictor of dental school performance, was equivalent for the two groups.

Overall, students wrote higher cognitive level questions compared to instructors. Of the student-generated exams, over two-fifths (42.2%) of the exam items were found to assess high-level cognitive skills. In comparison, a very small proportion (15.6%) of the instructor-authored items measured skills such as evidence-based deci­sion making, synthesis, and evaluation.

Additionally, students performed as well or better on these more complex assessments. Those taking a student-generated test scored higher on the midterm (87% vs. 82%) and on the final (89% vs. 83%), when compared to those taking an instructor-generated exam. Differences on the midterm were statistically significant.

In a survey about their experience with the student-generated exam exercise, over three-quarters (77%) of students reported that it helped them on exams, and a similar percentage (73%) reported that the assignment enhanced their critical thinking skills. One student noted, “I really learned well having to critically analyze the material myself in order to make questions.” Another remarked, “This exercise forced me to evaluate questions and review why they were right and wrong, instead of just taking a test…”

Why were these students more successful at creating higher-level exam questions than the expert instructors? The authors can only speculate, but it could be that the conditions under which the students wrote the questions (collaborative generation, with guidance and feedback) were very different from conditions in which many faculty typically write exam questions. Moreover, experts’ difficulty with articulating tacit knowledge is well documented, so it may be that novices are better able to identify new applications and evaluative perspectives for concepts.

For more information about the student-generated testing approach, see:

Gonzalez-Cabezas, C., Anderson, O. S., Wright, M. C., & Fontana, M. (2015). Exam questions developed by students lead to higher cognitive level of learning. Journal of Dental Education, 79(11): 1295-1304.