Follow this link to a short video describing this teaching strategy.
In intensive clinical courses, dentistry students frequently request study guides to organize and digest the deluge of content. Margherita Fontana and Carlos González-Cabezas, School of Dentistry, crowdsource this task via Google Docs as a learning activity to prepare students for exams. They assign groups of 10-15 students to each of ten major content areas.
Groups create their own Google Docs and work together to write the best possible exam questions (two per student) aligned with the learning objectives in the syllabus. To earn credit, questions must go beyond regurgitation of facts and require the evidence-based application of key concepts. The instructors provide a few questions as models. Groups share Google Docs with instructors, who provide feedback. After students revise their questions, instructors compile them in a new Google Doc that is shared with the entire class.
To motivate students, if questions meet the desired criteria, Fontana and González-Cabezas promise to create the majority of the exam from this pool (or slightly edited versions of the questions). However, if the learning objectives are not covered by the students’ submissions, they promise to create their own challenging exam questions on those topics. Overall, this approach fosters higher-order learning while also leading to the creation of a pool of potential exam questions for both current and future courses.
Does this strategy lead to higher levels of learning?
Fontana and González-Cabezas worked with CRLT to evaluate the impact of this teaching strategy. They found that:
- The student group-generated exams resulted in higher-level questions (on Bloom's taxonomy), compared to the instructor-generated exams.
- Students scored higher on the student group-generated exams.
- Did students do better because of a repetition effect (i.e., seeing the same question earlier) or because of some other characteristic of the exercise? Ideally, in the future, this could be tested using exams that have some student-generated questions from previous years, so students would see some student-generated questions again and some for the first time. Since this was the first year that Drs. Gonzalez and Fontana did the exercise, we instead had to rely on self-reports. Students did rate "seeing exam questions before taking the exam" as the most useful piece of the exercise, but there was no significant difference between that component and students' ratings of the helpfulness of (a) getting instructor feedback, (b) getting extra credit for writing higher-level questions, and (c) using Google docs to write and share questions. However, "working in teams to write the exam questions" was rated significantly lower.