Active Learning

Does Active Learning Work? A Review of the Research

This study examines the evidence for the effectiveness of active learning. It provides a definition of active learning and explores the different types of active learning most frequently discussed in engineering education literature. Those outside of engineering will likewise find this source helpful in providing concise definitions, literature review, and valuable questions that will promote instructor’s understanding of active learning.

Classroom Activities for Active Learning

Actively engaging students motivates deeper thinking about course content, brings additional energy to a classroom, and helps an instructor pinpoint problem areas. This article provides summaries of current practices and gives practical suggestions for implementing active learning in a variety of disciplines. Topics covered include: Questioning techniques, small groups, whole class involvement, and reading & writing exercises.

Thurnau Professor Alford Young: Engaging Students in Critical Thinking

Professor Alford Young, Arthur F. Thurnau Professor in Sociology and the Department for Afroamerican and African Studies (DAAS), discusses strategies for helping students develop the complex thinking skills central to learning in the social sciences. Using a variety of course materials and teaching strategies, Professor Young helps students develop their ability to ask good questions, examine their own assumptions, analyze course materials and social structures, and construct well-supported arguments.

 

Fall 2020 Course Evaluations: Creating Useful Questions

Students solving problems on a whiteboard As we pass the middle of the term, instructors are asked to think about course evaluations that students complete at the end of the term (November 19-December 9). By November 17, U-M instructors are invited to preview evaluation questions and create a few of their own if they wish. What principles or goals might guide you in that process?

In this blog post, we review the university-wide questions that appear on end-of-semester evaluations, as well as those added for Fall 2020 in particular, and we offer guidance on how to make the most of instructor-created questions. These principles can also be used to create questions for feedback that you collect at other times of the semester. In addition, this previous CRLT blog post provides strategies for increasing student response rates, and this Registrar's site contains details about the course evaluation process.

Revisiting Active Learning: Bridging the Gap Between What Students Perceive They Learn vs. What They Actually Learned

Students gathered for a chemistry study groupWith the construction of dedicated active learning spaces across U-M’s campus, widespread professional development focused on active learning, and many instructors looking to increase student engagement, students are experiencing active learning more and more in their time at the University of Michigan. But how do students perceive this kind of instructional approach? Studies have indicated that the majority of students respond positively to active learning, and although resistance occurs, it occurs at relatively low levels (Finelli et al, 2018). However, a new study points to a potential aspect of students' experiences of learning in such classrooms that instructors may want to address (Deslauriers et al, 2019). In short, while students in active learning classrooms learn more, they may feel that they have learned less.

The authors looked at students’ outcomes and their perceptions of learning in a large-enrollment introductory physics course (Deslauriers et al, 2019). While this study was performed in a STEM classroom, the researchers highlight ways in which these principles might also be extended into non-STEM active learning classrooms. Students in the course were divided into two random groups: one which would experience “active instruction (following best practices in the discipline)” while the second group received “passive instruction (lectures by experienced and highly rated instructors).” These groups then switched the type of learning they did in a subsequent unit, to allow for comparison. Students participating in the active learning sections earned higher grades, suggesting they learned more.  But in self-reported surveys, those students perceived that they had learned less compared to the lecture-based sections. 

Creating useful student evaluation questions

Students solving problems on a whiteboard As we approach the middle of the term, instructors are already asked to think about the student evaluations of teaching that happen at the end of the term. In late October, U-M instructors will be invited to preview evaluation questions and create a few of their own questions if they wish.  What principles or goals might guide you in that process?

In this blog post, we review the questions that are used University-wide on end-of-semester evaluations, and we provide guidance on how to make the most of the instructor-created questions. These question-writing principles can also be used to create questions for feedback that you collect at other times of the semester.

The current student evaluation of teaching has 10 required items (rated on a scale of “Strongly Agree” to “Strongly Disagree” for all but one, as noted below). Two of these will no longer be included as standard questions after 2020. These questions are:

CRLT Resources on Active Learning

Collage of pictures showing students working in groups, raising their hands, and writing on flipchart paperFaculty and GSIs from across campus are invited to explore our newest resource dedicated to active learning. At CRLT, we work every day with instructors who are committed to engaging their students actively both inside and outside the classroom. As Michael Prince explains, “Active learning is generally defined as any instructional method that engages students in the learning process. In short, active learning requires students to do meaningful learning activities and think about what they are doing.” (Prince, 2004). Research from Prince as well as a number of other sources (Freeman, 2014; Hake, 1998; Ruiz-Primo et al., 2011) indicates that having students actively engaged increases learning outcomes across disciplines (Ambrose, 2010; Bonwell & Eison, 1991). This new resource showcases the diversity of active learning techniques used by instructors at U-M, from the humanities and arts to STEM, from small seminars to large lectures, to demonstrate not just what active learning is, but how it works in classrooms right here on campus.

The website includes:

  • Reflecting on Your Practice: We designed this inventory to help you identify areas in which active learning could be used in your classroom and to suggest opportunities to build on strategies that you already use.
  • Implementing Active Learning: Integrating active learning can be beneficial to student learning, but it does come with some challenges. We share tips and techniques for integrating active learning strategies while avoiding common pitfalls.
  • U-M Faculty Examples: In these brief case studies, U-M instructors share how they use active learning in specific courses, as well as how they have refined their approach over time. From large Physics lectures to small Screenwriting seminars, these examples span the range of complexity and diversity of approaches to engaging students.
  • Resources: This section includes discipline specific resources and the research that speaks to the efficacy of active learning. 
  • Share your Example: Are you using active learning in your classroom? We would love for you to share your examples, as we hope to grow this resource to include even more voices in the conversation.

Gauging Learning with “Muddiest Point” Student Understanding Checks

Faculty Name Joanna Millunchick
Course Materials Science and Engineering 220
  • Muddiest Point Evaluation
  • Video Resources
  • Lecture Replay

Introduction

Joanna Millunchick noticed that students were entering her MSE 220 introductory course with wildly different amounts of preparation and prior knowledge, and preparation directly impacted course performance. She devised a system of muddiest point understanding checks to determine which concepts were unclear, and then created screencasts to supplement lectures.

Active Learning in the Course

At the end of each lecture, students submit a quick Google form survey explaining which, if any, concepts from the lecture or homework they found unclear. Millunchick then looks for concepts where 30% or more of the students expressed confusion, and creates a screencast to illustrate the idea. This can take the form of an extra homework problem with the solution narrated on screen, or a few lecture slides with explanation of a foundational concept that had been referenced in lecture. Students can access these videos via the course website, and refer to them while completing homework or reviewing for exams.