Gauging Learning with “Muddiest Point” Student Understanding Checks

Resource Title:
Active Learning - Andries Coetzee (Linguistics)
Course Type:
Faculty Name Joanna Millunchick
Course Materials Science and Engineering 220
  • Muddiest Point Evaluation
  • Video Resources
  • Lecture Replay


Joanna Millunchick noticed that students were entering her MSE 220 introductory course with wildly different amounts of preparation and prior knowledge, and preparation directly impacted course performance. She devised a system of muddiest point understanding checks to determine which concepts were unclear, and then created screencasts to supplement lectures.

Active Learning in the Course

At the end of each lecture, students submit a quick Google form survey explaining which, if any, concepts from the lecture or homework they found unclear. Millunchick then looks for concepts where 30% or more of the students expressed confusion, and creates a screencast to illustrate the idea. This can take the form of an extra homework problem with the solution narrated on screen, or a few lecture slides with explanation of a foundational concept that had been referenced in lecture. Students can access these videos via the course website, and refer to them while completing homework or reviewing for exams.

Challenges and Solutions

While students could raise their hands and ask for clarification during lecture, they were often resistent to do so, and stopping to explain foundational concepts could take away time from course content. Muddiest point allows students to express confusion privately, and then take steps to supplement their understanding outside of class time, guiding their own learning without taking time away from lectures. When the original system of handing out index cards became too cumbersome and resource intensive, Millunchick moved to an online form to collect these responses, even further anonymizing the process.

Changes in Instruction

The screencasts allow Millunchick to stay focused on the content of the lecture, knowing that she will provide opportunities and resources for students to learn foundational concepts necessary to understand the lecture material. Rather than pausing midstream to explain a point of confusion, she can create (or reference previously created) screencasts with more preparation and planning to best address the question or misunderstanding.

Benefits for Students

After engaging the help of CRLT to design and facilitate analysis, Millunchick had data that proved that the screencasts were statistically significant in improving the course outcomes of students who used them regularly. The difference was particularly marked in students who entered the course without much preparation or prior knowledge, but was positively and significantly correlated for any students who used the screencasts.