Learning Analytics

David Niemi, Vice President of Measurement and Evaluation, Kaplan, Inc., will present on learning analytics at Kaplan.

Sound learning analytics based on valid metrics give us new leverages to study and improve learning and other outcomes. This presentation focuses on how Kaplan uses learning analytics to identify the best learning paths for individual students; identify and implement approaches to help students who are struggling; test principles of learning and instruction in authentic learning environments at scale; and provide stronger evidence on the impact of new approaches and tools on learning and other key outcomes.

Sponsored by the Provost's Task Force on Learning Analytics, the Student Learning and Analytics at Michigan (SLAM) Seminar is a year-long speaker series. Lunch will be provided.  For more information about learning analytics at U-M and to view videos and slides from the 2012-2013 SLAM series, click here.

For information about the 2011-12 Symposium on Learning Analytics at Michigan series, click here.

Event Information
Dates: 
Fri, 02/08/2013 - 12:00pm - 1:30pm
Presenter(s): 
Dr. David Niemi, Vice President of Measurement and Evaluation, Kaplan, Inc.
Eligible for Certificate: 
Not eligible for Certificate
shadow

Joanna Mirecki Millunchick, U-M Professor of Materials Science and Engineering, will discuss her learning analytics research on screencasting in her materials science class, MSE 220. Because the course draws students with widely varying degrees of background knowledge about course concepts, Professor Millunchick developed screencasts to offer diverse students opportunities to review lecture topics and learn at a pace appropriate to their needs. This session will describe the screencasts and learning analytics research about them. This project was awarded a 2012 Provost's Teaching Innovation Prize.

Please note that this SLAM will be held on North Campus at the Robert H. Lurie Engineering Center's Johnson Rooms (3rd Floor). Lunch will be provided. 

Sponsored by the Provost's Task Force on Learning Analytics, the Student Learning and Analytics at Michigan (SLAM) Seminar is a year-long speaker series. For more information about learning analytics at U-M and to view videos and slides from the 2012-2013 SLAM series, click here.

For information about the 2011-12 Symposium on Learning Analytics at Michigan series, click here.

Event Information
Dates: 
Fri, 01/25/2013 - 12:00pm - 1:30pm
Location (Room): 
Johnson Rooms, 3rd Floor, Lurie Engineering Center (North Campus)
Presenter(s): 
Dr. Joanna Mirecki Millunchick, Professor of Materials Science and Engineering, U-M College of Engineering
Eligible for Certificate: 
Not eligible for Certificate
shadow

At this session, Provost Tristan Denley, Austin Peay State University, will discuss "Degree Compass," which uses predictive analytics techniques --based on grades and enrollment data-- to make individualized course recommendations for students. 

Lunch will be provided. 

Sponsored by the Provost's Task Force on Learning Analytics, the Student Learning and Analytics at Michigan (SLAM) Seminar is a year-long speaker series. For more information about learning analytics at U-M and to view videos and slides from the 2012-2013 SLAM series, click here.

For information about the 2011-12 Symposium on Learning Analytics at Michigan series, click here.

Event Information
Dates: 
Fri, 01/18/2013 - 12:00pm - 1:30pm
Presenter(s): 
Dr. Tristan Denley, Provost, Austin Peay State University
Eligible for Certificate: 
Not eligible for Certificate
shadow

"For assessment to be successful, it is necessary to put aside the question, 'What’s the best possible knowledge?' and instead to ask, 'Do we have good enough knowledge to try something different that might benefit our students?'"

-Blaich, C. F., & Wise, K. S. (2011). From gathering to using assessment results: Lessons from the Wabash National Study (NILOA Occasional Paper No.8). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.


Key Definitions & Frameworks

Data sources that are useful to consider in assessing student learning are:

  1. Evidence of learning outcomes

    Direct measures of learning

shadow

How can we use available data about students to fine-tune our instruction and facilitate their learning? Thanks to the Learning Analytics Task Force and SLAM lecture series, this question is getting lots of attention on campus this year. Some especially innovative answers are provided by 2012 TIP winners Tim McKay, David Gerdes, and August Evrard (pictured below, left to right), whose "Better-Than-Expected" (BTE) project used analysis of large data sets to support student learning in introductory physics courses. 

Dr. Evrard

Dr. GerdesDr. McKayThe three Arthur F. Thurnau professors analyzed data from 48,579 U-M intro physics students over 14 years to generate models for predicting student success in these gateway courses. Correlating data concerning students' preparation (e.g., standardized test scores, prior U-M GPA, previous coursework, etc.), background (gender, socioeconomic status, etc.), and progress through the courses (homework grades, exam scores, class participation, etc.), the BTE team discovered that prior academic performance was a significant indicator of success in the introductory courses. In effect, students' progress through the semester was largely determined by their starting point. The team realized that, in order to develop the learning potential of all students, they needed to move away from a "one-size-fits-all" instructional model.

Enter E2Coach.  With support from the Gates Foundation, the group built an Electronic Expert Coaching system which they launched across all intro physics courses in January 2012. Read more »

shadow