The University's Learning Analytics Task Force has recently announced a new Fellows Program. This program will bring together faculty, staff, graduate students, and postdocs in a semester-long collaborative study of Learning Analytics. If you're considering applying for this opportunity or just curious what "Learning Analytics" means, read this guest post by Natalie Sampson, Public Health Ph.D. student, and Graduate Student Instructional Consultant at CRLT.
As academics, many of us think a lot about assessment in the classroom. How do we best assess our students' learning? How can we be sure they are getting it? In this information age of "big data," Learning Analytics is an emergent field that is tackling these core pedagogical questions.
You may have heard the term "Learning Analytics" (LA) around campus but still wonder what it refers to. According to the Society for Learning Analytics Research (SOLAR), LA is "the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs."
- Tracking how often students log into CTools to see which resources they do or don’t download--and assessing whether these online interactions correlate with better grades. Are those videos of lecture you posted worth your time and energy? (For an example of a productive response to such a question, read about how U-M Engineering professor Joanna Mirecki Millunchick assessed her supplemental screencasts.)
- Using course or registrar data to predict "at-risk" students, for the purposes of improving student success. Are students who are non-majors faring worse in your course than those who are majoring in related disciplines? What resources could improve their learning? (For an example of productive responses to these kinds of questions, read about the U-M Physics Department's ECoach project.)
- Developing an integrated database of information (e.g., admissions data, demographics, course performance, course management system use) about students in your department to make evidence-based decisions about curriculum.
We must approach this data responsibly, of course. As in other types of studies, LA requires that scholars consider how data is collected and the level of transparency they afford their study participants—in the case of LA, their students. Given that LA is generally intended to inform some form of intervention, there is potential for unintended consequences. An early alarm system to predict ‘at-risk’ students could instigate stereotype threat or other biases that impact teaching and learning. While such quantitative data can be exceedingly telling, we must simultaneously avoid reducing students to mere numbers.
Still, with these warnings spelled out, it makes sense to use existing data to design interventions in your classroom or with your department to ensure that students are, in fact, "getting it." If your interest is piqued, consider attending one of this year's SLAM sessions at CRLT, review the materials from previous SLAM presentations posted on the CRLT website, or apply to be a Learning Analytics Fellow. Additionally, the Provost's Learning Analytics Task Force will be awarding grants for faculty, staff, and students to engage in learning analytics projects. More information about the Task Force and grants.
Natalie Sampson, MPH
Graduate Student Instructional Consultant