Collecting Assessment Data
This page provides an introduction to assessment definitions and frameworks. Resources are also listed that may help guide assessment projects. To read about CRLT’s assessment, curriculum and learning analytics services visit our services page. To connect with us about a potential project, submit a consultation request here.
Key Definitions & Frameworks
Data sources that are useful to consider in assessing student learning within an academic program, major or degree include the following:
Evidence of learning outcomes
Direct measures of learning allow students to demonstrate their learning for faculty to assess how well a program's students are meeting the expected level of proficiency for skills or knowledge. Examples include capstone projects, papers, standardized tests, observations of students in a clinical setting, and quiz questions that align with a key area of knowledge needed.
- Embedded assessments are direct measures of student learning that serve purposes both as a course requirement for students (i.e., a normal work product like a paper, quiz, or a design project) and as part of a program's assessment processes. They are most frequently collected from required courses, capstone courses, or key classes where a student must demonstrate mastery of a specific learning objective important to the department.
Indirect measures of learning gather students' perceptions of and satisfaction with their learning. Common examples are focus groups and surveys of students and/or alumni.
Multiple methods of assessment pair direct and indirect methods. These assessments are valuable because:
- Research indicates that students are not always able to accurately self-assess their learning, so the use of indirect measures alone may be inaccurate.
- Some outcomes (e.g., attitudes) can be assessed only through surveys, interviews or focus groups.
- Indirect measures (e.g., student dissatisfaction with some aspect of their learning experience) can help explain results seen through collection of direct measures (e.g., low student performance on a key learning outcome).
Background information about the students in the course or curriculum
For example, what academic preparation do students bring into a course or academic program? What are the demographics of students in a class or program? What are their career or post-U-M educational aspirations?
Documentation of the learning experience
Materials that can document the specific learning experiences students have had. From students, this might include evidence of student engagement with key learning resources (videos, documents in an LMS system). From instructors, this could include instructor reflections, syllabi, lesson plans or assignments.
Resources
The following are short articles, sample instruments and other resources where you can learn about direct and indirect assessment approaches.
Assessing Engaged Learning Outcomes
The papers below address approaches for assessing student learning outcomes that are often embedded in curricula and courses:
- Development and Assessment of Student Social/Civic Responsibility and Ethical Reasoning by Samantha K. Hallman, 2016.
- Development and Assessment of Collaboration, Teamwork, and Communication by Stephanie M. Kusano, Amy J. Conger, and Mary C. Wright, 2016.
- Development and Assessment of Self-Agency and the Ability to Innovate and Take Risks by Stephanie M. Kusano, Mary C. Wright, and Amy J. Conger, 2016.
- Development and Assessment of Student Creativity by Samantha K. Hallman, Mary C. Wright, and Amy J. Conger, 2016.
- Development and Assessment of Intercultural Engagement by Stephanie M. Kusano, Amy J. Conger, and Mary C. Wright, 2016.
Indirect Evidence of Learning
Large Surveys Used at U-M
- The Office of Budget and Planning lists several surveys, including UMAY, which assesses undergraduate experience.
- The First Destination Profile from the University Career Center provides data on where LSA, Public Policy, and SMTD students go after graduation.
U-M Alumni and Exit Surveys
- Sample exit surveys from U-M departments (from May 2009)
External validated student surveys (some are fee-based)
- Assessing Intercultural and Global Competence: Survey and Measurement Tools (From the Fall 2010 Provost's Seminar on Teaching)
- Guide to Surveys Used in the Wabash National Study of Liberal Arts Education (49 institutions, including the University of Michigan, participated in this project).
- For other topics, the U-M library has an online guide for finding validated tests and measures.
Focus groups
- Focus groups allow students to collectively hear other students' experiences and reflect on achievement of key learning goals for a course, curriculum, or educational innovation.
Direct Evidence of Learning
Rubrics are commonly used as an assessment tool for papers or projects.
- For examples of rubrics developed by expert teams, see AAC&U VALUE rubrics for evaluating core knowledge, skills and attitudes.
- For information about rubric development, see CRLT Occasional Paper No. 24.
- An example of a project that uses this type of assessment data is "Teaching Close Reading Skills in a Large Lecture Course" by Theresa Tinkle, Daphna Atias, Ruth McAdams, & Cordelia Zukerman, English Language and Literature, LSA.
Quiz questions, linked to specific key learning objectives
- An example of a project that uses this type of assessment data is "Online Quiz Use in a Large Lecture Genetics Course (.pdf)" by Patricia J. Wittkopp & Lisa Sramkoski, Molecular, Cellular and Developmental Biology, LSA.
Concept inventories
- Concept inventories are reliable and valid tests that are designed to test students' knowledge of key concepts in a field. Often, they can be used to make comparisons in student learning over time (e.g., a student's performance at the beginning and end of a course) or between students at different universities. They are most commonly used in science, math, and engineering.
- Examples of concept inventories from engineering have been collected by the Foundation Coalition.
Background information about the students in a course or curriculum
Atlas enables LSA faculty to create course-specific data reports on a range of topics related to student performance, such as: enrollment and grade histories of students in a given course, enrollment and grade connections between courses, and relations of course grades to pre-college measures (ACT/SAT scores and AP exams).
Documentation of the learning experience
Common measures to document learning activities include:
- Syllabi
- Instructor reports of key instructional activities
- Metrics of engagement with course materials via LMS data
Contact CRLT at [email protected] if you would like to consult further about utilizing our assessment services.