Institutionalizing Assessment as Part of Introductory Course Reform in Physics

Institutionalizing Assessment as Part of Introductory Course Reform in Physics

Academic Year:
2009 - 2010 (June 1, 2009 through May 31, 2010)
Funding Requested:
$9,720.00
Project Dates:
-
Applicant(s):
Overview of the Project:
The Department of Physics is undertaking significant reform of all of its large introductory physics courses. These reforms include modernization of content and major changes in the structure of the classes. This Whitaker proposal seeks funds to establish a solid framework for assessment in advance of these reforms, and to utilize these assessment tools during the next two years. In the long term, we will use these tools to continuously monitor change in the effect of our courses on students, building more substantive and quantitative assessment in to the fabric of the Department's teaching.
Final Report Fields
Project Objectives:
The first-stage proposal, “Institutionalizing Assessment as Part of Introductory Course Reform in Physics”, was granted funds in February 2010. Its goal was to put in place both a system and a culture of assessment around the large introductory courses we teach in physics. This step was motivated by a series of major reforms taking place in the structure and content of these courses, especially the elimination of small discussion sections and the move to a new calculus-based Physics for the Life Sciences course. Report on Progress from the Whitaker I Proposal The first step in our proposal was to combine historical information about our students obtained from MAIS databases with information about their performance internal to our classes drawn from the LSA SAMS system. This step is complete. We now have combined student background data (high school performance and test scores, Michigan performance before this class etc.) with detailed internal course grade information for 23,310 intro physics students over an eight-year period. In a separate step, we have also added information on which students participated in Science Learning Center study groups for one of these years. We are currently working to fully analyze these historical data with several specific questions in mind. First, we would like to learn how to identify at risk students early, so that we might intervene and offer them a path to success in these courses. Second, we would like to better understand the long-standing small performance gap between male and female students in introductory physics courses (Figure 1). These historical analyses will form the core of the Honors senior thesis project for physics concentrator Kate Miller, who intends to become a high school physics teacher. Figure 1: Mean course grade versus GPA for 23,310 students in introductory physics displays an 0.26 lower grade point for female (red) versus male (blue) students. Third, we would like to explore what factors lead to varied outcomes for students who enter the class with very similar profiles. What do the unusually successful students do? What do the unusually challenged students do? This question is difficult to address solely with this historical data, and a part of this Whitaker II proposal involves gathering additional data for this purpose. A second major goal of the Whitaker I proposal was to establish a system for administering community standard pre- and post-test assessment tools across all of our introductory physics courses. This step is now complete. We have implemented standard assessment tools from the Physics Education Community, the Force Concept Inventory (FCI, for first semester students) and the Basic Electricity and Magnetism Assessment (BEMA, for the second semester) across all of our introductory physics courses. The tests are administered through the CTools Test Center mechanism, and are overseen centrally by the Physics Office of Student Services. Running these independent of individual instructors is essential to making their application stable and permanent going forward. Results from the first applications of the FCI and BEMA pretests in each of our introductory courses are appended below. They already reveal two things. First, students entering our different introductory courses have an average preparation which is significantly different: medians (out of a max of 30) are 10.2, 12.4, 16.7, and 23.1 in Physics 125, 135, 140, and 160. Students are, in the mean, sorting themselves into a graded series of course levels. More important for us though, the distributions of pretest scores in these courses are all very broad. Students come to all of our courses prepared at very different levels. Fully recognizing this, and understanding better what to do about it, is a high priority item for us going forward. FCI and BEMA post-tests will be administered during the last week of the Fall term. Our final goal for the Whitaker I proposal was to establish a system which would allow us to make this kind of data available in real time, as it appears. To do this, we need to learn how to extract MAIS data within the department, so that we no longer rely on collaboration with Rob Wilke in the Dean’s office to obtain the data. We are just getting started on this task, and expect to have it in place for the 2011/12 academic year.
Project Achievements:
Establishing a System for Monitoring the Impacts of Reform on Diverse Student Groups To assess the impact of course reforms we need to know how every student is transformed by participation in our class. Each brings to class a suite of preparation and prior performance, and leaves having accomplished some fraction of the goals we have set. We need to measure how our classes transform the incoming student to their ultimate performance. Over the last two years, we have been exploring the mapping of the incoming student measures to their class performance, as measured by final grade. For each student who enters a class at Michigan, many quantitative measures of prior performance are available; standardized tests like the SATs and ACTs, advanced placement test scores in various subjects, grades in previously completed courses at Michigan. The connections we are uncovering are supported by high statistics (Figure 1), and features beyond overall GPA (the dominant effect) are also apparent, particularly SAT/ACT Math performance. To enrich our analysis of student performance in our classes, we are in the process of augmenting final grade information (which we note is all the academic performance information available within the MAIS student data systems) with internal information drawn from each class. This includes exam scores for three midterms and a final, classroom participation (i>Clicker) grades, and credit obtained for online homework. It is possible for us to add this information to our assessment because it is collected systematically in the LSA Student Assignment Management System (SAMS). SAMS was built in Physics in 1999 (with a grant from the LSA IT Committee) and it has since grown to be used by many LSA departments with large multi-section courses, such as Chemistry, Astronomy, Geology. The services SAMS provides will be migrated to CTools within the next two years, so our ability to perform this type of analysis will be maintained and, we hope, simplified. This data collection effort was the primary activity proposed in the Whitaker I grant. As a measure of the faculty commitment to this activity, consider the fact that all instructors of our introductory course sequences, 10 full-time faculty and four lecturers, participated in four meetings over the summer months to plan the migration of our courses to the new delivery model. All agreed to support the actions we’re taking to improve assessment.
Continuation:
Extending and Enriching Empirical Assessment Tools that map the incoming preparation of the student to their ultimate performance in the course provide a powerful new approach to assessing what happens in our classes. This is an important tool, but it does not provide the whole story. For example, the data contain no information on student perceptions and attitudes. How do they feel about the subject of physics? What aspects do they find most difficult? Which set of resources do they find most helps them learn? There is every reason to suspect that students in the different sequences will respond differently. For one thing, students in the Science & Engineering courses tend to be nearly a year younger than their counterparts in the Life Sciences sequences (Figure 2). Yet the younger students tend to hold more positive attitudes toward physics than the latter, as measured by their response to the E&E question “I had a strong desire to take this course”. How should our teaching methods be adjusted to accommodate a diverse set of student attitudes? Figure 2: Enrollment histories of Science & Engineering-oriented sequences (left) and Life Sciences-oriented sequences (right), disaggregated by academic level. The Life Science courses attract mostly sophomores and juniors, whereas first-years and sophomores dominate the Science & Engineering courses. Fall 2010 total enrollment is ~2400 students. We seek funds to expand our quantitative analysis of how the incoming student maps to ultimate performance, and enrich it with student attitudinal and perception data obtained from surveys and focus groups. This will allow us to go beyond our Whitaker I goal – to develop a system that will allow us to compare the performance of our students in traditional and reformed courses – by helping us understand differential benefits among sets of students characterized by their incoming record. Among the activities we will undertake are: 1. Establish an early warning system for at-risk students. We wish to extend our data collection to include the assessment exam given to all students by the Mathematics Department. These data, joined with ACT/SAT Math and other student information, are very likely to improve the correlation with expected final course grade. This knowledge will help us identify students with a high likelihood of failure (C– or lower) who we can then follow up for advisement. 2. Conduct resource use and usefulness surveys. Following the lead of Brian Coppola and the IDEA institute, we will survey students to understand which resources they use regularly and which they find most useful. Along with informing resource management decisions, the results of these surveys will help us improve the way we advise students. In particular, we can identify the resources that correlate with improvement over predicted grade, and share this knowledge with our students. Potentially, finer-grained analysis will allow us to identify specific sets of resources for different types of students. 3. Conduct focus group studies to reveal the role of student attitude. Our grade prediction model is accurate to half a letter grade, but this leaves a significant chance that an incoming B student may exit with an A or a C grade. What are the more successful students doing? Do they simply work harder? Or might it be that they tend to “think like a physicist”, or identify themselves as a member of a larger “geek culture”? We will engage with CRLT staff to undertake focus group studies aimed at gaining understanding as to how similar (in terms of predicted grade) students differ in terms of their engagement with and perception of physics. We budget for a total of eight focus groups, two for each of our four large enrollment courses. We will coordinate with CRLT on selection methods to maximize our ability to inter-compare groups across these courses. 4. Integrate our growing Learning Assistant cohort into Physics education research. We are taking the number of steps to grow the number of physics concentrators who wish to become teachers. Our Interdisciplinary Physics (IP) major is an excellent vehicle for this, and our new LA program offers a means to build a student cohort interested in teaching. The combination of the new, flexible IP degree with the growing involvement of our students in teaching has led at least five of our undergraduates to pursue secondary teaching since 2006. With Charles Dersham (CRLT/School of Ed), we are preparing a proposal for support from PhysTEC , a coalition of institutions that support model physics teacher preparation programs. We will enlist members of the LA community to build reporting tools and to perform statistical analysis of the large databases we are assembling with Whitaker support. Some of these steps we have already begun. For example, we are in the process of collecting survey results from a midterm evaluation that includes questions about use and usefulness. What is needed to accomplish our goals is people. We have conducted all our work on this topic so far using a model inspired by our physics research groups. Oversight and guidance is provided by experienced faculty, who work in close coordination with undergraduate and graduate researchers. Big projects are tackled over the summers; during the school year progress is more gradual. This research model for educational projects plays to our strengths and has worked very well for us in recent years. Funds from this Whitaker grant will be used to fund focus group studies managed by CRLT staff as well as to support two or three undergraduate students during the summer of 2011. These students will put in place tools required for regular data collation, along with a core suite of analysis tools that provide quick reporting function along the lines of the LSA ART system. The tools will then be available for monitoring and comparison as we continue our course reform process. During this time, we very much hope we will be able to involve a student in the new Master’s Degree in Postsecondary Science Education program being jointly run by the IDEA Institute and the LSA science departments . This project could provide excellent material for a Master’s thesis.

Source URL: https://crlt.umich.edu/node/85657