Commentary

Using Math Diagnostics to Inform Course Placement in Community Colleges

Authors
Tatiana Melguizo
University of Southern California
Federick Ngo
University of Southern California

Given that community colleges are open-access institutions that serve a diversity of students with a range of skills, they need some means of identifying students’ readiness for college-level work. This typically happens via placement testing during college matriculation. Subsequently, we estimate that about 80 percent of all incoming California community college students are placed in developmental/remedial courses during the assessment and placement process (based on our calculations from the CCCCO Data Mart and data from a large urban community college district).

Low completion rates coupled with concerns about the accuracy of commonly-used placement tests such as the ACCUPLACER and COMPASS have prompted calls for reform. Studies have demonstrated that as many as one-quarter of students may be mis-assigned to their math courses, compelling policymakers and community colleges to seek alternative placement tools and practices in an effort to improve math remediation outcomes. There is limited placement policy research to inform these practitioner decisions, resulting in continual experimentation that may or may not be beneficial to students.

We were able to examine the impact of two types of placement policy experimentation on community college student outcomes: switching from a math diagnostic to the ACCUPLACER, and raising placement cutoffs. We first examined two colleges that switched from using a math diagnostic, the Mathematics Diagnostic Testing Project (MDTP), to using ACCUPLACER, a computer-adaptive test, to make placement decisions. The MDTP is a diagnostic tool developed by the UC/CSU and provides skill-specific information about student math background. The ACCUPLACER is a computer-adaptive test developed by the College Board that identifies student math skill using an algorithm that responds to student performance. Second, we examined student outcomes in one other college that raised test score cutoffs by seven points. The goal of the study was to compare the effects of math remediation under each of these policy contexts. Specifically, we used quasi-experimental methods to identify the impact of remediation under each policy and the change in impact following placement policy experimentation.

The findings indicate a possible advantage to using diagnostics to inform placement decisions in developmental math. All else constant, the two community colleges that switched from using the MDTP diagnostic to using a computer-adaptive test experienced a larger negative impact of remediation, with fewer students at the margin of the placement cutoffs enrolling and moving on to the next course in the developmental math sequence. Our supplementary analyses also showed that there were higher proportions of severe placement errors following the switch from diagnostics to computer-adaptive tests, suggesting that more students were incorrectly placed. Modestly raising placement cutoffs had no significant effects.

We make the following policy recommendations for placement in developmental math:

  • We suggest further consideration of the MDTP as a placement tool. It is reassuring that the new Common Assessment Initiative being piloted includes diagnostic components. The skill-specific information from diagnostics can be incorporated into placement policies to improve math placement decisions and also be used by faculty to tailor instruction in math courses.
  • We suggest that colleges experiment with lowering placement cutoffs instead of raising them. Our related research indicates that the negative consequences of over-placement are lower than the ones of under-placement. Under-placement increases the length of already long developmental education sequences.
  • We suggest using regression discontinuity as a means to evaluate cutoffs and the impact of placement decisions.

The full study can be found in “How Can Placement Policy Improve Math Remediation Outcomes? Evidence From Experimentation in Community Colleges”, Educational Evaluation and Policy Analysis, forthcoming.

Suggested citationMelguizo, T., & Ngo, F. (2016, January). Using math diagnostics to inform course placement in community colleges [Commentary]. Policy Analysis for California Education. https://edpolicyinca.org/newsroom/using-math-diagnostics-inform-course-placement-community-colleges