Programs with more components that address a broad range of student needs tend to work better, but research has not clearly identified which components are most critical for success.
Efforts to reduce procedural complexity and to proactively communicate with students about required tasks can produce impacts that are impressive, given their relatively low cost. At the same time, these impacts are often modest in absolute terms, suggesting that such strategies have a role to play but are unlikely to lead to dramatic improvements in college access and success among those who most need additional supports. Instead, these strategies may be most effective when they are integrated into comprehensive systems of support.
One of the most valuable resources that students have for navigating to and through college is an informed, involved adult providing guidance. While more affluent students often have such an adult through family connections, coaching and advising interventions aim to provide similar guidance to lower-income students.
Colleges and universities are complex organizations, with administrative units often operating in silos. Providing students with comprehensive and orchestrated supports can require many different units to communicate and collaborate in new ways. Even when implemented with fidelity, programs that appeared promising in some contexts do not always yield similarly positive results when replicated in other contexts.
Most U.S. high school graduates pursue some form of post-secondary education. However, many exit with no degree or credential.1 For example, among those who entered college in 2014 as first-year, full-time bachelor’s degree-seeking students, 64% completed their degree within six years. Completion rates are even lower at two-year institutions. Students from wealthier backgrounds are more likely to access college, and conditional on enrollment, they are also more likely to complete.2
Differences in degree attainment by socioeconomic status are a particular concern, given that college credentials have been linked to improvements in employment, earnings, health, and other socially important outcomes. Furthermore, many students must borrow to afford college , which compounds the problem of degree noncompletion. If students from low-income backgrounds disproportionately assume college debt without the labor market payoff of a college credential, this has the potential to exacerbate income inequality.
The barriers to completion take multiple forms and are tied to decisions about where to apply and enroll. First, financial barriers can hinder students if they or their families are unable or unwilling to pay the costs of receiving a college education.3 Second, the college landscape is enormously complex, and students who lack access to information and support in navigating their college options may enroll in institutions that are not the best fit given their needs and interests.4 Third, behavioral barriers can hinder student success.5 For example, once enrolled, students may be overwhelmed by the range of academic paths available and have a hard time making informed choices about their courses and major. Fourth, students may lack sufficient academic preparation for college-level coursework.6 Finally, from a psychological perspective, students may feel socially marginalized and that they do not belong if they perceive themselves to be outside the majority identities on campus.7
These barriers to college success—financial, informational, behavioral, academic, and psychological—are not mutually exclusive and can interact. The overall complexity that students must navigate to enroll in and complete college is a dominant theme emerging from research on barriers to college access and success and their potential solutions. Given this reality, improving student outcomes may require alleviating challenges in multiple domains simultaneously.8 We review programs that support students’ college success by addressing one or more of these barriers. Although not the main focus of this chapter, because there is overlap in the approaches, we also discuss findings from some programs designed to increase college enrollment rather than completion; many of these studies also examine persistence as an outcome. Other chapters separately cover research on the impacts of student grants and scholarships, student loans, and the Federal Work-Study Program.
There is great variety in the college success efforts that we discuss below. Here, we highlight several key dimensions along which programs vary:
Program goal and duration: Some programs begin in high school and support students to cross the threshold of college or to attend a “higher-quality” institution. Other programs support students exclusively once enrolled. Still others begin in high school and continue into college.
Relationship with educational institutions: Some programs are run by colleges themselves, while others are run by outside organizations and offer support beyond what colleges provide.
Programmatic targeting: Some college success efforts aim to reach and are open to all students enrolled in an institution, whereas others target specific populations and set criteria for participation.
Programmatic intensity and structure: The lightest-touch efforts involve increased communication to students to encourage engagement with campus offices and services that are important for their continued enrollment and success. The most intensive programs offer a set of structured supports, and they shape how and where students spend their time by requiring and/or incentivizing take-up of these supports.
Understudied aspects of college success efforts: The most prominent goal of higher education to build human capital and improve labor market outcomes. However, most studies examine the effects of interventions on educational outcomes such as enrollment, persistence/credit accumulation, and completion.9 Relatively few studies of college success efforts track participants all the way through college and into the labor market, leaving open questions about the labor market payoffs of improved college-going outcomes for students targeted by college success programming.
Policy considerations: Policymakers who are considering the implementation or expansion of college-going supports should carefully consider the student populations targeted and the specific barriers to college success that they face. Program designers should take seriously the complex organizational environment of colleges and universities and the additional administrative structures that may be needed for collaboration and orchestration of work across administrative units within organizations. Program designers should also carefully attend to the constraints that students face and how to incentivize participation in college success efforts or how to build college-going supports seamlessly into students’ college experience. Finally, program designers should take into account the financial sustainability of any programmatic effort, given the often high costs per student of the multifaceted programs that we describe here.
More successful programs give structure to students’ college experiences and often build in incentives and/or requirements to encourage take-up of program services. Programs with more components that address a broad range of student needs tend to work better, but research has not clearly identified which components are most critical for success.
Among post-secondary success programs that have strong evaluations, a few quite comprehensive interventions that address a range of barriers that college students face have produced the largest effects on degree completion. The exemplar in this category is the City University of New York Accelerated Study in Associate Program (CUNY ASAP) and expansions of that program to community colleges in Ohio (Ohio ASAP). ASAP programs provide students with financial resources, structured academic pathways, and a range of direct support services. Baseline completion rates among the populations served by these programs were just under 20%, and the intervention increased completion by 22 and 18 percentage points for the CUNY and Ohio programs, respectively.10,11
Longer-term follow-up shows that these programs not only reduced the time to completion but also increased degree attainment overall12 and, in the case of Ohio ASAP, earnings13 (the effects of CUNY ASAP on earnings have not been studied). A similar model for four-year college students was also evaluated at CUNY’s John Jay College. That program (Accelerate, Complete, Engage (ACE)) increased bachelor’s completion in five years by 12 percentage points, compared to a control group completion rate of 57%.14 There has not been a longer-term follow-up of ACE (not enough time has passed); hence, it is not yet clear whether the program increased degree completion overall or only reduced time to degree.
A few comprehensive interventions target increased attainment of credentials aligned with local labor market demand, including some associate degrees as well as shorter-term certificates. These interventions typically serve nontraditional students facing multiple barriers to completing a program; such interventions could be considered college success or job training programs. For example, Project Quality Employment Through Skills Training (QUEST) in San Antonio, Texas, offers financial aid, remedial instruction, counseling, referrals, skills training, and job placement assistance to adults seeking training for jobs in healthcare, information technology (IT), manufacturing, and trades. This program increased the completion of any healthcare certificate by 26 percentage points (compared to a control mean of 42%) after six years and earnings by more than $5,000 after nine years.15
The Valley Initiative for Development and Advancement (VIDA) offers financial support, full-time enrollment in degree programs, counseling, wraparound services, and a 16-week College Prep Academy, all aligned with local labor market needs to help low-income adults gain employment in higher-paying sectors. VIDA increased credential completion by 12 percentage points after three years (compared to a control mean of 54%). While the program did not increase earnings after three years, this amount of time may be too soon to see any effects.16
Other comprehensive programs show promise, but their effects have been somewhat smaller than those described above. For example, One Million Degrees (OMD) is a comprehensive program that provides financial aid, skill-building workshops, advising, and coaching to community college students in the Chicago area, supporting them in earning their first college degree. OMD increased the three-year completion of associate degrees by 8 percentage points, compared to a baseline completion rate of 39%. However, the effects of OMD were largely driven by students who enrolled in the program directly from high school, for whom there was a large participation effect: compared to the control group, students in the program were more likely to ever enroll in community college. Since most of the intervention components operate after enrollment, this raises questions about whether the comprehensive, in-college supports were the driving mechanism underlying the impacts on completion.17
Not all comprehensive programs have been effective. For example, Detroit’s Promise Path served students participating in the Detroit Promise program, supplementing the financial aid provided by the Promise scholarship with comprehensive coaching, including additional financial incentives for meeting with the coach, but it did not increase completion (though it increased the number of credits completed significantly).18 Washington's Integrated Basic Education and Skills Training (I-BEST) program promotes enrollment in and completion of credentials that are tightly linked to specific occupations with a comprehensive set of supports. A long-term evaluation did not find any detectible effects of the program on the completion of credentials requiring more than one year to complete or on earnings after six years. However, the study did not enroll as many participants as hoped, so the estimates are imprecise.19 The effects of Upward Bound, a relatively expensive comprehensive college access program that offers a range of services including tutoring and counseling, have been the subject of debate, but in any case, they are not consistently large across subgroups and sites.20
Comprehensive models cost more than some colleges are willing or able to spend. In fact, the Ohio ASAP sites have struggled to sustain the program, as discussed under key finding #4, and consistent fidelity of implementation may be difficult to maintain. The Scaling Up College Completion Efforts for Student Success (SUCCESS) program, an intervention being evaluated at five colleges, is an effort to implement the principles of ASAP at lower cost by offering students personalized coaching, financial incentives for meeting program requirements, data-driven tracking of participation and progress, and strategies to support full-time enrollment. To date, this program has not improved student progress much, though it was implemented during the disruption caused by COVID-19. Furthermore, there is some evidence of larger effects at sites where the program was implemented with more fidelity.21
The authors of the ASAP evaluation note that the program was more expensive than less comprehensive programs on a per-student or per-credit basis but less expensive on a per-degree basis.22 Furthermore, even among participants in the most successful completion programs, often half or more do not complete. This reality raises important questions about the value of credits that do not lead to a degree, and it points to the need to ensure that the student loan and accountability system protects students who complete some post-secondary education without obtaining a certificate or degree.
Informational barriers and procedural complexity can hinder students’ college access and success. For example, many students do not complete the Free Application for Federal Student Aid (FAFSA) each year because they do not think that they will qualify for financial aid, do not know about the FAFSA, or do not know how to complete it.23 Young adults can face significant challenges in navigating the higher education landscape. This is especially so, given that, from a developmental perspective, young adults can be comparatively present focused, impulsive, averse to seeking help, and inexperienced in handling complex tasks.24
Numerous light-touch interventions have been implemented and tested to address these concerns. These efforts do not involve building new support structures or programming. Rather, they typically aim to improve students’ college access and success by reducing procedural complexity, encouraging students to complete required tasks, and/or increasing students’ take-up of existing supports and resources that are already available but are often underutilized. These light-touch efforts typically do not include personalized help from program or campus staff. As a result, they can be relatively inexpensive to implement and scale.
Such efforts can help students navigate the administrative tasks associated with college-going.25 Offering college entrance exams during the regular school day and filing the FAFSA together with the annual tax filing are two examples of how reducing procedural costs and complexity can increase the completion of key college-going tasks, with subsequent effects on college enrollment and persistence.26 Similarly, clear, personalized communication about eligibility for scholarship aid dramatically increased application and attendance rates at the University of Michigan.27
In addition to reducing procedural complexity, proactively communicating with, supporting, and “nudging” students toward task completion can be an effective approach to encouraging students to complete certain processes.28 Several studies have documented the success of low-touch, nudge interventions that support students through the specific administrative steps required to apply to and enroll in college, to file or refile financial aid applications, and to take up basic needs supports.29 However, these positive impacts have not been replicated in all contexts.30
Despite this general promise, subsequent efforts to scale and/or expand such strategies have revealed important limitations.31 First, earlier studies tested the efficacy of proactive outreach to drive students’ completion of discrete, well-defined tasks. In contrast, subsequent efforts using similar strategies to shape more sustained student behaviors yielded less promising results. For example, in a set of studies focused on goal setting, student mindset, and success coaching—processes with different foci, more complex components, and longer time horizons—the authors emerge with a pessimistic view of the potential for nudge strategies to improve academic outcomes.32 This work, together with other studies of nudging in post-secondary contexts,33 point to the conclusion that proactive outreach and support can improve students’ completion of discrete, time-bound tasks but hold less efficacy on their own in sustaining behavioral change over time.
Second, proactive outreach is generally more effective for motivating completion of college-related tasks when it comes from a trusted, local source with which students have a preexisting relationship. One pair of experiments targeting high school students supports this observation, finding that outreach framed as coming from a student’s own high school counselor is more promising for affecting students’ completion of key college-going tasks than outreach from a more distal sender, such as the College Board.34 Similarly, in large-scale, state and national efforts to bolster FAFSA filing and reapplications, the impacts on FAFSA submission and completion are modest to null.35 In both of these studies, outreach was framed as coming from a centralized entity with which students had no affiliation. The authors of one of these studies conclude that scaling of such efforts needs to happen “locally,” institution by institution, rather than “globally” through a centralized entity.36 As an example of this premise, both email and text-based outreach integrated into undergraduate courses have shown promise for improving student academic performance.37
Relatedly, local organizations (e.g., schools, colleges, and counseling organizations) are better positioned to nudge students because they have better insight into the contexts in which students operate, the tasks that they need to complete, the required timing of those tasks, and students’ status with regard to completing required processes. Hence, local organizations can better craft tailored, relevant outreach. By leveraging administrative data, organizations can target communication only to students for whom specific messages are relevant. For example, when Georgia State University (GSU) was implementing a text-based chatbot to communicate with students intending to enroll at the university, it used data on students’ status with regard to completing required administrative processes to target communication on specific tasks (e.g., submitting the final high school transcript, attending orientation, completing loan counseling) only to those students who had not yet completed them.38
In summary, efforts to reduce procedural complexity and to proactively communicate with students with reminders about required tasks can produce impacts that are impressive, given their relatively low cost. At the same time, these impacts are often modest in absolute terms, suggesting that such strategies have a role to play but are unlikely to lead to dramatic improvements in college access and success among those who most need additional supports. In short, theories of action that rely on students to permanently change their behavior without institutions changing their systems, structures, and supports will likely fall short. Instead, these light-touch strategies may be most effective when they are integrated into comprehensive systems of support that account for the several dimensions along which students can falter in achieving college success.39
Many college interventions seek to provide less advantaged students with the type of college guidance and support that more advantaged students frequently receive. The most affluent students often have highly engaged parents who hound them about attending class, going to office hours, and doing their homework—or who even complete tasks on behalf of their child. College coaching and mentoring interventions try to match less advantaged students with adults or near-peers who can provide similar information, advice, encouragement, reminders, and help completing important tasks. In one field experiment evaluating college access coaches, the authors argue this point, noting that “our mentoring program appears to substitute for (…) parental or teacher time and encouragement.” 40
Both in-person and virtual coaching modalities have been evaluated in different student contexts at both the access and completion margins via randomized controlled trials. Evidence consistently shows positive effects of in-person coaching on the college access/entry margin (often with downstream effects on completion), with more mixed results on virtual advising and coaching post-college entry.
In-person, college access/entry: Several nonprofits operate college advising programs that have successfully increased college enrollment. One of the most effective coaching interventions is Bottom Line. In Bottom Line, full-time, professional advisors work with a caseload of 50-60 students and meet one on one with each student for approximately an hour per month starting at the end of the students’ junior year of high school and, if students attend a “partner college,” continuing throughout college. Students randomly assigned to Bottom Line advising were significantly more likely to attend any college and were more likely to attend a four-year college. 41 Students assigned to Bottom Line were also more likely to earn a bachelor’s degree in four years; the authors argue that this finding is most likely a result of students attending more well-resourced institutions, though the ongoing support after enrollment may also contribute to higher completion rates. 42
The nonprofit College Forward uses a similar model of individualized advising starting during students’ junior year of high school and persisting after enrollment, and it has produced similarly large increases in college-going and persistence. 43 Other variants of in-person advising include College Possible, which did not increase college enrollment overall but shifted students from two- to four-year colleges and to more selective four-year institutions.44
Integrating college coaching into the high school experience can also be effective. One study in Michigan developed a college planning module that teachers administered during existing 12th-grade classes (e.g., integrated into Senior English or during homeroom). The program did not increase enrollment overall, but it did shift who enrolled in college. Enrollment among lower-income, high-achieving students increased, while enrollment among higher-income, lower-achieving students declined. This compositional change led to overall higher persistence rates three years after the students graduated from high school, suggesting some of the lower-achieving students who did not enroll in college would have been likely to dropout.45 Another study of college planning courses in high schools with historically low rates of college-going also found increases in enrollment.46 Efforts to expand college-going by providing high schools with near-peer counselors dedicated to helping students navigate the college application process initially found small effects on enrollment among low-income students, but there were no effects in subsequent years. This could indicate that high school advising does not work as well when it is not woven into the curriculum, but the authors note other potential explanation for the declining treatment effect, including the possibility that more services were available in the control group schools over time.47
Virtual advising for college entry: Attempts to leverage virtual or remote mentors and coaches to support students have not been as successful as their in-person counterparts. Several interventions have found no effect of remote advising on overall college enrollment, though some have affected where students applied, modestly shifted enrollment in more well-resourced institutions, and increased students' sense of feeling supported in their college search.48, 49, 50
Coaching for completion: It has been more challenging to support students’ persistence through coaching after they enroll. InsideTrack is a for-profit company that contracts with colleges to offer students individualized advising for approximately a year, and research shows that its advising model significantly increases persistence and degree attainment.51 The InsideTrack model involves mostly remote coaching conducted via phone calls, email exchanges, and virtual meetings. The professional nature of InsideTrack (similar to the training for Bottom Line on the access margin) may help explain the program’s positive effects. The program is also customized for each campus, and coaches have access to information about students’ progress.
Monitoring Advising Analytics to Promote Success (MAAPS) is an advising intervention that increases the number of college advisors on a campus and encourages them to engage in more proactive, data-driven outreach to students. When MAAPS was implemented with high fidelity, students were more likely to complete credits and earn higher grades.52 However, as discussed in regard to key finding #4, the implementation of these promising models across different contexts has been difficult and, in the case of MAAPS, did not replicate the initial findings. Some colleges have integrated college persistence coaching into coursework (similar to the Michigan high school college access course model) focused on teaching time management. However, these efforts have not had large effects on academic performance or persistence.53, 54
Requiring or incentivizing tasks increases completion: Institutions often struggle to increase students’ take-up of services, limiting the overall effectiveness of these interventions. Students lead busy and complex lives and are often unlikely to seek out resources on their own. When an intervention component is promoted as required or incentivized, students are more likely to complete the task or process. While many college student success courses have not significantly affected student outcomes, one version that strongly emphasized that students were required to take the course increased performance, indicating that stronger messaging about the required nature of a college coaching program will increase students’ engagement with the program and improve students’ outcomes.55 Incentives to use services can also increase engagement with discrete tasks, though the downstream effect on enrollment and persistence is not clear. For example, in the CollegePoint advising model, the use of financial incentives significantly increased students’ likelihood of completing tasks such as meeting with advisors and applying for colleges, though ultimately it did not affect whether or where students enrolled.56 Similarly, the Virtual Student Outreach for College Enrollment (VSOURCE) intervention offered students gift cards for completing milestone tasks (e.g., registering for and taking the SAT), and it increased students’ completion of the tasks. However, it did not have an effect overall on college enrollment.57 As the CollegePoint study authors posit, it may be that completing these pre-college tasks are necessary but insufficient to ensure enrollment, given the many structural factors often outside of students’ control, such as institutional admissions, college affordability, and financial aid policy.58
Colleges and universities are complex organizations, with administrative units often operating in silos. Providing students with comprehensive and orchestrated supports can require many different units to communicate and collaborate in new ways. Even when implemented with fidelity, programs that appeared promising in some contexts do not always yield similarly positive results when replicated in other contexts.
The most optimistic evidence on sustaining and scaling comprehensive student support programs comes from evaluations of ASAP, which is described in more detail above (see key finding #1). The program was successfully expanded within CUNY and successfully adapted to a four-year degree context, and versions of it were successfully replicated in Ohio. Notably, however, only one of the three colleges that participated in the Ohio replication study continuously maintained its ASAP-based program.59 One of the two colleges that discontinued the program after the demonstration evaluation has recently reinstated it after a successful fundraising campaign led by the college’s leadership to ensure sufficient funding.60
InsideTrack’s student coaching model (see key finding #3 above) has also been successfully scaled.61 Since 2001, it has served over 3 million students across both two- and four-year institutions.62 Its unique approach—delivering high-touch support from highly trained professional coaches via telephone and electronic communication—keeps costs low and is highly scalable compared to other comprehensive student support programs.
Project QUEST continued beyond its demonstration phase and has served over 10,000 students since 1992.63 One replication focusing on adults enrolled in any program at partner community colleges, VIDA, has shown promising results,64 and another, Capital IDEA, targeting students in registered nursing associate degree programs in Austin, Texas, is being evaluated in an RCT. Both replications adapted Project QUEST’s programmatic approach to support student populations training for employment in different areas of specialization.
The evidence for scaling other comprehensive approaches is less encouraging. First, the MAAPS program, which featured technology-enhanced proactive advising, successfully increased students’ credit success rates and cumulative GPAs at GSU, but it failed to replicate GSU’s positive effects at 11 other universities. This failure to replicate has been attributed to local adaptations and early implementation challenges such as advisor turnover, low student engagement, and poor integration with other advising infrastructure.65
Second, the Stay the Course program was designed with replication in mind and showed promising experimental results at one campus of Tarrant Community College.66 However, an attempt to replicate it at a larger scale at the same community college two years later faced significant implementation challenges and yielded null effects on student outcomes. The program was discontinued after just one year.67 Despite this setback, two additional preregistered RCT evaluations are still ongoing.
Third, the SUCCESS program had no positive effects on persistence or credit accumulation after one year of implementation at seven community colleges in California, Indiana, New Jersey, and Ohio.68 Program implementation diverged from the SUCCESS model mostly due to adaptations during the COVID-19 pandemic and varied by college and semester. These adaptations, coupled with differing responses to external factors and the unique challenges faced by various student cohorts at each college, likely contributed to the lack of significant impact.
Evidence from implementation studies suggests several key barriers to scaling multifaceted student support programs with fidelity. First, cost is a major challenge. Comprehensive, multiyear advising programs that combine additional financial aid with other student supports are expensive, and many contexts lack the political will or resources to fund such programs.69
Second, local adaptation may be needed to tailor any program model to the unique resources, existing structures, and student needs that are present in a particular context. However, such adaptation can compromise implementation fidelity and, ultimately, program effectiveness.70 Established models offer varying degrees of flexibility. A recent meta-analysis indicates that the effectiveness of multifaceted student support programs increases with more intervention components, greater emphasis on full-time enrollment, and higher rates of student usage of advising and tutoring.71 These findings suggest that local adaptations must strike a delicate balance with implementation fidelity to achieve the desired outcomes.
Finally, the complexity of these programs can require significant institutional change, including changes to structured and informal communication, collaboration, and data sharing across administrative offices and academic departments and sometimes even across campuses or state agencies.72 Established organizational structures and cultures add complexity to the development of new processes required by the implementation of holistic student support programs. Multiple stakeholders must contribute to making core programmatic decisions, such as those about the program model, target population, staffing, timeline, sustainability, data management, and evaluation, among others. Such work requires extensive planning and coordination among departments such as academic affairs, advising, student services, financial aid, and other administrative units.73 In other words, implementing multifaceted student support programs is a complex endeavor, with sustained institutional leadership playing a crucial role in driving the organizational change often needed for successful execution.
These challenges are represented in the emerging qualitative evidence on strategies for scaling successful programs. Expansion of One Million Degrees at City Colleges of Chicago benefited from a unified vision, investment in a proven model, and a collaborative approach to advising where One Million Degrees staff partnered closely with the colleges’ advising departments.74 At Bronx Community College, scaling ASAP required articulating its role in student success, improving communication across the institution, and adjusting core college functions such as admission and enrollment, developmental education placement, and course scheduling.75 Evidence from the Detroit Promise Path’s expansion suggests that leveraging early success to secure funding and stakeholder support may facilitate effective scaling.76
Future research should explore whether effective comprehensive support programs can be successfully scaled across various contexts. Given limited resources, it is also important to identify the most critical components of these programs and develop cost-effective, adaptable models. Consistent reporting of key intervention features and contextual characteristics will help researchers synthesize evidence and inform policy action.
Irwin, V., Zhang, J., Wang, X., Hein, S., Wang, K., Roberts, A., ... & Purcell, S. (2021). Report on the Condition of Education 2021. NCES 2021-144. National Center for Education Statistics.↩︎
Ma, J. & Pender, M. (2023). Education Pays 2023: The benefits of higher education for individuals and society. College Board.↩︎
Dynarski, S., Page, L., & Scott-Clayton, J. (2023). College costs, financial aid, and student decisions. In E. A. Hanushek, S. Machin, & L. Woessmann (Eds.), Handbook of the Economics of Education (Vol. 7, pp. 227-285). Elsevier.↩︎
Smith, J., Pender, M., & Howell, J. (2013). The full extent of student-college academic undermatch. Economics of Education Review, 32, 247-261↩︎
Dynarski, S., Nurshatayeva, A., Page, L. C., & Scott-Clayton, J. (2023). Addressing nonfinancial barriers to college access and success: Evidence and policy implications. In E. A. Hanushek, S. Machin, & L. Woessmann (Eds.), Handbook of the Economics of Education (Vol. 6, pp. 319-403). Elsevier.↩︎
Reber, S. & Smith, E. (2023). College enrollment disparities: Understanding the role of academic preparation. Brookings.↩︎
Walton, G. M., Murphy, M. C., Logel, C., Yeager, D. S., Goyer, J. P., Brady, S. T., ... & Krol, N. (2023). Where and with whom does a brief social-belonging intervention promote progress in college? Science, 380(6644), 499-505.↩︎
For previous reviews, see
Domina, T. (2009). What works in college outreach: Assessing targeted and schoolwide interventions for disadvantaged students. Educational Evaluation and Policy Analysis, 31(2), 127-152.
Swail, W. S., & Perna, L. W. (2002). Pre-college outreach programs. In W.G. Tierney & L.S. Hagedorn (Eds.), Increasing access to college: Extending possibilities for all students (pp. 15-34). SUNY Press.
Page, L. C., & Scott-Clayton, J. (2016). Improving college access in the United States: Barriers and policy responses. Economics of Education Review, 51, 4-22.
Dynarski, S., Nurshatayeva, A., Page, L. C., & Scott-Clayton, J. (2023). Addressing nonfinancial barriers to college access and success: Evidence and policy implications. In E. A. Hanushek, S. Machin, & L. Woessmann (Eds.), Handbook of the Economics of Education (Vol. 6, pp. 319-403). Elsevier.↩︎
Harris, D. N. (2013). Applying cost-effectiveness analysis to higher education. In A. Kelly & K. Carey (Eds.), Stretching the Higher Education Dollar, (pp. 45-66) Harvard Education Press.↩︎
Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students. MDRC.↩︎
Sommo, C., Cullinan, D., Manno, M., Blake, S., & Alonzo, E. (2018). Doubling graduation rates in a new state: Two-year findings from the ASAP Ohio demonstration. MDRC.↩︎
Miller, C., & Weiss, M. J. (2022). Increasing Community College Graduation Rates: A Synthesis of Findings on the ASAP Model From Six Colleges Across Two States. Educational Evaluation and Policy Analysis, 44(2), 210–233.↩︎
Hill, C., Somo, C., & Warner, K. (2023). From degrees to dollars: Six-year findings from the ASAP Ohio demonstration. MDRC.↩︎
Miller, C., & Weiss, M. J. (2022). Increasing community college graduation rates: A synthesis of findings on the ASAP model from six colleges across two states. Educational Evaluation and Policy Analysis, 44(2), 210–233.
Scuello, M., & Strumbos, D. (2024). Evaluation of Accelerate, Complete, Engage (ACE) at CUNY John Jay College of Criminal Justice: Final Report. MDRC.↩︎
Miller, C., & Weiss, M. J. (2022). Increasing community college graduation rates: A synthesis of findings on the ASAP model from six colleges across two states. Educational Evaluation and Policy Analysis, 44(2), 210–233.↩︎
Rolston, H., Copson, E., Buron, L., & Dastrup, S. (2021). Valley Initiative for Development and Advancement (VIDA): Three-year impact report. PACE.↩︎
Martinson, K., Cho, S.-W., & Gardiner, K. (2018). Washington State’s Integrated Basic Education and Skills Training (I-BEST) program in three colleges: Implementation and early impact report. PACE.↩︎
Ratledge, A., Sommo, C., Cullinan, D., O’Donoghue, R., Lepe, M., & Camo-Biogradlija, J. (2021). Motor City Momentum: Three years of the Detroit Promise Path program for community college students. MDRC.↩︎
Martinson, K., & Glosser, A. (2022). Washington State’s Integrated Basic Education and Skills Training (I-BEST) program: Six-year impact report. PACE.↩︎
Seftor, N.S., Mamun, A., Schirm, A., 2009. The impacts of regular Upward Bound on post-secondary outcomes seven to nine years after scheduled high school graduation: Final report. US Department of Education.↩︎
Ratledge, A., Sommo, C., Cullinan, D., O’Donoghue, R., Lepe, M., & Camo-Biogradlija, J. (2021). Motor City Momentum: Three years of the Detroit Promise Path program for community college students. MDRC.
Sommo, C., Slaughter, A., Saunier, C., Scrivener, S., & Warner, K. (2023). Varying levels of Success. MDRC.↩︎
Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students. MDRC.↩︎
Bahr, S., Sparks, D., & Hoyer, K. M. (2018). Why didn't students complete a Free Application for Federal Student Aid (FAFSA)? A detailed look. Stats in Brief. NCES 2018-061. National Center for Education Statistics.↩︎
Casey, B.J., Jones, R.M., & Somerville, L.H. (2011). Braking and accelerating of the adolescent brain. Journal of Research in Adolescence, 21(1), 21–33.
Castleman, B.L. & Page, L.C. (2015). Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates? Journal of Economic Behavior and Organization, 115, 144–160.
Steinberg, L. (2008). A social neuroscience perspective on adolescent risk-taking. In K.M. Beaver (Eds), Biosocial theories of crime (pp. 78–106). Taylor & Francis.
Steinberg, L., Cauffman, E., Woolard, J., Graham, S., & Banich, M. (2009). Are adolescents less mature than adults? Minors’ access to abortion, the juvenile death penalty, and the alleged APA “Flip-Flop”. American Psychologist, 64(7), 583–594.
Bohns, V. K., & Flynn, F. J. (2010). “Why didn’t you just ask?” Underestimating the discomfort of help-seeking. Journal of Experimental Social Psychology, 46(2), 402-409.↩︎
Dynarski, S., Nurshatayeva, A., Page, L. C., & Scott-Clayton, J. (2023). Addressing nonfinancial barriers to college access and success: Evidence and policy implications. In E. A. Hanushek, S. Machin, & L. Woessmann (Eds.), Handbook of the Economics of Education (Vol. 6, pp. 319-403). Elsevier.↩︎
Bettinger, E., Long, B.T., Oreopoulos, P., & Sanbonmatsu, L. (2012). The role of application assistance and information in college decisions: results from the H&R Block FAFSA experiment. Quarterly Journal of Economics, 127(3), 1205-1242.
Hurwitz, M., Smith, J., Niu, S. & Howell, J. (2015). The Maine question: How is 4-year college enrollment affected by mandatory college entrance exams? Educational Evaluation and Policy Analysis, 37(1), 138-159.
Hyman, J. (2017). ACT for all: The effect of mandatory college entrance exams on postsecondary attainment and choice. Education Finance and Policy, 12(3), 281-311.↩︎
Dynarski, S., Libassi, C., Michelmore, K., & Owen, S. (2021). Closing the gap: The effect of reducing complexity and uncertainty in college pricing on the choices of low-income students. American Economic Review, 111(6), 1721-1756.↩︎
Thaler, R.H. & Sunstein, C.R. (2008). Nudge: Improving decisions about health, wealth, and happiness. Yale University Press.↩︎
Avery, C., Castleman, B.L., Hurwitz, M., Long, B.T., & Page, L.C. (2021). Digital messaging to improve college enrollment and success. Economics of Education Review, 84, 102170.
Goldrick-Rab, S., Clark, K., Baker-Smith, C., & Witherspoon, C. (2021). Supporting the whole community college student: The impact of nudging for basic needs security. Hope Center for College, Community, and Justice.
Castleman, B. L., Meyer, K. E., Sullivan, Z., Hartog, W. D., & Miller, S. (2017). Nudging students beyond the FAFSA: The impact of university outreach on financial aid behaviors and outcomes. Journal of Student Financial Aid, 47(3), 2.
Castleman, B. L., & Page, L. C. (2015). Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates? Journal of Economic Behavior and Organization, 115, 144-160.
Castleman, B. L., & Page, L. C. (2016). Freshman year financial aid nudges: An experiment to increase FAFSA renewal and college persistence. Journal of Human Resources, 51(2), 389-415.
Hyman, J. (2020). Can light‐touch college‐going interventions make a difference? Evidence from a statewide experiment in Michigan. Journal of Policy Analysis and Management, 39(1), 159-190.↩︎
Linkow, T., Miller, H., Parsad, A., Price, C., & Martinez, A. (2021). Study of college transition messaging in GEAR UP: Impacts on enrolling and staying in college. Institute of Education Sciences.
Page, L. C., Meyer, K., Lee, J., & Gehlbach, H. (2023). Conditions under which college students can be responsive to nudging. EdWorkingPapers.↩︎
Oreopoulos, P., & Petronijevic, U. (2019). The remarkable unresponsiveness of college students to nudging and what we can learn from it (No. w26059). National Bureau of Economic Research.↩︎
Page, L. C., Meyer, K., Lee, J., & Gehlbach, H. (2023). Conditions under which college students can be responsive to nudging. EdWorkingPapers.
Bettinger, E. P., Castleman, B. L., Choe, A., & Mabel, Z. (2022). Finishing the last lap: Experimental evidence on strategies to increase attainment for students near college completion. Journal of Policy Analysis and Management, 41(4), 1040–1059.
Clark, D., Gill, D., Prowse, V., & Rush, M. (2020). Using goals to motivate college students: Theory and evidence from field experiments. The Review of Economics and Statistics, 102(4), 648–663.↩︎
Avery, C., Castleman, B.L., Hurwitz, M., Long, B.T., & Page, L.C. (2021). Digital messaging to improve college enrollment and success. Economics of Education Review, 84, 102170.
Gurantz, O., Howell, J., Hurwitz, M., Larson, C., Pender, M., & White, B. (2020). A national‐level informational experiment to promote enrollment in selective colleges. Journal of Policy Analysis and Management, 40(2), 453-479.↩︎
Bird, K. A., Castleman, B. L., Denning, J. T., Goodman, J., Lamberton, C., & Rosinger, K. O. (2021). Nudging at scale: Experimental evidence from FAFSA completion campaigns. Journal of Economic Behavior and Organization, 183, 105-128.
Page, L. C., Sacerdote, B. I., Goldrick-Rab, S., & Castleman, B. L. (2023). Financial aid nudges: A national experiment with informational interventions. Educational Evaluation and Policy Analysis, 45(2), 195-219.↩︎
Bird, K. A., Castleman, B. L., Denning, J. T., Goodman, J., Lamberton, C., & Rosinger, K. O. (2021). Nudging at scale: Experimental evidence from FAFSA completion campaigns. Journal of Economic Behavior and Organization, 183, 105-128.↩︎
Carrell, S. E., & Kurlaender, M. (2023). My professor cares: Experimental evidence on the role of faculty engagement. American Economic Journal: Economic Policy, 15(4), 113-141.
Meyer, K., Page, L. C., Mata, C., Smith, E., Walsh, B. T., Fifield, C. L., Eremionkhale, A., Evans, M., & Frost, S. (2023). Let’s chat: Leveraging chatbot outreach for improved course performance. EdWorkingPapers.↩︎
Page, L. C., & Gehlbach, H. (2017). How an artificially intelligent virtual assistant helps students navigate the road to college. AERA Open, 3(4), 1-12.↩︎
Anzelone, C., Weiss, M., Headlam, C., & Alemany, X. (2020). How to encourage college summer enrollment: Final lessons from the EASE project. MDRC.
Gallego, F., Oreopoulos, P., & Spencer, N. (2023). The importance of a helping hand in education and in life (No. w31706). National Bureau of Economic Research.
Ortagus, J. C., Tanner, M., & McFarlin, I. (2021). Can re-enrollment campaigns help dropouts return to college? Evidence from Florida community colleges. Educational Evaluation and Policy Analysis, 43(1), 154-171.
Phillips, M., & Reber, S. (2022). Does virtual advising increase college enrollment? Evidence from a random-assignment college access field experiment. American Economic Journal: Economic Policy, 14(3), 198–234.↩︎
Carrell, S., & Sacerdote, B. (2017). Why do college-going interventions work? American Economic Journal: Applied Economics, 9(3), 124-51.↩︎
Barr, A. C., & Castleman, B. L. (2021). The Bottom Line on college advising: Large increases in degree attainment. EdWorkingPapers.↩︎
Ibid↩︎
Castleman, B. L., Deutschlander, D., & Lohner, G. (2020). Pushing college advising forward: Experimental evidence on intensive advising and college success. EdWorkingPapers.↩︎
Avery, C. (2013). Evaluation of the College Possible program: Results from a randomized controlled trial. National Bureau of Economic Research.↩︎
Hyman, J. (2023). College counseling in the classroom: Randomized evaluation of a teacher-based approach to college advising. EdWorkingPapers.↩︎
Oreopoulos, P., & Ford, R. (2019). Keeping college options open: A field experiment to help all high school seniors through the college application process. Journal of Policy Analysis and Management, 38(2), 426–454.↩︎
Bettinger, E. P., & Evans, B. J. (2019). College guidance for all: A randomized experiment in pre-college advising. Journal of Policy Analysis and Management, 38(3), 579–599.↩︎
Sullivan, Z., Castleman, B. L., Lohner, G., & Bettinger, E. (2021). College advising at a national scale: Experimental evidence from the CollegePoint initiative. EdWorkingPapers.↩︎
Gurantz, O., Pender, M., Mabel, Z., Larson, C., & Bettinger, E. (2020). Virtual advising for high-achieving high school students. Economics of Education Review, 75, 101974.↩︎
Phillips, M., & Reber, S. (2022). Does virtual advising increase college enrollment? Evidence from a random-assignment college access field experiment. American Economic Journal: Economic Policy, 14(3), 198–234.↩︎
Bettinger, E. P., & Baker, R. B. (2014). The effects of student coaching: An evaluation of a randomized experiment in student advising. Educational Evaluation and Policy Analysis, 36(1), 3–19.↩︎
Alamuddin, R., Rossman, D., & Kurzweil, M. (2018). Monitoring Advising Analytics to Promote Success (MAAPS): Evaluation Findings from the First Year of Implementation. ITHAKA S+R.↩︎
Rutschow, E. Z., Cullinan, D., & Welbeck, R. (2012). Keeping students on course: An impact study of a student success course at Guilford Technical College. MDRC.↩︎
Culver, K. C., & Bowman, N. A. (2020). Is what glitters really gold? A quasi-experimental study of first-year seminars and college student success. Research in Higher Education, 61(2), 167–196.↩︎
Scrivener, S., Sommo, C., & Collado, H. (2009). Getting back on track: Effects of a community college program for probationary students. MDRC.↩︎
Bird, K. A., & Castleman, B. L. (2024). Do financial incentives increase the impact of national-scale educational programs? Experimental evidence from a national college advising initiative. EdWorkingPapers.↩︎
Phillips, M., & Reber, S. (2022). Does virtual advising increase college enrollment? Evidence from a random-assignment college access field experiment. American Economic Journal: Economic Policy, 14(3), 198–234.↩︎
Bird, K. A., & Castleman, B. L. (2024). Do financial incentives increase the impact of national-scale educational programs? Experimental evidence from a national college advising initiative. EdWorkingPapers.↩︎
Ratledge, A., & Wavelet, M. (2021). Improving college graduation rates with multifaceted student support programs: Here's what institutions and state agencies need to know. MDRC.↩︎
Mowreader, A. (2024). Funding student success: Creating sustainable student supports. Inside Higher Ed.↩︎
Bettinger, E. P., & Baker, R. B. (2014). The effects of student coaching: An evaluation of a randomized experiment in student advising. Educational Evaluation and Policy Analysis, 36(1), 3-19.↩︎
InsideTrack. (n.d.). Students we serve. InsideTrack.↩︎
Project QUEST. (n.d.). Homepage. Project QUEST.↩︎
Rolston, H., Copson, E., & Gardiner, K. (2017). Valley Initiative for Development and Advancement: Implementation and early impact report. PACE.↩︎
Alamuddin, R., Rossman, D., & Kurzweil, M. (2019). Interim findings report: MAAPS advising experiment. Ithaka S+R.↩︎
Evans, W. N., Kearney, M. S., Perry, B., & Sullivan, J. X. (2020). Increasing community college completion rates among low‐income students: Evidence from a randomized controlled trial evaluation of a case‐management intervention. Journal of Policy Analysis and Management, 39(4), 930-965.↩︎
Dawson, R. F., Kearney, M. S., & Sullivan, J. X. (2020). Comprehensive approaches to increasing student completion in higher education: A survey of the landscape. National Bureau of Economic Research.↩︎
Sommo, C., Slaughter, A., Saunier, C., Scrivener, S., & Warner, K. (2023). Varying levels of Success. MDRC.↩︎
Ratledge, A., & Wavelet, M. (2021). Improving college graduation rates with multifaceted student support programs: Here's what institutions and state agencies need to know. MDRC.↩︎
Mayer, A., & Tromble, K. (2022). Comprehensive approaches to student success: An evidence-based approach to increasing college completion. MDRC.↩︎
Weiss, M. J., Bloom, H. S., & Singh, K. (2023). What 20 years of MDRC RCTs suggest about predictive relationships between intervention features and intervention impacts for community college students. Educational Evaluation and Policy Analysis, 45(4), 569-597.↩︎
CUNY ASAP. (2020). Inside ASAP: A resource guide on program structure, components, and management. CUNY ASAP.↩︎
Morton, T., Headlam, C., & Spencer, B. (2021). Evidence to practice: Scaling up postsecondary student success strategies. MDRC.↩︎
Chandler, J. W., & Franz, L. (2023). Scaling comprehensive supports to equitably get students to the finish line: Lessons from City Colleges of Chicago and One Million Degrees. The Institute for College Access & Success.↩︎
Cormier, M. S., & Raufman, J. (2020). Scaling ASAP: How expanding a successful program supported broader institutional change at Bronx Community College. Community College Research Center.↩︎
Ratledge, A., Sommo, C., Cullinan, D., O'Donoghue, R., Lepe, M., & Camo-Biogradlija, J. (2021). Motor City Momentum: Three years of the Detroit Promise Path program for community college students. MDRC.↩︎
Meyer, Katharine, Lindsay Page, Sarah Reber and Aizat Nurshatayeva (2025). "College Access (and Success) Programs," in Live Handbook of Education Policy Research, in Douglas Harris (ed.), Association for Education Finance and Policy, viewed 04/11/2025, https://livehandbook.org/higher-education/college-access/college-access-and-success-programs/.