Interpreting the Program’s ImpactsThere are several important factors to consider when interpreting the program’s impactsfrom this study.First, this study looks at one enhanced student services program implemented at twocommunity colleges. The randomized experimental design of this study leads MDRC to believethat the impact estimates have high internal validity. 2 That is, it can be said with confidence that,at these specific locations, the program services had a positive effect on registration and creditsearned during the second program semester and a small positive effect on registration during thefirst postprogram semester — impacts that did not persist through later semesters. However,since the study was conducted at two colleges only, claims regarding how well this type ofprogram might work at other colleges can be made only with low confidence and a fairly highdegree of speculation — that is, the external validity, or generalizability, of this study is low.As the number of rigorous studies on the effectiveness of enhanced student services atcommunity colleges increases, so will the confidence with which findings can be generalizedfrom this line of research. Currently, very few studies on community college student servicesuse an experimental design, the gold standard for making causal claims about program effectiveness.Combining this fact with the fact that internal validity is a necessary (though notsufficient) precursor to external validity, the research presented here provides important evidenceon the general effectiveness of enhanced student services, even though this evidence hasa high degree of uncertainty. With this in mind, offered below are some interpretations, implications,and suggested ideas for future research, based on this study.The main finding from this study is that the Opening Doors program that operated atLorain County Community College and Owens Community College had positive impactsduring the second program semester (and, to a lesser extent, during the first postprogramsemester); however, these impacts were, for the most part, not maintained in later postprogramsemesters. This result can be interpreted in at least two ways, both of which MDRC believes tobe reasonable. One interpretation is that the Opening Doors program in Ohio did not work well.The initial positive effects of the program disappeared over time, and by the second postprogramsemester, the academic success of the program group students and control group studentswas virtually indistinguishable. Furthermore, the program did not produce meaningful impactson the cumulative academic outcomes measured over the three-year study period. This findingmay suggest that while there is a perceived need for enhanced student services, the durationand/or set of services offered in the program at Lorain and Owens are not a useful policy leverfor improving students’ academic success at community colleges.2 Campbell and Stanley (1963).64
An alternative interpretation of the study’s results is that, given that the programboosted registration during the second semester in which the program services were provided,and, to some extent, during the semester after the program ended, it was successful. While thepositive impacts were not sustained, the two-semester program could be considered successfulas a first step toward longer-term academic success. The question then becomes: What, ifanything, can be done in order to sustain or even increase the temporary positive programimpacts? Here, speculation is the only option — speculation that presents interesting areas forfuture research on enhanced student services.One explanation for the program’s short-term success and lack of long-term success isthat the one-year duration of the program was simply too short. Many who advocate forenhanced student services view them as an ongoing need, not a temporary, one- or two-semesterneed. It is plausible that in order for enhanced student services to lead to sustained impacts,program efforts must be sustained. Since the registration drop-off rate is so dramatic during thefirst year (as depicted in Figure 5.1), it was reasonable to wonder whether a strong one-yearprogram might “nip in the bud” the persistence problem; however, such was not the case in thisstudy. Given this finding, it seems reasonable to wonder whether a similar intervention that isimplemented for a longer duration would lead to the desired long-term impacts.While increasing the program’s duration is one possible way to boost the program’slong-term impacts, it may also be worth exploring more comprehensive approaches to enhancedstudent services. The program studied in Ohio was robust and implemented with goodfidelity to the design; however, in order to see larger and more enduring impacts, morecomprehensive student services strategies may be required. Although some additional serviceswere provided, the studied program focused mainly on enhanced academic counseling, whichis one of several key student services. While academic counseling may attempt to addresssome of the barriers to students’ persistence, it alone may not be enough. A more comprehensiveapproach to enhanced student services could also include enhanced academic supports,like tutoring, remedial assistance, and time management and study skills training. A morecomprehensive approach might also offer enhanced supplemental services, like on-campuschild care and transportation assistance.Still, it is important to consider that it is unlikely that enhanced student services alonecan address all of the barriers to student academic success. Student services are unlikely to havea large impact on several areas that may be critical to students’ academic success, such asfinancial assistance or the activities that go on within the college classroom. It is possible that inorder for enhanced student services to have a substantial effect on community college students,they need to be offered in conjunction with other reforms that significantly reduce the financialburden of attending community college (that is, more considerable reforms than the modest65
- Page 1:
OPENING DOORSMORE GUIDANCE,BETTER R
- Page 4 and 5:
Funders of the Opening Doors Projec
- Page 7 and 8:
ContentsOverviewList of Tables, Fig
- Page 9 and 10:
List of Tables, Figures, and BoxesT
- Page 11:
PrefaceIf approved by Congress, the
- Page 14 and 15:
guidance on the study. Thomas Brock
- Page 16 and 17:
• The Ohio colleges successfully
- Page 18 and 19:
Program group members were assigned
- Page 20 and 21:
• For the most part, the program
- Page 22 and 23:
offered, but might also provide stu
- Page 25 and 26:
Chapter 1IntroductionOver the last
- Page 27 and 28:
The Opening Doors DemonstrationTabl
- Page 29 and 30:
sure that students complete the req
- Page 31 and 32:
While the mechanism through which s
- Page 33 and 34:
imately 1,700 to 1 in 2001. 22 Exac
- Page 35:
of literature exist on career couns
- Page 38 and 39: The CollegeLorain County Community
- Page 40 and 41: • Were beginning freshmen or cont
- Page 42 and 43: The Opening Doors DemonstrationTabl
- Page 44 and 45: Table 2.1 (continued)SOURCE: MDRC c
- Page 46 and 47: Lorain and Owens Financial Aid Data
- Page 49 and 50: Chapter 3The Implementation of the
- Page 51 and 52: Toward the end of each student’s
- Page 53 and 54: meet with their counselor a minimum
- Page 55: The Opening Doors DemonstrationTabl
- Page 59 and 60: Appendix Table C.2 shows informatio
- Page 61: Opening Doors counselors, and progr
- Page 64 and 65: Program Control Difference Standard
- Page 67 and 68: Chapter 4The Effects of Enhanced St
- Page 69 and 70: Program SemestersTable 4.1 (page 47
- Page 71 and 72: Program Control Difference Standard
- Page 73 and 74: Program Control Difference Standard
- Page 75 and 76: The Opening Doors DemonstrationTabl
- Page 77 and 78: Transcript Outcomes by GenderAssess
- Page 79 and 80: Program Control Difference Standard
- Page 81 and 82: Program Control Difference Standard
- Page 83 and 84: correspond with the program’s imp
- Page 85 and 86: Chapter 5Summary and ConclusionsLor
- Page 87: The Opening Doors DemonstrationFigu
- Page 91: Appendix ASupplementary Baseline In
- Page 94 and 95: Full Program ControlCharacteristic
- Page 96 and 97: Appendix Table A.1 (continued)Full
- Page 98 and 99: Full Program ControlCharacteristic
- Page 100 and 101: Appendix Table A.2 (continued)Full
- Page 102 and 103: Appendix Table A.3 (continued)Full
- Page 105: Appendix BSurvey Response Analysis
- Page 108 and 109: Background Characteristics of Surve
- Page 110 and 111: Table B.1 (continued)Characteristic
- Page 112 and 113: The Opening Doors DemonstrationAppe
- Page 114 and 115: Table B.2 (continued)SOURCE: MDRC c
- Page 116 and 117: Table B.3 (continued)Characteristic
- Page 118 and 119: The Opening Doors DemonstrationAppe
- Page 120 and 121: Table B.4 (continued)SOURCE: MDRC c
- Page 122 and 123: Table B.5 (continued)Characteristic
- Page 124 and 125: The Opening Doors DemonstrationAppe
- Page 126 and 127: Table B.6 (continued)SOURCE: MDRC c
- Page 129 and 130: Lorain OwensProgram ProgramOutcome
- Page 131 and 132: Lorain County Community CollegeOwen
- Page 133: Appendix DDescription of Scales Pre
- Page 136 and 137: 4. I hardly ever expect things to g
- Page 138 and 139:
Social Support and Civic Engagement
- Page 140 and 141:
Psychological Distress (6-item summ
- Page 143 and 144:
Lorain County Community CollegeOwen
- Page 145 and 146:
Lorain County Community CollegeOwen
- Page 147 and 148:
Lorain County Community CollegeOwen
- Page 149 and 150:
Male SubgroupFemale SubgroupDiffere
- Page 151 and 152:
Male SubgroupFemale SubgroupDiffere
- Page 153 and 154:
Male SubgroupFemale SubgroupDiffere
- Page 155 and 156:
Lorain County Community CollegeOwen
- Page 157 and 158:
Appendix Table E.7 (continued)SOURC
- Page 159 and 160:
Lorain County Community CollegeOwen
- Page 161 and 162:
The Opening Doors DemonstrationAppe
- Page 163 and 164:
Appendix Table E.10 (continued)SOUR
- Page 165 and 166:
ReferencesAdelman, Clifford. 2004.
- Page 167 and 168:
EARLIER MDRC PUBLICATIONS ON OPENIN
- Page 169:
About MDRCMDRC is a nonprofit, nonp