03.01.2015 Views

City College of San Francisco - California Competes

City College of San Francisco - California Competes

City College of San Francisco - California Competes

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

THEME II<br />

and assist with information literacy skills,” but also seeks to help students “apply study and life management<br />

skills toward the realization <strong>of</strong> academic, vocational and personal goals” and “communicate<br />

assertively with members <strong>of</strong> our diverse communities.” Clearly, the initial efforts <strong>of</strong> the Student<br />

Development Division to define skills and competencies have produced a wealth <strong>of</strong> goals that range<br />

from clearly defined cognitive skills to ambitious efforts to transform students’ views <strong>of</strong> themselves<br />

and their place in society.<br />

Measuring the outcomes: Assessing effectiveness. While the instructional programs are grounded in testing<br />

and assessment (at least as they relate to individual student achievement), student services have not<br />

traditionally been in the business <strong>of</strong> formal outcomes assessment. This is not to say that there are not<br />

traditional measures <strong>of</strong> student success related to student development. Transfer rates usually have a direct<br />

correlation with the extent to which the <strong>College</strong> provides effective counseling and other support services<br />

to students with four-year transfer aspirations. Specific retention strategies, such as early alert programs<br />

and “wrap-around” counseling and support services for targeted student populations, can be credited with<br />

specific improvements in student outcomes. However, measuring a department’s success in developing<br />

students’ “confidence in their academic abilities” or their “ability to advocate for themselves” presents significant<br />

challenges. In fact, even the assessment <strong>of</strong> more traditional learning skills like “extrapolating prior<br />

knowledge to new situations” (CSCD) and students’ ability to conduct “informational interviews that will<br />

assist them in clarifying interests and goals” (New Student Counseling) can be difficult given the way<br />

student support services are delivered to and used by students. While a counselor may engage a student in<br />

an “informational interview” or the application <strong>of</strong> a previous experience to a new problem, the counselor<br />

does not give a test that establishes the individual student’s mastery <strong>of</strong> these skills nor does the counselor<br />

accumulate longitudinal data on many students repeating this process as evidence <strong>of</strong> the efficacy <strong>of</strong> the<br />

counseling <strong>of</strong>fice.<br />

The responses <strong>of</strong> the 12 service areas to the initial SLO inventory suggests that our Student Development<br />

Division is well aware <strong>of</strong> these challenges and is committed to developing useful assessment tools. The<br />

assessment tools identified thus far fall into three categories: information provided directly by students,<br />

e.g., surveys, anecdotal evidence, etc.; indirect measures <strong>of</strong> effectiveness; and direct measures <strong>of</strong> outcomes.<br />

CSCD provides a constructive framework for the use <strong>of</strong> surveys: “Ask students directly what counselors<br />

did for them—how they were affected and how counseling services supported and enhanced student<br />

learning.” Thus, if a counseling department wants to assess whether their services have contributed to<br />

students’ “confidence in their academic abilities,” surveys may provide useful data and point toward<br />

improved outcomes. Similarly, DSPS might survey students and faculty to assess whether their students<br />

are demonstrating “the ability to advocate for themselves.”<br />

Survey information relies upon students accurately reporting the impact <strong>of</strong> a particular service on an outcome.<br />

Indirect measures attempt to use student behaviors (and the statistical analysis <strong>of</strong> those behaviors)<br />

and other related outcomes as a reflection <strong>of</strong> student learning. For example, Admissions and Records<br />

(A&R) establishes the students’ “ability to read, comprehend and interpret … the content <strong>of</strong> the <strong>College</strong><br />

Catalog” as a major learning outcome. The indirect measure <strong>of</strong> this outcome might be “a decrease in the<br />

number <strong>of</strong> students being denied graduation due to incomplete coursework as a result <strong>of</strong> not reading and<br />

comprehending the CCSF graduation requirements and utilizing the [soon-to-be-operational] Degree Audit<br />

[program].” Similarly, DSPS wants its student clients “to anticipate needs and make requests for accommodation<br />

in a timely manner” and proposes using the DSPS <strong>of</strong>fice’s records on documented requests as a tool<br />

for assessing improvements in students’ timeliness and evidence that students are improving their ability<br />

to “advocate for themselves” and “fully participate in the college community,” two <strong>of</strong> the DSPS program’s<br />

most significant SLOs.<br />

CITY COLLEGE OF SAN FRANCISCO<br />

273

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!