03.01.2015 Views

City College of San Francisco - California Competes

City College of San Francisco - California Competes

City College of San Francisco - California Competes

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

THEME II<br />

In an effort to facilitate student progress, the Koret project has experimented with allowing students who<br />

successfully complete the linked courses to skip over the next level <strong>of</strong> composition (English 92) and move<br />

directly into English 94. An assessment <strong>of</strong> the success <strong>of</strong> these students in English 94 has shown mixed<br />

results. Some students are clearly ready for the more advanced class, while others clearly need the additional<br />

development provided by English 92. At the same time, the effort to assess the outcomes <strong>of</strong> the<br />

Koret project itself has been difficult and not as productive as anticipated. That assessment has used both<br />

traditional outcomes measures (e.g., success rates for students in Koret classes compared to students in<br />

regular sections <strong>of</strong> English 9 and 90) and extensive use <strong>of</strong> focus groups <strong>of</strong> students, faculty, and even tutors<br />

involved in the project. While the faculty and students clearly value the Koret experience and believe it is<br />

having a positive effect, the hard data is not as convincing. The combination <strong>of</strong> the need to better assess<br />

the readiness <strong>of</strong> individual students for more advanced classes and the desire to develop more accurate<br />

skills-based assessments <strong>of</strong> the Koret program itself has led to the development <strong>of</strong> a pilot portfolio review<br />

project (see the Theme I essay for more information about this project).<br />

The second pilot assessment project involves English 96, the last course in the sequence before the transfer-level<br />

1A. One <strong>of</strong> the primary goals <strong>of</strong> the curriculum redesign has been to orient the pre-collegiate<br />

reading and writing courses to specific skills required for academic success as students transition into both<br />

the transfer-level writing courses and their other college-level studies. The Department has developed<br />

expectations for the types <strong>of</strong> readings that will be used at each course level in an effort to integrate the<br />

teaching <strong>of</strong> reading and writing and to ensure that students develop their abilities to master the type <strong>of</strong><br />

reading assignments they will encounter in college-level classes across disciplines. Therefore, in Spring<br />

2005, the Department piloted a common reading assessment in nine sections <strong>of</strong> English 96 to assess the<br />

students’ independent reading comprehension, using a combination <strong>of</strong> objective questions and essay<br />

responses. This is a very focused and limited assessment that the Department hopes it can use to more<br />

effectively determine the various levels <strong>of</strong> comprehension that can be expected <strong>of</strong> students as they prepare<br />

to enter the transfer-level sequence, and, eventually, to help the Department to develop interventions<br />

throughout the sequence to improve those skills and measure those outcomes.<br />

Are grades enough During the workshops conducted on the new accreditation standards, many faculty<br />

and administrators questioned the need for expanded assessments related to the SLO paradigm, stating<br />

that grades given by faculty are adequate for assessment. Grades are certainly one <strong>of</strong> the tools faculty can<br />

use in assessing instructional effectiveness. However, grades are the evaluation <strong>of</strong> progress by individual<br />

students, not an overall assessment <strong>of</strong> instructional effectiveness and aggregate student learning. Grades<br />

can be used as a basis for SLO assessment (in addition to other types <strong>of</strong> assessment) when they are coupled<br />

with strategies for improvement. However, in the discussion <strong>of</strong> the need for other types <strong>of</strong> assessment, the<br />

discussion inevitably came around to “show me the evidence.” That is to say, if we are going to become<br />

actively engaged in a major effort to assess learning outcomes beyond the methods we currently use, said<br />

the workshop participants, then you are going to have to convince us that this effort will produce meaningful<br />

outcomes other than simple compliance with the Accrediting Commission’s new standards. In some<br />

ways, the workshop participants were asking for evidence <strong>of</strong> effectiveness in much the same way the new<br />

standards require institutions to provide evidence <strong>of</strong> effectiveness. As the <strong>College</strong>-wide discussion about<br />

outcomes assessment continues, it will be important for faculty to guide pr<strong>of</strong>essional development opportunities<br />

that focus on their learning about different ways to assess outcomes and how that can improve<br />

teaching and learning. The experience <strong>of</strong> the English Department suggests some evidence <strong>of</strong> the benefits<br />

for certain types <strong>of</strong> outcomes and programs promoting those outcomes.<br />

CITY COLLEGE OF SAN FRANCISCO<br />

269

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!