institutional plan for the assessment of student learning outcomes

institutional plan for the assessment of student learning outcomes

Ins t i t u t i o n a l Pl a nf o r th e As s e s s m e n t o fStu d e n t Le a r n i n gOut c o m e sApproved – January 2010Revised/Updated – Dec. 2004; Dec. 2005; Dec. 2006; Dec. 2007; Jan. 2010

About UMUCUniversity of Maryland University College (UMUC) is the largest public university in the United States. As one of the 11 degree-granting institutionsof the University System of Maryland, this global university specializes in high-quality academic programs tailored to working adults.UMUC has earned a worldwide reputation for excellence as a comprehensive virtual university and, through a combination of classroom anddistance-learning formats, provides educational opportunities to 90,000 students. The university is proud to offer highly acclaimed faculty andworld-class student services to educate students online, throughout Maryland, across the United States, and in 27 countries around the world.UMUC serves its students through undergraduate and graduate degree and certificate programs, noncredit leadership development, and customizedprograms. For more information regarding UMUC and its programs, visit StatementUniversity of Maryland University College is accredited by the Commission on Higher Education of the Middle States Association of Collegesand Schools, 3624 Market Street, Philadelphia, PA 19104 (267-284-5000), one of the six regional accrediting agencies recognized by the U.S.Department of Education.UMUC is governed by the University System of Maryland Board of Regents and certified by the State Council of Higher Education for Virginia.UMUC is a constituent institution of the University System of Maryland.NondiscriminationUMUC is committed to ensuring that all individuals have equal access to programs, facilities, admission, and employment without regard topersonal characteristics not related to ability, performance, or qualifications as determined by UMUC and/or University System of Maryland policyor by federal, state, or local authorities, in accordance with UMUC Policy 40.30 Policy and Procedures on Affirmative Action, Equal Opportunity,and Sexual Harassment ( UMUC does not discriminate against or harass any person because of race,religion, color, creed, gender, marital status, age, national origin, ancestry, political affiliation, mental or physical disability, sexual orientation,or veteran status (including Vietnam-Era veterans). All inquiries regarding UMUC’s Nondiscrimination Statement or compliance with applicablestatutes and regulations should be directed to the director of Diversity Initiatives, Office of the President, UMUC, 3501 University Boulevard East,Adelphi, MD 20783-8000 (phone 800-888-UMUC, ext. 7940).

Tab l e of Co n t e n t s .1. In t r o d u c t i o n_________________________________________________________________________________ 12. Hi s t o r y a n d Ac k n ow l e d g m e n t s__________________________________________________________________ 23. Rat i o n a l e ____________________________________________________________________________________ 33.1 Definition3.2 Philosophy3.3 Culture of Outcomes Assessment4. Co n c e p t u a l Fr a m e wo r k_ _______________________________________________________________________ 54.1 Guiding Principles4.2 Curricular Alignment4.2.1 Institutional-Level Learning Outcomes4.2.2 School-Level Learning Outcomes4.2.3 Program-Level Learning Outcomes and Course Objectives4.3 Outcomes Assessment Methodology4.3.1 Institutional-Level Outcomes Assessment4.3.2 Program-Level Outcomes Assessment4.3.3 Developing and Evaluating Outcomes Assessment Measures5. Im p l e m e n tat io n _ ______________________________________________________________________________ 95.1 Data Collection5.2 Analysis and Dissemination of Results5.3 Closing the Loop: Applying the Results of Assessment6. Re p o rt i n g _______________________________________________________________________________ 116.1 Annual Reporting Cycle6.2 Three-Year Reporting Cycle6.3 Five-Year Reporting Cycle6.4. Academic Program Reviews6.5 Summary: Overview of Data Flow and Reporting Cycles7. Ro l e s a n d Responsibilities_ _____________________________________________________________________ 147.1 Learning Outcomes Assessment Steering Committee7.2 Offices and Individuals8. Le a r n i n g Assessment Go a l s a n d Timelines_________________________________________________________ 168.1 Short-Term Planning for Undergraduate Assessment8.2 Short-Term Planning for Graduate Assessment8.3 Long-Term Planning for Outcomes Assessment at UMUCwww.umuc.edui

Li s t of Ta b l e sTa b l e 1: Rat i o n a l e f o r Le a r n i n g Ou t c o m e s Assessment at UMUC______________________________________ 4Ta b l e 2: Guiding Principles f o r Le a r n i n g Ou t c o m e s Assessment at UMUC_______________________________ 5Ta b l e 3: In s t i t u t i o n a l-Le v e l Le a r n i n g Ou t c o m e s : St u d e n t Le a r n i n g Ex p e c tat i o n s (SLEs)_ _________________ 5Ta b l e 4: Assessment o f Gr a d u at e a n d Un d e r g r a d u at e SLEs_ ___________________________________________ 6Ta b l e 5: Ad d i t i o n a l Un d e r g r a d u at e SLEs_ __________________________________________________________ 6Ta b l e 6: Sc h o o l o f Un d e r g r a d u at e St u d i e s Ha l l m a r k s________________________________________________ 6Ta b l e 7: Pr o g r a m-Le v e l Assessment o f SLEs_________________________________________________________ 7Ta b l e 8: Sa m p l e Le a r n i n g Ou t c o m e s Assessment Measures_____________________________________________ 8Ta b l e 9: Cr i t e r i a f o r Rev i ew o f Assessment To o l s____________________________________________________ 8Ta b l e 10: Ex a m p l e s o f Ch a n g e s Ma d e a s a Re s u lt o f Ou t c o m e s Assessment_ ______________________________ 10Ta b l e 11: Re p o rt i n g Cy c l e s f o r Le a r n i n g Ou t c o m e s Assessment_ _______________________________________ 13Ta b l e 12: Ov e rv i e w o f Ro l e s a n d Responsibilities f o r Le a r n i n g Ou t c o m e s Assessment______________________ 14Ta b l e 13: Lo n g-Te r m Le a r n i n g Ou t c o m e s Assessment Go a l s (2009–14)__________________________________ 16www.umuc.eduii

App e n d i c e s.Ap p e n d i x A: Al i g n m e n t o f Sc h o o l-Le v e l Ou t c o m e s a n d SLEs_________________________________________ A-1Ap p e n d i x B: Assessment Pl a n s f o r t h e Sc h o o l o f Un d e r g r a d u at e St u d i e s_______________________________ B-1Ap p e n d i x C: Assessment Pl a n s f o r t h e Gr a d u at e Sc h o o l o f Ma n a g e m e n t a n d Te c h n o l o g y _______________ C-1Ap p e n d i x D: Ex a m p l e o f a Pr o g r a m Assessment Pl a n—Un d e r g r a d u at e_________________________________ D-1Ap p e n d i x E: Ex a m p l e o f a Pr o g r a m Assessment Pl a n—Gr a d u at e ______________________________________ E-1Ap p e n d i x F: Ex a m p l e o f a Pr o g r a m Assessment Re p o rt—Un d e r g r a d u at e _______________________________ F-1Ap p e n d i x G: Ex a m p l e o f a Pr o g r a m Assessment Re p o rt—Gr a d u at e_ ___________________________________ G-1Ap p e n d i x H: ETS Pro f i c i e n c y Pr o f i l e (EPP) Im p l e m e n tat io n Pl a n_____________________________________ H-1www.umuc.eduiii

1 Int r o d u c t io n.This Institutional Plan for the Assessment of Student LearningOutcomes establishes a roadmap for all activities relatedto student learning outcomes assessment at University ofMaryland University College (UMUC). The plan clarifiesthe university’s rationale for undertaking outcomes assessmentand provides coordination for the broad range oflearning assessment activities carried out by the university’stwo major academic units: the School of UndergraduateStudies and the Graduate School of Management andTechnology.Based on a conceptual framework that defines institutionwidestudent learning outcomes, the plan describes howthese outcomes are assessed across the university and withindegree programs. Each step of the assessment process iscovered: definition of learning outcomes and alignment ofthe curriculum; design of assessment tools; collection andanalysis of data; dissemination of results; and continuousimprovement of curricula, instruction, and the assessmentprocess itself.This plan builds upon what is already a flourishinginstitutional culture of learning outcomes assessmentat UMUC—a commitment to ongoing and systematicassessment shared by faculty, administrators, and otherkey stakeholders. The goals of student learning outcomesassessment are directly informed by the university’smission, core values, and strategic plan, as well as by themissions of the undergraduate and graduate schools andby the needs of academic degree programs and UMUCfaculty. The university is further committed to meetingexternal reporting requirements of the Maryland HigherEducation Commission and the Middle States Commissionon Higher Education.UMUC’s Institutional Plan for the Assessment of StudentLearning Outcomes will be reviewed and revised, as appropriate,each year. The most up-to-date approved version ofthe plan will be made available to the UMUC communityonline ( 1

2 His t o r y an d Ac k n o w l e d g m e n t sThe first Institutional Plan for the Assessment of StudentLearning Outcomes consolidated years of dedicated workby UMUC faculty, administrators, and staff to establishongoing and systematic learning outcomes assessmentacross the institution. At the heart of the 2003 plan werethe same seven competency areas that have informedinstitutional assessment efforts ever since: written communication,critical thinking, quantitative reasoning,scientific literacy, information literacy, and technologicalfluency, as well as specialized disciplinary knowledgeand skills. In 2006, the university’s assessment activities,including its Institutional Plan, were recognized by theMiddle States evaluation team, which noted in its reportthat UMUC “has clearly articulated academic performanceexpectations…the resulting evidence from theseassessments is systematically used to improve teaching andlearning throughout the institution” (Report to the Faculty,Administration, Regents, Students of University of MarylandUniversity College, p. 32).While serving as a roadmap for institution-wide assessmentactivities, the Institutional Plan is itself an evolvingdocument, subject to continuous improvement based onthe results of learning outcomes assessment. Since 2003,the plan has undergone a number of revisions, includingthe current version. The current plan incorporates the conclusionsof an unusually thoroughgoing effort by administratorsand faculty in both schools to “assess the assessment”at UMUC—to review the philosophy, approach,and methods of all outcomes assessment activities to date.and undergraduate deans; the vice president and associatevice president of Institutional Planning, Research, and Accountability;and a representative of the Faculty AdvisoryCouncil. Further details about roles and responsibilities areprovided in Section 7.The present plan owes a special debt to members of theFaculty Advisory Council. On June 21, 2009, a plenarysession of the council discussed a draft of the current planand shared ideas with the provost on a variety of matters,including the role of faculty in implementing the plan andthe potential impact of assessment activities on studentlearning. Council members also made specific recommendationsfor structural improvements to this document;their ideas and suggestions have been incorporated in thecurrent revision. Efforts such as these to enhance collaborationand communication in outcomes assessment addressa suggestion by the Middle States evaluation team in its2006 report that the university incorporate additionalstrategies for building “an assessment culture with buy-infrom the full range of constituencies” (p. 33).Responding to this wide-ranging review of the assessmentprocess, the university has instituted a series of organizationalchanges designed to support enhanced schoolandprogram-level leadership of the learning outcomesassessment process. The Provost’s Office is responsible forensuring the appropriate direction, emphasis, and supportfor learning outcomes assessment institution-wide.The School of Undergraduate Studies and the GraduateSchool of Management and Technology are responsiblefor the design and implementation of outcomes assessmentplans. Technical support for assessment activitiesin the schools is provided by the Office of Evaluationand Assessment, a unit within the Office of InstitutionalPlanning, Research, and Accountability. Overall coordinationof learning outcomes assessment university-wide isfacilitated by the Learning Outcomes Assessment SteeringCommittee, which consists of the provost; the 2

3 Rat io n a l e fo r Le a r n in g Ou t c o m e s As s e s s m e n tto assessing student achievement in each of these commoncompetency areas and applying the results of outcomesassessment to continuous improvement of instruction andcurriculum. Accordingly, all processes and activities withinthe outcomes assessment cycle should be designed with theultimate goal of “closing the loop”—applying assessmentresults to the continuous improvement of teaching andlearning (see Section 5.3).3.3 Culture of OutcomesAssessmentWhy has UMUC made outcomes assessment an institutionalpriority? The university’s answer to this questionflows from its mission, core values, and strategicplan. These broader imperatives inform the institution’swell-established culture of learning outcomes assessment:an environment that demonstrates an attitudinal andorganizational commitment to student learning, outcomesassessment, and continuous improvement.3.1 DefinitionLearning outcomes are measurements of what studentsknow or can accomplish by the time they graduate. Learningoutcomes assessment is accordingly the systematicprocess of comparing measured outcomes against clearlystated goals for the knowledge, skills, habits of mind, andvalues that students should acquire during their academiccareer. Institution-wide assessment is a continuous cyclecomprising a variety of activities, including curricularmapping, data collection, analysis, interpretation, reporting,and application of assessment results to both the improvementof instruction and refinement of the assessmentprocess itself.3.2 PhilosophyGuided by institutional as well as undergraduate andgraduate school mission statements, UMUC has undertakenlearning outcomes assessment to support andensure student success. The university is committedto ensuring that all students, regardless of their chosendegree program, receive systematic instruction across thecurriculum in a set of institution-wide competency areas(see Section 4.2.1). The university further commits itselfContinuous improvement is implicit in the institutionalmission to “offer top-quality educational opportunities toadult students in Maryland, the nation, and the world,setting the global standard of excellence in adult education”( Ensuringand supporting student success is the primary mission ofUMUC and the end toward which ongoing assessmentactivities and metrics are directed. This imperative is givenspecific focus in UMUC’s Strategic Plan, which prioritizesefforts to “constantly improve our quality” and to “maximizestudent success and serve the entire field of distanceeducation for adults by serving as a leader both nationallyand internationally” ( Systematic learning outcomes assessmentprovides the vehicle for identifying teaching and learningstrategies that address these goals.School-level mission statements further support the developmentof an institutional culture that values assessment.The School of Undergraduate Studies is “committed tomeeting undergraduate students’ needs for lifelong learningby providing innovative delivery of high-quality educationalprograms, ensuring substantive and relevant curricula,and recognizing the value of experiential learning.”Ensuring continuing innovation, quality, and relevance incurricula and programs requires a strong feedback loop atmultiple points in students’ academic careers and, 3

fore, requires a methodology that addresses the needs of adiverse study body. The mission of the Graduate Schoolof Management and Technology is to equip graduates notonly with disciplinary knowledge but also with the ability“to apply what they study to their professions and theirdaily lives.” The forms of assessment in graduate degreeprograms accordingly emphasize both the acquisition ofknowledge as well as the ability to apply it appropriately ina variety of situations.In addition, the institution-wide commitment to learningoutcomes assessment is informed by UMUC’s adherenceto regulatory processes that mandate three-year reports forthe Maryland Higher Education Commission as well asfive- and ten-year progress reports on learning outcomesassessment for the Middle States Commission on HigherEducation. Efficiencies are possible in these externalreporting cycles because Maryland has adopted theMiddle States Commission’s definitions of student learningcompetencies for the following areas: writing, quantitativereasoning, critical analysis, information literacy,technological fluency, and scientific literacy. UMUC hasaccordingly adapted the same competency areas as thebasis of its institutional-level student learning outcomes—common areas of student achievement defined for alldegree programs.To summarize, the four factors informing an institutionalculture of assessment at UMUC are shown in Table 1below. The institutional-, school-, and program-levellearning outcomes referred to in Table 1 are discussed inSection 4.2. These intended student learning outcomesprovide specific goals and structure for all learning assessmentactivities at UMUC.Table 1: Rationale for Learning Outcomes Assessment at 4

4 Con c e p t ua l Fr a m e w o r k4.1 Guiding PrinciplesUMUC has identified guiding principles to ensure thatlearning outcomes assessment is systematic, sustained, andmeaningful. Summarized in Table 2 below, these principlesarticulate the values and expectations for enhancing andinstitutionalizing a learning outcomes assessment cultureand are intended to inform all discussions pertaining tolearning outcomes assessment across the institution.Guiding Principles for Effective LearningOutcomes AssessmentAll UMUC administrators, faculty, and staff have a role in ensuringthe academic success of UMUC students.Every UMUC degree program is responsible for the developmentand assessment of student skills in specific and identified areasof learning.Assessment of students is an integral and unconditional componentof effective instruction at UMUC.Assessing student learning using reliable and effective methodologiesis the collective responsibility of every UMUC facultymember.UMUC assessment results are used in a documented way toinform curriculum review and design.UMUC stakeholders, including internal and external constituents,are routinely apprised of learning outcomes results.Table 2: Guiding Principles for Learning Outcomes Assessment atUMUC4.2 Curriculum Alignment4.2.1 Institutional-Level Learning OutcomesUMUC has developed four levels of student learningoutcomes: institutional-level, school-level, and programlevelas well as course objectives common to all sections ofa given course.Institutional-level learning outcomes are based upon theeducational missions of UMUC’s undergraduate andgraduate programs. Institutional-level outcomes also reflectcompetency areas identified in Standard 12 of the MiddleStates Commission’s Characteristics of Excellence in HigherEducation (2006 edition) and adopted by the MarylandHigher Education Commission as the basis of mandatoryreporting categories within the triennial assessment reportrequired of the state’s higher education institutions. Previouslyknown as Core Learning Areas, UMUC’s institutional-levellearning outcomes have been renamed StudentLearning Expectations (SLEs). Definitions of these seveninstitutional-level SLE areas are provided in Table 3 below.These common areas provide a structure for institutionwidelearning outcomes assessment. What they collectivelysay is that all UMUC students, regardless of their degreeprogram, will be instructed and assessed in six commonlearning areas: written communication, critical thinking,quantitative reasoning, scientific literacy, information literacy,and technology fluency (abbreviated COMM, THIN,QUAN, SCIE, INFO, and TECH, respectively). In addition,the university expects that all students demonstratecompetence in their chosen field of study (designated bythe seventh SLE area, SPEC).While SLE definitions and other basic terms and methodsof learning outcomes assessment are shared across theinstitution, differences in mission and in students servednecessitate some differences in how the schools approachthe assessment process. Specifically, assessment in theWritten Communication (COMM)Technology Fluency (TECH)Information Literacy (INFO)Critical Thinking (THIN)Quantitative Reasoning* (QUAN)Scientific Literacy* (SCIE)Content/Discipline-SpecificKnowledge (SPEC)Student Learning Expectations (SLEs)Produce writing that meets expectations for format, organization, content, purpose, and audience.Demonstrate an understanding of information technology broad enough to apply technologyproductively to academic studies, work, and everyday life.Demonstrate the ability to use libraries and other information resources to effectively locate,select, and evaluate needed information.Demonstrate the use of analytical skills and reflective processing of information.Demonstrate the application of mathematical and numerical reasoning skills.Demonstrate the ability to understand key concepts and principles of the natural, social, andbehavioral sciences and to apply these principles appropriately within personal lives.Demonstrate knowledge and competencies specific to program or major area of study.*Not assessed in the graduate school.Table 3: Institutional-Level Learning Outcomes: Student Learning Expectations (SLEs) 5

School of Undergraduate Studies covers the full complementof SLE areas, reflecting the broader reach of fouryearundergraduate degrees, which include a common setof general education requirements as part of all specializeddegree programs. By contrast, the Graduate School ofManagement and Technology, focused on more narrowlyspecialized professional programs, omits assessment ofscientific literacy and quantitative reasoning.These differences in undergraduate and graduate practiceare summarized in Table 4 below:a means of emphasizing educational quality and characteristicsand minimizing what may be perceived by somestudents as technical terminology or acronyms. However,both “hallmarks” and “SLEs” share the same definitionand have a common meaning.A summary illustration of the undergraduate hallmarks,including the two school-level outcomes and the seveninstitutional-level SLEs, is provided in Table 6 below:Table 4: Assessment of Graduate and Undergraduate SLEs4.2.2 School-Level Learning OutcomesSchool-level outcomes are those additional competencyareas, over and above the institutional SLEs, that eitherthe undergraduate or graduate school decides to embedand assess across all degree programs. At present, onlythe School of Undergraduate Studies has identified suchoutcomes beyond the institutional SLEs. Reflecting itsmission to produce graduates who are not only competentin their areas of study but also well prepared to beresponsible citizens in a global society, the undergraduateschool has designated history and ethics as additional keycompetency areas:Additional School-Level Outcomes Assessed in theSchool of Undergraduate StudiesHistorical and CulturalPerspectives (HIST)Ethics (ETH)Table 5: Additional Undergraduate SLEsKnowledge of diverse cultures andhistorical periodsUnderstanding of and ability to applyframeworks for ethical decisionmakingInternally, the School of Undergraduate Studies uses theterm “hallmarks” to refer collectively to the institutionallevelSLEs plus the two additional school-level outcomes.In its communications with students, the School ofUndergraduate Studies refers to “the hallmarks of theeducated person,” adopting the term “hallmark” to conveythat these areas distinguish UMUC’s approach to learningand program design. The undergraduate school prefersthe term “hallmark” for communications with students asTable 6: School of Undergraduate Studies Hallmarks4.2.3 Program-Level Learning Outcomes andCourse ObjectivesAn initial step in designing an assessment plan for aparticular degree program involves curriculum alignment:mapping institutional- and school-level intended learningoutcomes to program-level outcomes and, in turn, tocourse objectives within those programs. Program-levelobjectives are informed not only by SLEs, school-level outcomes,and the mission of each program but also, in manycases, by external standards either mandated or recommendedby accrediting bodies or scholarly societies. To theextent that they have been aligned with SLEs, programleveloutcomes provide a means by which the diverse rangeof assessment activities undertaken at the course level canbe compared, analyzed, and summarized in internal andexternal reports. Appendices B and C summarize howcurrent program outcomes for the School of UndergraduateStudies and the Graduate School of Management andTechnology are aligned with SLEs.Each undergraduate and graduate degree program hasdeveloped a five-year assessment plan that includes acurricular alignment piece as the basis for assessing theSLEs and school-level outcomes. To design an assessmenttool for the writing SLE, for example, the undergraduatepsychology program would first map this SLE to a correspondingprogram outcome for competency in writing.Next, from within the psychology program, a requiredcourse would be selected that includes a course objectiveemphasizing writing skills. Every undergraduate andgraduate program has such a mapping exercise 6

majors, or in both a course required for the major and ageneral education course. For example, in the accountingmajor, program-level assessment of quantitative reasoningoccurs within a course required for the major as well as ageneral education course; in the history major, by contrast,program-level assessment of QUAN would occur only in ageneral education mathematics course.For every program in the graduate school, all SLEs measuredare assessed at the program level within requiredcourses for the degree.Whichever model is followed, all SLEs measured by undergraduateor graduate programs are assessed at the programlevel at least once every three years. The frequencyof program-level assessment as well as of reporting on theresults of assessment is discussed in Section 5.2. Specificreporting schedules are provided in Section 6 and AppendicesB and C, which cover undergraduate and graduateassessment plans, respectively.4.3.3 Developing and Evaluating OutcomesAssessment MeasuresWith the support and consultation of InstitutionalPlanning, Research, and Accountability, the schools andprograms develop and deploy a variety of methods toassess student learning outcomes. Methods are selectedto incorporate both direct and indirect assessment tools.Examples of direct and indirect assessment tools used inthe undergraduate and graduate schools are identified inTable 8 below:InstitutionalLevel Measure SLE(s)Programmatic—general educationcourse exams(undergraduateonly)ProgrammaticETS Proficiency Profile(EPP)LIBS 150 Final ExamBIOL 101 Final ExamFluency in TechnologyExamMATH 106 and 107 Exams• Capstone Course Projects• Course Examinations• Course Assignments (e.g.,Research Papers, Essays)Table 8: Sample Learning Outcomes Assessment MeasuresCOMMQUANTHININFOSCIETECHQUANAll SLEsInstitutional-level assessment measures, such as the EPPexam, require formal review by Institutional Planning,Research, and Accountability and approval by the Provost’sOffice. Program-level assessments must also be submittedto Institutional Planning, Research, and Accountabilityfor review and technical assistance. Guiding this reviewprocess is the following set of criteria regarding the designand use of effective assessment tools:Criterion #1Criterion #2Criterion #3Criterion #4Criterion #5Criterion #6Criteria for the Review of Assessment ToolsSome standardized assessments will be used forthe assessment of student learning. A standardizedassessment is defined as a test constructedusing standard administration procedures.Examples of standardized assessments include,but are not limited to, final examinations, casestudies, research papers, or norm-referenced/criterion-referenced tools.Scoring procedures for standardized assessmentsmust be well documented and uniformlyapplied. Documentation of reliability and validityof scores is required for norm-referenced andcriterion-referenced assessments. A scoringrubric is required for all other forms of standardizedassessments.Any assessment used to report student learningmust demonstrate a clear linkage between theSLE(s), program-level learning outcomes, orcourse learning objectives.Any assessment used to report student learningmust provide quantifiable and concrete resultsdirectly linked to the SLE(s), program-level learningoutcomes, or course learning objectives.Any assessment used to report student learningmust adhere to a collaboratively developedaction plan for the implementation of theassessment. The action plan must address theadministration of the assessment(s), provide animplementation timeline, identify the studentsto be assessed, and detail the use of studentlearning results at the school or program level.The implementation of an assessment instrumentmust provide a documented process forcontinuous feedback that connects content,instructional practice, and the results of studentlearning. Evidence must be provided that documentshow the iterative feedback process hasbeen (or will be) used to revise, refine, and/orenhance the appropriate curriculum and theassessment instrument.Table 9: Criteria for the Review of Assessment ToolsUpon request, Institutional Planning, Research, andAccountability supports the work of individual degree programsby conducting a review of any planned assessmenttool. Following review of an assessment tool, the office willprepare an analysis that details findings and recommendationsspecific to the 8

5 Imp l e m e n tat i o n5.1 Data CollectionAt the institutional level, the standardized EPP exam isadministered online and results are compiled for UMUC asa service of ETS (for additional discussion of EPP implementationplans, see Appendix H). Further analysis andinterpretation of EPP results is provided by InstitutionalPlanning, Research, and Accountability, which distributesan annual summary report on institutional-level assessmentto the university community (see Section 6 for additionaldiscussion of this and other reporting cycles).Collection of program-level outcomes assessment databegins with the five-year Program Assessment Plans developedfor all undergraduate and graduate degree programs.As was previously discussed (Sections 4.2.3 and 4.3.2),each plan includes a curricular map that identifies, for eachSLE, a corresponding program outcome, course objective,and assessment tool. The variety of tools used is suggestedby Table 8. Program-level assessment data thus takes avariety of forms, depending on the tool employed.Collecting data at the program level currently requires alabor-intensive process, involving considerable coordinationamong program directors, faculty, and other staff. Forexample, the assessments that the School of UndergraduateStudies currently administers in its general educationcourses (LIBS 150, BIOL 101, and IFSM 201) requiremanual gathering and scanning of final exam forms frommultiple course sections. Other types of program-level datacollection involve downloading results from WebTycho,the university’s proprietary online learning platform.To improve the efficiency and reduce the burden on staffof program-level data collection, the Learning OutcomesAssessment Steering Committee and Institutional Planning,Research, and Accountability are currently investigatingoutcomes assessment management solutions thatwould provide a central location for the gathering andstoring of learning outcomes assessment documentationand data. With such an assessment management system,program directors and other assessment administratorswould be able to upload data from a variety of sources.Depending on the system adopted, assessment administratorswould have the capacity to pull results directly frommultiple online course gradebooks, upload or input externalassessment data directly, compile scores produced bycommon assessment rubrics, administer custom tests andsurveys, and develop and score electronic portfolios.5.2 Analysis and Disseminationof ResultsWith the support of Institutional Planning, Research,and Accountability, all institutional- and program-leveloutcomes assessment data is aggregated, analyzed, andshared with undergraduate and graduate faculty andadministrators. Data from the institutional-level EPPexam is gathered and analyzed twice a year, and a summaryreport produced annually by Institutional Planning,Research, and Accountability. Program-level assessmentdata is gathered, analyzed, and reported annually in theProgram Assessment Reports, prepared annually for eachundergraduate and graduate degree program. In addition,external and internal reports on outcomes assessmentactivity are produced on three- and five-year cycles.A complete discussion of data flow and formal reportingcycles is provided in Section 6.Open and ongoing dialogue across the university is crucialfor the meaningful application of outcomes assessmentresults. At the program level, online discussions of assessmentresults are conducted via the WebTycho course sectionsthat have been constructed for each discipline. Thesesections may include smaller “study groups” of selectedfaculty members; such study groups allow for targetedapplications of assessment results or the development ofassessment tools specific to a particular course or courseswithin a program.Annual faculty meetings, held separately for undergraduateand graduate personnel, provide a more intensive venuefor dialogue and discussion of learning outcomes. Suchmeetings provide an opportunity for the university provostand deans to discuss outcomes assessment plans andresults with faculty across many programs. These meetingsinclude breakout sessions in which program directors candiscuss outcomes assessment plans and results with facultymembers in particular disciplines.The university maintains a Web site ( with information and resources on learningoutcomes assessment at 9

5.3 Closing the Loop: Applying theResults of AssessmentA phrase commonly used in the literature of learningoutcomes assessment, “closing the loop,” emphasizes theimportance of not merely gathering, analyzing, and reportinglearning outcomes data, but also acting upon those resultsto improve student learning. At an institutional level,such actions might include changes in the administrationof outcomes assessment. For example, the organizationalchanges discussed previously (Section 2) were undertakenin 2009 based upon prior experience with outcomes assessmentthat suggested the need for strengthened school- andprogram-level leadership of the process. A further exampleof closing the loop at an institutional level is representedby the university’s new approach to implementing the EPPstandardized exam (see Appendix H). The current revisionof UMUC’s Institutional Plan also represents an instanceof closing the loop at an institutional level.Documenting such instances of program-level closingthe-loopactivities is included in the formal reportingprocedures described in Section 6. Sharing examples ofsuccessful applications of outcomes assessment helps toencourage similar applications within and across programs.An awareness of the ultimate ends to which outcomesassessment is directed also helps to minimize “compliancementality”: gathering assessment data for its own sake.Accordingly, the timetables developed for each ProgramAssessment Plan include not only the development ofassessment tools and the gathering of data but also thesemester and year in which the results of a particularassessment event will be applied.A summary of possible closing-the-loop activities is providedin Table 10 below. The chart is not intended to beexhaustive, merely illustrative of possible ways in which toapply the results of outcomes assessment.It is in closing the loop at the program level, however, thatperhaps the most immediate impacts to pedagogy and curriculumoccur. Following the analysis and dissemination ofresults, faculty and program administrators need to ensurethat these results are applied to improvements in pedagogyand curricula. For example, program-level assessmentsof information literacy have over the past several yearsbeen applied to changes in both curricula and pedagogy.Revisions to the content of the online library skills course(LIBS 150) have been implemented to address deficits instudent performance revealed by a program-level assessment.To help support faculty teaching this course, theinterface to a quiz-results database has been improved sothat instructors get a snapshot of student performance onassessments across a section and can adjust their instructionand provide feedback to individual students accordingly.Actions for Closing the LoopLevel Changes to Assessment Plan Changes to Curricula and Pedagogy Changes to Academic ProcessesProgrammatic• revising intended learningoutcomes• collecting and analyzing additionaldata to corroborate institutionalleveloutcomes assessment data• revising course objectives• revising course sequence• revising course content• modifying frequency or schedule ofcourse offerings• implementing additional training• revising of advising standards orprocesses• identifying or creating activitiesbeyond the classroom related tostudent learningInstitutional• implementing organizationalchanges in outcomes assessmentadministration• changing data collection methods• highlighting and disseminating successfulapplications of assessmentfindings• identifying/changing/creating communicationand feedback methods• improving technology to supportlearning outcomes assessmentTable 10: Examples of Changes Made as a Result of Outcomes 10

6 Rep o rt i n g6.1 Annual Reporting CycleEach school adheres to a formal reporting process thatsummarizes school- and program-level assessment activities.Learning outcomes assessment activities—includingclosing-the-loop activities—are documented in internaland external reports produced on one-, three-, and fiveyearintervals. These reports are disseminated to stakeholdersacross the institution as appropriate and support aculture of learning outcomes assessment.Fundamental to all subsequent reports are the annualProgram Assessment Reports, one of which is written foreach degree program. Examples of annual undergraduateand graduate Program Assessment Reports are providedin Appendices F and G, respectively. The annual ProgramAssessment Reports provide information on instruments,historical data collection, corroborating assessments,benchmarks, analysis of data, use of results, and a timelinefor implementation of changes based on data. ProgramAssessment Reports document assessments of the institutional-andschool-level outcomes as well as assessmentsof program outcomes related to specialized disciplinaryknowledge or skills. Program directors are responsible forcompiling these reports and are supported in these effortsby the assessment coordinators in the School of UndergraduateStudies and the Graduate School of Managementand Technology and by Institutional Planning, Research,and Accountability personnel. Each annual report isshared with the undergraduate and graduate deans as wellas faculty within that discipline.A second form of annual reporting is the Institutional-Level Assessment Report. Written by InstitutionalPlanning, Research, and Accountability personnel, thissummary report provides an overview and analysis ofall EPP standardized exam results from the precedingacademic year. Institutional-Level Assessment Reports arestructured around the three SLE competency areas coveredby the EPP exam: writing, critical thinking, and quantitativereasoning. The report is shared with administratorsand faculty across all programs in the School of UndergraduateStudies or the Graduate School of Managementand Technology.The purpose of a third form of annual reporting, the SLESummary Report, is to document all program-level assessmentsfor each SLE area in an academic year. For each SLEcorresponding to one of the mandated competency areas,this report synthesizes and analyzes the program-levelassessments conducted in a given year. Where meaningfulcorrelations can be made, the SLE Summary Reportswill also include a discussion of institutional measures ofSLE areas. These reports will be developed by the undergraduateand graduate assessment coordinators, supportedby Institutional Planning, Research, and Accountabilitypersonnel and by designated assessment leads within eachof the schools. They are key documents in the externalreporting requirements described in Sections 6.2 and Three-Year Reporting CycleBeginning August 2, 2004, the Maryland Higher EducationCommission required Maryland institutions ofhigher education to submit triennial reports on assessmentactivities. To promote efficiency among institutions withinthe University System of Maryland, the commission hasadopted for the triennial Student Learning Outcomes AssessmentReport the same competency areas identified inStandard 12 of the Middle States Commission’s Characteristicsof Excellence in Higher Education (2006 edition).These include written and oral communication skills,quantitative reasoning, critical thinking, scientific literacy, 11

information literacy, and technological fluency. (Given thenature of the online learning environment, UMUC hasreceived a temporary waiver for assessing student learningin oral communication skills.)The Student Learning Outcomes Assessment Reporthas specific requirements in terms of form and content.For each competency area, the report should provide aninstitutional-level definition, a discussion of methodologyand measures, and documentation of the ways in whichoutcomes assessment has been applied to the improvementof teaching and learning.At the conclusion of each three-year cycle, UMUC’sreport will be informed by a corresponding three-year SLESummary Report (one report each for the undergraduateand graduate schools). The purpose of these reports is tocompile results of three years’ worth of institutional-leveland program-level assessment activity, as recorded in theannual SLE Summary Reports. Like the annual SLE SummaryReports, the triennial reports will synthesize, for eachof the SLE areas, the results of program- and institutionallevelassessments. The three-year summary reports will giveparticular emphasis to closing-the-loop activities. These reportswill be prepared by the undergraduate and graduateassessment coordinators, with the support of InstitutionalPlanning, Research, and Accountability personnel.6.3 Five-Year Reporting CycleIn 2011, five years subsequent to its successful decennialevaluation by the Middle States accreditation team,UMUC is required to submit an interim Periodic ReviewReport. According to the commission, this report is a“retrospective, current, and prospective analysis of an institutionsince its last evaluation … [including] a descriptionof how the institution responded to any recommendationsmade by the institution in its own self-study report, bythe visiting team that evaluated the institution, and by theCommission.”Among its recommendations in the 2006 report, the MiddleStates evaluation team included one directly related tolearning outcomes assessment:UMUC is engaged in multiple activities to assess student learning.The university has committed substantial resources to assessmentand there is broad-based commitment to use these assessmentsto enhance student learning. We recommend that UMUC reportover the coming years to the UMUC community selected findingsfrom these assessments and begin to develop and report on theoutcomes of these assessments, how and where they are analyzed,and how the conclusions drawn from the assessments canbe used to improve current practices, programs and services.In addition to formal recommendations, an accreditationteam may also make suggestions. While the Periodic ReviewReport is structured to directly and fully address recommendations,the narrative should also address suggestions.Three suggestions with regard to learning outcomesassessment were made by the evaluation team:The team suggests that UMUC consider clear metrics related tostudent learning in the balanced scorecard under development.The team suggests consideration of an electronic portfolio withinWebTycho as an additional tool for assessing student learning.The team suggests that UMUC continue to make explicit linkages,in public, transparent and redundant ways with its constituencies,between assessment activities, the resulting data, and thedecisions for improvement based on assessment in general and ofstudent learning in particular.The above recommendations and suggestions, as wellas the university’s own recommendations in its 2006Self-Study, will be addressed in UMUC’s 2011 PeriodicReview Report. The substance for learning outcomesassessment reporting in the 2011 report will derive fromthe university’s triennial report submitted to the MarylandHigher Education Commission in 2010. In addition,updates for 2011 will be provided by the annual SLESummary Reports and the Institutional-Level AssessmentReport (EPP exam analysis) that will be produced in thespring of 2011.A visual overview and summary of the relationship amongthese various reporting cycles is provided in Section Academic Program ReviewsAlso on five-year cycles (although not necessarily coincidingwith the Periodic Review Report) are the academicprogram reviews completed by each undergraduate andgraduate academic program. The full academic programreview is a comprehensive assessment, covering enrollmentand graduation rates, faculty demographics, grade distributions,course evaluations, and other measures relevantto the overall health and quality of an academic program.Included as well is a report from an external reviewer notaffiliated with UMUC.Learning outcomes assessment has become an increasinglyprominent part of the academic program review process.The annual Program Assessment Reports discussedpreviously in Section 6.1 provide the basis not only forinstitutional-level reporting on SLEs but also for the fiveyearsummary and analysis provided in the academic programreviews. The focus of the academic program reviews,however, will be those assessment activities most directlyrelated to students’ mastery of the specialized content 12

• Establish a communication process that demonstratessupport of the learning outcomes assessment initiativeto faculty and students• Develop and implement a curricular mapping processthe ensures institutional-level student learning outcomesare embedded across all degree programs• Ensure program-level and school-level reporting oninstances of closing-the-loop activities• Provide school-level faculty development opportunitiesthat act on the results of assessment and further supportcontinuing effective learning outcomesThe deans are further responsible for ensuring that learningoutcomes assessment findings are incorporated intoacademic program reviews and other curriculum developmentprocesses. For each school, the respective deanappoints a senior administrator responsible for the generalcoordination of learning outcomes assessment activities.These administrators are charged with building a team ofprogram directors and faculty to ensure coordinated directionand strategies for assessment activities within each ofthe schools. These administrators communicate learningoutcomes assessment activities and progress to the deansand faculty, track assessment efforts and resources, andserve as the key points of contact for assessment questionswithin each school or division.Numerous other school-level personnel and faculty areinvolved in learning outcomes assessment; additionalroles within the graduate and undergraduate schools aredescribed in school-level assessment plans.Institutional Planning, Research,and AccountabilityEach academic unit is ultimately responsible for learningoutcomes assessment in its respective unit. However,there are also offices at the institutional level that existin support of learning outcomes assessment. The mostprominent of these is Institutional Planning, Research,and Accountability, which serves as a collaborative partnerin learning outcomes assessment—providing a frameworkfor assessment and technical assistance in the learningoutcomes assessment process.The Office of Evaluation and Assessment within InstitutionalPlanning, Research, and Accountability has primaryresponsibility for learning outcomes assessment. Personnelin this office analyze data from the EPP and share resultswith the university community. Additionally, this officesupports each stage of the assessment process as it is implementedby the School of Undergraduate Studies and theGraduate School of Management and Technology.Among the many forms of support provided to the schoolsby Institutional Planning, Research, and Accountabilityare the following:• Ensure reliability and validity of assessment tools andmeasures• Support personnel engaged in assessment activities byproviding expertise related to effective student learningassessment• Design data-driven projects and, where possible, shareresponsibilities related to collection, analysis, and interpretationof dataFaculty Advisory CouncilConsisting of 18 elected members, the Faculty AdvisoryCouncil represents all faculty, including librarians, inUMUC Adelphi, UMUC–Asia, and UMUC–Europe.The council advises the provost on a variety of matters ofconcern to faculty, including the Institutional Plan for theAssessment of Student Learning Outcomes.Assessment Liaison, Office of the ProvostThe designated assessment liaison works directly withassessment coordinators and other personnel across theinstitution to support the assessment process. As appropriateand as needed, the liaison for assessment facilitatescommunication between the schools and university administrationregarding assessment matters, maintains andupdates the Institutional Plan and other documentationhoused on the Assessment Web site, and assists as neededin meeting institutional reporting requirements.Undergraduate and GraduateAssessment CoordinatorsAssessment coordinators within the School of UndergraduateStudies and the Graduate School of Managementand Technology work directly with school personnel andfaculty in the design and implementation of assessmentinstruments. They collaborate with academic departmentsto manage, support, and document learning outcomes assessmentactivity within the schools. Assessment coordinatorsassist program directors in the design as well as theimplementation of Program Assessment Plans. Workingwith school personnel as well as with the assessment liaisonin the Office of the Provost, assessment coordinators aredirectly responsible for producing the one- and three-yearSummary SLE Reports (see Section 6 for a discussion ofreporting) 15

8 Lea r n in g As s e s s m e n t Go a l s an d Ti m e l i n e sIn addition to individual Program Assessment Plans(see Section 4.2.3), both the undergraduate and graduateschools have developed school-level plans and schedules.8.1 Short-Term Planning forUndergraduate AssessmentOne- and five-year outcomes assessment schedules forall undergraduate programs are detailed in three chartsavailable in Appendix B. These schedules will be usedto ensure that all SLEs assessed by the School of UndergraduateStudies are measured in each program within athree-year cycle.The first two charts in Appendix B are focused on assessmentduring the period summer 2009 to spring 2010.During this period, all undergraduate programs will assessat least one program-level outcome, with additional outcomesbeing assessed in subsequent semesters. Chart 1 is abroad overview showing the breakdown of program-levelassessment by SLE category. Chart 2 provides a breakdownby program, showing the specific program outcome to beassessed and the planned assessment tool. Finally, the thirdchart in Appendix B extends the timeline to four yearsout, showing for each SLE which programs anticipate aprogram-level assessment in a required course for themajor. Undergraduate programs without such an assessmentfor a particular SLE must ensure that this SLE iscovered during the same period by an assessment takenby their majors in a required general education course.Individual Program Assessment Plans will be updated twicea year, fall and spring, to reflect changes as a result of theoutcomes assessment process and to ensure that all SLEs areassessed at the program level at least once every three years.8.2 Short-Term Planning forGraduate AssessmentShort-term targeted goals guide the annual cycle of learningassessment activities. Currently, the graduate assessmentliaisons are working regularly with faculty membersto develop rubrics and course assignments to assess theSLEs covered by the graduate school. The graduate schoolwill conduct assessments every spring semester. The datagathering and analysis will be done during the subsequentsummer semester. In conjunction with these activities,each fall semester the graduate school will conduct areview of the findings and changes to course syllabi, curriculums,and programs based on findings of the previousspring. No changes in the assessment tools will be madeuntil a complete cycle (three rounds) of assessment iscompleted. The short-term learning assessment goal forthe graduate school is to begin the first implementation ofassessments in the spring 2010 semester. Further details ofpast as well short- and long-term assessment of each SLEare presented in Appendix C.Graduate Program Assessment Plans will be updated twicea year, fall and spring, to reflect changes as a result of theoutcomes assessment process.8.3 Long-Term Planning forOutcomes Assessment at UMUCBroad, long-term goals are projected to guide future learningoutcomes assessment activities. Long-term learningassessment goals are subject to change. The broad goals for2009–14, described in Table 13 below, are refined annuallyfor incorporation into revisions of the Institutional Plan.Long-Term Learning Outcomes Assessment Goals (2009–14)1 Conduct twice annual (freshman in fall; seniors in spring) assessment of undergraduate-level learning in communication, quantitativereasoning, and critical thinking using the EPP exam.2 Continue development of measures for and conduct annual assessment of undergraduate-level learning in all program outcomes,with full coverage of all outcomes within the 2009–14 cycle.3 Conduct biannual assessment of undergraduate-level learning in information literacy, technology fluency, and scientific literacy.4 Conduct annual assessment of graduate-level learning in the Graduate School of Management and Technology.5 Monitor all Program Assessment Plans for implementation, including data collection, analysis, reporting, and use of results.6 Investigate opportunities to build practices and processes into WebTycho (the university’s proprietary online learning managementsystem) and/or other electronic formats (e.g., an assessment management system such as TK-20) to ease gathering, storing,and analyzing learning outcomes assessment data.Table 13: Long-Term Learning Outcomes Assessment Goals (2009–14) 16

App e n d i x AAli g n m e n t ofSch o o l -Le v e lOut c o m e s an d SLEs

Alignment of School-Level Outcomes and SLEsSLE Expected Outcome (Graduate) Expected Outcome (Undergraduate)Written Communication(COMM)Technology Fluency(TECH)Information Literacy(INFO)Graduates will be able to1. Produce writing samples that meet expectations forcontent and purpose.2. Develop a clearly articulated and original thesisand/or main idea consistent with expectations ofcontent and purpose.3. Organize ideas in clear and sequential paragraphsthat logically reinforce the main idea.4. Incorporate sufficient use of appropriate research,supporting evidence, and relevant sources.5. Use language and tone appropriate in a writtendocument.6. Critically evaluate information and/or data withinboundaries established by a main idea.7. Display sound grammar, spelling, and appropriateconventions.8. Produce an acceptably researched and documentedextended essay, thesis, or dissertation.Graduates will be able to1. Explain the generic nature and uses of technologies,both physical and information technologies,for competitiveness.2. Exhibit technical and managerial competencies inemploying, integrating, and managing technologieswithin organizations to achieve competitive edge.3. Apply technology in a manner most appropriate tocontext and discipline.4. Maintain knowledge of current and new trends intechnologies.5. Adapt to current and new trends in technologies.Graduates will be able to1. Formulate viable and subject-relevant researchquestions.2. Use appropriate investigative methods and informationretrieval systems.3. Evaluate the scholarly merits of sources against aset of supportable criteria.4. Compare new knowledge with prior knowledge todetermine the value added, contradictions, or otherunique characteristics of information.5. Use information in ethical and legal ways to supportor refute research hypotheses.6. Cite subject-expert authors and scholarly sourcesin the respective field of study.7. Locate relevant books, journals, articles, and scholarlyWeb sites to support research activities.Graduates will be able to1. Communicate effectively to a target audience.2. Use expected conventions of format andorganization in writing.3. Use credible reasoning and evidence in communication.4. Satisfy standards of writing style and grammaticalcorrectness.5. Produce an acceptably researched and documentedextended essay.6. Incorporate sufficient use of appropriateresearch, supporting evidence, and relevantsources.Graduates will be able to1. Identify the basic parts and functions ofcomputers, information systems, networks,and the relationships between data andinformation in the computer environment.2. Analyze issues faced by information systemprofessionals, including security, ethical, andprivacy problems.3. Explain the issues in managing informationsystems.4. Effectively use the Internet to find, evaluate,and present information.5. Create simple word processing documents,spreadsheets, databases, and presentations.Graduates will be able to1. Determine the nature and extent of informationneeded.2. Access needed information effectively andefficiently.3. Evaluate information and sources critically.4. Individually, or as a member of a group, useinformation effectively to accomplish a specificpurpose.5. Understand the economic, legal, and socialissues surrounding the use and access A-1

Alignment of School-Level Outcomes and SLEsSLE Expected Outcome (Graduate) Expected Outcome (Undergraduate)Quantitative Reasoning(QUAN)Critical Thinking (THIN)Graduates will be able to1. Perform quantitative operations relative to thechosen field of study.2. Use skills involved in data collection and interpretationfor the purpose of describing phenomena,creating hypotheses, and analyzing results.3. Interpret data in graphical, tabular, or abstractedform for the purpose of summarizing results, revealingunderlining trends, and communicating keymeanings.4. Evaluate evidence and assertions based on quantitativeinformation and reasoning for the purposesof prediction, decision making, and problem solvingas well as determining risk and uncertainty.5. Recognize the limitations of mathematical andstatistical methods when creating and critiquingquantitative reasoning.Graduates will be able to1. Demonstrate the ability to integrate disparate concepts,theories, academic disciplines, and contexts(e.g., social, ethical, legal, cultural, global).2. Connect skills and knowledge from multiplesources, academic disciplines, and experiences.3. Apply theory to practice and vice versa in varioussettings.4. Utilize diverse and possibly contradictory informationand points of view.5. Understand, synthesize, and critically evaluateissues and information contextually so as to makeinformed decisions and judgments in the conduct ofpersonal, professional, and civic life.6. Show evidence of linkage between affective andcognitive (“head, hand, and heart”) learning in theirmanagement education.Graduates will be able to1. Demonstrate understanding of basic quantitativeand mathematical principles.2. Interpret quantitative information and ideas.3. Communicate quantitative informationeffectively.Graduates will be able to1. Make linkages or connections between diversefacts, theories, and observations.2. Use reasoning to recognize, develop, defend,and criticize arguments and other persuasiveappeals.3. Distinguish among assumptions, emotionalappeals, speculations, and defensible evidence.4. Weigh support for conclusions to determinehow well reasons support conclusions.5. Develop credible responses to complexquestions.Scientific Literacy (SCIE) N/A Graduates will be able to1. Describe basic concepts, principles, andcommon themes of the natural, social, andbehavioral sciences.2. Cite the strengths and limitations of scientificmethods of inquiry.3. Form relevant, reasonable, and sound scientificquestions.4. Think critically to recognize and weigh alternativescientific evidence, explanations, A-2

Alignment of School-Level Outcomes and SLEsSLE Expected Outcome (Graduate) Expected Outcome (Undergraduate)Historical and CulturalPerspectives(HIST)N/AGraduates will be able to1. Recognize basic characteristics of importanthistorical periods and cultures.2. Use appropriate historical and cultural evidenceto form relevant, reasonable, and objectiveconclusions.3. Demonstrate understanding of key historicaland cultural perspectives.4. Demonstrate an understanding of diversity in aglobal context.Ethics (ETH) N/A Graduates will be able to1. Explain how their decisions and actions affectthe human and physical environment.2. Explain the philosophy, techniques, and ethicaldecision making involved in making choices.Content/Discipline(SPEC)N/AGraduates will be able to1. Apply principles and theories in the discipline.2. Conduct research appropriate to the discipline.3. Integrate principles and theories of the disciplinefor evaluation and A-3

App e n d i x BAss e s s m e n t Pl a n sf o r th e Sc h o o l ofUnd e r g r a d u at eStu d i e s

Undergraduate assessment plans include the following:• Overview of Undergraduate Program-Level Assessment of SLE Areas, Spring 2010• Summary Plans for Undergraduate Program-Level Assessment of SLE Areas, 2009–10• Undergraduate Program-Level Assessments by SLE (All Programs), 2010–14Academic ProgramsHumanitiesLegal StudiesStudent Learning Expectations AssessedCOMM TECH INFO QUAN SCIE THIN ETH* HIST*Writing Computer Asian Studies Finance Biotechnology Accounting Global HistoryandBusiness andFire Science Information PoliticalEnvironmental Criminal Public PolicyScience ScienceManagement JusticeComputerInformationTechnologyComputerScienceComputerStudiesInformationSystemsManagementHumanitiesManagementStudiesPsychologyGerontologyLaboratoryManagement*Not included in external reporting.Chart 1: Overview of Undergraduate Program-Level Assessment of SLE Areas, Spring 2010EnglishGerontologyPoliticalScienceHumanResourceManagementLegal StudiesMarketingUndergraduate Program-Level and SLE Assessment, 2009–10InformationAssuranceLegal StudiesProgram/Major SLE(s) Course/Outcome Tool SU FA SPAcademic Writing COMM WRTG 101 Outcome 1 Final exam questionw/rubricDADMIA CDADRDAccounting THIN ACCT 495 Outcome 5 Project w/rubric DADMIA CDAsian Studies INFO ASTD 160 Outcome 2 Exam questionw/rubricBiotechnology SCIE BIOL 362 Outcome 1 Exam questionw/rubricBiotechnology SCIE BIOL 486A/486B Outcome 2 Being developed DADMBusiness Administration SPEC BMGT 364 Outcome 3 Final exam questionw/rubricCommunication Studies SPEC COMM 300 Outcome 2 Common assignmentw/rubricCommunication Studies TECH COMM 379A Outcome 5 Being developed DAComputer andInformation ScienceComputer andInformation ScienceComputer InformationTechnologyDADMDADMDADMDADMTECH CMIS 102 Outcome 1 Final exam question DADMTHINTECHIAIAIAIAIACDCDADCDADRDCDCDADRDCMIS 141/CMIS 310 Being developed DADMTECH CMIT 265 Outcome 1 Scenario-based essayquestion w/ B-1

Undergraduate Program-Level and SLE Assessment, 2009–10Program/Major SLE(s) Course/Outcome Tool SU FA SPComputer Science TECH CMSC 230 Outcome 1 Project w/rubric DADMComputer Science TECH CMSC 130 Outcome 2 Being developed DADMComputer Studies TECH (2) CMST 103 Outcome 1& Outcome 2Final exam questionw/rubricCriminal Justice THIN CCJS 100 Outcome 2 Paper w/rubric DADMCriminal Justice THIN CCJS 100 Outcome 3 Being developed DADMEmergency Management SPEC HMLS 495 Outcome 5 Research paperw/rubricEnglish THIN ENGL 303 Outcome 1 Final exam questionw/rubricEnvironmentalManagementEnvironmentalManagementDADMDADMDADMSCIE ENMT 301 Outcome 1 Essay w/rubric DADMIACDIAIAIAIAIAADRDCDADCDADIACDADCDSCIE ENMT 495 Being developed DADMFinance QUAN FINC 330 Outcome 1 Common examw/rubricFire Science COMM HMLS 495 Outcome 2 Research paperw/rubricGerontologyGlobal Business andPublic PolicyGovernment andPolitical ScienceGovernment andPolitical ScienceTHINSCIEGERO 302Final exam essayquestion w/rubricETH BMGT 392 Outcome 2 Research paperw/rubricTHIN GVPT 444 Final exam essayquestion w/rubricDADMDADMDADMDADMDADMINFO GVPT 480 Outcome 5 Being developed DADMHistory HIST HIST 157 Outcome 1 Final exam questionw/rubricHistory HIST HIST 309 Outcome 2 Being developed DA IADMHistory HIST HIST 116 Outcome 3 Being developed DADMHomeland Security SPEC HMLS 495 Outcome 7 Research paperw/rubricHumanitiesHuman ResourceManagementINFOCOMMPHIL 315Assigned essayw/rubricDADMDADMDADMTHIN HRMN 495 Final paper w/rubric DADMInformation Assurance ETH IFSM 310 Research paperw/rubricInformation SystemsManagementDADMTECH IFSM 300 Outcome 1 Final exam question DADMInvestigative Forensics SPEC CCJS 320 Outcome 1 Paper w/rubric B-2

Undergraduate Program-Level and SLE Assessment, 2009–10Program/Major SLE(s) Course/Outcome Tool SU FA SPInvestigative Forensics SPEC CCJS 425 Outcome 2 Being developed DADMLaboratory Management SCIE NSCI 301 Exam question DADMLaboratory Management SCIE BIOL 486A/486B Outcome 2 Being developed DADMLegal Studies ETH LGST 204 Outcome 2 Memorandum w/rubric IA CDADLegal Studies THIN LGST 204 Outcome 5 Memorandum w/rubric IA CDADLegal Studies COMM LGST 204 Outcome 6 Memorandum w/rubric IA CDADLegal Studies COMM LGST 201 Outcome 8 Being developed IA CDADManagement Studies INFO BMGT 364 Outcome 2 Essay w/rubric DADMMarketing THIN MRKT 495 Outcome 1 Case study w/rubric DADMPsychology INFO PSYC 305 Outcome 3 Final exam questionw/rubricSocial Science SPEC SOCY 100 Outcome 1 Common exam DADMChart 2: Summary Plans for Undergraduate Program-Level Assessment of SLE Areas, 2009–10Codes: Develop Assignment = DA; Define Measures = DM; Implement Assignment in Course = IA; Collect Data = CD; Analyze Data = AD;Report Data = RD; Determine and Implement Needed Changes = DIPrior Learning/Portfolio and Cooperative Education (COOP) will be assessed. Because of the format of these programs,they do not consistently fit with the model for the assessment of all other programs. However, each of these programs hasdeveloped desired outcomes and is continuing to work on effective models for B-3

The chart below shows the schedule for all undergraduate program-level assessments for 2010–14.Academic ProgramsSummer or Fall 2010COMM TECH INFO QUAN SCIE THIN ETHComputer Finance Asian Studies Investigative Asian Studies Asian StudiesScienceForensicsLegal StudiesInformationSystemsManagementInformationAssuranceFinance Computer andInformationSciencePoliticalScienceEnglishEnvironmentalManagementInformationSystemsManagementEnvironmentalManagementAcademic ProgramsSpring 2011COMM TECH INFO QUAN SCIE THIN ETHWritingComputer Asian Studies Asian Studies Asian StudiesScienceComputer andInformationScienceLegal StudiesComputerInformationTechnologyMarketing ComputerStudiesPoliticalScienceComputerStudiesEmergencyManagementLegal Studies HomelandSecurityLegal StudiesLegal StudiesAcademic ProgramsSummer or Fall 2011COMM TECH INFO QUAN SCIE THIN ETHHumanInformation Accounting EnvironmentalBiotechnology AccountingResource AssuranceManagementManagementHumanities Legal Studies CommunicationStudiesLegal Studies Human ResourceManagementHumanResourceManagementBusinessAdministrationComputerScienceHumanities InvestigativeForensicsInformationSystemsManagementBusinessAdministrationComputer andInformationScienceSocial B-4

Academic ProgramsWritingSpring 2012COMM TECH INFO QUAN SCIE THIN ETHComputerInformationTechnologyComputerStudiesGlobalBusiness andPublic PolicyHomelandSecurityHumanitiesComputerStudiesHuman ResourceManagementBiotechnology Biotechnology Fire Science CommunicationStudiesGerontology LaboratoryManagementGlobalBusiness andPublic PolicyHumanities Psychology LaboratoryManagementMarketingEnglishEnvironmentalManagementAcademic ProgramsSummer or Fall 2012COMM TECH INFO QUAN SCIE THIN ETHAccounting Accounting Asian Studies SocialAsian Studies Asian StudiesSciencesHumanitiesInvestigativeForensicsComputer andInformationScienceComputerScienceInformationSystemsManagementLegal StudiesBusinessAdministrationInformationAssuranceHumanities LaboratoryManagementLaboratoryManagementSocialSciencesManagementStudiesCriminal JusticeFinanceManagementStudiesAcademic ProgramsSpring 2013COMM TECH INFO QUAN SCIE THIN ETHCommunication Computer and Criminal Justice ComputerComputer ComputerStudies InformationScienceInformation ScienceScienceTechnologyHumanities Marketing Humanities CommunicationStudiesEmergencyManagementFire B-5

Academic ProgramsSummer or Fall 2013COMM TECH INFO QUAN SCIE THIN ETHEnglishComputer English Biotechnology GerontologyInformationTechnologyHumanities English Humanities FinanceLaboratory Information InvestigativePsychologyManagement SystemsForensicsManagementInformationAssuranceAcademic ProgramsSpring 2014COMM TECH INFO QUAN SCIE THIN ETHBusiness Fire Science Asian Studies Asian Studies Asian StudiesAdministrationHumanities Biotechnology LaboratoryManagementChart 3: Undergraduate Program-Level Assessments by SLE (All Programs), 2010–14ComputerInformationTechnologyFire Science EnvironmentalManagementHumanities GlobalBusiness andPublic B-6

App e n d i x CAss e s s m e n t Pl a n s fo rt h e Gr a d u at e Sc h o o lo f Ma n a g e m e n t an dTec h n o l o g y

Plans for Graduate Assessment by SLESLE: Written CommunicationPast Assessment: A standardized rubric, entitled the GSMT Standardized Writing Rubric, was developed and piloted in 2006. Assessmentof graduate students began in spring 2007 using the standardized rubric. Results were reported in August 2007.ProgramPlansMaster – Accounting & Financial ManagementRound 1: FIN 610, Short Paper, scored using GSMT COMM Rubric; Round2: ACCT 612, Research Paper, scored using GSMT COMM Rubric; Round 3:FIN 660, CFO Research Paper, scored using GSMT COMM Rubric.Master – Accounting & Information TechnologyRound 1: ACCT 610, Short Paper, scored with GSMT COMM Rubric; Round2: ACCT 614, Short Paper, scored with GSMT COMM Rubric; Round 3: ACCT665, Short Paper, scored with GSMT COMM Rubric.Master – Financial Management & Information SystemsRound 1: FIN 610, Short Paper, scored using GSMT COMM Rubric; Round2: FIN 645, Short Paper, scored using GSMT COMM Rubric; Round 3: MSFS670, Short Paper, scored using GSMT COMM Rubric.Master – Business Administration (MBA)Round 1: AMBA 610, GSMT COMM Rubric; Round 2: AMBA 640, GSMTCOMM Rubric; Round 3: AMBA 670, GSMT COMM Rubric.Master – Environmental Management Round 1: USCP 611, Final Exam; Round 2: ENVM 670; Round 3: ENVM 643.Master – Distance Education Round 1: OMDE 603, Essay and/or Writing Task; Round 2: OMDE 606,Project; Round 3: OMDE 670, Essay and/or Writing Task.Master – Health Care AdministrationHCAD 600, Research Paper.Past Assessment: No previous assessment.ProgramMaster – Accounting & Financial ManagementMaster – Accounting & Information TechnologyMaster – Financial Management & Information SystemsMaster – Business Administration (MBA)Master – Environmental ManagementSLE: Technology FluencyPlansRound 1: MGMT 640, Homework in Excel, scored using modified TECHRubric; Round 2: FIN 630, Homework in Excel, scored using modified TECHRubric; Round 3: FIN 660, Simulation Homework Assignment, scored usingmodified TECH Rubric.Round 1: ACCT 610, Homework in Excel, scored using modified TECHRubric; Round 2: ACCT 614, Homework in Access, scored using TECHRubric; Round 3: ACCT 665, Homework in Excel, scored using modifiedTECH Rubric.Round 1: MGMT 640, Homework in Excel, scored using modified TECHRubric; Round 2: FIN 630, Homework in Excel, scored using modified TECHRubric; Round 3: MSFS 670, Homework in Excel, scored using modifiedTECH Rubric.Round 2: Course-identified rubric and assignment; Round 3: Courseidentifiedrubric and assignment.Round 2: ENVM 649, Exam; Round 3: ENVM 670, Term Project.Master – Distance Education Round 1: OMDE 603, Essay and/or Writing Task; Round 2: OMDE 606,Project; Round 3: OMDE 670, E-portfolio.Master – Health Care AdministrationHCAD 600, C-1

Plans for Graduate Assessment by SLESLE: Information LiteracyPast Assessment: The assessment tool, a standardized final examination entitled the UCSP 611 Final Exam, was selected as the tool tomeasure institutional-level student learning. The assessment tool was developed and piloted in 2006. Assessment of graduate studentsbegan in spring 2007. Results were reported in October 2007.ProgramPlansMaster – Accounting & Financial ManagementRound 1: UCSP 611 Final Exam or ACCT 608 Article Briefing Paper, scoredusing GSMT Modified INFO Rubric; Round 2: ACCT 665, Short Paper, scoredusing GSMT Modified INFO Rubric; Round 3: MSAF 670, Article BriefingPaper, scored using GSMT Modified INFO Rubric.Master – Accounting & Information TechnologyRound 1: UCSP 611 Final Exam or ACCT 610 Short Paper, scored usingGSMT Modified INFO Rubric; Round 2: ACCT 608, Article Briefing Paper,scored using GSMT Modified INFO Rubric; Round 3: ACCT 665, ArticleBriefing Paper, scored using GSMT Modified INFO Rubric.Master – Financial Management & Information Systems Round 1: UCSP 611 Final Exam or FIN 615 Short Paper, scored using GSMTModified INFO Rubric; Round 2: FIN 630, Short Paper, scored using GSMTModified INFO Rubric; Round 3: MSFS 670, Short Paper, scored usingGSMT Modified INFO Rubric.Master – Business Administration (MBA)Round 1: UCSP 611, Final Exam; Round 2: Rubric and Assignment from OADepartments; Round 3: Rubric and Assignment from OA Department.Master – Environmental ManagementRound 1: UCSP 611 Final Exam or ENVM 646 Assignment; Round 2: ENVM643, Assignment; Round 3: ENVM 649, Term Paper.Master – Distance Education Round 1: USCP 611, Final Exam; Round 2: OMDE 606, Project; Round 3:OMDE 670, Essay and/or Writing Task.Master – Health Care AdministrationUCSP 611 Final Exam or HCAD 600 Research Paper.SLE: Critical ThinkingPast Assessment: No previous assessment. School-level outcomes in the graduate school for critical thinking emphasize the importanceof integrating a variety of skills from SLE areas as well as knowledge from a variety of disciplinary areas. Consequently, criticalthinking will not be assessed in Round 1 for any of the programs. Given the nature of school-level outcomes for this SLE and theemphasis on integrating learned skills, there is nothing to integrate at the outset of the program. After three or more courses have beentaken—by Round 2—this SLE will be assessed.ProgramPlansMaster – Accounting & Financial Management Round 2: FIN 614, Group Project, scored using Custom Rubric; Round 3:FIN 660, Simulation, scored using Custom Rubric.Master – Accounting & Information Technology Round 2: ACCT 614, Group Project, scored using Custom Rubric; Round 3:MSAT 670, Group Project, scored using Custom Rubric.Master – Financial Management & Information Systems Round 2: FIN 614, Group Project, scored using Custom Rubric; Round 3:MSFS 670, Group Project, scored using Custom Rubric.Master – Business Administration (MBA) Courses, assignments, and rubrics are being identified for Rounds 2 and 3.Master – Environmental ManagementRound 2: ENVM 646, Assignment; Round 3: ENVM 670, Term Project.Master – Distance Education Round 2: OMDE 606, Project; Round 3: OMDE 670.Master – Health Care AdministrationHCAD 670, Case C-2

Plans For Graduate Assessment by SLESLE: ContentPast Assessment: No previous assessment. Each department is developing a rubric to assess program content knowledge.ProgramPlansMaster – Accounting & Financial Management Round 1: MGMT 640, Final Exam; Round 2: FIN 630, Final Exam; Round 3:MSAF 670, Final Exam.Master – Accounting & Information Technology Round 1: ACCT 610, Final Exam; Round 2: ACCT 614, Final Exam; Round 3:MSAT 670, Final Exam.Master – Financial Management & Information Systems Round 1: MGMT 640, Final Exam; Round 1: FIN 615, Final Exam; Round 3:MSFS 670, Group Project Assignment, scored using Custom Rubric.Master – Business Administration (MBA)Assignment and rubrics embedded in three courses at each of threerounds.Master – Environmental ManagementRound 1: ENVM 643, Assignment; Round 2: ENVM 646, Assignment;Round 3: ENVM 670, Term Project.Master – Distance Education Round 1: OMDE 603, Essay and/or Writing Task; Round 2: OMDE 606,Project; Round 3: OMD 670, Essay and/or Writing Task.Master – Health Care AdministrationHCAD 670, Case C-3

App e n d i x DExa m p l e of a Pr o g r a mAss e s s m e n t Pl a nin th e Sc h o o l ofUnd e r g r a d u at e St u d i e s

Five-Year Program Assessment PlanFive-Year Assessment Cycle Planning (Tentative Schedule)In completing this schedule, please keep in mind that all of your program outcomes should be assessed at least once duringthe five-year cycle so that, when a program/discipline comes up for review, you can report on the learning for all ofthe goals. Every goal should be listed at least once, and every goal does not need to be assessed every year.Date: February 26, 2009Program/Discipline Name: Computer information technologyProgram Mission: The computer information technology major prepares students to enter or advance in informationtechnology fields where certification of knowledge level is commonly considered in hiring and promotiondecisions. The computer information technology major is designed to combine the benefits of a traditional collegeeducation with the benefits of hands-on training in state-of-the-art computer technology. Students becometechnically competent but also learn to write well-organized and clear memoranda and reports. The computerinformation technology curriculum integrates technical skill with communication skills, superior general educationrequirements, and breadth of knowledge in computer information technology fields, particularly networking.Student Learning Outcomes:1. Develop solutions to network administration problems.2. Apply effective information research techniques.3. Use effective communication.4. Critically analyze multifaceted network problems and develop solutions (THIN).5. Articulate ethical issues related to the impact of information technology on contemporary issues (ETH).StudentLearningOutcomesPlace inCurriculum(course)Su09Fa09Sp10Su10Fa10Sp11Su11Fa11Sp12Su12Fa12Sp13Su13Fa13Sp141 CMIT 265 DADMIACDADRDDIDADMIACDADRDDI2 CMIT 368 DADMIACDADRDDI3 CMIT 368 DADMIACDADRDDI4 Capstone DADMIACDADRDDI5 Capstoneand CMIT 321DADMIACDADField Codes:Develop Assignment = DADefine Measures = DMImplement Assignment in Course = IACollect Data = CDAnalyze Data = ADReport Data = RDDetermine and Implement Needed Changes = D-1

App e n d i x EExa m p l e of a Pr o g r a mAss e s s m e n t Pl a nin th e Gr a d u at e Sc h o o lo f Ma n a g e m e n t an dTec h n o l o g y

Mast e r of Sc i e n c e in Ac c o u n t in ga n d Fin a n c ia l Ma n a g e m e n t (MSAF)Man a g e m e n t , Ac c o u n t in g , a n d Fin a n c e De pa rt m e n tPr o g r a m As s e s s m e n t Pl a nPr o g r a m Ou t c o m e s an d Le a r n in gAss e s s m e n t Cr i t e r i aSpr in g 2009

CONTENTSProgram Overview....................................................................................................................................................... E-3Program of Study......................................................................................................................................................... E-3Development of Program Outcomes............................................................................................................................ E-4Program Outcomes...................................................................................................................................................... E-5Alignment of Program Outcomes with Learning Objectives and Assessment Methods................................................. E-2

PROGRAM OVERVIEWThe Master of Science in Accounting and FinancialManagement Program is designed for students who wantto gain a comprehensive understanding of the financialreporting process, the impact of financial reporting onthe financial markets, and the use and analysis of financialinformation for better decision making. Graduatespossess academic depth in both the fields of accountingand finance upon completion of coursework in financialand managerial accounting, taxation, financial managementof operations, capital markets, investments, globalfinance, corporate ethics, and e-commerce. The resultingcompetencies will enable graduates to pursue managementpositions of increasing responsibility, with potential toassume the role of chief financial officer. Additionally, successfulcompletion of the program may satisfy the educationrequirement for candidacy for the Certified PublicAccountant examination.PROGRAM OF STUDYCore CoursesUCSP 611Introduction to Graduate LibraryResearch Skills0 credit hoursACCT 608 Fraud Detection and Accounting Ethics 3 credit hoursACCT 610 Financial Accounting 3 credit hoursACCT 612 Auditing Process 3 credit hoursACCT 613 Federal Income Taxation 3 credit hoursACCT 665 Special Topics in Accounting 3 credit hoursFIN 610 Financial Management in Organizations 3 credit hoursFIN 620Capital Markets, Institutions, andLong-Term Financing3 credit hoursFIN 630 Investment Valuation 3 credit hoursFIN 645 Behavioral Finance 3 credit hoursFIN 660 Strategic Financial Management 3 credit hoursMGMT 640 Financial Decision Making forManagers3 credit hoursMSAF 670Financial Management andAccounting Capstone3 credit E-3

DEVELOPMENT OF PROGRAM OUTCOMESThe table below identifies the curricular influences that support the program outcomes specific to the Master of Sciencein Accounting and Financial Management Program.SourceStudent Learning Expectation ofthe UMUC Graduate School ofManagement and TechnologySources/Resources Providing Curricular Foundation For Program OutcomesMaster of Science in Accounting and Financial Management ProgramDescriptionUMUC degree programs are required to embed identifiedinstitutional SLEs into each degree program. The SLEs forthe Graduate School of Management and Technology areWritten Communication (COMM)Information Literacy (INFO)Technology Fluency (TECH)Program Content Knowledge (KNOW/SPEC)Integrative Skills (THIN)Web Address or Document Name(if applicable)UMUC Institutional Plan for theAssessment of Student LearningOutcomesAmerican Institute of CertifiedPublic Accountants (AICPA)American Finance Association(AFA)Financial EducationAssociationInternational Federation ofAccountants (IFAC)State Board of PublicAccountancy of MarylandThe expanded definition for each Student Learning Expectationwas considered in creating the respective programoutcome.The AICPA writes the Certified Public Accountant examinationand suggests curricula that will prepare studentsfor entry into the public accounting profession.The AFA is an academic organization that publishes TheJournal of Finance, a leading resource on the applicationof finance theory.This professional association of finance academicianshas as its mission to enhance the teaching of financialeducation and improve the collegiate financial educationexperience and curriculum.The IFAC is a pioneer in discussions of distance educationin accounting and suggests standards that should be metby such programs.The Maryland State Board of Public Accountancy setseducation requirements for eligibility for the CertifiedPublic Accountant examination in Maryland.http://www.aicpa.orghttp://www.afajof.orghttp://www.fea.sju.eduhttp://www.ifac.org E-4

PROGRAM OUTCOMESThe program outcomes for the Master of Science in Accounting and Financial Management Program are delineatedbelow. The program outcomes describe the expectations for all graduates of the Master of Science in Accounting andFinancial Management Program.StudentLearningExpectationCOMMINFOTECHKNOW(SPEC)THINPROGRAM OUTCOMESMaster of Science in Accounting and Financial Management ProgramProgram OutcomeUse effective communicationto express clearly ideas aboutaccounting and financialmanagement in a professionalmanner and tone.Identify problem areas inaccounting and financialmanagement and informationsources useful in formingresolutions to such problems.Utilize technology andinformation systems in thefinancial management oforganizations to share accessto information and improvethe quality of decision makingenterprise-wide.Integrate accounting andfinancial managementconcepts, principles, andapplications into a coherent,structured system for analysisand resolution of problemswithin organizations.Evaluate complex problemsand ethical situations found inaccounting and financialmanagement and potential appropriatesolutions within thecontext of the internal structuresand external institutionsthat influence multinationalorganizations in ways thatevidence problem solving,systems thinking, criticalthinking, ethical behavior, sensitivityto diversity, team work,and quantitative reasoning.ACCT608ACCT610ACCT612ACCT665FIN610FIN630FIN6452 1 3FIN660MGMT6401 2 32 3 11 2 3Note: Number under course column indicates the round in which a specific SLE will be assessed.2 1 E-5

Student LearningExpectationCOMMINFOTECHKNOW (SPEC)THINPROGRAM OUTCOMESMaster of Science in Accounting and Financial Management ProgramSummary Schedule by RoundsProgram Outcome Round 1 Round 2 Round 3Use effective communication to express clearly ideas about accountingand financial management in a professional manner andtone.Identify problem areas in accounting and financial management andinformation sources useful in forming resolutions to such problems.Utilize technology and information systems in the financial managementof organizations to share access to information and improvethe quality of decision making enterprise-wide.Integrate accounting and financial management concepts, principles,and applications into a coherent, structured system for analysisand resolution of problems within organizations.Evaluate complex problems and ethical situations found in accountingand financial management and potential appropriate solutionswithin the context of the internal structures and external institutionsthat influence multinational organizations in ways that evidenceproblem solving, systems thinking, critical thinking, ethical behavior,sensitivity to diversity, team work, and quantitative reasoning.FIN 610 ACCT 612 FIN 660ACCT 608 ACCT 665 MSAF 670MGMT 640 FIN 630 FIN 660MGMT 640 FIN 630 MSAF 670ACCT 610 FIN 645 FIN E-6

ALIGNMENT OF PROGRAM OUTCOMES WITH LEARNING OBJECTIVES ANDASSESSMENT METHODSThe following grid aligns the program outcomes of the Master of Science in Accounting and Financial ManagementProgram with learning objectives from the designated program coursework and with the specific methods used to assessstudent learning within the degree program.Student LearningExpectationCOMMCURRICULAR ALIGNMENTMaster of Science in Accounting and Financial Management ProgramProgram OutcomeUse effective communication toexpress clearly ideas about accountingand financial management in aprofessional manner and tone.Learning Objective(s) andCorrelating CourseworkRound 1: FIN 610 – Identify thefunctions of financial managementin public, private and not-for-profitorganizations.Method(s) of AssessmentRound 1: FIN 610 – Short Paper,scored using GSMT COMMRubricRound 2: ACCT 612 – Explain themeaning of Generally Accepted AuditingStandards and distinguish thetypes of assurance and attestationprovided by independent auditors.Round 2: ACCT 612 – ResearchPaper, scored using GSMTCOMM RubricRound 3: FIN 660 – Analyze thefinancial management policies andpractices of selected organizationsand recommend methods forimproving them.Round 3: FIN 660 – CFO ResearchPaper, scored using GSMTCOMM RubricINFOIdentify problem areas in accountingand financial management andinformation sources useful in formingresolutions to such problems.Round 1: ACCT 608 – Analyze thedifferent types of fraud and theircauses in a variety of contexts.Round 2: ACCT 665 – Identify thediversity in global accounting practices,the difficulties such diversitycauses, and the pros and cons ofinternational harmonization ofaccounting standards.Round 1: ACCT 608 – ArticleBriefing Paper, scored usingGSMT Modified INFO RubricRound 2: ACCT 665 – ShortPaper, scored using GSMTModified INFO RubricRound 3: MSAF 670 – Assess thestrengths and weaknesses ofcorporate strategies from recentfinancial performance.Round 3: MSAF 670 – ArticleBriefing Paper, scored usingGSMT Modified INFO RubricTECHUtilize technology and informationsystems in the financial managementof organizations to share access toinformation and improve the quality ofdecision making enterprise-wide.Round 1: MGMT 640 – Perform afinancial analysis of a for-profit firmusing the organization’s financialstatements and conducting a financialratio analysis.Round 1: MGMT 640 – Homeworkin Excel, scored usingmodified TECH rubricRound 2: FIN 630 – Apply valuationprinciples using various comparablestechniques, such as multiplesof earnings and revenues.Round 2: FIN 630 – Homeworkin Excel, scored using modifiedTECH rubricRound 3: FIN 660 – Analyze thefinancial management policies andpractices of selected organizationsand recommend methods forimproving them.Round 3: FIN 660 – SimulationHomework Assignment, scoredusing modified TECH E-7

Student LearningExpectationKNOW (SPEC)CURRICULAR ALIGNMENTMaster of Science in Accounting and Financial Management ProgramProgram OutcomeIntegrate accounting and financialmanagement concepts, principles,and applications into a coherent,structured system for analysis andresolution of problems within organizations.Learning Objective(s) andCorrelating CourseworkRound 1: MGMT 640 – Apply financialand nonfinancial informationto both short-term and long-termmanagement decisions in forprofit,nonprofit, and governmentorganizations.Method(s) of AssessmentRound 1: MGMT 640 – Final ExamRound 2: FIN 630 – Assess theimpact of managerial and financialdecision making on the value of thefirm.Round 2: FIN 630 – Final ExamRound 3: MSAF 670 – Apply thecourse material and the concepts,principles, and tools developedin the financial management andaccounting courses to support thevaluation estimate of a given firm.Round 3: MSAF 670 – Final ExamTHINEvaluate complex problems and ethicalsituations found in accounting andfinancial management and potentialappropriate solutions within thecontext of the internal structures andexternal institutions that influencemultinational organizations in waysthat evidence problem solving, systemsthinking, critical thinking, ethicalbehavior, sensitivity to diversity, teamwork, and quantitative reasoning.Round 1: ACCT 610 – Articulateissues currently debated in modernaccounting research and demonstratethe ability to critique relevantresearch based on professionalaccounting literature.Round 2: FIN 645 – Apply the effectsof potential biases with the use ofvaluation heuristics to real-worldscenarios provided by the instructor.Round 1: ACCT 610 – GroupProject, scored using CustomRubricRound 2: FIN 614 – GroupProject, scored using CustomRubricRound 3: FIN 660 – Apply theconcepts and tools developed inthe track courses to the solutionof financial management problemsthat arise in cases and/or simulationsin this course.Round 3: FIN 660 – Simulation,scored using Custom E-8

App e n d i x FExa m p l e of anUnd e r g r a d u at e Pr o g r a mAss e s s m e n t Re p o r t:Soc i a l Sc i e n c e s (2009)

Lea r n i n g Ou t c o m e sAss e s s m e n t Re p o r tPr o g r a m : So c ia l Sc i e n c e sCou r s e: So c io l o g y 100Ter m: Fa l l 2009

I. Program Learning OutcomeAssessedExplain basic foundational concepts of sociology.II. MethodsA 50-item, 4-option, multiple choice test was used. Thetest was a common final exam. The exam had three testforms – A, B, and make-up. The analyses below are forForm A. The sample sizes for Form B (n = 18) and themake-up (n = 37) were too small to allow for meaningfulanalysis. Form A reflects both CBT and Paper-and-Pencildata.III. Results• The sample size was 152. The minimum score in thesample was 18 out of 50. The maximum score in thesample was 50 out of 50. The mean score was 35.03(SD = 5.92).• For this test, Coefficient Alpha is .774. This test has agood level of reliability. No items should be removedbased on the reliability analysis.• Based on the item difficulty analysis, Items 7, 8, 12, 16,17, 18, 20, 25, 33, 25, 40, and 43 are very easy; theyhave a proportion correct of 90% or higher. That meansthat 90% or more of the students who took the examanswered these items correctly. Items 39, 50, and 4, aredifficult; they reflect a proportion correct of 20% to30%.• Based on the item discrimination indices, special attentionshould be paid to Items 5, 16, 45, 1, 2, 6, 10, 12,13, 15, 20, 21, 23, 32, 34, 35, 37, 40, and 43 becausetheir discrimination level is very low, and in some caseszero. The only item with a high level of discriminationis Item 47. Generally, these items do not perform wellin discriminating between students who know the materialand those who do not know the material.• The items in this test that have fully functioning distractersare Items 19, 21, 23, 28, 37, 38, 44, 45, 47, 48,and 49. For all other test items, the distracters should bereviewed, such that the item writers should change theunused distracter to be more feasible options.• See attached Item Analysis Report for details.IV. Next Steps• The primary suggestion for improving the test is tore-write the distracters that are used less than 5% of thetime, such that they are more feasible options.• Remember to define a benchmark using the results,against which future results can be compared to determineif desired improvements in learning are beingreached.• Remember to disseminate results, reports, recommendations,and changes to appropriate audiences (e.g.,faculty).V. Program AdditionsLearning Outcomes Assessment ReportProgram Comments and ActionsSocial Sciences ProgramSubmitted by Darlene A. Smucny, Ph.D.5/3/2010I. SummaryThe Learning Outcomes Assessment Report of the ItemAnalysis (psychometric analysis and suggestions forcontent review) was reviewed by the Academic Director(Darlene Smucny), the SOCY 100 Exam Chair, and thedocument was posted in the Faculty Resource classroom(BEHS 999) for general discussion. In order to betterdetermine which areas of the course may pose more difficultyto students, or are less clear, the multiple-choice testitems were mapped to course goals and objectives (Table I,Table II).Preliminary observations of the mapping reveal that thethree most difficult items all mapped to course objective#1, while the very easy questions mapped to a number ofthe objectives (#1, 3,4, 5,7, and 8). The Academic Directorand the Exam Chair will work to improve the discriminationof the multiple-choice test items, by re-writing thedistracters particularly for items identified as having lowdiscrimination (Table II). The test items will be furtherreviewed in terms of basic sociological concepts, specifyingbeyond the course goals and objectives. Working withthe faculty, the Program will identify the benchmarks ofstudent learning for this F-2

II. Tasks, Action Itemsand TimelineTask Action Item Target DateImprove discrimination of multiplechoicetest itemsWork with Exam Chair in re-writing distractersfor items with low discriminationSpring 2010Determine how sociology course goalsand objectives, and then concepts mapto item analysis resultsDefine benchmarks of student learningfor this assessmentUse assessment results to improveteaching practices in SOCY 100Table I: SOCY 100 Course Objectives:1. identify the main components of a sociological perspective as contrastedwith a common sense perspective2. apply key theoretical frameworks to such institutionalized spheres asfamily, work, and the economyMap course goals and objectives, andthen more specific concepts, to resultsof item analysis; identify possible problemareas in course and discuss withinstructorsWork with SOCY faculty to definebenchmarksWork with SOCY working group prepareguidelines for best practices for SOCY100 instruction; “course guide”3. explain sociologically how social structures and cultures serve both tofacilitate as well as to constrain people’s actions and interactions4. identify components of social stratification as rooted in such characteristicsas race, gender, class, and age5. identify social, cultural, and economic differences among developedand newly developing countries6. describe the importance of population growth and decline for society7. summarize current patterns of global stratification8. apply skills in critical thinking as well as research and writing to topicsimportant to your own outlook and perspectiveSummer 2010Summer 2010Academic Year 2010-11 (as part ofSEGUE curricular redesign process inthe School of Undergraduate Studies) F-3

Question, Version A Question, Version B Correct Response Mapped to Objective Difficulty Item Discrimination1 28 B 3 low2 29 D 1 low3 30 B 1,34 31 D 1 difficult5 32 A 3 low6 33 C 8 low7 34 B 3 Very easy8 1 B 4 Very easy9 38 A 310 2 D 3 low11 13 B 5,712 14 B 8 Very easy low13 15 B 3 low14 16 B 415 17 B 2 low16 18 A 8 Very easy low17 19 C 4 Very easy18 20 B 1 Very easy19 21 B 820 22 C 1 Very easy low21 23 A 1 low22 24 A 323 26 D 3 low24 47 C 5,725 48 C 3 Very easy26 27 C 327 35 D 628 36 B 429 37 A 330 39 A 131 40 D 832 41 A 3 low33 42 A 5,7 Very easy34 43 D 5,7 low35 6 A 6 low36 7 B 337 5 C 3 low38 3 A 139 4 C 1 difficult40 8 B 3 Very easy low41 9 B 142 10 C 143 11 B 1 Very easy low44 12 A 5,6,745 44 C 8 low46 45 A 347 46 B 1 high48 49 B 349 50 A 350 25 A 1 difficultTable II: Mapping of Test Items to Course Goals and Objectives, and General Summary of Item F-4

Ite m An a ly s is Re p o r tCou r s e: So c io l o g y 100Tes t: So c io l o g y 100 Fin a l Ex a mSem e st e r: Fa l l 2009

Section One: PsychometricAnalysisPart I: BackgroundSummary statistics, reliability calculations, item difficultyindices, item discrimination indices, and distributions ofstudents’ item responses are presented for the Sociology100 Final Exam. The exam had three test forms—A, B,and make-up. The analyses below are for Form A. Thesample sizes for Form B and the make-up were too smallto allow for meaningful analysis. Form A reflects bothCBT and Paper-and-Pencil data. The analysis was conductedusing Classical Test Theory (CTT). A review of thetest items’ contents was also conducted. Combining thepsychometric analysis with the content review, recommendationsare made for improving the test’s psychometrics bymaking modifications to the test content.Part II: Summary StatisticsThe sample size was 152. The minimum score in thesample was 18 out of 50. The maximum score in thesample was 50 out of 50. The mean score was 35.03(SD = 5.92). Table 1 shows the summary statistics.Count (n) 152Minimum 18Minimum Possible 0Maximum 50Maximum Possible 50Median 34.50Mean 35.03Standard Deviation 5.92Table 1: Summary StatisticsPart III: Reliability (Coefficient Alpha)Coefficient alpha estimates the internal consistencyreliability of test scores. That is, combined overall, it isa measure of how much consistency there is among theitems when measuring the students’ knowledge. The valuesof coefficient alpha can range from zero to one. The closerthe reliability coefficient is to one, the better the reliabilityis. High reliability is desirable especially when the goal ofthe test is to measure a single dimension of knowledge,because the magnitude of coefficient alpha indicates thelevel of consistency among test items that measure similarcontent. For interpreting the value of coefficient alpha,the common rule of thumb is that when the alpha issomewhere between 0.60 and 0.70, the test scores can beregarded as having acceptable reliability. When coefficientalpha is above 0.80, the test scores can be regarded ashighly reliable.For this test, Coefficient Alpha is .774. This test has agood level of reliability.Sometimes coefficient alpha can increase by removing atest item or items. Table 2 shows the overall Alpha, identifieseach test item, identifies what Alpha would be withoutthat item, and what the change in Alpha is if that itemwere to be removed. By removing any item in this test,reliability would decrease. No items should be removedbased on the reliability analysis.ItemAlpha without ItemAlpha = .774Change in Alpha byRemoving Item1 .772 -0.001912 .773 -0.001413 .769 -0.004744 .768 -0.005985 .776 0.002426 .772 -0.001877 .771 -0.002788 .772 -0.002459 .772 -0.0022910 .773 -0.0008211 .767 -0.0067412 .773 -0.0005513 .773 -0.0007414 .771 -0.0029915 .774 0.0000816 .774 0.0004617 .770 -0.0039718 .770 -0.0035319 .772 -0.0023420 .772 -0.0017421 .775 0.0011622 .770 -0.0039923 .773 -0.0014624 .769 -0.0049825 .769 -0.0049026 .769 -0.0047227 .770 F-6

ItemAlpha without ItemAlpha = .774Change in Alpha byRemoving Item28 .769 -0.0046829 .771 -0.0033030 .769 -0.0050431 .769 -0.0052332 .774 -0.0001533 .770 -0.0040334 .774 -0.0001135 .773 -0.0009536 .766 -0.0079437 .774 0.0002738 .765 -0.0092139 .764 -0.0096940 .772 -0.0015141 .765 -0.0089042 .768 -0.0064143 .773 -0.0008444 .768 -0.0060945 .777 0.0030746 .769 -0.0052247 .759 -0.0152148 .768 -0.0063949 .768 -0.0055550 .768 -0.00576Table 2: Coefficient Alpha if a Single Item is EliminatedPart IV: Item DifficultyNext, the level of difficulty of each item was determined.To do this, a statistical measure called the difficulty indexwas used. The difficulty index (also called a p-value)represents the proportion of students who answeredthe item correctly to the total number of students whoanswered the item. That is, item difficulty is calculated asthe percentage correct for the total group of students oneach item. Therefore, an item’s difficulty or p-value canrange from 0 to 1. The lower the p-value the more difficultthe item is, because fewer students (a smaller proportion)would have answered it correctly. Based on the item difficultyanalysis here, Items 7, 8, 12, 16, 17, 18, 20, 25,33, 25, 40, and 43 are very easy; they have a proportioncorrect of 90% or higher. That means that 90% or moreof the students who took the exam answered these itemscorrectly. Items 39, 50, and 4, are difficult; they reflect aproportion correct of 20% to 30%. It is the responsibilityof the item writer to decide whether the difficulty level ofthe items is what they intended to achieve. Table 3 presentsthe item difficulty levels for all items..00:.10:.20: Item 39 Item 50.30: Item 4.40: Item 2 Item 3 Item 9 Item 11 Item 28 Item 49.50:Item 5 Item 6 Item 21 Item 23 Item 34 Item 38 Item 45 Item47.60: Item 1 Item 19 Item 26 Item 29 Item 30 Item 44 Item 48.70: Item 36 Item 41 Item 42.80:Item 10 Item 13 Item 14 Item 15 Item 22 Item 24 Item 27Item 31 Item 32 Item 37 Item 46.90:Item 7 Item 8 Item 12 Item 16 Item 17 Item 18 Item 20 Item25 Item 33 Item 35 Item 40 Item 43Table 3: Item DifficultyPart V: Item DiscriminationIn psychometrics, the discrimination index indicates howwell as test differentiates between two groups of scorers onthe test – a group of high scorers and a group of low scorers.One would expect that the high scorers (on the overalltest) would answer a particular item correctly more oftenthat the low scorers. That is, it is probable that a studentwho scores high on the overall test answers any given itemcorrectly. Item discrimination is calculated based on thecorrelation between a single item and overall test scores.The higher the discrimination index the better the itemsdiscriminates between students who know the materialand those students who do not know the material. A lowdiscrimination index means that the test item does nothave enough power to discriminate between students withlow overall test performance and high overall test performance.Based on the item discrimination indexes, specialattention should be paid to Items 5, 16, 45, 1, 2, 6, 10,12, 13, 15, 20, 21, 23, 32, 34, 35, 37, 40, and 43 becausetheir discrimination level is very low, and in some caseszero. The only item with a high level of discriminationis Item 47. Generally, these items do not perform well indiscriminating between students who know the materialand those who do not know the material. Item discriminationfor all items is shown in Table F-7

.00: Item 5 Item 16 Item 45Item 1 Item 2 Item 6 Item 10 Item 12 Item 13 Item 15.10: Item 20 Item 21 Item 23Item 32 Item 34 Item 35 Item 37 Item 40 Item 43Item 3 Item 4 Item 7 Item 8 Item 9 Item 14 Item 17 Item18 Item 19 Item 22 Item 24.20:Item 26 Item 27 Item 28 Item 29 Item 30 Item 31 Item33 Item 44 Item 49 Item 50Item 11 Item 25 Item 36 Item 38 Item 39 Item 41 Item.30:42 Item 46 Item 48.40:.50: Item 47.60:.70:.80:.90:Table 4: Item DiscriminationPart VI: Distribution of Students’ Responsesper ItemTable 5 below presents the percentage of students whochose each answer option for each item.The last column in Table 5 lists the incorrect answeroptions created by the item writers (also called distracters)that need to be reviewed. These distracters are selectedbecause less than 5% of all students chose these options.From a psychometric perspective, if a distracter only attractsa very small group of students or no students, it isnot performing its function. Distracters should be analyzed,as they can influence the quality of the item. It isimportant to make distracters feasible, so that a studentmust really know (not just guess at) the answer, in order toget it correct. If nobody selects an answer option, even ifit is wrong (a distracter), then the distracter is not distractinganybody, which means it is not a feasible option andshould be changed. For example, Option A for Item 7was not selected by any students and the same is true ofOption D for Item 8. The items in this test that have fullyfunctioning distracters are Items 19, 21, 23, 28, 37, 38, 44,45, 47, 48, and 49. For all other test items, the distractersshould be reviewed, such that the item writers shouldchange the unused distracter to be more feasible options.Table 5 also has two columns toward the right that indicatethe difficulty index and the discrimination index foreach item.ItemStudent Response ChoicesA B C DDifficulty Index Discrimination Index Distracters to be ReviewedItem 1 5% 63% 32% 1% 0.63 0.19 DItem 2 2% 14% 36% 48% 0.48 0.18 AItem 3 2% 41% 4% 53% 0.41 0.26 A and CItem 4 4% 37% 22% 38% 0.38 0.29 AItem 5 58% 9% 4% 29% 0.58 0.08 CItem 6 13% 12% 49% 26% 0.49 0.19 AItem 7 91% 3% 5% 0.91 0.21 A and CItem 8 1% 94% 5% 0.94 0.21 A and DItem 9 43% 32% 22% 3% 0.43 0.20 DItem 10 1% 11% 1% 86% 0.86 0.14 A and CItem 11 28% 48% 20% 3% 0.48 0.30 DItem 12 3% 96% 1% 0.96 0.11 A, C, and DItem 13 5% 86% 1% 9% 0.86 0.13 CItem 14 84% 15% 1% 0.84 0.21 A and CItem 15 15% 80% 4% 1% 0.80 0.12 C and DItem 16 98% 1% 1% 0.98 0.03 B, C, and DItem 17 3% 3% 93% 1% 0.93 0.28 A, C, and DItem 18 3% 95% 1% 1% 0.95 0.27 A, C, and DItem 19 20% 62% 10% 9% 0.62 0.20Item 20 3% 1% 95% 1% 0.95 0.18 A, B, and DItem 21 54% 7% 33% 6% 0.54 0.12Item 22 84% 6% 7% 3% 0.84 0.24 F-8

ItemStudent Response ChoicesA B C DDifficulty Index Discrimination Index Distracters to be ReviewedItem 23 18% 19% 7% 55% 0.55 0.18Item 24 1% 18% 80% 1% 0.80 0.27 A and DItem 25 3% 4% 90% 3% 0.90 0.29 A, B, and DItem 26 34% 63% 3% 0.63 0.26 B and DItem 27 3% 9% 5% 84% 0.84 0.23 AItem 28 34% 44% 14% 8% 0.44 0.26Item 29 66% 3% 16% 14% 0.66 0.22 BItem 30 60% 5% 32% 3% 0.60 0.26 DItem 31 3% 11% 7% 80% 0.80 0.27 AItem 32 84% 15% 1% 1% 0.84 0.12 C and DItem 33 90% 1% 9% 0.90 0.26 B and CItem 34 19% 9% 16% 55% 0.55 0.15 CItem 35 97% 3% 0.97 0.14 B, C, and DItem 36 3% 76% 16% 4% 0.76 0.35 A and DItem 37 6% 5% 81% 9% 0.81 0.11Item 38 54% 20% 16% 11% 0.54 0.36Item 39 9% 3% 28% 61% 0.28 0.39 BItem 40 5% 94% 1% 0.94 0.16 C and DItem 41 13% 75% 11% 2% 0.75 0.37 DItem 42 11% 4% 71% 14% 0.71 0.30 BItem 43 98% 2% 0.98 0.15 A, C, and DItem 44 62% 11% 5% 22% 0.62 0.29Item 45 5% 35% 56% 5% 0.56 0.07Item 46 88% 5% 1% 6% 0.88 0.29 CItem 47 34% 54% 5% 7% 0.54 0.50Item 48 10% 62% 13% 16% 0.62 0.30Item 49 47% 19% 22% 11% 0.47 0.28Item 50 24% 18% 54% 3% 0.24 0.29 DTable 5: Distribution of Student Responses for Each ItemTable 6 presents distributions of students’ response optionsin more detail than in Table 5. In Table 6, the distributionsof students’ responses correspond to test items andto students’ different ability levels. The students’ abilitygroups are determined based on students’ total scores fromthe exam. As you can see for item 1 there is an “upper”ability level, a “2 nd ” ability level, a “3 rd ” ability level, and soforth down to the “lower” ability level.With the proportion of answer options selected dividedinto ability groups the test writer can see which distractersare working (drawing in the students who don’t knowthe correct answer) and which distracters are not working(nobody selects them or students who score high on thetest seem to select the same wrong answer). If all or manyof the best test takers (i.e., high scorers) select a particulardistracter, this indicates that there may be a double-keyeditem (there are actually two correct answers to the item) orthat the answer key being used to score the test is wrong.Items with response in bold, indicate an unusual patternof responding based on ability level. For example, the3 rd (41%) and 4 th (30%) highest ability group choose D(the correct answer) less frequently than the lowest abilitygroup (43%) F-9

Item and AbilityLevelResponse Options ChosenA B C DItem 1 upper 0.03 0.83 0.13 0.002nd 0.07 0.60 0.33 0.003rd 0.03 0.72 0.25 0.004th 0.03 0.53 0.40 0.03lower 0.10 0.43 0.47 0.00Item 2 upper 0.00 0.03 0.23 0.732nd 0.07 0.03 0.37 0.533rd 0.03 0.16 0.41 0.414th 0.00 0.23 0.47 0.30lower 0.00 0.27 0.30 0.43Item 3 upper 0.00 0.60 0.00 0.402nd 0.00 0.60 0.07 0.333rd 0.00 0.34 0.03 0.634th 0.03 0.33 0.07 0.57lower 0.07 0.20 0.03 0.70Item 4 upper 0.00 0.10 0.23 0.672nd 0.00 0.13 0.40 0.473rd 0.03 0.59 0.06 0.314th 0.13 0.53 0.07 0.27lower 0.03 0.47 0.33 0.17Item 5 upper 0.77 0.00 0.00 0.232nd 0.53 0.10 0.00 0.373rd 0.59 0.00 0.13 0.284th 0.50 0.30 0.03 0.17lower 0.50 0.07 0.03 0.40Item 6 upper 0.07 0.07 0.77 0.102nd 0.10 0.13 0.47 0.303rd 0.22 0.06 0.53 0.194th 0.20 0.10 0.40 0.30lower 0.03 0.23 0.30 0.43Item 7 upper 0.00 0.93 0.00 0.072nd 0.00 1.00 0.00 0.003rd 0.00 0.94 0.03 0.034th 0.00 0.97 0.00 0.03lower 0.00 0.73 0.13 0.13Item 8 upper 0.00 1.00 0.00 0.002nd 0.00 0.97 0.03 0.003rd 0.00 1.00 0.00 0.004th 0.03 0.83 0.13 0.00lower 0.03 0.90 0.07 0.00Item 9 upper 0.67 0.17 0.17 0.002nd 0.50 0.27 0.20 0.033rd 0.28 0.41 0.31 0.004th 0.43 0.37 0.17 0.03lower 0.27 0.40 0.27 0.07Item and AbilityLevelResponse Options ChosenA B C DItem 10 upper 0.00 0.07 0.00 0.932nd 0.00 0.07 0.00 0.933rd 0.00 0.13 0.00 0.884th 0.00 0.17 0.03 0.80lower 0.07 0.13 0.03 0.77Item 11 upper 0.07 0.87 0.07 0.002nd 0.27 0.50 0.23 0.003rd 0.31 0.41 0.22 0.064th 0.33 0.37 0.27 0.03lower 0.43 0.27 0.23 0.07Item 12 upper 0.00 1.00 0.00 0.002nd 0.00 1.00 0.00 0.003rd 0.00 0.97 0.00 0.034th 0.07 0.90 0.00 0.03lower 0.07 0.93 0.00 0.00Item 13 upper 0.07 0.93 0.00 0.002nd 0.00 1.00 0.00 0.003rd 0.09 0.81 0.00 0.094th 0.03 0.73 0.03 0.20lower 0.03 0.80 0.03 0.13Item14upper 0.00 0.97 0.03 0.002nd 0.00 0.83 0.17 0.003rd 0.00 0.88 0.13 0.004th 0.00 0.90 0.10 0.00lower 0.00 0.63 0.33 0.03Item 15 upper 0.07 0.93 0.00 0.002nd 0.20 0.80 0.00 0.003rd 0.22 0.72 0.06 0.004th 0.13 0.83 0.00 0.03lower 0.13 0.70 0.13 0.03Item 16 upper 1.00 0.00 0.00 0.002nd 1.00 0.00 0.00 0.003rd 0.97 0.03 0.00 0.004th 0.93 0.03 0.03 0.00lower 1.00 0.00 0.00 0.00Item 17 upper 0.00 0.00 1.00 0.002nd 0.03 0.00 0.97 0.003rd 0.00 0.00 1.00 0.004th 0.03 0.00 0.93 0.03lower 0.07 0.13 0.77 0.03Item 18 upper 0.00 1.00 0.00 0.002nd 0.00 1.00 0.00 0.003rd 0.03 0.97 0.00 0.004th 0.03 0.97 0.00 0.00lower 0.10 0.80 0.07 F-10

Item and AbilityLevelResponse Options ChosenA B C DItem 19 upper 0.17 0.77 0.03 0.032nd 0.10 0.70 0.13 0.073rd 0.25 0.66 0.06 0.034th 0.30 0.53 0.10 0.07lower 0.17 0.43 0.17 0.23Item 20 upper 0.00 0.00 1.00 0.002nd 0.03 0.00 0.97 0.003rd 0.03 0.03 0.94 0.004th 0.00 0.00 0.97 0.03lower 0.10 0.00 0.90 0.00Item 21 upper 0.70 0.03 0.27 0.002nd 0.50 0.07 0.40 0.033rd 0.63 0.06 0.28 0.034th 0.50 0.17 0.27 0.07lower 0.37 0.03 0.43 0.17Item 22 upper 1.00 0.00 0.00 0.002nd 0.87 0.03 0.10 0.003rd 0.78 0.13 0.09 0.004th 0.90 0.00 0.00 0.10lower 0.63 0.13 0.17 0.07Item 23 upper 0.13 0.03 0.00 0.832nd 0.20 0.17 0.13 0.503rd 0.13 0.38 0.00 0.504th 0.20 0.17 0.13 0.50lower 0.27 0.20 0.10 0.43Item 24 upper 0.00 0.00 1.00 0.002nd 0.00 0.10 0.90 0.003rd 0.00 0.22 0.78 0.004th 0.00 0.23 0.70 0.07lower 0.03 0.33 0.63 0.00Item 25 upper 0.00 0.00 1.00 0.002nd 0.00 0.00 1.00 0.003rd 0.03 0.03 0.84 0.094th 0.03 0.03 0.93 0.00lower 0.10 0.13 0.73 0.03Item 26 upper 0.13 0.00 0.87 0.002nd 0.43 0.00 0.57 0.003rd 0.25 0.00 0.75 0.004th 0.37 0.00 0.60 0.03lower 0.53 0.00 0.37 0.10Item 27 upper 0.03 0.07 0.00 0.902nd 0.00 0.03 0.00 0.973rd 0.03 0.00 0.06 0.914th 0.03 0.13 0.03 0.80lower 0.03 0.23 0.13 0.60Item and AbilityLevelResponse Options ChosenA B C DItem 28 upper 0.07 0.77 0.13 0.032nd 0.40 0.47 0.10 0.033rd 0.41 0.38 0.16 0.064th 0.33 0.37 0.13 0.17lower 0.47 0.23 0.20 0.10Item 29 upper 0.83 0.03 0.13 0.002nd 0.77 0.03 0.10 0.103rd 0.69 0.00 0.19 0.134th 0.57 0.07 0.13 0.23lower 0.47 0.03 0.23 0.27Item 30 upper 0.83 0.00 0.17 0.002nd 0.67 0.03 0.30 0.003rd 0.63 0.00 0.34 0.034th 0.50 0.10 0.40 0.00lower 0.37 0.13 0.40 0.10Item 31 upper 0.00 0.00 0.00 1.002nd 0.00 0.10 0.10 0.803rd 0.00 0.09 0.03 0.884th 0.10 0.13 0.07 0.70lower 0.03 0.20 0.17 0.60Item 32 upper 0.93 0.07 0.00 0.002nd 0.83 0.13 0.00 0.033rd 0.91 0.09 0.00 0.004th 0.80 0.20 0.00 0.00lower 0.70 0.27 0.03 0.00Item 33 upper 0.97 0.00 0.00 0.032nd 1.00 0.00 0.00 0.003rd 0.91 0.00 0.00 0.094th 0.90 0.00 0.00 0.10lower 0.73 0.03 0.00 0.23Item 34upper 0.07 0.07 0.17 0.702nd 0.17 0.00 0.20 0.633rd 0.13 0.09 0.19 0.594th 0.30 0.13 0.13 0.43lower 0.30 0.17 0.13 0.40Item 35 upper 1.00 0.00 0.00 0.002nd 1.00 0.00 0.00 0.003rd 0.97 0.00 0.03 0.004th 0.97 0.00 0.03 0.00lower 0.90 0.00 0.10 0.00Item 36 upper 0.00 0.97 0.03 0.002nd 0.03 0.90 0.07 0.003rd 0.03 0.78 0.19 0.004th 0.03 0.73 0.20 0.03lower 0.07 0.43 0.33 F-11

Item and AbilityLevelResponse Options ChosenA B C DItem 37 upper 0.00 0.03 0.90 0.072nd 0.07 0.00 0.87 0.073rd 0.09 0.06 0.72 0.134th 0.03 0.10 0.83 0.03lower 0.10 0.03 0.73 0.13Item 38 upper 0.93 0.03 0.00 0.032nd 0.70 0.20 0.03 0.073rd 0.44 0.19 0.28 0.094th 0.33 0.23 0.27 0.17lower 0.30 0.33 0.20 0.17Item 39 upper 0.07 0.00 0.63 0.302nd 0.07 0.03 0.33 0.573rd 0.13 0.03 0.22 0.634th 0.10 0.07 0.13 0.70lower 0.07 0.00 0.07 0.87Item 40 upper 0.00 1.00 0.00 0.002nd 0.00 1.00 0.00 0.003rd 0.06 0.91 0.03 0.004th 0.07 0.93 0.00 0.00lower 0.13 0.87 0.00 0.00Item 41 upper 0.00 1.00 0.00 0.002nd 0.03 0.93 0.03 0.003rd 0.16 0.75 0.09 0.004th 0.23 0.60 0.17 0.00lower 0.20 0.47 0.23 0.10Item 42 upper 0.07 0.00 0.90 0.032nd 0.00 0.00 0.93 0.073rd 0.09 0.00 0.66 0.254th 0.17 0.07 0.63 0.13lower 0.23 0.13 0.43 0.20Item 43 upper 0.00 1.00 0.00 0.002nd 0.00 1.00 0.00 0.003rd 0.00 1.00 0.00 0.004th 0.00 0.93 0.00 0.07lower 0.00 0.97 0.00 0.03Item 44 upper 0.87 0.10 0.00 0.032nd 0.73 0.07 0.03 0.173rd 0.66 0.09 0.03 0.224th 0.43 0.17 0.13 0.27lower 0.40 0.13 0.07 0.40Item 45 upper 0.00 0.30 0.70 0.002nd 0.03 0.37 0.53 0.073rd 0.09 0.38 0.50 0.034th 0.00 0.30 0.63 0.07lower 0.10 0.40 0.43 0.07Item and AbilityLevelResponse Options ChosenA B C DItem 46 upper 1.00 0.00 0.00 0.002nd 0.93 0.00 0.00 0.073rd 0.94 0.00 0.03 0.034th 0.80 0.07 0.00 0.13lower 0.70 0.20 0.03 0.07Item 47 upper 0.07 0.93 0.00 0.002nd 0.17 0.77 0.00 0.073rd 0.44 0.44 0.06 0.064th 0.43 0.43 0.07 0.07lower 0.60 0.13 0.13 0.13Item 48 upper 0.00 0.87 0.10 0.032nd 0.13 0.73 0.10 0.033rd 0.09 0.59 0.16 0.164th 0.13 0.50 0.13 0.23lower 0.13 0.40 0.13 0.33Item 49 upper 0.77 0.17 0.00 0.072nd 0.70 0.07 0.13 0.103rd 0.44 0.28 0.13 0.164th 0.10 0.23 0.53 0.13lower 0.37 0.20 0.33 0.10Item 50 upper 0.43 0.07 0.50 0.002nd 0.40 0.17 0.43 0.003rd 0.22 0.16 0.59 0.034th 0.13 0.33 0.50 0.03lower 0.03 0.20 0.67 0.10Table 6: Distribution of Responses Per Item and Ability Groups ofStudentsPart VII: Average Total Scores by AbilityGroupsAbilityGroupn M Avg. % SD MIN MDN MAXUpper 30 43.6 87% 2.1 41 43 502nd 30 38.3 77% 1.3 36 38 413rd 32 34.7 69% 0.8 33 35 364th 30 31.8 64% 1.0 30 32 33Lower 30 26.8 54% 2.8 18 28 30WholeGroup152 35.0 70% 5.9 18 35 50Table 7: Average Total Scores by Ability F-12

Section Two: Suggestions forContent ReviewWhile the reliability of the test is good, the test items aregenerally very easy and lack discrimination. There is a clearissue with unused distracters for many of the test items.By reviewing the unused distracters for each item, andrevising the content of these distracters, such that they aremore feasible options should increase the difficulty of theitems and improve the discrimination of the items. Theprimary suggestion for improving the test is to re-write thedistracters that are used less than 5% of the time, such thatthey are more feasible F-13

App e n d i x GExa m p l e of a Gr a d u at ePr o g r a m As s e s s m e n tRep o rt: Mast e r ofSci e n c e in Ac c o u n t in gTec h n o l o g y (2010)

Lea r n i n g Ou t c o m e sAss e s s m e n t Re p o r tPr o g r a m : MSATCou r s e s : ACCT 608 a n d ACCT 610Ter m: Sp r in g 2010

I. Program Learning OutcomeAssessed• Critical Thinking (THIN) – ACCT 608• Technology Fluency (TECH) – ACCT 610• Content Knowledge (KNOW) – ACCT 608• Written Communication (COMM) – ACCT 610II. Methods• Critical Thinking: n = 40. The GSMT standard CriticalThinking rubric was used. The rubric uses 5 criteriascored on a scale of 1 to 4. A principal componentsanalysis revealed that the rubric measured a singledimension that accounted for 80.8% of the variabilityin the scores. That is evidence to support the constructvalidity of the scores. In as much, the internal consistency(reliability) of the scores was high; a = .94.• Technology Fluency: n = 28. The GSMT standardTechnology Fluency rubric was used. The rubric uses 4criteria scored on a scale of 1 to 4. A principal componentsanalysis revealed that the rubric measured twodimensions that accounted for 63% of the variabilityin the scores. Criteria 1 and 3 combine into one factor,while Criteria 2 and 4 combine into another factor.That is evidence that the rubric is measuring 2 separateconstructs, rather than a single construct, “TechnologyFluency”. The internal consistency (reliability) of thescores was whether combined as 1 construct or separatedinto the two identified is extremely low, with themaximum a = .07, as would be expected this reflects allcriteria combined as one. See Section IV below.• Content Knowledge: n = 40. A Content Knowledgerubric for accounting was used. The rubric uses 4criteria scored on a scale of 1 to 4. A principal componentsanalysis revealed that the rubric measured a singledimension that accounted for 83.7% of the variabilityin the scores. That is evidence to support the constructvalidity of the scores. In as much, the internal consistency(reliability) of the scores was high; a = .93.• Written Communication: n = 62. A Content Knowledgerubric for accounting was used. The rubric uses 4criteria scored on a scale of 1 to 4. A principal componentsanalysis revealed that the rubric measured a singledimension that accounted for 52.3% of the variabilityin the scores. That is weak evidence to somewhat supportthe construct validity of the scores. In as much,the internal consistency (reliability) of the scores washigh; a = .77. See Section IV below.III. ResultsCritical Thinking• Criterion 1 – Conceptualization: Observes and describesgiven information (data, ideas, or concepts) inrelation to the context of question or assignment.• Criterion 2 – Analysis: analyzes given information ina logical and organized manner to examine how ideasare developed and interconnected. Identifies embeddedhypotheses, biases, causalities, and conclusions.• Criterion 3 – Synthesis: Incorporates own analyses withinformation or evidence drawn from other resources orprior learning to connect key concepts in a coherent way.• Criterion 4 – Conclusion: Integrates previous criteriaand own perspective(s) to formulate conclusion(s) ornew hypotheses that are appropriate to the context ofquestion or assignment.• Criterion 5 – Evaluation: Evaluates own conclusions ornew hypotheses by considering the issues of reliability,need for further support, and implications within alarger context.Descriptive StatisticsCriterion n Min Max M SDConceptualization 40 1 4 3.43 .78Analysis 40 1 4 3.20 .90Synthesis 40 1 4 3.03 1.0Conclusion 40 1 4 2.87 .97Evaluation 40 1 4 2.64 1.10Total 40 4 20 15.16 4.31Technology Fluency• Criterion 1 – Technology Mastery: Creates accurateelectronic documents and/or materials (i.e. technologyenhancedpresentations) with technologies appropriatefor the assignment or context, as evidenced in the layout,formatting, and accuracy of documents/presentation.• Criterion 2 – Information Retrieval: Utilizes technologyto research, evaluate, inform, and communicateinformation retrieved from appropriate resources.• Criterion 3 – Virtual Collaboration: Engages in electroniccollaboration (email, online conferences, chats,or web meetings), as appropriate to the assignment orcontext.• Criterion 4 – Technology Management: Shows considerationof legal, ethical, privacy, and security lawsthat may apply when using technology, handling andexchanging information, and working virtually, asappropriate for the G-2

Descriptive StatisticsCriterion n Min Max M SDTechnology Mastery 28 3 4 3.46 .51Information Retrieval 28 2 4 3.36 .62Virtual Collaborations 28 1 4 3.61 .99Technology Management 28 3 4 3.75 .44Total 28 10 16 14.18 1.39Content Knowledge• Criterion 1 – Conceptual Understanding: Demonstratesand understanding of discipline theories andconcepts presented in the course and relevant to theassignment criteria.• Criterion 2 – Theory Application: Exhibits an abilityto apply theories to practical problems within thecontext of the assignment.• Criterion 3 – Experience-based Understanding: Showsevidence of personal practice or application of theoriesand concepts presented in the course, in ways relevantto the assignment.• Criterion 4 – Knowledge Integration: Integrates currentlearning with prior learning to further demonstratea comprehensive understanding of the subject matter.Descriptive StatisticsCriterion n Min Max M SDConceptual40 2 4 2.98 .70UnderstandingTheory Application 40 1 4 2.73 .75Experience-based 40 1 4 2.35 .89UnderstandingKnowledge40 1 4 2.63 .74IntegrationTotal 40 6 16 10.68 2.82Written Communication• Criterion 1 – Context/Purpose: Considers the audience,purpose, and the circumstances surrounding thewriting assignment(s).• Criterion 2 – Content/Ideas/Support: Articulates andsupports main idea(s) that is consistent with contextand purpose.• Criterion 3 – Organization: Uses logical sequencingincluding introduction, transitions between paragraphs,and summary/conclusion to develop main idea(s) andcontent.• Criterion 4 – Sources: Incorporates use of and identifiessources and/or research, according to APA and/orinstructor guidelines.• Criterion 5 – Word Usage/Grammar/Spelling/Punctuation: Uses wording, grammar, spelling andpunctuation accurately and correctly.Descriptive StatisticsCriterion n Min Max M SDContext/Purpose 62 2 4 3.63 .61Content/Ideas/ 62 1 4 3.48 .70SupportOrganization 62 2 4 3.34 .70Sources 62 2 4 2.98 .22Word Usage/62 1 4 3.23 .78Grammar/Spelling/PunctuationTotal Score 62 10 20 16.66 2.27OverallAmong all of the outcomes assessed, Content Knowledgewas the weakest for the students (M = 10.68, SD = 2.82,with a maximum possible score of 16). Technology Fluencywas the strongest outcome for the students with amean of 14.18 (SD = 1.39), where the maximum possiblescore was 16. Written Communication had a mean of16.66 (SD = 2.27), with a maximum possible score of 20.The weakest areas within Written Communication wereContent/Ideas/Support and Word Usage/Grammar/Spelling/Punctuation. Critical Thinking had a mean of15.16 (SD = 4.31). The weakest areas within CriticalThinking were Conclusion and Evaluation.IV. Next Steps• In terms of measurement quality, the Critical Thinkingand Content Knowledge rubrics are excellent. However,for all rubric scoring an multi-rater per student assignmentmethod is recommended to establish inter-raterreliability.• The Technology Fluency rubric should be carefully reconsideredin terms of content to ensure that the rubricmeasures a single dimension, rather than 2 dimensions.This should help build the validity and reliability of thescores. Content development advice can be providedfrom Institutional Planning, Research, and Development,if the program wants that G-3

• The Written Communication rubric should alsobe carefully reconsidered to improve the amount ofvariance accounted for in the scores (i.e., reduce theamount of error in the scores). One approach is likelyto help. One is clarify any wording/language within therubric that may be unclear to the raters using it. Theindividual raters should first do this. Then, througha norming session, which Institutional Planning,Research, and Accountability can run with the faculty,a more consistent interpretation of the rubric can beassured. This should help build the validity and reliabilityof the scores.• In terms of learning weaknesses, the Content Knowledgearea is weakest. The best first steps to improve thescores are to ensure that the course objectives align tothe program outcome and subsequently the measurebeing used to address that outcome. A misalignment inany of these three parts can lower scores. Additionally,steps should be taken to ensure the course content thataddress each relevant objective is sufficiently coveredand learned prior to the outcomes assessment. Thisshould be done through a series of formative/in-classassessments that build from the course objectives up(aggregate) to the program outcome being measured.• Disseminate results to all appropriate stakeholders (e.g.,faculty, directors, chairs) to identify changes needed toimprove measures, students learning and to establishbenchmarks using the results above, so future results canbe compared to determine if improvements in learningare being reached.• When reporting data, included with each student’sscores should be their EMPLID. This will allow for thealignment of other student data (within the university’swarehouse) to their learning outcomes assessmentscores, to help establish validity.• Use multiple raters for each student assignment toestablish inter-rater reliability.V. Program Additions• The program will add its comments and intended actionswith timeline to complete this report. This willlikely at least include addressing the information insection IV and anything additional the program does.When complete, the final report should go to John Aje,copy Dan G-4

App e n d i x HETS Pr o f i c i e n c yPr o f i l e (EPP)Imp l e m e n tat i o n Pl a n

OVERVIEWOne measure being used to assess three SLEs (i.e., criticalthinking, written communication, and quantitative reasoning)in the School of Undergraduate Studies is the ETSProficiency Profile, created and administered by EducationalTesting Service (ETS). The EPP was formerly namedthe Measure of Academic Proficiency and Progress (MAPP)and the Academic Profile. Though the name of the test haschanged over its approximate 20-year history, the contentand psychometric properties have not changed.EPP CONTENTThe EPP is a standardized test that uses multiple-choiceitems to measure four constructs that are typically consideredcomponents of a general educational curriculum. Thecontent areas include writing skills (i.e., written communicationSLE), mathematics (i.e., quantitative reasoningSLE), critical thinking (i.e., critical thinking SLE), andreading (not an SLE). These four areas are measured inthe context of three content areas—humanities, socialscience, and natural science. That is, the questions onwriting skills, mathematics, critical thinking, and readinguse content (e.g., reading passages) in the traditionaldistribution areas of a general education curriculum (i.e.,humanities, social science, and natural science).From the MAPP User’s Guide (2007):Reading questions measure students’ ability to:1. interpret the meaning of key terms2. recognize the primary purpose of the passage3. recognize explicitly presented information4. make appropriate inferences5. recognize rhetorical devicesWriting questions measure students’ ability to:1. recognize the most grammatically correct revision of aclause, sentence, or group of sentences2. organize units of language for coherence and rhetoricaleffect3. recognize and reword figurative language4. organize elements of writing into larger units ofmeaningCritical Thinking questions measure students’ ability to:1. distinguish between rhetoric and argumentation in apiece of nonfiction prose2. recognize assumptions3. recognize the best hypothesis to account for informationpresented4. infer and interpret a relationship between variables5. draw valid conclusions based on information presentedMathematics questions measure students’ ability to:1. recognize and interpret mathematical terms2. read and interpret tables and graphs3. evaluate formulas4. order and compare large and small numbers5. interpret ratios, proportions, and percentages6. read scientific measuring instruments7. recognize and use equivalent mathematical formulas orexpressionsIn addition to the cognitive dimensions measured by EPP,the following demographic information is also gatheredby ETS: (1) completed credit hours, (2) transfer credit,(3) portion of general education completed, (4) communicationlanguage, (5) enrollment status, (6) type ofprogram, (7) age, (8) major, (9) number of courses taken,(10) race, (11) gender, and (12) hours spent working.ADDING CONTENT TO EPPIn addition to the content created by ETS for the EPP,UMUC can add up to 50 of its own multiple-choice questionsand nine demographic questions. These questionscan be used, either as a stand-alone group or in conjunctionwith another internal measure, to assess generaleducation constructs not otherwise measured.UMUC is adding the following demographic questions:1. Marital Status with the answer options “Single,” “Separated,”“Divorced,” “Widowed,” and “Married.”2. Income with the answer options “$0 to $25,000,”“$25,001 to $50,000,” “$50,001 to $75,000,”“$75,000 to $100,000,” and “Over $100,000.”3. Department with the answer options “Business andProfessional Programs (BAPP),” “Communications,Arts, and Humanities (COMM),” “Computer InformationSystems and Technology (CITE),” and “Social, Behavioral,Natural, and Mathematical Sciences (SCIP).”UMUC is also adding the following self-assessment questionsto serve as an indirect measure of student learning.That is, students will self-assess their ability H-1

1. How do you rate your current level of writing skills, includingrecognizing word choice and agreement amongbasic grammatical elements; combining multiple, simplesentences into complex sentences; and discriminatingbetween complex written elements?4 = Excellent (I can perform all of these writing skills.)3 = Good (I can perform most, but not all, of thesewriting skills.)2 = Fair (I can perform some of these writing skills.)1 = Poor (I can perform few or none of these writingskills.)2. How would you rate your current level of math skills,including solving problems involving whole numbers,fractions, and percents; word problems, algebraic equations,ratios, and graphs; and complex problems involvingexponents and square roots?4 = Excellent (I can perform all of these math skills.)3 = Good (I can perform most, but not all, of thesemath skills.)2 = Fair (I can perform some of these math skills.)1 = Poor (I can perform few or none of these mathskills.)3. How would you rate your current level of critical thinkingskills, including recognizing flaws and inconsistenciesin a written argument, evaluating alternativeexplanations in an argument, and determining relevanceof information in drawing conclusions?4 = Excellent (I can perform all of these critical thinkingskills.)3 = Good (I can perform most, but not all, of thesecritical thinking skills.)2 = Fair (I can perform some of these critical thinkingskills.)1 = Poor (I can perform few or none of these criticalthinking skills.)EPP VALIDITY AND RELIABILITYStudies done on the Academic Profile and MAPP havedemonstrated the content, construct, and predictivevalidity of the test. In studies conducted by ETS, theabbreviated form of what is now the EPP has a reliabilitycoefficient of .77 for the overall test.EPP ABBREVIATED FORMThe abbreviated form of the EPP is being used at UMUC.The abbreviated form has 36 questions. Due to the matrixsampling approach employed by ETS with the abbreviatedform of the EPP, at least 50 test takers must be included inthe sample. The abbreviated form can be completed in one40-minute session.EPP ADMINISTRATIONProctoringUMUC will administer the abbreviated EPP online withouta proctor.Overview of SamplingAll student participants in the EPP test will be recruitedaccording to demographic criteria and quotas using a stratifiedsampling procedure to obtain a representative samplebased on: (1) age, (2) gender, (3) ethnicity, (4) militarystatus, and (5) residency (in-state/out-of-state).Entering-Level Students: The goal is to gain an overallsample of 300 to 500 participants. To begin the enteringstudent sampling, the population is considered all enteringstudents who are new to UMUC. From the population,a randomly selected sample broadly representative of thepopulation will be taken. Representation of the populationin the sample will be ensured by comparing demographicinformation on age, gender, ethnicity, military status, andresidency.Exiting/Graduating Students: Each year UMUC willidentify those students who have applied for graduationin their third spring semester and take a subset of thosewho have completed 60 or more credits at UMUC. Thiswill comprise the population from which the sample canbe drawn. The goal will be to sample 100 to 200 exitinglevelstudents. As with the entering-level students, arandomly selected sample broadly representative of theentire exiting population will be taken. Representationof the population in the sample will be ensured by comparingdemographic information on age, gender, ethnicity,military status, and residency. Additionally, the measurementof the sampled exiting students will be done on adepartment-by-department basis, with different departmentsassessed each spring. Therefore, identifying studentsin the major is also necessary to identify the studentsfor the sample. The intention is to start with the largestdepartment (by head count), which is Business andProfessional Programs, followed by Computer InformationSystems and Technology; Communications, Arts, andHumanities; and Social, Behavioral, Natural, and MathematicalSciences, H-2

Logistics of SamplingUMUC in collaboration with a specialized survey researchcenter will recruit, schedule, and compensate participantsto take part in EPP. Letters and e-mails will be used to askstudents to volunteer for the sample. To recruit sampledstudents, an informational letter about UMUC’s EPP testingwill be mailed. This letter will be for information purposesonly, to alert the students that they may be selectedto take the EPP test and to look for an e-mail on takingthe EPP test. The letter will also explain to the studentsthe purpose of the test and the rewards for participation.Students who are selected for the final sample will receivea subsequent e-mail with instructions on how to log in tothe ETS Web site for the EPP test and how to completethe test.Incentives for ParticipationDue to the voluntary nature of participation, which isnecessary for sampling at the student level, a program ofincentives will be used to get the selected students to takethe test and to approach it seriously. First, any studentcompleting the EPP will receive a $50 gift card. Given thatindividual students cannot be classified according to EPPproficiency levels (i.e., criterion-based scoring) using theabbreviated test form, use of the overall scaled scores willbe used to determine the level of reward for each student.Therefore, the percentile ranks of each student relative toall EPP test takers at UMUC will be considered in determiningtheir reward levels. The reward levels will be asfollows: (1) a score in the top 25 percent will earn a $150gift card to the retailer of the participant’s choice and (2) ascore in the top 26 percent to 50 percent will earn a $100gift card to the retailer of the participant’s choice.SCORE REPORTINGWith the online test, scores can be reported immediately.However, UMUC has decided to withhold students’ scoresand inform students later what their scores are, whether ornot they received an award, how much their award is (ifapplicable), and what assistance is available to students toimprove performance.norm-referenced scores: total score, skills subscores (criticalthinking, reading, writing, and mathematics), and contextsubscores (humanities, social sciences, and natural sciences).The skills and context subscores are not available forindividual students with the abbreviated form of the EPP.To have enough data to report for each demographicsubgroup in the abbreviated form, there must be 50participants in each subgroup. The subgroup classificationsincluded in the EPP are completed credit hours, transfercredit, portion of general education completed, communicationlanguage, enrollment status, type of program,age, major, number of courses taken, race, gender, andhours spent working. However, it is the primary concernof UMUC to meet the sampling requirements for the subgroupsit has identified: (1) age, (2) gender, (3) ethnicity,(4) military status, and (5) residency (in-state/out-of-state),as described above. The sampling approach being implementedwill accomplish this.USING EPP DATAThe data from EPP can generally be used in one of twoways: for longitudinal analysis and for cross-sectionalanalysis. UMUC will use the data in both of these ways.With an administration to entering students in every thirdfall term and an administration to graduating studentsevery third spring term, a cross-sectional comparison ofprogram-entering students to graduating students will beconducted to determine skill levels gained by each graduatingclass. This type of analysis is commonly referred toas “value-added” analysis and can be used for internal purposes,such as curricular change. To maximize the validityof the results from this analysis, students from the enteringand graduating samples will be matched on a selectedset of characteristics. Matching will help to control forintervening variables outside of the UMUC curriculum. Inaddition, longitudinal studies will be conducted year afteryear to compare graduating classes. This will determineif there are gains in graduate performances over time andhelp indicate whether changes are needed in the undergraduategeneral education curriculum that would lead toimproved EPP scores.For the EPP, norm-referenced scores (scaled scores) andcriterion-referenced scores (proficiency classifications) areboth used. The norm-referenced scores are used to comparethe scores of one student or group of students to anotheror to compare the same student or group at different pointsin time. These scores can be used for benchmarking or fordetermining value-added learning gains by conducting alongitudinal or cross-sectional study. The EPP has H-3

More magazines by this user
Similar magazines