13.07.2015 Views

Outline - Jonathan Templin - University of Georgia

Outline - Jonathan Templin - University of Georgia

Outline - Jonathan Templin - University of Georgia

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

®Understanding the Impact <strong>of</strong> SkillAcquisition: Relating DiagnosticAssessments to Measurable Outcomes<strong>Jonathan</strong> <strong>Templin</strong>The <strong>University</strong> <strong>of</strong> <strong>Georgia</strong>National Council for Measurement in Education, 2008 Slide 1 <strong>of</strong> 18


<strong>Outline</strong><strong>Outline</strong>®● Discussion <strong>of</strong> how Cognitive Diagnosis Models (CDMs) canaid in determining the best path to follow when teachingstudents.Use <strong>of</strong> CDMsExample EOG TestConclusions● Relating CDMs to other important information.✦ Example end-<strong>of</strong>-grade test.● CDMs used as formative tests:✦ How CDMs can be related to other tests.✦ Types <strong>of</strong> outcomes that can be expected.✦ Using information from CDMs to improve studentperformance.National Council for Measurement in Education, 2008 Slide 2 <strong>of</strong> 18


Attributes Measured by Test<strong>Outline</strong>Use <strong>of</strong> CDMs● Attributes● Testing Cycle● CDM Benefits● Test CalibrationExample EOG TestConclusions®● Given the setting for our example, the benchmark testcreated was built to measure five basic skills in mathematics:1. Operate with algebraic expressions (polynomial, rational,complex fractions) to solve problems.2. Use the composition and inverse <strong>of</strong> functions to modeland solve problems.3. Use quadratic functions and inequalities to model andsolve problems.4. Create and use best-fit mathematical models <strong>of</strong> linear,exponential, and quadratic functions to solve problemsinvolving sets <strong>of</strong> data.5. Use equations and inequalities with absolute value tomodel and solve problems.National Council for Measurement in Education, 2008 Slide 3 <strong>of</strong> 18


Hypothetical Testing Cycle®<strong>Outline</strong>Use <strong>of</strong> CDMs● Attributes● Testing Cycle● CDM Benefits● Test CalibrationExample EOG TestConclusionsNational Council for Measurement in Education, 2008 Slide 4 <strong>of</strong> 18


Potential Benefits to CDMs<strong>Outline</strong>Use <strong>of</strong> CDMs● Attributes● Testing Cycle● CDM Benefits● Test CalibrationExample EOG TestConclusions®● By constructing the benchmark test to measure attributeswith CDMs, many benefits can be realized:✦ Student information comes from a pr<strong>of</strong>ile <strong>of</strong> skillsmastered.✦ The skills needed to preform well on the EOC test can bemeasured directly.✦ When developed as part <strong>of</strong> a curriculum, the benchmarktest can lead directly to the EOC test.✦ Such development can allow educators to understand howmuch gain in an EOC test score is associated withstudents mastery <strong>of</strong> any given skill underlying the content<strong>of</strong> the test.● This talk presents an example <strong>of</strong> what could be provided bylinking a benchmark test with that <strong>of</strong> an EOC test.National Council for Measurement in Education, 2008 Slide 5 <strong>of</strong> 18


Linking CDMs to Outcome Measures<strong>Outline</strong>Use <strong>of</strong> CDMs● Attributes● Testing Cycle● CDM Benefits● Test CalibrationExample EOG TestConclusions®● To link the diagnostic benchmark test and the EOC test, ajoint calibration <strong>of</strong> both tests is needed.1. Administer the EOC items should be simultaneously with thebenchmark items.✦ Examinees should take both tests at the same time.✦ This will enable us to understand how certain skills relateto the scores on the EOC test.✦ The EOC test does not have to be new.✦ Does not have to be item level (can just be test score).2. Determine the skill patterns <strong>of</strong> all examinees.3. Relate the skill patterns to the EOC test score.National Council for Measurement in Education, 2008 Slide 6 <strong>of</strong> 18


Demonstrating the Potential <strong>of</strong> CDMs<strong>Outline</strong>Use <strong>of</strong> CDMsExample EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusions®● To demonstrate the potential <strong>of</strong> CDMs analysis estimates toinform teachers, we include an hypothetical example.● We take the 25 item benchmark test (described in theprevious slides) as the test where examinee’s knowledgestates are assessed.✦ Same five skills measured by the test.✦ Skill acquisition could be considered meeting masterystandards.● We link the attributes <strong>of</strong> the test to the general abilitymeasured by the EOC exam.National Council for Measurement in Education, 2008 Slide 7 <strong>of</strong> 18


Example Assumptions®● We made several assumptions in constructing this example.<strong>Outline</strong>● Biggest is about skill and ability association:Use <strong>of</strong> CDMsExample EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusionsProportion <strong>of</strong> Correlation withSkill Masters EOC Ability1 0.70 0.812 0.60 0.813 0.50 0.634 0.40 0.635 0.30 0.45● The higher the correlation between a skill and ability, thegreater the impact the skill has on the EOC test score.National Council for Measurement in Education, 2008 Slide 8 <strong>of</strong> 18


End <strong>of</strong> Course Test Development<strong>Outline</strong>Use <strong>of</strong> CDMs®● We constructed a hypothetical 50 item EOC test designed tomimic a minimum competency exam.● Test created using a two-parameter item response model.Example EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusions● Item difficulties ranged between -1.5 and -0.5.● Item discriminations ranged between 0.5 and 1.5.● We then superimposed a cutscore for pr<strong>of</strong>iciency <strong>of</strong> 32.✦ This is where 60% <strong>of</strong> the population is consideredpr<strong>of</strong>icient.● The following results come from the joint calibration process.✦ Here approximated using a simulation.National Council for Measurement in Education, 2008 Slide 9 <strong>of</strong> 18


End <strong>of</strong> Course Test: Score Distribution®End <strong>of</strong> Grade Test Score Distribution<strong>Outline</strong>Use <strong>of</strong> CDMsNot Pr<strong>of</strong>icient (< 33)Pr<strong>of</strong>icientExample EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusionsProportion With Test Score0.00 0.01 0.02 0.03 0.040 10 20 30 40 50End <strong>of</strong> Grade Test ScoreNational Council for Measurement in Education, 2008 Slide 10 <strong>of</strong> 18


Descriptive Statistics <strong>of</strong> Skill Patterns<strong>Outline</strong>Use <strong>of</strong> CDMs®● To introduce the following slides, the first bit <strong>of</strong> informationwe present are the basic descriptives - how we would expectan examinee with a given skill pattern to perform on the EOCtest:Example EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusionsSkill PatternExpected Score[00000] 22.9[00001] 26.0[00011] 29.3[00111] 31.4[01111] 34.8[11111] 41.9National Council for Measurement in Education, 2008 Slide 11 <strong>of</strong> 18


Gain by Mastery <strong>of</strong> Each Attribute<strong>Outline</strong>®● The difference in test score between masters andnon-masters <strong>of</strong> skill can be quantified:Use <strong>of</strong> CDMsExample EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusionsSkill Gain in Score Ability Correlation1 2.61 0.812 2.50 0.813 1.15 0.634 1.19 0.635 0.75 0.45● The higher the correlation between skill and ability, thehigher the gain in test score.● No correlation would lead to no expected increase in EOCscore by mastery <strong>of</strong> a skill.National Council for Measurement in Education, 2008 Slide 12 <strong>of</strong> 18


Pathways to Pr<strong>of</strong>iciency<strong>Outline</strong>®● But perhaps the most important information is in the form <strong>of</strong>a path a student can follow that would most quickly lead topr<strong>of</strong>iciency on the EOC test.Use <strong>of</strong> CDMsExample EOG Test● Assumptions● EOC Test● Descriptives● Gain● Pr<strong>of</strong>iciency Paths● Road Map● Fast Path● Slow PathConclusions● Specifically, each student is assessed to have mastered aset <strong>of</strong> the skills on the benchmark test.● The pathway tells the student and the teacher the sequence<strong>of</strong> skills to learn next that will provide the biggest increase intest score.● This mechanism may help teachers decide which skills t<strong>of</strong>ocus on when teaching a course.✦ Balances time spent on instruction with impact on testscore.✦ Provides a practical implementation <strong>of</strong> CDMs in today’sclassroom testing environment.National Council for Measurement in Education, 2008 Slide 13 <strong>of</strong> 18


Pr<strong>of</strong>iciency Road Map’00000’01000’10000’11000’00010’10010’11010’01010’01100’00100’10100’11100’00110’11110’01110’10110’10001’01111’11111’11011’11001’01001’00001’10111’01011’10011’10101’11101’01101’00111’00011’00101National Council for Measurement in Education, 2008 Slide 14 <strong>of</strong> 18Pajek


Fast Path to Pr<strong>of</strong>iciencyNot Pr<strong>of</strong>icientPr<strong>of</strong>icientStudent A:00000 01000 11000 11010 11110 11111Student B:00001 01001 11001 11011 11111Student C:00010 10010 11010 11110 11111Student D:11000 11010 11110 1111120 25 30 35 40 45End <strong>of</strong> Grade Test ScoreNational Council for Measurement in Education, 2008 Slide 15 <strong>of</strong> 18


Harder Paths to Pr<strong>of</strong>iciencyNot Pr<strong>of</strong>icientPr<strong>of</strong>icientStudent A:00000 00001 00101 00111 01111 11111Student B:01000 01001 01011 01111 1111120 25 30 35 40 45End <strong>of</strong> Grade Test Score● Some learning paths are less efficient at maximizing EOC test scores.● A teacher could balance the time it takes to teach a given skill with theexpected gain in pr<strong>of</strong>iciency standing at the end <strong>of</strong> the course.● This information comes directly from the joint analysis linking the benchmarktest with the EOC test.National Council for Measurement in Education, 2008 Slide 16 <strong>of</strong> 18


Concluding Remarks<strong>Outline</strong>Use <strong>of</strong> CDMs®Example EOG TestConclusions● Acknowledgments● CDMs provide a wealth <strong>of</strong> information that can be useful inhelping students achieve and become pr<strong>of</strong>icient.● Such methods can help turn benchmark tests intoopportunities to better understand the areas where studentsneed the most help.● Tailored learning paths can be created, streamlininginstruction <strong>of</strong> students by informing teachers <strong>of</strong> the skillsstudents need to know.● The impact <strong>of</strong> skill acquisition can be measured andobserved, and can help schools meet AYP demands.● May help teachers understand how teaching to curriculummay lead to direct increases in EOC test scores.● Could also be linked to other instruments (i.e., NAEP, SAT, orACT).National Council for Measurement in Education, 2008 Slide 17 <strong>of</strong> 18


Acknowledgments®● Robert Henson<strong>Outline</strong>Use <strong>of</strong> CDMsExample EOG TestConclusions● Acknowledgments● Terry Ackerman● John Willse● Deborah Bartz● The <strong>University</strong> <strong>of</strong> North Carolina at Greensboro● Thank you.National Council for Measurement in Education, 2008 Slide 18 <strong>of</strong> 18

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!