10.07.2015 Views

Are DIBELS and Running Records effective tools for guiding ...

Are DIBELS and Running Records effective tools for guiding ...

Are DIBELS and Running Records effective tools for guiding ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

development plans in the area of language arts <strong>for</strong> all students. <strong>Running</strong> <strong>Records</strong> were collectedin September, January, <strong>and</strong> May, accompanied by <strong>DIBELS</strong> at the same intervals <strong>for</strong> benchmarkpurposes, <strong>and</strong> all students who were not benchmarked received progress monitoring to determinethe success of said interventions. This research study conducted at Douglas Elementary usedRigby <strong>Running</strong> <strong>Records</strong> <strong>and</strong> <strong>DIBELS</strong> to relate the use of data in selecting interventions, pacinginstruction, <strong>and</strong> <strong>guiding</strong> classroom teachers.Literature ReviewAssessment used to adapt teaching to meet student needs, known as <strong>for</strong>mative assessment, maybe considered the most important assessment practice educators use. Teachers use the results ofthis assessment approach proactively to differentiate instruction <strong>and</strong> better support studentlearning (Baker, S. Smith, <strong>and</strong> S. 2001).Research indicated that running records <strong>and</strong> Dynamic Indicators of Basic Early Literacy Skills(<strong>DIBELS</strong>), both <strong>for</strong>mative assessments, are <strong>effective</strong> <strong>tools</strong> <strong>for</strong> <strong>guiding</strong> classroom teachers towardchoosing interventions <strong>for</strong> students at risk of reading failure (Hebert, C. 2004).Connie R. Hebert (2004) wrote that teachers use running records to <strong>for</strong>m their reading groups, toguide them, <strong>and</strong> to make instructional decisions <strong>for</strong> individual students <strong>and</strong> the class as a whole.Hebert went on to say that running records are extremely valuable because they impactinstructional decisions such as grouping, text selection, verbal prompts, areas of concern <strong>and</strong>next steps in instruction.In a study done in 2004, John Ross found that teachers in <strong>effective</strong> schools are more likely to useclassroom assessment like running records to diagnose reading difficulties <strong>and</strong> to lead toinstruction. Assessing children’s reading progress is key to moving them along at the properdevelopmental rate. The combination of in<strong>for</strong>mation gained from the analysis of running recordswill help teachers select the appropriate books <strong>for</strong> the child’s reading level (Baker, Simmons,&Kame’enui 1997).<strong>DIBELS</strong> measures were designed to assess three areas of early literacy, Phonological Awareness,Alphabetic Principle <strong>and</strong> Fluency with connected text. These measures link together to <strong>for</strong>m anassessment system of early literacy development that allows educators to readily <strong>and</strong> reliablydetermine student progress <strong>and</strong> plan instruction accordingly (Adams, 1990; National ReadingPanel, 2000; National Research Council, 1998; Simmons & Kame’enui, 1998).Research DesignThe study was established to determine if there was an adequate correlation between <strong>DIBELS</strong><strong>and</strong> Rigby <strong>Running</strong> <strong>Records</strong> data, <strong>and</strong> that these data provided in<strong>for</strong>mation that would assistteachers in the general education classroom. Another goal of the study was to showcase the useof data in facilitating instruction <strong>and</strong> student achievement. During the study <strong>DIBELS</strong>in<strong>for</strong>mation was collected at benchmark periods in September, January, <strong>and</strong> April. Progressmonitoring was conducted every two weeks <strong>for</strong> students who were at-risk <strong>and</strong> not benchmarkedin the areas outlined in <strong>DIBELS</strong>. <strong>Running</strong> <strong>Records</strong> in<strong>for</strong>mation was collected in September,January, <strong>and</strong> April. <strong>Running</strong> <strong>Records</strong> were available <strong>for</strong> teachers to utilize as necessary to assistwith differentiating <strong>and</strong> <strong>guiding</strong> individual lessons <strong>for</strong> students.One teacher per grade level agreed to participate in the study. These teachers receivedin<strong>for</strong>mation every two weeks, received guidance from the reading interventionist, <strong>and</strong>participated in meetings at all benchmarked dates. Each teacher participating in the study wasgiven a survey midyear <strong>and</strong> one at the conclusion of the academic year. The in<strong>for</strong>mation asked


within the survey pointed to ways teachers used the data, feasibility, accessibility, ease withwhich assessments were given, <strong>and</strong> their overall use of the in<strong>for</strong>mation.During benchmark data points, a collaborative team tested all 407 students at DouglasElementary. This was to develop consistency in scoring <strong>and</strong> to maintain the integrity of the data.The goal of the project, determining the appropriate use of <strong>DIBELS</strong> <strong>and</strong> <strong>Running</strong> <strong>Records</strong> indriving student reading instruction, appears to demonstrate a relationship in diagnosing theappropriate intervention following the use of <strong>DIBELS</strong> in the regular classroom. Teachers whoparticipated in this research project, worked within the STEPS (Students <strong>and</strong> TeachersEmphasizing Personalized-education Strategies) Model as developed by Douglas Elementarystaff. The model, rooted in best practice, takes on a response to intervention framework, usingdata to determine the best interventions dependent upon student achievement. The interventionsembody the five alphabetic principles of reading, fluency, comprehension, vocabulary, phonics,<strong>and</strong> phonemic awareness.Data Analysis <strong>and</strong> InterpretationThe following graphs were created by the Douglas Elementary grant team. The students <strong>for</strong> allgrades first through fifth, were entered <strong>and</strong> their baseline achievement on the Rigby <strong>Running</strong><strong>Records</strong> assessments in September is recorded. The remaining benchmark periods, January, <strong>and</strong>again April, were added to determine if each child had made a year’s worth of growth in theareas of comprehension <strong>and</strong> fluency as measured with the running records assessment. Theredline on each graph indicates where students should be at the end of the academic year. Anymarks noted below the X axis are students who dropped a reading level, or demonstrated nogrowth during that period.The assessments documented students were not reading at grade level in accordance with theRigby <strong>Running</strong> <strong>Records</strong>, which assess students decoding skills, the orientation of errors made,<strong>and</strong> their comprehension. As a collective group the integration, implementation, <strong>and</strong> deliverycreated a consistent system to administer the Rigby <strong>Running</strong> <strong>Records</strong>, as they tend to besubjective in nature. These protocols <strong>and</strong> interventions were put in place during the school year,student achievement can be noted throughout all grade levels. One intervention providing largegains in achievement was the use of the Read Naturally Lab, focusing on comprehension,providing in-house training <strong>and</strong> in-services to all classroom teachers faciliated this growth <strong>and</strong>consistency throughtout the building.The yearly gains <strong>for</strong> Douglas Elementary note tremendous progress in <strong>Running</strong> <strong>Records</strong>.Students began the year instructionally lower than anticipated. During the course of the grantproject, teachers met biweekly to gain an underst<strong>and</strong>ing of student achievement as theassessments were completed. These meetings were facilited by the reading interventionspecialist. Percentages were calcualted as Below Grade Level (BGL), at Grade Level (GL), <strong>and</strong>Above Grade Level (GLE). Through the interpretation <strong>and</strong> comparison of the data, the focusturned to a curriculum issue that appeared prevalent in the language arts delievery inkindergarten <strong>and</strong> first grade. As Table 1 indicates below.


Table 1September 2007BGL GL AGLFirst Grade 80% 15% 5%Second Grade 95% 3% 2%Third Grade 75% 25% 0%Fourth Grade 79% 13% 8%Fifth Grade 67% 33% 0%As data collection examined the student achievement in all grades (1-5), students who enteredsecond grade during the 2007-2008 school year demonstrated a weaker grasp of storycomprehension, fluency, <strong>and</strong> decoding skills. The number of students at or above grade leveldecreased dramatically. Teachers were provided assistance from the reading interventionspecialists <strong>and</strong> received support from the LIFT program (Learning is Fun Together), by buildingself esteem <strong>and</strong> confidence under the direction of a mentor, one that faciliated a lesson pl<strong>and</strong>irected by the reading intervention specialist.Although notable gains are witnessed the collaborative grant team determined that the transitionfrom narrative to expository text hindered the achievement of learners on the assessment, as 43percent of all third grade students achieved benchmark in the <strong>DIBELS</strong> Oral Reading Fluency,while 29 percent were in the some risk category. Also noted is the number of studentsper<strong>for</strong>ming Below Grade Level on the Rigby <strong>Running</strong> Record Benchmark Assessments inSeptember. Anecdotal notes collected indicated students struggled with the retelling of theexpository literacy pieces, however, excelled in the final narratives provided. The transitionfrom story elements to main idea along with supporting details needed to recap the expositorypiece hindered learners from mastering the running record assessment. Thus providing a newfocus <strong>for</strong> the learning community <strong>for</strong> the 2008-2009 school year.The growth in third, fourth, <strong>and</strong> fifth grades illustrates continued practice with the retelling ofexpository pieces in language arts. In correlation to our <strong>DIBELS</strong> scores, where 58 percent of thestudents demonstrated proficiency in the oral reading fluency, students scored significantlyhigher in fourth grade at the end of the benchmark cycle in April. This two can be noted in thedata that 59 percent of all students passed the oral reading fluency portion of the <strong>DIBELS</strong>assessment in the fifth grade. The results of the <strong>DIBELS</strong> testing assist in predicting studentachievement on the running records assessment as well.Table two illustrates this progression <strong>and</strong> growth as the comparison of January <strong>and</strong> April scoresare noted across all grade levels. Especially important is the percentage of change between gradelevels, as a larger portion of students were at or above grade level on running records in gradesthree, four, <strong>and</strong> five.


Using <strong>DIBELS</strong> in<strong>for</strong>mation as a tool the compiled results note the following percentages asoutlined in the table below:Table 3Established Emerging DeficitKindergartenPSF 40% 36% 24%NWF 36% 26% 38%FirstPSF 97% 3% 0%NWF 64% 33% 3%ORF 69% 15% 16%SecondORF 68% 15% 18%ThirdORF 43% 29% 29%FourthORF 58% 27% 15%FifthORF 59% 23% 18%Teachers who operated in conjunction with the grant project found <strong>DIBELS</strong> in<strong>for</strong>mation usefulin <strong>for</strong>mulating appropriate interventions <strong>and</strong> programs <strong>for</strong> individual students in their classroomsas is indicated through the results collected via the teacher survey. Teachers received coachingfrom the reading intervention specialist, biweekly meetings to disaggregate data, <strong>and</strong>professional resources to encourage the use <strong>and</strong> underst<strong>and</strong>ing of the data collected. Thein<strong>for</strong>mation provided valuable in<strong>for</strong>mation into individual student achievement in reading, theappropriateness of reading interventions employed, <strong>and</strong> developed an accountability model.One teacher per grade level participated in the grant project. These teachers received the Rigby<strong>Running</strong> Record data, <strong>DIBELS</strong> scores, <strong>and</strong> assistance from the reading intervention specialist inusing the data to differentiate <strong>and</strong> drive their classroom instruction.The survey was conducted in the fall <strong>and</strong> in the spring. Growth <strong>and</strong> progress are noted in allareas in using the <strong>DIBELS</strong> data. Teachers found the in<strong>for</strong>mation to be useful, convenient, <strong>and</strong>that is provided adequate <strong>and</strong> valuable in<strong>for</strong>mation in meeting the needs of their students. Overthe course of the year, these teachers collaborated on data with our reading intervention specialistto prescribe the appropriate intervention based on the area of the alphabetic principle where thestudent demonstrated the greatest need. The teacher survey data is represented in tables 4, 5, <strong>and</strong>6.


Table 4Do you find <strong>DIBELS</strong> data to be useful?65432VerySomewhatNot Useful10FallSpringTeachers at Douglas Elementary had no previous experience with the <strong>DIBELS</strong> assessment priorto the fall of 2007. Teachers at grade levels were provided with an in-service to highlight thevarious parts of the assessment, the data results as documented in the <strong>DIBELS</strong> software, <strong>and</strong> howto provide interventions based upon the data results. As highlighted in Table 5, teachers whoparticipated in the grant completed all progress monitoring during the weeks between benchmarkperiods. The process provided a reliability of results, a comparison of student achievement data,<strong>and</strong> a consistency among staff members providing testing.Table 565How convenient is it to use <strong>DIBELS</strong>within your classroom?43211 - Very2- Somewhat3 - Neutral4 - Not Really5 - Not at all0FallSpring


Table 6How often do you examine your <strong>DIBELS</strong>data to guide instruction in Language Arts?654321Daily1 Timer Per. Wk2-3 Times Per Mth.1 Time Every Other Wk.1 Time Per MonthNot At All0FallSpringDuring the past year, classrooms teachers found the purpose <strong>and</strong> use of the <strong>DIBELS</strong> assessmentsin providing differentiated instruction to meet the needs of all learners in the classroom.Through collaborative ef<strong>for</strong>ts with the reading intervention specialist, classroom teachers beganfocusing dialogue <strong>and</strong> collaboration days around student data, as highlighted through <strong>DIBELS</strong>achievement scores. This in<strong>for</strong>mation translated into action, <strong>and</strong> results <strong>for</strong> students improveddrastically, as teachers in a professional learning community discovered the goal <strong>and</strong> purpose ofassessment.As a result of our findings on teacher use, support, <strong>and</strong> in<strong>for</strong>mation, the Douglas ElementarySchool Community will be using <strong>DIBELS</strong> as a screening tool due to its <strong>effective</strong>ness <strong>and</strong>convenience in the fall of 2008.ReferencesBaker, S. K., Simmons, D. C., & Kame’enui, E. J. (1997). Vocabulary acquisition: Researchbases. In Simmons, D.C. & Kame’enui, E.J. (Eds.), What reading research tells us aboutchildren with diverse learning needs: Bases <strong>and</strong> basics. Mahway, NJ: Erlbaum.Bean, R.M., Cassidy, J., Grumet, J.E., Shelton, D.S., & Wallis, S.R. (2002). What do readingspecialists do? Results from a national survey. The Reading Teacher, 55, 736 – 744.Black, P., & William, D. (1998)). Assessment <strong>and</strong> classroom learning. Assessment inEducation, Principles, Policies & Practices, 5(1), 7-74.Chapman, J. W., Tunmer, W. E., & Prochnow, J. E. (2001). Does success in the ReadingRecovery program depend on developing proficiency in phonological-processing skills? Alongitudinal study in a whole language instructional context. Scientific Studies of Reading,5(2), 141-176.


Clay, M.M. (2000). <strong>Running</strong> records <strong>for</strong> classroom teachers. Auckl<strong>and</strong>: Heinimann.Fawson, Parker C., Ludlow, Brian C., Reutzel, D. Ray, Sudweeks, Richard, Smith, John A.(2006). Examining the reliability of running records: Attaining generalizable results.Journal of Educational Research, 100, 113-126.Fountas, I.C., & Pinnell, G. S. (1996). Guided reading: Good first teaching <strong>for</strong> all children.Portsmouth, NH: Heinemann. Retrieved October 18, 2007, from Rigby PM Resourcesdatabase.Good, R.H., Simmons, D.C., & Kame’enui, E.J. (2001). The importance <strong>and</strong> decision-makingutility of a continuum of fluency-based indicators of foundational reading skills <strong>for</strong> thirdgradehigh-stakes outcomes, Scientific Studies of Reading, 5, 257-288.Limbrick, L. (1999, September). The literacy debates: What are the issues in New Zeal<strong>and</strong>?Paper presented at the British Educational Research Association Annual Conference atBrighton, September 2-5 (1999). Retrieved November 23, 2007 fromhttp://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/19/68/d7.pdPearson, P. D. (2001). Life in the radical middle: A personal apology <strong>for</strong> a balanced view ofreading. Reading Researchers in Search of Common Ground. Newark, DE: InternationalReading Association.Pinnell, G. S., Lyons, C.A., DeFord, D.E., Bryk, A. S., & Seltzer, M. (1994). Comparinginstructional models <strong>for</strong> the literacy education of high-risk first graders. Reading ResearchQuarterly, 29 (1), 9-39.Pressley, M., Hilden, K., & Shankl<strong>and</strong>, R. (2005). An evaluation of end-grade-3 dynamicindicators of basic early literacy skills (<strong>DIBELS</strong>): Speed reading without comprehension,predicting little. East Lansing, MI: Literacy Achievement Research Center, technicalreport. Retrieved Jan 16, 2008, fromhttp://www.msularc.org/symposium2005/pressley_paper.pdfRamaprasad, A. (1983). On the definition of feedback. Behavioral Science, 28 (1): 4-13.Ross, John.A. (2004). Effects of running records assessment on early literacy achievement. TheJournal of Educational Research, 97, 186-194.Sadler, D. R. (1989). Formative assessment <strong>and</strong> the design of instructional systems.Instructional Science, 18 (2): 119-144.Shanahan, T. (2001). Assessment <strong>and</strong> differentiation on instructional differentiation. RetrievedNovember 28, 2007 from wvde.state.wv.us/reading/document/T.Shanahan.pptShanahan, T. (2003). Research-based reading instruction: Myths about the National ReadingPanel report. The Reading Teacher, 56, 646-655. Retrieved December 16, 2007, fromwww.lib.ncsu.edu/web_root/collection/available/etd-05172006-163207/unrestricted/etd.pdfTaylor, B. M., Pearson, P. D., Clark, K., & Walpole, S. (2000). Effective schools <strong>and</strong>accomplished teachers: Lessons about primary-grade reading instruction in low-incomeschools. Elementary School Journal, 101(2), 121-166.Traub, R. E.., & Rowley, G. L. (1991). Underst<strong>and</strong>ing reliability. Educational Measurement:Issues <strong>and</strong> Practice, 10, 37-45.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!