11.07.2015 Views

Smart & Good High Schools - The Flippen Group

Smart & Good High Schools - The Flippen Group

Smart & Good High Schools - The Flippen Group

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

CHAPTER 4: <strong>The</strong> Professional Ethical Learning CommunityPELC 2: Align practices.2.3PELCPromising Practice 3:Examine existing research oneducational practices thatcontribute to desired outcomes.Suppose, for example, that a high school decides that thecreation or continued use of advisories is an aligned practice,consistent with its goal of excellence (since advisoriespotentially create a strong bond with an adult who canmentor a student’s academic development), and with itsgoal of ethics (since advisories potentially promote mutualrespect and support among peers as well as positiveadult-student relations). <strong>The</strong> next question should be: Isthere existing research demonstrating that advisories reallydo contribute to these outcomes?<strong>The</strong> answer to that question is yes. A number of studieshave found advisories to be effective in easing the transitionto high school, strengthening teacher-student relations,increasing student achievement, reducing failinggrades, decreasing drop-outs, and improving relationswith parents. 20Wherever possible, a school should search the empiricalliterature for what it tells us about the effectiveness of anygiven practice.PELC 2: Align practices.Promising Practice 4:2.4 Engage in a continuous cycle ofPELC research-based action andreflection in order to assesseffectiveness and plan next steps.Research-based action and reflection (Re-BAR) is, inessence, practitioner action research. It is a process ofdata-based reflection on the impact of a particular practicein a particular school setting. This kind of practitionerresearch is needed to answer the question, “To whatextent is our school’s implementation of a given practiceeffective in achieving our intended outcomes?”Continuous school improvement mustbe assessed on the basis of resultsrather than intentions.—RICHARD DUFOUR AND ROBERT EAKERFor example, we know there is research showing theeffectiveness of a practice such as advisories. But, as ourschool observations and interviews confirmed, advisoriesvary widely in focus, frequency, and effectiveness. One studenttold us, “We need to do something in our school tohelp the not-so-good advisories become more like thegood ones.”“We need to do something to help thenot-so-good advisories become morelike the good ones.”<strong>The</strong>refore a school should ask, How effective are the advisoriesin our school, in the particular way we are implementingthem? Would they be more effective if they weremore frequent? Less frequent, but more focused on goalsrelated to the development of performance character andmoral character and the eight strengths of character?One could get data on perceived advisory effectiveness(“What’s working? What needs improvement?”) from formativeevaluation (e.g., surveys of all advisors and studentsand sample interviews), make revisions based on the feedback,then repeat the survey in an action research cycle.One could also do an action research study comparingthe performance of, for example, students who had advisories,with matched students who did not; or studentswho had advisory every day versus those who had it onlyonce a week; or students whose advisory focused systematicallyon performance character and moral charactergoals versus those advisories that lacked such a focus.In one large school we visited, faculty had begun to use“common assessments” (exams standardized across differentinstructors of the same course). We view this practiceas a good illustration of a Re-BAR (research-based actionand reflection) around a curriculum issue. <strong>The</strong> commonassessments procedure in this school compares studentperformance on the same test in different sections of thesame course (e.g., U.S. History, World Civilization). <strong>The</strong>purpose of this comparison is to enable team members touse the results to refine their teaching strategies in orderto improve student performance. <strong>The</strong> chair of the historydepartment explained how this process works:Our department agreed to do four common tests a year—amidterm and a final each semester. After the office scoresthe test and gives us the data, we meet and look at theperformance of each of our sections. We all see the departmentmean for the test and subtests, and we each see ourown section scores. Looking at how your own students per-71<strong>Smart</strong> & <strong>Good</strong> <strong>High</strong> <strong>Schools</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!