PISA Under Examination - Comparative Education Society in ...
PISA Under Examination - Comparative Education Society in ...
PISA Under Examination - Comparative Education Society in ...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
<strong>PISA</strong> UNDER EXAMINATION<br />
by a group of work<strong>in</strong>g-class girls <strong>in</strong> a disadvantaged <strong>in</strong>ner city school <strong>in</strong> a large<br />
urban area <strong>in</strong> the Republic of Ireland. The study comprised a visit to the school on<br />
the day follow<strong>in</strong>g the adm<strong>in</strong>istration of the 2009 <strong>PISA</strong> test and <strong>in</strong>cluded focus<br />
group <strong>in</strong>terviews with three groups of students and the pr<strong>in</strong>cipal. The thematic<br />
analysis of the <strong>in</strong>terviews and the focus groups transcripts revealed three themes:<br />
(1) the <strong>in</strong>tensity of the test<strong>in</strong>g process was too high and most students, especially<br />
those with special educational needs, felt overstretched by the amount and the<br />
content and difficulty of the read<strong>in</strong>g test items. (2) children who simply ticked the<br />
boxes to complete the test <strong>in</strong> time have implications for the validity of some of the<br />
responses to test items (3) students compla<strong>in</strong>ed about too many personal questions<br />
and a lack of anonymity <strong>in</strong> the student questionnaire, which was to collect data<br />
<strong>in</strong> relation to a number of background variables <strong>in</strong>clud<strong>in</strong>g family and home<br />
circumstances. In his conclusion Mac Ruairc highlights the need for a more<br />
proactive approach to student support and a more nuanced model of assessment <strong>in</strong><br />
future <strong>PISA</strong> tests to take account of social class difference.<br />
Marie Duru-Bellat analyses <strong>in</strong> her contribution the ability of <strong>PISA</strong> data <strong>in</strong><br />
assess<strong>in</strong>g the quality of education systems. The author starts off by discuss<strong>in</strong>g the<br />
question why <strong>PISA</strong> data are so appeal<strong>in</strong>g for policy-makers despite their<br />
limitations. In her analysis Duru-Bellat po<strong>in</strong>ts out that <strong>PISA</strong> data are so attractive<br />
because, rather than assess<strong>in</strong>g conformity to academic knowledge, <strong>PISA</strong> gives a<br />
concrete picture of 15-year-old students’ performance <strong>in</strong> subjects or exercises that<br />
are supposed to be relevant for daily life (“life skills”). In addition to this, <strong>PISA</strong><br />
data, even if they are imperfect and questionable, are very helpful <strong>in</strong> highlight<strong>in</strong>g<br />
differences <strong>in</strong> educational outcome across countries. Accord<strong>in</strong>g to Duru-Bellat, the<br />
misuses and limitations of <strong>PISA</strong> become obvious, when <strong>PISA</strong> data are used for<br />
benchmark<strong>in</strong>g and when countries are ranked as result of cross-comparative<br />
comparisons: “The core problem with benchmark<strong>in</strong>g is that benchmarks are set<br />
us<strong>in</strong>g the most readily available data” (p. 154). S<strong>in</strong>ce <strong>PISA</strong> data are readily<br />
available, they are used as if there were no other relevant <strong>in</strong>dicators of educational<br />
quality of an education system (e.g. equity), which is of course highly<br />
questionable. However, <strong>in</strong>dicators are isolated pieces of <strong>in</strong>formation, which<br />
accord<strong>in</strong>g to Duru-Bellat, are not sufficient for assess<strong>in</strong>g a whole ‘system’. For the<br />
comprehensive assessment of a whole education system, evaluation is far more<br />
useful than <strong>in</strong>dicators, because evaluation requires “the comb<strong>in</strong>ation of <strong>in</strong>dicators<br />
and most of all, the more qualitative <strong>in</strong>terpretation of their mean<strong>in</strong>g” (p. 155). In<br />
her conclusion Duru-Bellat po<strong>in</strong>ts out that her criticism, which is focused on the<br />
misuse of <strong>PISA</strong> data for benchmark<strong>in</strong>g processes, should not lead us “to renounce<br />
processes that evaluate education systems based on their output” (p. 157). The<br />
student output is and rema<strong>in</strong>s an important factor <strong>in</strong> assess<strong>in</strong>g the quality of<br />
education systems. However, accord<strong>in</strong>g to Duru-Bellat, it needs to be supplemented<br />
by additional data: “it is important not to limit oneself to measurement of student<br />
achievement but rather to <strong>in</strong>clude measurements of system characteristics such as<br />
coverage, f<strong>in</strong>anc<strong>in</strong>g (public/private) and track<strong>in</strong>g (early/comprehensive track<strong>in</strong>g,<br />
types of student groups etc.)” (p. 156).<br />
9