18.02.2014 Views

Blank Doc with SSC Logo and RDMS Doc Number - State Services ...

Blank Doc with SSC Logo and RDMS Doc Number - State Services ...

Blank Doc with SSC Logo and RDMS Doc Number - State Services ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

REPORT ON THE 2004 SCHOLARSHIP<br />

TO THE DEPUTY STATE SERVICES COMMISSIONER<br />

BY THE<br />

REVIEW TEAM<br />

LED BY DOUG MARTIN<br />

29 April 2005


CROWN COPYRIGHT<br />

ISBN 0-478-24491-6


TABLE OF CONTENTS<br />

Background ................................................................................................................ 1<br />

Review Process <strong>and</strong> Methodology............................................................................ 2<br />

Context ........................................................................................................................ 2<br />

Overview of Findings ................................................................................................. 3<br />

Discussion of Findings.............................................................................................. 4<br />

1.1 Whether the roles <strong>and</strong> responsibilities of the various agencies involved were<br />

clearly defined.................................................................................................................. 4<br />

1.2 Whether the identification <strong>and</strong> addressing of issues arising in designing <strong>and</strong><br />

implementing the new system were adequate.................................................................. 6<br />

1.3 Whether the processes to set <strong>and</strong> moderate the st<strong>and</strong>ards for the qualification<br />

were adequate................................................................................................................... 8<br />

Coordination of roles............................................................................................... 8<br />

Setting the st<strong>and</strong>ards ............................................................................................. 10<br />

Pre-Moderation (Underst<strong>and</strong>ing the St<strong>and</strong>ards).................................................... 11<br />

Professional Development .................................................................................... 12<br />

Monetary Awards.................................................................................................. 13<br />

1.4 Whether the administration of the examinations against those st<strong>and</strong>ards,<br />

including review of the processes for setting, moderating, <strong>and</strong> marking the<br />

examination was adequate.............................................................................................. 13<br />

Paper Setting ......................................................................................................... 13<br />

Marking <strong>and</strong> Moderation (of marking <strong>and</strong> marks)................................................ 14<br />

Patterns of Entry.................................................................................................... 15<br />

1.5 Whether communication <strong>and</strong> guidance to teachers <strong>and</strong> students concerning New<br />

Zeal<strong>and</strong> Scholarship were adequate ............................................................................... 16<br />

1.6 Any other matter which the Review Team considers relevant to the foregoing<br />

questions......................................................................................................................... 17<br />

Recommendations ................................................................................................... 18<br />

Annex 1: Terms of Reference for Review of the New Zeal<strong>and</strong> Qualifications Authority<br />

by the <strong>State</strong> <strong>Services</strong> Commissioner ........................................................................... 21<br />

Annex 2: Organisations consulted during the course of this Review ........................................ 23


Background<br />

1 In February 2005 the Associate Minister of Education, Hon David Benson-Pope, asked<br />

the <strong>State</strong> <strong>Services</strong> Commissioner to review the adequacy of the setting <strong>and</strong> management<br />

of the 2004 Scholarship <strong>and</strong>, more widely, the performance of New Zeal<strong>and</strong><br />

Qualifications Authority (NZQA) in respect of the school sector qualifications [CAB<br />

Min (05) 5/14]. The Associate Minister of Education requested the review under Section<br />

11 of the <strong>State</strong> Sector Act 1988. This request was in response to surprise at the degree of<br />

variability of results between subjects for the 2004 Scholarship <strong>and</strong> the fact that<br />

relatively few Scholarships were awarded.<br />

2 The Deputy <strong>State</strong> <strong>Services</strong> Commissioner, Tony Hartevelt, on behalf of the <strong>State</strong><br />

<strong>Services</strong> Commissioner, asked Doug Martin of Martin Jenkins <strong>and</strong> Associates to lead the<br />

Review. Dr Alan Barker, who is an independent education <strong>and</strong> public management<br />

consultant based in Hong Kong, assisted Mr Martin. They were supported by staff of the<br />

<strong>State</strong> <strong>Services</strong> Commission. The team has approached the Review in two parts: the<br />

implementation of the 2004 Scholarship <strong>and</strong> the performance of NZQA in relation to<br />

secondary school qualifications. This report covers the implementation of the 2004<br />

Scholarship only.<br />

3 Concurrent <strong>with</strong> this Review, the Associate Minister of Education, Hon David Benson-<br />

Pope, convened a group of education practitioners <strong>and</strong> experts to consider the shape of<br />

the 2005 Scholarship. This group (the Scholarship Reference Group 1 ) reported to the<br />

Associate Minister in mid-March 2005 <strong>and</strong> Cabinet subsequently considered <strong>and</strong> adopted<br />

25 of their 26 recommendations on 29 March 2005. The Ministry of Education (the<br />

Ministry) remains responsible for setting the Scholarship st<strong>and</strong>ards, NZQA remains<br />

responsible for operating the external assessment <strong>and</strong> awarding the Scholarships, <strong>and</strong> the<br />

Ministry of Social Development distributes the monetary awards.<br />

4 This report, by focussing on the public sector management aspects of Scholarship,<br />

supports the revised policy settings arising from the work of the Scholarship Reference<br />

Group.<br />

5 Part one of the Terms of Reference ask for a review of the 2004 Scholarship <strong>with</strong><br />

particular regard to:<br />

1.1 whether the roles <strong>and</strong> responsibilities of the various agencies involved were clearly<br />

defined<br />

1.2 whether the identification <strong>and</strong> addressing of issues arising in designing <strong>and</strong><br />

implementing the new system was adequate<br />

1.3 whether the processes to set <strong>and</strong> moderate the st<strong>and</strong>ards for the qualification were<br />

adequate<br />

1.4 whether the administration of the examinations against those st<strong>and</strong>ards, including<br />

review of the processes for setting, moderating, <strong>and</strong> marking the examination was<br />

adequate<br />

1.5 whether communication <strong>and</strong> guidance to teachers <strong>and</strong> students concerning New<br />

Zeal<strong>and</strong> Scholarship were adequate, <strong>and</strong><br />

1.6 any other matter which the Review Team considers relevant to the foregoing<br />

questions.<br />

1 To be distinguished from the earlier Scholarship Reference Group which met in 2001 & 2002. See paragraphs 36 & 37<br />

1


6 The full terms of reference are attached as Annex 1. This Review, which has<br />

concentrated principally on implementation matters, makes recommendations under each<br />

of the relevant Terms of Reference (TOR). General observations by the Review Team<br />

are included in the overview of findings <strong>and</strong> in discussion under TOR 1.6 below.<br />

Review Process <strong>and</strong> Methodology<br />

7 The approach taken by the Review Team has involved identifying the principal<br />

weaknesses in the implementation of the 2004 Scholarship, the lessons that can be drawn<br />

from these, <strong>and</strong> the changes necessary to ensure that the 2005 Scholarship is<br />

implemented smoothly.<br />

8 In order to underst<strong>and</strong> <strong>and</strong> analyse the implementation process for the 2004 Scholarship<br />

the Review Team interviewed a wide range of external stakeholders as well as a number<br />

of internal managers at both NZQA <strong>and</strong> the Ministry of Education. The Review Team<br />

also met <strong>with</strong> the Education Review Office. The Team developed structured questions<br />

that were made available to those organisations it met (a list of the organisations is<br />

attached as Annex 2).<br />

9 The Ministry <strong>and</strong> NZQA provided a large amount of written material to the Review<br />

Team. In addition, the Review Team invited comment from students, teachers <strong>and</strong><br />

members of the public. Advertisements seeking comment were placed in the Education<br />

Gazette <strong>and</strong> the Education Review on 11 <strong>and</strong> 21 March 2005 respectively. Teachers sent<br />

in 22 individual or collective (whole school) comments. Six stakeholder groups sent in<br />

comments. The Chief Executive of NZQA wrote to all 4,500 students who entered the<br />

2004 Scholarship, on behalf of the Review Team, inviting comment on their experience.<br />

220 emails were received from students in reply.<br />

10 On 31 March 2005 the Review Team ran a workshop to test preliminary findings <strong>with</strong><br />

key stakeholders who the Team had previously interviewed. Members of the team met<br />

separately <strong>with</strong> some stakeholder groups who were not able to attend, to ensure that there<br />

was an opportunity for them to provide feedback on the preliminary findings. The<br />

Review Team has also engaged <strong>with</strong> the NZQA Board, presenting its initial findings on 7<br />

April 2005. The feedback on the initial findings indicated to the Review Team that they<br />

had captured the essence of the problem. This report builds on those findings <strong>and</strong> makes<br />

appropriate recommendations to address the issues for the 2005 Scholarship.<br />

Context<br />

11 Whilst, as the report notes, there have been significant difficulties <strong>with</strong> the 2004<br />

Scholarship, it is important to view these <strong>with</strong>in the context of the much wider reforms to<br />

teaching, learning <strong>and</strong> assessment in New Zeal<strong>and</strong>’s senior secondary schools which<br />

have been progressively implemented over the previous decade or more. These reforms,<br />

which reflect changes in the nature <strong>and</strong> role of the senior secondary school since the<br />

1970s, are directed at providing all students <strong>with</strong> the opportunity to succeed <strong>and</strong> to make<br />

successful transitions into tertiary study. The National Certificate of Educational<br />

Achievement (NCEA), as it was set out in the 1998 Cabinet paper was a starting point,<br />

representing a huge <strong>and</strong> unprecedented change in teaching, learning <strong>and</strong> assessment<br />

outcomes.<br />

12 The development of the new approach to senior secondary school qualifications was<br />

contentious, <strong>and</strong> the policy that was agreed by Cabinet in 1998 represented a compromise<br />

between widely polarized positions on senior school assessment <strong>and</strong> qualifications (see<br />

2


discussion under TOR 1.6). NCEA provides for external <strong>and</strong> internal assessment at three<br />

levels, as well as the externally assessed Scholarship at Level 4 (in 2004). In total this<br />

amounts to more than 150,000 c<strong>and</strong>idates <strong>and</strong> approximately 1.9 million individual<br />

examination booklets, a massive undertaking. In short, the implementation of NCEA<br />

required a major, complex change management process. This included three years of<br />

professional learning <strong>and</strong> development managed by the Ministry, the instituting of<br />

significant new business processes by NZQA, <strong>and</strong> extensive relationship management<br />

<strong>with</strong> schools by NZQA.<br />

13 Furthermore, the implementation of NCEA <strong>and</strong> Scholarship, commencing in 2002, has<br />

been undertaken over a very short timeframe, necessary in order to provide a coherent<br />

pathway for students through senior secondary school. The implementation of Level 1<br />

NCEA was particularly difficult, <strong>with</strong> a major industrial dispute <strong>with</strong> secondary school<br />

teachers complicating communication <strong>and</strong> discussion on Level 1 implementation in the<br />

early months of 2002. In contrast, the implementation of Level 2 NCEA was relatively<br />

smooth, although there were some technical difficulties <strong>with</strong> the posting of results on the<br />

NZQA website.<br />

14 When it came to implementing Level 3 NCEA <strong>and</strong> the 2004 Scholarship in the same<br />

year, there was insufficient appreciation of the need to differentiate between the two<br />

assessments. In the event, implementation of Level 3 NCEA, which has a much wider<br />

impact on students than Scholarship, was the main focus of attention, <strong>with</strong> Scholarship<br />

essentially being implemented in its slipstream. The strategic risks, both policy <strong>and</strong><br />

implementation, associated <strong>with</strong> Scholarship received insufficient attention.<br />

15 It is important to appreciate that the implementation of NCEA is in its early stages <strong>and</strong><br />

will continue to evolve. The magnitude of the change is such that the transition to better<br />

learning, teaching <strong>and</strong> assessment will continue over at least the next five years. The<br />

focus of attention <strong>and</strong> debate should not merely be on approaches to assessment. The<br />

Review Team notes that to this point the change has been implemented reasonably well<br />

overall.<br />

Overview of Findings<br />

16 The overall finding is that the difficulties experienced <strong>with</strong> the 2004 Scholarship resulted<br />

from inadequacies in both policy advice <strong>and</strong> implementation.<br />

17 The Government received inadequate advice on the policy risks associated <strong>with</strong> the 2004<br />

Scholarship. The principles on which Scholarship was to be based were established in<br />

1998, <strong>with</strong> the more detailed aspects of policy being developed through consultation <strong>with</strong><br />

key stakeholders over the succeeding five years. This was a long process during which<br />

officials failed to adequately take stock of the policy settings <strong>and</strong> provide the government<br />

<strong>with</strong> an explicit analysis of the implications for the outcome of the 2004 Scholarship. A<br />

stocktake <strong>and</strong> analysis might well have led to a further refinement in the policy settings<br />

for the 2004 Scholarship, along the lines of those recently approved by the government.<br />

Instead, there was a ‘drift’ into implementation <strong>with</strong>out an adequate analysis of the<br />

strategic policy risks.<br />

18 The result was that the expectations of Ministers were not aligned <strong>with</strong> those of officials<br />

in the Ministry of Education <strong>and</strong> NZQA, <strong>and</strong> the outcome of the 2004 Scholarships came<br />

as a surprise to both Ministers <strong>and</strong> the wider public. For example, officials saw the<br />

variability of results between subjects as a consequence of the approach to Scholarship,<br />

rather than as a significant risk that could undermine the credibility of the examinations.<br />

3


19 In similar vein, the government was not provided <strong>with</strong> adequate advice on the strategic<br />

risks associated <strong>with</strong> implementing both the high stakes Level 3 NCEA <strong>and</strong> Scholarship<br />

in the same year, 2004. Officials were focused on operational risks <strong>and</strong> lost sight of the<br />

higher level implementation risks which impacted on outcomes. Strategies to mitigate<br />

these risks were not identified <strong>and</strong> put into effect in the approach to the 2004 Scholarship<br />

to ensure a fair result for students.<br />

20 As it transpired, the risks were significant. For example, Scholarship was still being<br />

implemented part way through the normal examination cycle; the registered st<strong>and</strong>ards<br />

were not available until December 2003, delaying the writing of exemplars; there was<br />

only one exemplar produced for each subject; new subjects were being examined <strong>with</strong> no<br />

previous examination history at Scholarship level; <strong>and</strong> there was no dedicated<br />

professional development for teachers on Scholarship, not<strong>with</strong>st<strong>and</strong>ing the different<br />

nature of a one-off, externally assessed examination which needed to identify top<br />

students. These risks were not adequately taken into account in the approach to the 2004<br />

Scholarship. The consequence was that teachers <strong>and</strong> some students were not well<br />

prepared for the 2004 Scholarship <strong>and</strong> the outcome was unfair to those students. This<br />

may also have affected the entry of students into some University courses <strong>and</strong> their<br />

eligibility for subsequent award opportunities.<br />

Discussion of Findings<br />

21 The detailed findings of the Review Team are discussed below under each of the Terms<br />

of Reference (TOR) 1.1 – 1.6.<br />

1.1 Whether the roles <strong>and</strong> responsibilities of the various agencies involved<br />

were clearly defined<br />

22 In a formal sense the roles <strong>and</strong> responsibilities of the Ministry of Education (the<br />

Ministry) <strong>and</strong> NZQA are clearly defined, <strong>with</strong> the Ministry being responsible for the<br />

provision of policy advice, the development of st<strong>and</strong>ards <strong>and</strong> professional development,<br />

<strong>and</strong> NZQA for operational matters including external assessment. Nonetheless, external<br />

stakeholders commented consistently to the Review Team that they were unclear as to<br />

who was leading the change <strong>and</strong> at what point the focus shifted from policy to<br />

implementation. A compounding factor was that some significant policy issues were<br />

addressed during the implementation phase, <strong>with</strong> NZQA taking the lead on these<br />

sometimes <strong>with</strong>out Ministry involvement (see paragraph 32). This lack of clarity added<br />

to the perception, notably by teachers <strong>and</strong> schools, that they “were being shuffled from<br />

one agency to the other, <strong>and</strong> neither agency was taking responsibility for the issues they<br />

raised.”<br />

23 The Ministry established a dedicated NCEA project team, the Qualifications<br />

Development Group (QDG), in early 1999. This group, which was based in the Ministry<br />

<strong>and</strong> drew on considerable expertise from both agencies, wrote the st<strong>and</strong>ards, produced<br />

resource material (e.g. exemplars) for Levels 1, 2 <strong>and</strong> 3, undertook initial planning for<br />

implementation including addressing emerging issues. The QDG also managed<br />

communications <strong>and</strong> professional development, which was largely contracted out to<br />

Colleges of Education which in turn ran several ‘Jumbo’ days involving all secondary<br />

teachers for years 11, 12, <strong>and</strong> 13. The group was disestablished in 2001 when tagged<br />

implementation funding came to an end.<br />

24 One key stakeholder told the Review Team that the disestablishment of the<br />

Qualifications Development Group was a significant loss to implementation of NCEA, as<br />

4


teachers no longer had a single point of entry into the system where they could take their<br />

concerns. Instead responsibility was split between the two agencies. This resulted in a<br />

noticeable communication problem for the 2004 Scholarship. The Review Team also<br />

believes that the disestablishment of the QDG weakened the implementation focus for<br />

Scholarship from 2001.<br />

25 Following the disestablishment of the Qualifications Development Group, the<br />

responsibility for NCEA implementation oversight (but not actual implementation), fell<br />

primarily to the Joint Overview Group (JOG) of senior <strong>and</strong> mid-level managers from the<br />

Ministry <strong>and</strong> NZQA. However, JOG’s m<strong>and</strong>ate was fluid, involving principally<br />

information exchange <strong>and</strong> discussion of operational policy matters. Throughout 2004,<br />

the focus of JOG was on broader matters associated <strong>with</strong> NCEA <strong>and</strong> a number of wider<br />

issues like the draft Secondary Education Work Programme. Scholarship received little<br />

attention.<br />

26 At the explicit request of Ministers in 2002 the two agencies adopted a number of<br />

measures to reduce role confusion. 2 Again, the focus on Scholarship was weak <strong>and</strong><br />

confusion over roles remained for external stakeholders. The lack of well-focused<br />

implementation for Scholarship drawing on expertise from both agencies contributed to<br />

the drift into implementation <strong>with</strong>out adequate analysis of the strategic implementation<br />

risks.<br />

27 The work of the Qualifications Development Group, the Joint Overview Group, <strong>and</strong> the<br />

other initiatives noted in paragraph 26 did, however, significantly improved the climate<br />

of cooperation <strong>and</strong> collaboration between the Ministry <strong>and</strong> NZQA.<br />

28 The Review Team concludes that a dedicated implementation team should be established<br />

by NZQA for the 2005 Scholarship, <strong>and</strong> that this team should include people <strong>with</strong><br />

academic expertise in the area, senior practitioners, <strong>and</strong> key representatives from the<br />

Ministry of Education.<br />

29 Given the magnitude of the change from the old Scholarship arrangements to the new,<br />

which involved introducing a separate st<strong>and</strong>ards-based examination at Level 4 for 27<br />

subjects, some of which had never been tested at Scholarship level before, the Review<br />

Team would have expected there to have been:<br />

• a formal h<strong>and</strong>over of completed policy specifications from the Ministry to NZQA,<br />

<strong>with</strong> supporting documentation that clearly explained the policy to be<br />

implemented <strong>and</strong> the expected outcomes,<br />

• a comprehensive implementation plan for Scholarship developed by NZQA, <strong>with</strong><br />

an analysis of implementation risks <strong>and</strong> strategies to deal <strong>with</strong> them, a<br />

communications plan <strong>and</strong> a plan to monitor progress.<br />

30 Neither eventuated. The absence of this documentation meant that the identification <strong>and</strong><br />

addressing of issues in the design <strong>and</strong> implementation of Scholarship was inadequate (see<br />

1.2 below), <strong>and</strong> the confusion over which agency was leading the change continued for<br />

external stakeholders.<br />

31 The Review Team concludes that for 2005 <strong>and</strong> beyond the policy settings should be<br />

formally communicated to the NZQA Board in a letter from the Associate Minister of<br />

Education, together <strong>with</strong> an expression of the expected policy outcomes. NZQA should,<br />

in turn, develop a detailed implementation plan that is formally approved by the<br />

2 These included joint publication of NCEA Update, enhancement of the integrated NCEA Website, <strong>and</strong> joint chairing of<br />

the Leaders Forum from May 2003<br />

5


Associate Minister. At the same time, both agencies should aim for a seamless<br />

implementation process whereby NZQA is involved in work that the Ministry leads <strong>and</strong><br />

the Ministry is involved in work that NZQA leads, so that the value of the combined<br />

expertise <strong>and</strong> different perspectives is maximised. This seamlessness will require better<br />

feedback loops between officials <strong>and</strong> the sector <strong>and</strong> a preparedness to take on board<br />

sector concerns.<br />

32 As noted above, a number of important policy issues surfaced during the implementation<br />

phase, such as the number of subjects to be examined; how to fairly manage language<br />

examinations for native <strong>and</strong> non-native speakers of languages; <strong>and</strong> the allocation of<br />

monetary awards. There appears to have been a degree of role confusion in settling these<br />

matters. NZQA saw some of these issues as operational in nature <strong>and</strong> did not always<br />

involve the Ministry in their resolution. Some were in fact substantive policy issues,<br />

albeit <strong>with</strong> important operational dimensions, <strong>and</strong> both agencies should have been<br />

involved in the process of advising the Minister. The protracted consideration of these<br />

issues during the implementation phase also created uncertainty in the sector about the<br />

nature of the new Scholarship.<br />

33 The degree to which NZQA had dedicated operational policy capacity to help resolve<br />

these policy issues is also relevant. NZQA originally had substantial policy capacity<br />

when it was established. This capacity was progressively reduced from the mid 1990s<br />

because it created tensions <strong>with</strong> the Ministry’s overarching responsibility for policy. The<br />

result was that NZQA was left primarily as an operational agency. The Review Team<br />

believes that this situation, coupled <strong>with</strong> the loss of staff experienced in dealing <strong>with</strong> the<br />

policy <strong>and</strong> operational risks of running competitive examinations, contributed to the drift<br />

into implementation, <strong>and</strong> the insufficient analysis <strong>and</strong> communication of potential risks.<br />

34 The Review Team notes that the Ministry has the infrastructure <strong>and</strong> institutional<br />

capability to set up <strong>and</strong> run effective professional development. This should be utilised to<br />

provide professional development for the 2005 Scholarship. NZQA needs to provide<br />

input <strong>and</strong> expertise to the materials to be produced for students <strong>and</strong> teachers.<br />

35 In light of the above discussion, the Review Team has also come to the view that there is<br />

an immediate need for the Ministry to provide active leadership to help coordinate <strong>and</strong><br />

integrate the various parts of the sector. In particular, the Ministry needs to ensure that<br />

for the 2005 Scholarship, the Leaders’ Forum, the Ministry (policy <strong>and</strong> monitoring roles)<br />

<strong>and</strong> the NZQA are all part of a coherent whole that leads to a consistent approach <strong>and</strong><br />

results.<br />

1.2 Whether the identification <strong>and</strong> addressing of issues arising in designing<br />

<strong>and</strong> implementing the new system were adequate<br />

36 Cabinet agreed in 1998 to the principles of a new National Certificate of Educational<br />

Achievement (NCEA) including a new st<strong>and</strong>ards-based Scholarship. Details of the new<br />

2004 Scholarship were worked out over the next five years principally through a range of<br />

sector-based consultative groups. These included:<br />

• Scholarship Working Group reporting to the Leaders’ Forum in July 2000<br />

• Scholarship Reference Group 3 – met from February 2001 to November 2002<br />

• Secondary Principals’ <strong>and</strong> Leaders’ Forum – re-convened in May 2003 <strong>with</strong> a<br />

much wider brief<br />

3 This the first of two different groups referred to as “the Scholarship Reference Group” See paragraph 3<br />

6


37 The Scholarship Reference Group (SRG) comprised representatives drawn from the<br />

sector including members of the New Zeal<strong>and</strong> Education <strong>and</strong> Scholarship Trust<br />

(NZEST), which ran a private <strong>and</strong> independent scholarship examination from 1992 to<br />

2003. The NZEST representatives <strong>with</strong>drew from the SRG in November 2002 <strong>and</strong> the<br />

SRG did not meet again after that. This <strong>with</strong>drawal removed the main voice of<br />

opposition to st<strong>and</strong>ards-based assessment <strong>and</strong> weakened the capacity of the consultative<br />

process to confront <strong>and</strong> resolve some critical differences.<br />

38 At the conclusion of the consultation process, the Government was not provided <strong>with</strong> a<br />

stocktake of the policy settings by the Ministry, including an explicit analysis of the<br />

implications of those policy settings for the outcome of the 2004 Scholarship. Given the<br />

extended period over which the policy evolved, <strong>and</strong> <strong>with</strong> the benefit of hindsight, a<br />

stocktake of this nature was warranted, <strong>and</strong> might well have resulted in a further<br />

refinement of the policy settings along the lines approved in March 2005 by the<br />

Government for the 2005 Scholarship.<br />

39 The absence of a stocktake meant that Ministers, the public, <strong>and</strong> students were<br />

unpleasantly surprised by the results of the 2004 Scholarship. The surprise occurred both<br />

in terms of the relatively few scholarships awarded (leading to a decision to subsequently<br />

create a new category of awards known as Distinction Awards) <strong>and</strong> the extent of the<br />

variability between subjects that was acceptable.<br />

40 It was clear that in 2004 the variability of results would be more transparent than ever<br />

before, because the results of st<strong>and</strong>ards-based assessment are not scaled to achieve<br />

comparability across subjects <strong>and</strong> across years. This was a very new way of assessing<br />

<strong>and</strong> reporting results, <strong>and</strong> NZQA <strong>and</strong> the Ministry should have given far greater attention<br />

to preparing the education community <strong>and</strong> the public to underst<strong>and</strong> it.<br />

41 Variation in the results of subjects from year to year has a number of causes, only one of<br />

which is due to annual changes to examination questions. Other causes lie, for instance,<br />

in the curriculum, how it is interpreted, <strong>and</strong> whether it is relevant; the st<strong>and</strong>ard of<br />

teaching (across the system <strong>and</strong> also at a particular school); c<strong>and</strong>idate selection; <strong>and</strong><br />

whether parts of the curriculum receive attention but not others. The results of the new<br />

2004 Scholarship could not of course be compared <strong>with</strong> results from previous years<br />

because the st<strong>and</strong>ard was new.<br />

42 Variation is also affected by the success of moderation employed before the marking<br />

(such as exemplars <strong>and</strong> teacher professional development) which aim to establish<br />

common underst<strong>and</strong>ing for teachers, examiners <strong>and</strong> markers; <strong>and</strong> moderation during<br />

marking when schedules <strong>and</strong> marking practices are adjusted to modify unexpected<br />

outcomes. The unknown composition of the c<strong>and</strong>idature of the 2004 Scholarship also<br />

contributed to variation <strong>and</strong> is discussed below in paragraph 92. There are also other<br />

seemingly minor but nevertheless influential factors, such as whether papers are<br />

timetabled in close succession for some students while other students have time between<br />

papers to refresh <strong>and</strong> do further preparation.<br />

43 The combination of these factors may outweigh the variation caused by the changes in<br />

questions from year to year, although the latter is normally the most prominent source of<br />

variation.<br />

44 The public, however, has a reasonable expectation that the assessment system will<br />

deliver fairness. Most people interviewed thought that it was unreasonable that students<br />

should be disadvantaged for matters beyond their control, such as inconsistency in<br />

preparing students, extra-hard questions, an inadequate curriculum, or an overly<br />

compressed examination timetable.<br />

7


45 The Government was not provided <strong>with</strong> adequate advice on the strategic risks, including<br />

variability, associated <strong>with</strong> implementing high stakes Level 3 NCEA <strong>and</strong> the new<br />

Scholarship in the same year, 2004. There was no appreciation of the need to<br />

differentiate between Level 3 NCEA <strong>and</strong> Scholarship. The focus was on detailed<br />

operational issues that were common to both, rather than the high-level strategic risks<br />

around implementation of Scholarship.<br />

46 The outcome of that attention was that the business processes developed to implement<br />

Level 3 NCEA <strong>and</strong> Scholarship in 2004 were comprehensive <strong>and</strong> efficient. They<br />

adequately addressed the detailed operational risks associated <strong>with</strong> managing external<br />

assessment for that cohort of students.<br />

47 As it turned out, however, the strategic implementation risks were significant. For<br />

example:<br />

• the registered st<strong>and</strong>ards were not available until December 2003, even though the<br />

Ministry made available ‘final form’ st<strong>and</strong>ards from mid 2003, effectively<br />

delaying the production of exemplars<br />

• for NZQA the effective implementation of Scholarship commenced six months<br />

into the normal examination cycle, resulting in very compressed timeframes<br />

• only one exemplar per subject was produced<br />

• new subjects were added to Scholarship <strong>with</strong>out any previous history of<br />

examination<br />

• there was no dedicated professional development for teachers of Scholarship, <strong>and</strong><br />

• the first year of significant change is more difficult for the teachers <strong>and</strong> students<br />

than succeeding years, even <strong>with</strong> exhaustive preparation (which was not possible<br />

in 2004).<br />

48 Teachers <strong>and</strong> students therefore could not possibly, or uniformly, have been as well<br />

prepared for the 2004 Scholarship as was desirable <strong>and</strong> this lack of preparedness was not<br />

adequately taken into account by officials. The result was that the outcome was unfair on<br />

some students because they did not perform as well as they might otherwise have done.<br />

1.3 Whether the processes to set <strong>and</strong> moderate the st<strong>and</strong>ards for the<br />

qualification were adequate<br />

Coordination of roles<br />

49 As discussed under TOR 1.1 the Ministry was responsible for setting the st<strong>and</strong>ards for<br />

the 2004 Scholarship. NZQA participated in the development of the st<strong>and</strong>ards <strong>and</strong><br />

contributed expertise <strong>and</strong> judgement from an assessment perspective. Once the st<strong>and</strong>ards<br />

were written <strong>and</strong> signed off by the Ministry, NZQA played a lead role in registering the<br />

st<strong>and</strong>ards which involved further checks on the clarity <strong>and</strong> assessability of the st<strong>and</strong>ards.<br />

This reflected firm demarcations between these two agencies, separating the role of<br />

st<strong>and</strong>ards-setting from the role of assessing students against those st<strong>and</strong>ards.<br />

50 Separating these two roles is common for many st<strong>and</strong>ards-based qualifications.<br />

Professions, technologies <strong>and</strong> industries typically set their st<strong>and</strong>ards <strong>and</strong> appropriate<br />

assessment instruments are subsequently devised. This division of roles is not so clear<br />

for a traditional ranking examination for school-aged students, because the assessment<br />

instruments (the examination questions <strong>and</strong> marking schedule) are, to a degree,<br />

8


inseparable from them. The st<strong>and</strong>ards are embedded in the examination. This holds true<br />

even for the new 2004 Scholarship which aimed to set an explicit, st<strong>and</strong>-alone st<strong>and</strong>ard<br />

for each subject.<br />

51 Setting st<strong>and</strong>ards for school students, based on national curriculum statements, which are<br />

deliberately developmental in nature, presents a different challenge from setting<br />

st<strong>and</strong>ards in industry, the professions <strong>and</strong> most tertiary courses. School students are<br />

developing <strong>and</strong> growing, <strong>and</strong> the st<strong>and</strong>ards must acknowledge a wide base of<br />

participating talent <strong>and</strong> developing achievement. They require the interpretation of<br />

experienced teachers.<br />

52 The Ministry <strong>and</strong> NZQA recognised this complexity, <strong>and</strong> did not ignore links between<br />

curriculum <strong>and</strong> assessment. The st<strong>and</strong>ards for the 2004 Scholarship subjects, the<br />

subsequent examinations papers, marking schedules <strong>and</strong> the marking were in the h<strong>and</strong>s<br />

of a reasonably tight core of people in the secondary <strong>and</strong> tertiary education community.<br />

These dedicated people, on whom the assessment system relies, were frequently past<br />

examiners <strong>and</strong> markers of the University Bursary examinations or the NZEST<br />

examinations. A number of them served on the Ministry’s st<strong>and</strong>ards-setting panels <strong>and</strong><br />

brought their experience <strong>and</strong> knowledge to the task.<br />

53 They did however need firm guidance <strong>and</strong> direction. A degree of inflexibility in the roles<br />

of the two agencies seems to have impeded the coordination <strong>and</strong> resolution of issues<br />

needed to implement a new examination <strong>with</strong> new st<strong>and</strong>ards, <strong>and</strong> a new way of assessing<br />

<strong>and</strong> ranking. This inflexibility manifested itself most obviously at the working level <strong>and</strong><br />

is likely to be rooted in history <strong>and</strong> is difficult to untangle. While differences in view<br />

amongst professionals are to be expected, the lesson to be drawn is that the issues<br />

debated in both organisations need to be referred to senior managers for resolution rather<br />

than languishing at lower levels. The leaders of the respective organisations have not<br />

been fully successful in ensuring that substantive issues are escalated for resolution <strong>and</strong><br />

this requires urgent <strong>and</strong> sustained attention.<br />

54 The National Assessment Facilitators in NZQA should have been fully involved in<br />

setting the 2004 Scholarship st<strong>and</strong>ards instead of being at the margins, if only to reduce<br />

the time it took later to refine the st<strong>and</strong>ards <strong>and</strong> make the examinations operationally<br />

possible. More realistic knowledge of the length <strong>and</strong> dem<strong>and</strong>s of assessment operations<br />

would have informed the content of the st<strong>and</strong>ards, kept them realistic for a one-off<br />

competitive assessment, <strong>and</strong> kept the st<strong>and</strong>ards-setting on target so that support material<br />

could be generated in time to meet the school year. Equally, the Ministry should have<br />

been kept informed <strong>and</strong> involved in the assessment details instead of being kept at a<br />

distance by NZQA.<br />

55 One point where the relationship broke down was professional development. It seems<br />

that at the time of the Scholarship examination st<strong>and</strong>ards being written, the writers<br />

developed early exemplars. The Ministry felt the quality of these exemplars was uneven<br />

but they were a resource <strong>and</strong> were given to NZQA. Further, the Ministry felt that the<br />

Jumbo days offered the chance to stage-manage some of this exploratory <strong>and</strong><br />

developmental material. However, NZQA was reluctant to use the material because it<br />

wished to have accurate assessment exemplars based on firmly agreed subject st<strong>and</strong>ards,<br />

knowing that teachers rely closely on past experiences <strong>and</strong> examples when preparing<br />

students for competitive examinations. NZQA subsequently contracted the development<br />

of new exemplars. The two agencies did not seem able to resolve their differences. The<br />

outcome was that the booklets for the professional development Jumbo days contained<br />

little material to help embed the Scholarship examination st<strong>and</strong>ards.<br />

9


56 Most important of all, better coordination between the Ministry <strong>and</strong> NZQA would have<br />

surfaced <strong>and</strong> brought to resolution the different underst<strong>and</strong>ings <strong>and</strong> interpretations of<br />

st<strong>and</strong>ards-based assessment that identifies top students which were held by different<br />

parties. As it was, these differences continued unresolved throughout implementation.<br />

What should have been a seamless <strong>and</strong> integrated implementation process became<br />

fragmented, <strong>and</strong> as we have noted under TOR 1.1, teachers <strong>and</strong> interested parties were<br />

uncertain which agency they should be dealing <strong>with</strong>.<br />

57 The fact that the Ministry did not h<strong>and</strong> over to NZQA a clear, documented policy<br />

specification for the 2004 Scholarship meant that subsequently there was no benchmark<br />

for monitoring implementation. The subject st<strong>and</strong>ards were delivered to NZQA <strong>with</strong>out<br />

a shared underst<strong>and</strong>ing of what they really represented <strong>and</strong> how they would be applied in<br />

high stakes, competitive examination. Equally, NZQA did not seek to clarify<br />

uncertainties, or challenge decisions which committed it to unrealistic timelines.<br />

Setting the st<strong>and</strong>ards<br />

58 The st<strong>and</strong>ards for the 2004 Scholarship in each subject were based on a broad, generic<br />

descriptor established for all subjects by the Ministry’s Scholarship Working Group in<br />

2002 4 :<br />

• Learners who have been awarded Scholarship in a subject will have demonstrated:<br />

- High level critical thinking, abstraction <strong>and</strong> generalisation; <strong>and</strong><br />

- The ability to integrate, synthesise <strong>and</strong> apply knowledge, skills,<br />

underst<strong>and</strong>ing <strong>and</strong> ideas to complex situations.<br />

• Depending on the learning area, a range of the following will have also been<br />

displayed:<br />

- comprehensive content knowledge (breadth & depth)<br />

- effective communication<br />

- original or sophisticated solutions, performances or approaches<br />

- critical evaluation<br />

- flexible thinking in unfamiliar/unexpected contexts.<br />

Assessment is restricted to the content of the 1evel 3 achievement st<strong>and</strong>ards for that<br />

subject, derived from Level 8 of New Zeal<strong>and</strong> Curriculum or their equivalent, but the<br />

skills <strong>and</strong> underst<strong>and</strong>ing required will meet the Scholarship criteria.<br />

59 In 2003 this generic descriptor guided the panels which set the specific st<strong>and</strong>ards for each<br />

2004 Scholarship subject. Notes were added to each subject st<strong>and</strong>ard, clarifying the<br />

scope of application of the st<strong>and</strong>ard <strong>with</strong>in the subject but these notes <strong>and</strong> guidance<br />

varied from subject to subject. For instance, Accounting provided some detail on the<br />

body of law, regulations <strong>and</strong> conventions the st<strong>and</strong>ard covered. This set a boundary for<br />

the Accounting examination st<strong>and</strong>ard. On the other h<strong>and</strong>, Biology had less specific<br />

guidance for its st<strong>and</strong>ard. In part, the difference between the two subjects reflects their<br />

4 In March 2005 it was confirmed by the 2005 Scholarship Reference Group <strong>and</strong> agreed by Cabinet in March 2005 when<br />

it was renamed more accurately an ‘Outcome <strong>State</strong>ment’<br />

10


nature – Biology is a more open-ended discipline than Accounting. But these differences<br />

possibly contributed to the uneven performance of students in the new 2004 Scholarship.<br />

60 The different boundaries for subjects may also have posed particular issues for the<br />

sciences. Whereas recent University Bursary examinations emphasised the calculation of<br />

correct answers, the new 2004 Scholarship asked additionally for commentary <strong>and</strong><br />

explanations of those answers.<br />

61 This experience highlighted two related problems. These were the level at which the<br />

st<strong>and</strong>ard was pitched <strong>and</strong> how realistic that level was. Setting an examination paper<br />

requires clarity about what is expected of students at a certain age (or a certain level of<br />

learning). The issue is whether the st<strong>and</strong>ards address a level that the best students can<br />

realistically reach, or whether they are aspirational <strong>and</strong> aim at a level that the ideal<br />

student ought to reach. It is not clear which approach the chief examiners <strong>and</strong> markers<br />

put in practice in the 2004 Scholarship, <strong>and</strong> whether each subject had the same notion of<br />

the st<strong>and</strong>ard.<br />

62 With the benefit of hindsight, the Ministry <strong>and</strong> NZQA should have confirmed a common<br />

underst<strong>and</strong>ing of what was involved in the new Scholarship examination. NZQA staff<br />

should then have been more flexible in their application of st<strong>and</strong>ards-based assessment.<br />

Less reliance should have been placed on the subject st<strong>and</strong>ard, <strong>and</strong> more guidelines<br />

provided to the examiners <strong>and</strong> markers to assist them to interpret the st<strong>and</strong>ard.<br />

63 The Review Team supports the recommendation of the Scholarship Reference Group<br />

report that student performance needs to be assessed against an agreed assessment<br />

schedule that allows markers to discriminate between students.<br />

64 The preparation for developing the st<strong>and</strong>ards for the new 2004 Scholarship required<br />

special attention. The addition of new subjects <strong>with</strong> no supporting case history, wider<br />

curricula than before <strong>and</strong> a new approach to high stakes assessment all contributed to<br />

implementation risk.<br />

65 The st<strong>and</strong>ards for the individual subjects were not registered until December of 2003, <strong>and</strong><br />

the assessment exemplars were consequently delayed until the next year, 2004, when<br />

teaching <strong>and</strong> the preparation of students was already underway. This squeezed the<br />

implementation as far as teachers <strong>and</strong> students were concerned.<br />

66 It is noticeable from the minutes of the Joint Overview Group (JOG) of the Ministry <strong>and</strong><br />

NZQA, that NZQA was not comfortable <strong>with</strong> the logistics of running four parallel sets of<br />

examinations. While the minutes record debate, no formal advice was forwarded to the<br />

Minister.<br />

67 The 2004 Scholarship was one of four cycles of external assessment (along <strong>with</strong> internal<br />

assessment in the three NCEA levels), amounting to more than 150,000 c<strong>and</strong>idates <strong>and</strong><br />

1.9 million external assessment booklets. This was a huge undertaking <strong>and</strong> the fact that<br />

the great majority of it went well deserves praise.<br />

Pre-Moderation (Underst<strong>and</strong>ing the St<strong>and</strong>ards)<br />

68 There are two stages of moderation. Pre-moderation occurs when the teaching<br />

community works collectively to underst<strong>and</strong> the st<strong>and</strong>ard <strong>and</strong> achieve a common<br />

interpretation (this process is sometimes referred to as the socialisation of assessors). It<br />

happens through a range of activities, in particular formal professional development<br />

occasions where teachers exchange assessments, <strong>and</strong> informal accessing of websites or<br />

item banks for common assessment tasks <strong>and</strong> marked examples.<br />

11


69 Pre-moderation contrasts <strong>with</strong> post-moderation, where the marking of assessments<br />

generates common underst<strong>and</strong>ing, <strong>and</strong> adjustments are made to marking on the basis of<br />

that shared underst<strong>and</strong>ing. St<strong>and</strong>ards-based assessment places a lot of emphasis on premoderation,<br />

whereas norm-referenced assessment places more emphasis on postmoderation<br />

(see paragraphs 99 to 101). In the event, pre-moderation activities for the<br />

2004 Scholarship proved insufficient.<br />

70 The generic descriptor <strong>and</strong> the st<strong>and</strong>ard for each subject were designed to avoid<br />

overloading the teaching of NCEA <strong>with</strong> new content. They were also designed to fit<br />

<strong>with</strong> the model of st<strong>and</strong>ards-based assessment. However, both the general descriptor <strong>and</strong><br />

the subject st<strong>and</strong>ards were very general <strong>and</strong> provided little prescription or definition for<br />

the teachers <strong>and</strong> students.<br />

71 Traditional examination systems normally revolve around the prescription for each<br />

subject, a document which sets out which elements of the curriculum (the broader body<br />

of knowledge) will be assessed <strong>and</strong> broadly how this assessment will occur. The<br />

prescription is a key document for examinations because teachers teach to it, <strong>and</strong> the<br />

students learn from it – which is one reason why the educational benefit of examinations<br />

is questioned. However, in the case of the new 2004 Scholarship teachers <strong>and</strong> students<br />

had little prescription, <strong>and</strong> the interpretation of the subject st<strong>and</strong>ard was more open <strong>and</strong><br />

tentative than before. They had little to pin down the st<strong>and</strong>ards <strong>and</strong> to prepare for a<br />

competitive examination.<br />

72 This open interpretation may be more faithful to learning than preparing students for an<br />

examination event, but it is not realistic for a competitive examination which has<br />

substantial money involved <strong>and</strong> where the reputation of some schools is felt to be at<br />

stake. It departed too far from past practices for it to be understood <strong>and</strong> accepted by<br />

teachers <strong>with</strong>out very careful preparation <strong>and</strong> communications.<br />

73 In the absence of an examination prescription, teachers <strong>and</strong> students were even more<br />

dependent on the examinations exemplars. However, these were few in number (one for<br />

each subject), <strong>and</strong> they were completed <strong>and</strong> posted on the website late in the<br />

examinations cycle because the subject st<strong>and</strong>ards were not completed until late in 2003.<br />

Some website exemplars were posted in 2003 but others were not posted until 2004,<br />

when the teaching year had begun. The caution was underst<strong>and</strong>able to the extent that if<br />

exemplars had preceded the st<strong>and</strong>ards, NZQA would have repeated the implementation<br />

mistake of Level 1 NCEA, where teachers were irate that the early exemplars differed<br />

from the external assessments that were finally set. The 2004 Scholarship involved<br />

higher stakes. Unfortunately, this situation left little time for teachers <strong>and</strong> students.<br />

Professional Development<br />

74 Professional development is an important part of pre-moderation, particularly for<br />

st<strong>and</strong>ards-based assessment <strong>and</strong> the Ministry invested substantially in running several<br />

Jumbo days. However, everyone agrees that the professional development in the Jumbo<br />

days focused on the NCEA <strong>and</strong> the first year of Level 3. The aim was to build a common<br />

underst<strong>and</strong>ing of the internally assessed st<strong>and</strong>ards. There was no focus on externally<br />

assessed st<strong>and</strong>ards, <strong>and</strong> the 2004 Scholarship was an external assessment.<br />

75 In most situations, this focus on NCEA for professional development would be rational.<br />

NCEA is the main qualification for the great majority of senior secondary students, <strong>and</strong><br />

this was the first year of Level 3. In comparison, the 2004 Scholarship was a one-off<br />

competitive examination for a small minority of students. But the reality was that the<br />

Scholarship had iconic status to some of the education community because it challenged<br />

12


the most gifted students, attracted large monetary rewards <strong>and</strong> was a high priority for<br />

many elite schools which jostle to be winners. The education community collectively<br />

took its eye off the ball.<br />

76 The Review Team stresses ‘collectively’ because the responsibility for professional<br />

development is a shared one. The schools <strong>and</strong> some of their representative bodies have<br />

much of the management control. Although Colleges of Education are funded to provide<br />

professional development, it seems that few approaches were made by schools to the<br />

Colleges to request 2004 Scholarship training. This is underst<strong>and</strong>able as schools could<br />

reasonably expect to receive advice from the Ministry either directly or through sector<br />

forums as to the availability of professional development. In spite of this, it appears from<br />

the Scholarship results that some schools were able to prepare their students well from<br />

the material provided.<br />

77 The assessment exemplars posted were marking schedules <strong>with</strong>out marked (mock or<br />

real) student work. NZQA acknowledges that this was a thin resource, but felt the<br />

dem<strong>and</strong>s for more could not be met from its limited physical resources <strong>and</strong> in the<br />

available timeframe. The exemplars would also be developed by the same small group<br />

of overburdened chief examiners <strong>and</strong> markers who were setting the examinations for the<br />

first time. No item bank or previous examples could be called upon. Unfortunately,<br />

neither the Ministry nor NZQA formally drew attention to the implementation risk this<br />

posed.<br />

78 As noted in TOR 1.1, ownership at management level of professional development for<br />

the 2004 Scholarship was unclear. Whilst the Ministry has formal responsibility for<br />

professional development, this work in the context of a competitive examination is unlike<br />

development for the st<strong>and</strong>ards-based NCEA. It focuses on the examination itself, not the<br />

curriculum or teaching. St<strong>and</strong>ards are set in practice, not by documentation <strong>and</strong> therefore<br />

NZQA should have been deeply involved in the professional development for<br />

Scholarship. The agencies understood this but it was not translated into action.<br />

Monetary Awards<br />

79 The policy decision to attach substantial monetary rewards to performance in the 2004<br />

Scholarship was finalised in a Cabinet paper in late 2003, <strong>with</strong> the Minister announcing<br />

the awards in December 2003. The introduction of monetary awards raised the stakes for<br />

the 2004 Scholarship <strong>and</strong> may have prompted some c<strong>and</strong>idates to enter <strong>with</strong>out a realistic<br />

chance of achieving Scholarship. The Cabinet paper included, for costing purposes,<br />

assumptions about the number of Scholarships, but the policy parameters meant that<br />

NZQA felt unable to give guidance about the desired success rates.<br />

1.4 Whether the administration of the examinations against those st<strong>and</strong>ards,<br />

including review of the processes for setting, moderating, <strong>and</strong> marking the<br />

examination was adequate<br />

Paper Setting<br />

80 To write the papers NZQA advertised for Chief Examiners <strong>and</strong> used established practices<br />

to ensure the papers were considered by external experts. Because of the new nature of<br />

the 2004 Scholarship, two assistants were employed to provide assessment materials to<br />

the Chief Examiner. Otherwise the processes followed the well-oiled precedents of the<br />

University Bursary <strong>and</strong> School Certificate examinations.<br />

13


81 An independent checker (sometimes two) sat the examination as a c<strong>and</strong>idate <strong>and</strong><br />

provided feedback on time, language <strong>and</strong> instructions. The National Assessment<br />

Facilitators (NAF) in NZQA considered the checkers’ comments <strong>and</strong> provided their own<br />

feedback. The NZQA editors also checked the papers for examination layout, language<br />

<strong>and</strong> terminology. Pre-testing of some elements of the examination questions was used in<br />

some limited circumstances. There were, on average, three further editorial checks, one<br />

further NAF check, <strong>and</strong> two further examiner checks before final sign off. This was a<br />

thorough process.<br />

82 Past resources were not particularly useful for paper setting because the 2004 Scholarship<br />

was a new hybrid form of assessment. However, many of the Chief Examiners were<br />

University Bursary examiners <strong>and</strong> their broad experience came <strong>with</strong> them. In some<br />

cases, such as English, the NZEST examinations proved useful. In other cases, such as<br />

the sciences, they were not. For example, NZEST did not have a requirement to provide<br />

written explanations of abstract, underlying concepts.<br />

83 There may have been some influence from the similarity of Level 3 NCEA <strong>and</strong> the 2004<br />

Scholarship. For the four Level 3 NCEA externally assessed English st<strong>and</strong>ards, students<br />

had to write three essays <strong>and</strong> ten short answers. For Scholarship, they had to write three<br />

essays. The Scholarship essays were framed around very similar questions <strong>and</strong> identical<br />

source material – the literature studied as for Level 3. Overall, the Scholarship paper<br />

appeared similar to the Level 3 paper <strong>and</strong> students were expected to answer in similar<br />

ways to very similar starter quotes. This would have provided some comfort <strong>and</strong><br />

familiarity.<br />

84 However, the four Level 3 externally assessed Biology st<strong>and</strong>ards required students to<br />

write 39 reasonably short answers in response to 12 questions. For Scholarship, they had<br />

to write two essays <strong>and</strong> three short answers to three questions. The Scholarship answers<br />

had to respond to six pages of unseen resources, each two solid pages of text <strong>and</strong><br />

diagrams. For Level 3, the unseen resources were in bite-sized portions, most about a<br />

third of a page in length. This brief analysis suggests that Scholarship was an unexpected<br />

challenge for Biology students <strong>and</strong> their respective teachers.<br />

Marking <strong>and</strong> Moderation (of marking <strong>and</strong> marks)<br />

85 Panel Leaders <strong>and</strong> Markers were recruited from both the tertiary <strong>and</strong> secondary sectors.<br />

Most subjects had one Panel Leader <strong>and</strong> one Panel Marker. Those <strong>with</strong> larger numbers<br />

of c<strong>and</strong>idate entries had up to three additional markers appointed. 17 of the 52 Panel<br />

Leaders <strong>and</strong> Panel Markers were from the tertiary sector, the rest were from the<br />

secondary sector. Every panel had tertiary input through markers, material developers,<br />

material critiquers or independent checkers.<br />

86 Little guidance was given, however, on how to conduct <strong>and</strong> rank the assessments <strong>and</strong><br />

each subject took its own approach. There were variations in the process whereby<br />

Scholarships <strong>and</strong> Outst<strong>and</strong>ing Scholarships were awarded which may have exceeded the<br />

natural differences between subjects. For example, English had three essay questions.<br />

The marking panel applied a minimum yardstick of two Scholarship st<strong>and</strong>ards plus one<br />

Not Achieved to award an overall Scholarship. By contrast, Media Studies, which also<br />

had three essay questions, required the Scholarship st<strong>and</strong>ard in all three questions before<br />

it gave the overall Scholarship award. This can be explained as acceptable subject<br />

variation, but in our view a more st<strong>and</strong>ardised approach would have improved<br />

professional <strong>and</strong> public confidence.<br />

14


87 St<strong>and</strong>ard-setting meetings using a sample of initial scripts were held. The papers were<br />

trial marked to see ‘what the c<strong>and</strong>idates had done to the paper’ <strong>and</strong> to detect any rogue or<br />

ambiguous questions not detected earlier in the paper setting exercise. The draft-marking<br />

schedule was then adjusted in light of what the markers found <strong>and</strong> debated. This is<br />

st<strong>and</strong>ard practice <strong>and</strong> it was thoroughly conducted. However, the fact remains that the<br />

unusual results that appeared, showing wide variations between subjects, did not set off<br />

an alarm in NZQA.<br />

88 The explanation for this seems to be that NZQA was aware that there would be variation,<br />

but was not aware that the nature of this variation would be a surprise to nearly everyone<br />

else. The 2004 approach can be contrasted <strong>with</strong> NZQA’s previous experience <strong>with</strong> unscaled<br />

School Certificate, where marks from the first 10% of papers were entered on to a<br />

database <strong>and</strong> the potential mean found. A decision was then made whether the aggregate<br />

results were <strong>with</strong>in ‘professional public tolerances’. Intervention took place from time to<br />

time in the subjects <strong>with</strong> significant numbers of entries, though some <strong>with</strong> a small<br />

number of entries would have been left alone.<br />

89 As already mentioned, the business processes that NZQA employed were highly<br />

efficient. These were driven by very intense pressures to deliver the results as rapidly as<br />

possible. The Examination Centre Managers coordinated the collection of the c<strong>and</strong>idate’s<br />

scripts <strong>and</strong> forwarded them to regional hubs, from where they were sent to markers.<br />

Markers then sent in results <strong>and</strong> returned the scripts to c<strong>and</strong>idates via a hub. The drive<br />

<strong>and</strong> tempo behind these processes meant that an analytical pause, a step back to interpret<br />

the wider impact <strong>and</strong> meaning of the results, did not occur. In the end, the overall picture<br />

from all the subjects was not apparent until results were in the pipeline for processing<br />

<strong>and</strong> mailing to c<strong>and</strong>idates. By then it was too late.<br />

90 Major time pressure at two points is difficult for NZQA to manage. On the one h<strong>and</strong>, the<br />

schools exert pressure for the examinations to happen as late as possible in the school<br />

year, to maximise teaching time <strong>and</strong> the discipline that impending assessment applies to<br />

the students. On the other h<strong>and</strong>, schools, tertiary institutions, parents <strong>and</strong> the public want<br />

results to be out as soon as possible. Added to this is the operational ‘given’ that<br />

marking has to be over by Christmas, because markers are less reliably available<br />

afterwards.<br />

91 This matter will have to be tightly managed in 2005, because the Scholarship<br />

examinations are scheduled to be split into two halves, some before the NCEA<br />

assessments <strong>and</strong> some after. Security arrangements will need to be increased for papers<br />

to remain secure in the Examination Centres for a month; alternatively, the papers will<br />

have to be sent out twice over that period. In addition the period of time between the last<br />

examination <strong>and</strong> Christmas is shortened by a minimum of four days. NZQA are<br />

exploring an arrangement whereby the Examination Centre Managers send the scripts<br />

directly to markers rather than to the regional hubs for distribution. This makes good<br />

business sense, but a note of caution is that business processes may take precedence over<br />

sound <strong>and</strong> measured assessment practices. The Review Team emphasises that more<br />

time, not less, needs to be available to NZQA to manage its responsibilities <strong>and</strong> to ensure<br />

fair results.<br />

Patterns of Entry<br />

92 The numbers <strong>and</strong> characteristics of the c<strong>and</strong>idates were unknown for the new 2004<br />

Scholarship. Publicity stressed the elite nature of the examination, but the combination<br />

of monetary awards <strong>and</strong> entry requirements that did not cost students any more than the<br />

15


NCEA fees may have contributed to higher than expected numbers of clearly unsuitable<br />

c<strong>and</strong>idates in some subjects. In other subjects there was a contrasting pattern of low<br />

percentages of entries from the eligible cohort. The reasonably high correlation between<br />

NCEA results which rated ‘Excellent’ <strong>and</strong> the 2004 Scholarship performance (which<br />

would be expected) did not happen uniformly. There is also some evidence that not all<br />

the best students entered the 2004 Scholarship, whereas students who in the past entered<br />

the University Bursary examinations were automatically eligible for scholarship grades<br />

(higher than an ‘A’ grade) <strong>and</strong> possible monetary awards which were given to top<br />

category students. 5 (Note that the NZEST scholarship examination remained separate<br />

<strong>and</strong> independent <strong>and</strong> was not government funded – see paragraph 37). This pattern of<br />

elective entry further unbalanced the results.<br />

1.5 Whether communication <strong>and</strong> guidance to teachers <strong>and</strong> students<br />

concerning New Zeal<strong>and</strong> Scholarship were adequate<br />

93 The Review Team specifically invited comment from teachers <strong>and</strong> from students on their<br />

views about how well they felt prepared for the 2004 Scholarship (see Review Process<br />

<strong>and</strong> Methodology). The universal response was they did not feel well prepared.<br />

94 For teachers this was caused by a combination of the deficiencies in professional<br />

development, <strong>and</strong> the compressed timeframes already discussed. The Post Primary<br />

Teachers’ Association, in a submission to the Review Team, stated, “there appears to<br />

have been a general unawareness in the profession of what the Scholarship exams would<br />

require.”<br />

95 A summary of comments from students highlights the following points:<br />

• The main focus in 2004 was on preparation for the introduction of Level 3 NCEA.<br />

• The difference between the old scholarship <strong>and</strong> the new was not clear to parents,<br />

students <strong>and</strong> the public, e.g. the requirement to write well especially in the<br />

sciences.<br />

• The information teachers received about Scholarship was patchy due to the lack of<br />

professional development, early problems for some schools in accessing the<br />

exemplars on the website, not receiving what little material was available or not<br />

being sure how to interpret what they did receive.<br />

• The students were dependent on their teachers for guidance on whether or not to<br />

enter Scholarship. This led to some top students not entering <strong>and</strong> some students<br />

who were less capable entering, spurred by the motivation of monetary award,<br />

resulting in a variable quality of entrants. This contrasted <strong>with</strong> previous practice<br />

where scholarship was awarded on the basis of University Bursary examination<br />

performance. In other words entry was automatic.<br />

• The results that were graded as Outst<strong>and</strong>ing, Scholarship or Not Achieved were<br />

not understood. Students wanted more transparency. This was compounded by<br />

students receiving their exam scripts back <strong>with</strong> “unintelligible marking” on them.<br />

96 Despite the above, many students commented that the level of scholarship exams was<br />

about right, even those who did not achieve a Scholarship. There were some outliers like<br />

Calculus which students found very difficult. These comments are consistent <strong>with</strong> the<br />

Review Team’s findings as discussed in TOR 1.1 - 1.4.<br />

5 Up until 2003 the term ‘Scholarship’ was used to denote both subject grades <strong>and</strong> monetary awards<br />

16


97 The way any implementation agency approaches communication about change is critical<br />

to building confidence <strong>with</strong> key stakeholders. In terms of communicating this significant<br />

change to the sector, as one stakeholder put it, “you need to underst<strong>and</strong> what it is you are<br />

trying to do. You need to agree key messages, which are simple <strong>and</strong> straightforward, <strong>and</strong><br />

you need to keep saying them.” The Review Team agrees.<br />

1.6 Any other matter which the Review Team considers relevant to the<br />

foregoing questions<br />

98 There are three matters that the Review Team believes it is worth commenting on briefly.<br />

They need to be seen in the spirit of moving forward <strong>and</strong> implementing the 2005<br />

Scholarship successfully. They are:<br />

• Continuous improvement - implementation needs to be a process whereby the<br />

implementation team makes constant adjustments to the way it does things, to<br />

ensure a successful outcome. Any rigid adherence to an ideal (if this did indeed<br />

occur) will get in the way of a successful outcome.<br />

• Feedback loops – sector representatives who participate in the Leaders’ Forum<br />

need to be kept informed of policy <strong>and</strong> implementation decisions <strong>and</strong> issues in a<br />

continuous way. The Ministry <strong>and</strong> NZQA need to dedicate resource specifically<br />

to keeping the sector <strong>and</strong> its key stakeholders consistently well informed as well<br />

as involved in its consultative forums.<br />

• Building sector capacity in external assessment – the Scholarship <strong>and</strong> the wider<br />

NCEA call constantly on the same <strong>and</strong> small number of experienced teachers in<br />

the tertiary <strong>and</strong> secondary sectors. The Review Team believes that there is an<br />

urgent need to build up this capacity <strong>and</strong> a related need to review the conditions<br />

under which they are engaged to facilitate development.<br />

99 A final word on the debate about assessment. Norm-referenced assessment is at one end<br />

of the assessment spectrum <strong>and</strong> st<strong>and</strong>ards-based (or criterion-referenced) assessment lies<br />

at the other. Norm-referencing usually applies norms at the stage of assessment results<br />

<strong>and</strong> reporting, using an inherent underst<strong>and</strong>ing of st<strong>and</strong>ards to set the mean scores<br />

appropriate for an age group or cohort. St<strong>and</strong>ards-based approaches tend to apply norms<br />

at an earlier, pre-assessment stage, e.g. when setting students’ achievement st<strong>and</strong>ards.<br />

100 These positions are seen as opposing positions, though in fact it is more constructive to<br />

see them as complementary, not exclusive, <strong>and</strong> differing largely at the point where the<br />

norms are applied. Norms <strong>and</strong> st<strong>and</strong>ards in fact apply continuously throughout the whole<br />

education <strong>and</strong> assessment process. When teachers use exemplars they apply both norms<br />

<strong>and</strong> st<strong>and</strong>ards. So too do markers when they adjust schedules after assessing an initial<br />

selection of student’s scripts. There has been an international movement to focus more<br />

on st<strong>and</strong>ards-based assessment because of the balance of advantage it offers to teaching<br />

<strong>and</strong> learning. The intention of st<strong>and</strong>ards-based assessment is that teachers have better<br />

knowledge of a student’s success or failure before it is too late to do anything about it,<br />

students have improved knowledge of their targets, teachers improve their professional<br />

assessment skills, <strong>and</strong> deficiencies in the wider educational system are known.<br />

101 While both assessment approaches belong to the same spectrum, the reality is that the<br />

new 2004 Scholarship <strong>and</strong> NCEA were introduced in a climate of polarisation. This has<br />

affected the way in which the sector <strong>and</strong> the public perceive this major transition in<br />

secondary school education. These two approaches need not <strong>and</strong> should not be opposing<br />

<strong>and</strong> it is incumbent on both agencies but notably the Ministry to provide leadership in<br />

17


ensuring that the new approach to teaching, learning <strong>and</strong> assessment is implemented in a<br />

way that works for New Zeal<strong>and</strong>ers. For both agencies <strong>and</strong> particularly NZQA, there<br />

needs to be a strong internal culture which welcomes different opinions about assessment<br />

so that robust policies <strong>and</strong> practices are implemented.<br />

Recommendations<br />

The Review Team recommends that:<br />

In respect of 1.1:<br />

1 The Ministry takes the lead in ensuring that all the activities (involving government<br />

agencies as well as sector consultation groups) associated <strong>with</strong> the 2005 Scholarship are<br />

properly coordinated <strong>and</strong> deliver a consistent approach.<br />

2 The NZQA Board receives clear specifications <strong>and</strong> expectations relating to Cabinet’s<br />

decisions on the Scholarship Reference Group report [CAB Min (05) 11/2], in a letter<br />

from the Associate Minister of Education.<br />

3 The NZQA Board engages <strong>with</strong> the Minister <strong>and</strong> the Ministry on any concerns or lack of<br />

clarity related to leadership of implementation of the 2005 Scholarship.<br />

4 NZQA develops a comprehensive implementation plan for the 2005 Scholarship <strong>and</strong><br />

beyond in response to the letter from the Associate Minister of Education. The plan<br />

should be approved by the Associate Minister. This plan should cover, but not be limited<br />

to:<br />

• resources required<br />

• timelines <strong>and</strong> milestones<br />

• an analysis of risks <strong>and</strong> risk mitigation strategies<br />

• IT integration matters<br />

• a professional development strategy including exemplar development<br />

• a single communication strategy (see recommendations 17 <strong>and</strong> 20)<br />

• progress reporting including exception reporting for milestones not met<br />

• plans for establishing the two independent external groups recommended by the<br />

SRG <strong>and</strong> approved by Cabinet on 29 March 2005.<br />

5 NZQA sets up as a matter of urgency a dedicated implementation team for the 2005<br />

Scholarship, <strong>with</strong> representation from the Ministry of Education, which has strong<br />

feedback loops <strong>with</strong> the sector <strong>and</strong> which is responsive to concerns raised.<br />

6 Operational policy matters <strong>and</strong> risks relating to the 2005 Scholarship <strong>and</strong> beyond be<br />

clearly identified by NZQA in consultation <strong>with</strong> the Ministry <strong>and</strong> conveyed to the<br />

Associate Minister of Education. NZQA, in consultation <strong>with</strong> the Ministry, should lead a<br />

process for resolution of these matters. This should include the timing of the<br />

examinations <strong>and</strong> release of the results (see recommendation 16).<br />

In respect of 1.2:<br />

7 The Ministry Education provides the Associate Minister <strong>with</strong> an analysis of the risks<br />

associated <strong>with</strong> the revised policy settings for the 2005 Scholarship <strong>and</strong> strategies to<br />

mitigate those risks, prior to the Associate Minister writing to the NZQA Board to<br />

18


communicate the government’s expectations around the implementation of the 2005<br />

Scholarship.<br />

8 The NZQA <strong>and</strong> the Ministry of Education develop a programme to build capability in the<br />

secondary sector to widen the already stretched pool of examiners <strong>and</strong> markers.<br />

In respect of 1.3:<br />

9 NZQA establishes a small team of National Assessment Facilitators (NAFs) specifically<br />

for the 2005 <strong>and</strong> 2006 Scholarships, or ensures that NAFs have an explicit component of<br />

their work dedicated to the Scholarships.<br />

10 The Ministry of Education involves NZQA <strong>and</strong> sector expertise actively in the continued<br />

revision of the Scholarship st<strong>and</strong>ards.<br />

11 NZQA, <strong>with</strong> input from the Ministry of Education, revises the 2005 papers to ensure the<br />

questions can adequately distinguish a range of student performance, <strong>and</strong> establish<br />

marking schemes that allow transparent differentiation of achievement <strong>and</strong> transparent<br />

ranking.<br />

12 NZQA, <strong>with</strong> input from the Ministry of Education, establishes conventions that provide<br />

consistent guidance to marking panels of different subjects on how to award<br />

Scholarships.<br />

13 NZQA ensures the development of multiple exemplar resources for teachers to use in the<br />

2005 school year including some <strong>with</strong> marked students’ scripts 6 .<br />

14 The Ministry of Education conducts two professional development days in 2005 <strong>and</strong> two<br />

in 2006, tailored specifically to the needs of Scholarship.<br />

In respect of 1.4:<br />

15 The implications of the boundaries of tolerance set for each subject in the 2005<br />

Scholarship, as agreed by Cabinet following the SRG report 7 should be clearly<br />

communicated to Ministers.<br />

16 NZQA provides urgent advice to the Associate Minister of Education about the potential<br />

for an earlier start to the examinations <strong>and</strong> a later release of results than occurred for the<br />

2004 Scholarship. Timing <strong>and</strong> resources should allow for the start <strong>and</strong> completion of<br />

script marking to reasonably identify issues, <strong>and</strong> if necessary, for expert advice to be<br />

obtained <strong>and</strong> consultation <strong>with</strong> Ministers to occur. A delay should be possible in the<br />

release of results <strong>and</strong> the return of examination scripts to students until issues are<br />

satisfactorily resolved.<br />

17 A concerted information campaign, as part of the communications strategy, be conducted<br />

in 2005 to set realistic professional <strong>and</strong> public expectations for the release of results <strong>and</strong><br />

to provide reassurance that the issues that emerged in 2004 have been adequately<br />

understood <strong>and</strong> addressed.<br />

18 NZQA ensures a conservative, risk-averse approach to the 2005 Scholarship, so that<br />

people <strong>and</strong> systems are prepared for the outcomes <strong>and</strong> can implement appropriate<br />

management strategies.<br />

6 It may be possible to use the guinea pig scripts that were retained from 2004<br />

7 Cabinet has agreed to a range of 2-3% for each subject, <strong>with</strong> an annual variation of plus +/- 1%. For small subjects the<br />

variation can extend to +/-1-5 c<strong>and</strong>idates<br />

19


19 An examination “hot-line” between agencies <strong>and</strong> the office of the Associate Minister of<br />

Education be maintained over the Christmas holiday period.<br />

In respect of 1.5:<br />

20 NZQA, <strong>with</strong> the support of the Ministry of Education, develops a communication<br />

strategy based on a shared underst<strong>and</strong>ing of the implementation. Key messages need to<br />

be simple, direct, <strong>and</strong> clear <strong>and</strong> ensure effective <strong>and</strong> consistent communication to<br />

multiple stakeholders.<br />

In respect of 1.6:<br />

21 The Ministry of Education <strong>and</strong> NZQA together encourage an active internal <strong>and</strong> external<br />

debate on assessment that engages <strong>and</strong> earns the respect of the education community.<br />

This debate should encompass the full spectrum of assessment expertise <strong>and</strong> wellresearched<br />

international models.<br />

20


Annex 1<br />

New Zeal<strong>and</strong> Qualifications Authority<br />

Terms of Reference for Review by the <strong>State</strong> <strong>Services</strong> Commissioner<br />

Introduction<br />

The New Zeal<strong>and</strong> Qualifications Authority ("NZQA") is a Crown entity established under<br />

Part XX of the Education Act 1989 "to establish a consistent approach to the recognition of<br />

qualifications in academic <strong>and</strong> vocational areas"(s247).<br />

NZQA's functions include the administration of the National Certificate of Educational<br />

Achievement (NCEA) <strong>and</strong> other school, trade <strong>and</strong> vocational assessment. In 2004, NZQA<br />

instituted a new Level 4 qualification, New Zeal<strong>and</strong> Scholarship, to be the basis for the<br />

allocation of university scholarship awards.<br />

Background to Review<br />

New Zeal<strong>and</strong> Scholarship was introduced at the same time as NCEA Level 3. The new Level 4<br />

st<strong>and</strong>-alone qualification was designed to enable the award of scholarships on the basis of<br />

examination. Previously, scholarships had been awarded to the top 3 or 4 percent in University<br />

Bursary examinations.<br />

Students sat the New Zeal<strong>and</strong> Scholarship examinations in November 2004, <strong>and</strong> results were<br />

mailed to them in January 2005, <strong>and</strong> later made available on the NZQA website. The results<br />

showed a much smaller number of scholarships awarded this year than expected <strong>and</strong> widely<br />

differing achievement rates in different subject areas. The results gave rise to public concern<br />

about the administration of the new qualification, <strong>and</strong> its fairness to otherwise high achieving<br />

students.<br />

The Associate Minister of Education has asked the <strong>State</strong> <strong>Services</strong> Commissioner to review the<br />

adequacy of the setting <strong>and</strong> management of the 2004 Scholarship examinations <strong>and</strong>, more<br />

widely, the performance of NZQA. The Minister's request is made under section 11 of the <strong>State</strong><br />

Sector Act 1988.<br />

Terms of reference<br />

Scholarship examinations<br />

1 To review the 2004 New Zeal<strong>and</strong> Scholarship examinations <strong>with</strong> particular regard to:<br />

1.1 whether the roles <strong>and</strong> responsibilities of the various agencies involved were clearly<br />

defined;<br />

1.2 whether the identification <strong>and</strong> addressing of issues arising in designing <strong>and</strong> implementing<br />

the new system was adequate;<br />

1.3 whether the processes to set <strong>and</strong> moderate the st<strong>and</strong>ards for the qualification were<br />

adequate;<br />

1.4 whether the administration of the examinations against those st<strong>and</strong>ards, including review<br />

of the processes for setting, moderating, <strong>and</strong> marking the examination were adequate;<br />

21


1.5 whether communication <strong>and</strong> guidance to teachers <strong>and</strong> students concerning New Zeal<strong>and</strong><br />

Scholarship were adequate;<br />

1.6 any other matter which the reviewer considers relevant to the foregoing questions; <strong>and</strong><br />

2 To make recommendations concerning these matters.<br />

NZQA’s Performance<br />

3 Given the purpose <strong>and</strong> role of the NZQA to provide quality assured qualifications, to<br />

review how well the NZQA is undertaking its role in respect of the school sector<br />

qualifications, <strong>with</strong> particular attention to:<br />

3.1 the governance, monitoring <strong>and</strong> reporting of NZQA performance;<br />

3.2 the adequacy of NZQA capability including planning, systems <strong>and</strong> processes to<br />

undertake its role:<br />

3.3 the nature <strong>and</strong> extent of the involvement of, <strong>and</strong> responsiveness to, key stakeholders in<br />

the decision-making of the NZQA;<br />

3.4 whether <strong>and</strong> if so how the NZQA is providing professional leadership in examination<br />

processes;<br />

3.5 any other matter which the reviewer considers relevant to the foregoing question.<br />

4 To make recommendations concerning these matters.<br />

Reviewer<br />

The <strong>State</strong> <strong>Services</strong> Commissioner will appoint a suitable person or persons to carry out the<br />

review <strong>and</strong> to report to him, pursuant to section 25 of the <strong>State</strong> Sector Act.<br />

Under section 25 of the <strong>State</strong> Sector Act, the Commissioner <strong>and</strong> persons he appoints under<br />

section 25(2) of the <strong>State</strong> Sector Act have the same powers to summon witnesses <strong>and</strong> to<br />

receive evidence as are conferred on a Commission of Inquiry by the Commissions of Inquiry<br />

Act 1908.<br />

Timing<br />

The reviewer will report <strong>and</strong> make recommendations on:<br />

• the questions concerning the introduction of the New Zeal<strong>and</strong> Scholarship<br />

qualification by 29 April 2005, <strong>and</strong><br />

• the performance of the NZQA by 31 July 2005.<br />

The two parts of the review have separate reporting dates to enable any findings <strong>and</strong><br />

recommendations concerning the New Zeal<strong>and</strong> Scholarship examinations to be taken into<br />

account, as far as will be possible, in the administration of the 2005 examinations.<br />

VA (Tony) Hartevelt<br />

Deputy <strong>State</strong> <strong>Services</strong> Commissioner<br />

22


Annex 2<br />

Organisations consulted during the course of this Review<br />

Ministry of Education<br />

New Zeal<strong>and</strong> Qualifications Authority<br />

Education Review Office<br />

Independent Schools of New Zeal<strong>and</strong> (Tim Oughton)<br />

Institutes of Technology <strong>and</strong> Polytechnics of New Zeal<strong>and</strong> (Jim Doyle)<br />

New Zeal<strong>and</strong> Catholic Education Office (Pat Lynch)<br />

New Zeal<strong>and</strong> School Trustees’ Association (Ray Newport)<br />

New Zeal<strong>and</strong> Vice Chancellors’ Committee (Professor Roger Field)<br />

Post Primary Teachers’ Association (Debbie Te Whaiti, Kate Gainsford)<br />

Post Primary Teachers’ Association Principals’ Council (Don McLeod)<br />

School Principals’ Association of New Zeal<strong>and</strong> (Graham Young)<br />

A selection of Secondary School Principals<br />

Chief Examiners<br />

University Assessment Academics<br />

Auckl<strong>and</strong> University<br />

Manukau Institute of Technology<br />

23

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!