Untitled - Umalusi
Untitled - Umalusi
Untitled - Umalusi
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Report to the Minister of Education,<br />
Ms GNM Pandor, MP,<br />
on The Senior Certificate Examination<br />
2005
Contents<br />
Page<br />
Foreword 3<br />
Chapter 1: Overview of the 2005 Senior Certificate Examination Report 5<br />
Chapter 2: The moderation of question papers 11<br />
Chapter 3: The moderation of Continuous Assessment (CASS) 21<br />
Chapter 4: The monitoring of the Conduct of the Senior Certificate Examination 42<br />
Chapter 5: The moderation of marking 60<br />
Chapter 6: The standardisation of 2005 Senior Certificate results 73<br />
Chapter 7: Conclusion 78<br />
2
Foreword<br />
<strong>Umalusi</strong> has approved the release of all the results of this year’s Senior<br />
Certificate assessments as well as the assessments of the General<br />
Education and Training Certificate for adults and vocational education<br />
and training. The Minister of Education will be informed of this decision.<br />
The examination system is massive. There has been a significant increase<br />
in the number of candidates taking examinations across the three sectors.<br />
Assessment bodies administer this system very smoothly and it has<br />
reached an admirable level of maturity. There was open disclosure to<br />
<strong>Umalusi</strong> of various relatively minor irregularities which were reported on<br />
At its meeting this morning, the executive committee of <strong>Umalusi</strong>’s Council<br />
concluded, from reports submitted by its own moderators and monitors as<br />
well as those of the national and provincial departments of education and<br />
a daily basis. <strong>Umalusi</strong> was vigilant in following up these wrongdoings<br />
and can report that assessment bodies have, to the best of our<br />
knowledge, handled them efficiently and in line with policy.<br />
the two independent assessment bodies, that the examinations in all three<br />
sectors were conducted in line with policy and regulations and that the<br />
results were reliable, valid, fair and credible.<br />
This has been a year of review and improvement of <strong>Umalusi</strong>’s quality<br />
assurance processes. A comprehensive evaluation of all assessment<br />
bodies was conducted. Together with the results of the research into the<br />
<strong>Umalusi</strong> is the quality assuror in the general and further education and<br />
training bands of the national qualifications framework. It applies various<br />
quality assurance methods such as:<br />
• moderation of question papers;<br />
• monitoring of the conduct of examinations;<br />
standard of Senior Certificate examinations and the comparability of<br />
examinations in vocational education and training with the Senior<br />
Certificate, <strong>Umalusi</strong> was able to interrogate, revise and improve all its<br />
quality assurance of assessment processes and procedures. The Council<br />
is also seriously engaged with the issue of standards in the examinations.<br />
• moderation of marking of scripts (centralised and on-site);<br />
• moderation of continuous assessment (CASS); and<br />
• standardisation of the marks in accordance with agreed statistical and<br />
educational principles.<br />
This year, as a result of last year’s research, <strong>Umalusi</strong> paid particular<br />
attention to the cognitive challenge of question papers resulting in a<br />
higher level of papers for 2005. The examination has become less<br />
3
Foreword (continued)<br />
predictable and this enhances its reliability and validity. However,<br />
the Council is still troubled by the time allocated for some examination<br />
papers where there is a greater emphasis on source-based and<br />
interpretive questions.<br />
2005 is a special year for the Senior Certificate Examination. This year’s<br />
cohort are the first to have completed their general education phase (i.e.<br />
up to grade 9) through outcomes-based education (OBE). They then had<br />
to go back to the old curriculum system from Grade 10. Although<br />
difficulties could have been expected, the system appears to have coped<br />
The monitoring of examinations has also received particular attention from<br />
<strong>Umalusi</strong>. Three key stages of the examination were monitored: the state<br />
satisfactorily, thus easing concerns for the next two years. Thereafter, the<br />
new National Senior Certificate is due to be written in 2008.<br />
of readiness of assessment bodies; the conduct of the examination<br />
and the results’ phase. Measures have been put in place to ensure<br />
assessment bodies have the required systems to monitor examinations.<br />
<strong>Umalusi</strong> remains concerned about continuous assessment in all three<br />
sectors. In the forthcoming year the quality assurance processes for these<br />
assessments will receive particular attention. This also goes for the<br />
Furthermore, the quality assurance of the marking of Senior Certificate<br />
standard of examinations in practical subjects.<br />
papers has been greatly improved through the implementation of<br />
centralised moderation of marking for the six national subjects:<br />
Accounting, Mathematics, Physical Science, Biology, History and English<br />
<strong>Umalusi</strong> takes this opportunity to thank all its stakeholders for their<br />
co-operation and support in each of its quality assurance processes.<br />
Second Language. Assessment bodies were required to send a sample of<br />
scripts to the Council. This sample was then moderated by the<br />
organisation’s external moderators and feedback was provided on a<br />
daily basis to the assessment bodies.<br />
John Pampallis, Chairperson<br />
21 December 2005<br />
4
Chapter 1<br />
Overview of the 2005 Senior Certificate Examination Report<br />
1. Introduction<br />
implementing examination related processes, the cognitive challenge of<br />
examination question papers, the appropriateness and weighting of<br />
<strong>Umalusi</strong> reports on the standard of the Senior Certificate Examination to<br />
the Minister of Education on an annual basis. In this regard, <strong>Umalusi</strong><br />
reports on each of the quality assurance of assessment processes and<br />
procedures which together ensure a credible Senior Certificate<br />
Examination. These processes ensure that all aspects of the examination<br />
are put through rigorous quality checks. This enhances confidence that the<br />
content in question papers in relation to the syllabus, the quality of<br />
presentation of examination question papers, the efficiency and<br />
effectiveness of systems, processes and procedures for the monitoring of<br />
the conduct of the Senior Certificate Examination, the quality of marking<br />
as well as the quality and standard of internal quality assurance<br />
processes within the assessment body.<br />
Examination meets the required standards. <strong>Umalusi</strong> has very carefully<br />
considered all the public concerns about standards in this examination. In<br />
this regard, <strong>Umalusi</strong> conducted research to determine quality and<br />
standards in this examination towards the end of 2004. The findings of<br />
this research have been fed directly into the quality assurance processes<br />
used by <strong>Umalusi</strong>. The tools for moderation of question papers have been<br />
reviewed and sharpened as a result of this research. Other processes, like<br />
moderation of continuous assessment, moderation of marking as well as<br />
the monitoring of the conduct of the Senior Certificate Examination have<br />
all been strengthened by the findings of the research.<br />
Chapter 1 of this report outlines the purpose of the report, its scope and<br />
briefly discusses the quality assurance processes used by <strong>Umalusi</strong> to ensure<br />
that the Senior Certificate Examination meets the required standards. The<br />
second chapter reports on the findings of the moderation of question papers.<br />
This chapter also reports on the standard of the question papers. Chapter 3<br />
outlines the findings from the moderation of continuous assessment. The fourth<br />
chapter discusses the findings from <strong>Umalusi</strong>’s monitoring of the conduct of the<br />
Senior Certificate Examinations. Chapter 5 discusses in brief the details of the<br />
moderation of marking. The next chapter reports on the standardization of<br />
Senior Certificate results and the seventh and final chapter summarizes the<br />
<strong>Umalusi</strong> judges the quality and standard of the Senior Certificate<br />
Examination by determining the level of adherence to policy in<br />
findings of the quality assurance of the 2005 Senior Certificate Examination<br />
and makes some recommendations for improvement.<br />
5
2. Purpose<br />
The purpose of this report is to report on <strong>Umalusi</strong>’s quality assurance of<br />
continuous assessment, moderation of marking, monitoring the conduct<br />
of the Senior Certificate Examination as well as the moderation of<br />
examination marks.<br />
the 2005 Senior Certificate Examination with respect to the following:<br />
The report covers each of the processes in different chapters. Each<br />
• The salient findings from the external moderators’ reports, which are<br />
synthesized, analyzed and used to make judgements on the standard<br />
of the Senior Certificate Examinations.<br />
• The quality and standard of continuous assessment across assessment<br />
bodies.<br />
• The quality and standard of marking the Senior Certificate<br />
Examination among assessment bodies.<br />
• The efficiency and effectiveness of processes for the conduct of the<br />
Senior Certificate Examinations within assessment bodies.<br />
• The moderation of marks during the standardization process.<br />
• The recommendations for the improvement of assessment processes.<br />
3. Scope of the report<br />
This report covers all the five quality assurance of assessment processes<br />
used by <strong>Umalusi</strong> to ensure that the Senior Certificate Examination is of the<br />
chapter captures the salient findings with respect to each of the<br />
processes, highlights some problem areas and ends by offering<br />
recommendations for improvement. The report does not cover these areas<br />
in great detail but it highlights salient observations on each of the<br />
quality assurance processes.<br />
4. Quality assurance of assessment processes<br />
used by <strong>Umalusi</strong><br />
A brief outline of <strong>Umalusi</strong>’s quality assurance of assessment processes<br />
and procedures used for the Senior Certificate Examination process<br />
will help the reader to understand the rigor and extent of <strong>Umalusi</strong>’s<br />
quality assurance function. This section aims to provide an overview of<br />
the processes that are used to ensure the quality of the Senior<br />
Certificate Examination.<br />
required standard; namely, moderation of question papers, moderation of<br />
6
<strong>Umalusi</strong> is responsible for the quality assurance of the Senior Certificate.<br />
In keeping with its mandate, <strong>Umalusi</strong> focuses on the quality assurance of<br />
the external examinations and the school-based continuous assessment<br />
(CASS) which leads to the attainment of the Senior Certificate. The<br />
quality assurance activities undertaken by <strong>Umalusi</strong> include moderation of<br />
question papers, monitoring the conduct of the examinations, moderation<br />
of marking, standardisation of results, and verification and moderation of<br />
school-based continuous assessment.<br />
4.1 Moderation of question papers<br />
• Cognitive challenge<br />
• Technical criteria<br />
• Language usage<br />
• Quality and standard of internal moderation.<br />
4.2 Monitoring the conduct of examinations<br />
Over the last four years, <strong>Umalusi</strong> and its predecessor, SAFCERT, have<br />
engaged in rigorous and extensive monitoring of the Senior Certificate<br />
Examination. The monitoring focuses on three main aspects:<br />
In order to accomplish this function, <strong>Umalusi</strong> utilises the services of<br />
external moderators who are highly qualified and experienced<br />
professionals in their respective subjects. The moderation process focuses<br />
on ensuring that question papers are of an acceptable standard, cover<br />
the appropriate content as prescribed in the syllabus, and are presented<br />
• Auditing the assessment bodies’ monitoring systems.<br />
• Monitoring their state of readiness to administer the Senior Certificate<br />
Examination.<br />
• Monitoring the administration and conduct of the Senior Certificate<br />
Examination.<br />
in a professional manner. Moderators are required to consider the<br />
following criteria:<br />
With regard to monitoring the administration and conduct of the<br />
examination, <strong>Umalusi</strong> uses uniform criteria and an elaborate monitoring<br />
• Adherence to policy<br />
instrument. The criteria focus on the following:<br />
• Content coverage<br />
7
• Management of examination and marking centres;<br />
• Delivery of question papers and collection of scripts;<br />
• Invigilation and suitability of examination centres;<br />
• Security and storage of scripts;<br />
• Credibility of markers;<br />
• Training of markers;<br />
• Checking of marked scripts;<br />
• Transfer of marks to mark sheets; and<br />
• Accommodation of markers.<br />
4.3 Moderation of the marking process<br />
<strong>Umalusi</strong> moderates the marking of scripts by deploying external<br />
moderators to marking centres during the marking process, and also by<br />
moderating a sample of marked scripts after the release of the results.<br />
External moderators are deployed to the marking centres to ensure that:<br />
• all the systems and processes that relate to marking are in place and<br />
effective; and<br />
• the product of marking is a true reflection of the performance of<br />
individual candidates.<br />
4.4 Moderation of continuous assessment<br />
In 2001, school-based continuous assessment (CASS) marks (or year<br />
marks) were included in the Senior Certificate Examination, counting 25%<br />
of the final mark in all subjects. Inclusion of year marks was not new to<br />
the Senior Certificate Examination; under the previous dispensation, year<br />
marks counted 50% towards the final mark in the Natal Education<br />
Department and 33% in the Transvaal Education Department. <strong>Umalusi</strong>’s<br />
approach to the verification and moderation of CASS looks at specific<br />
subjects and all aspects relating to CASS in that subject. The evaluation<br />
focuses on:<br />
• the memoranda are correctly interpreted;<br />
• the standard of marking and internal moderation of scripts is<br />
maintained across all examining bodies and throughout the<br />
• the input into the CASS system;<br />
• the process of CASS implementation; and<br />
• the assessment outcome.<br />
marking process;<br />
8
A sample of nine subjects was identified for evaluation and samples of<br />
portfolios were selected from each of the assessment bodies. Subject<br />
experts evaluated these in accordance with the agreed criteria.<br />
4.5 Statistical moderation of Senior Certificate results<br />
<strong>Umalusi</strong> standardises both the examination marks and the CASS scores<br />
presented by the different schools in the country. Standardisation is<br />
necessary to address the variation in the standard of question papers and<br />
Statistical moderation of CASS is undertaken per institution and per<br />
subject. The mean and standard deviation of the examination mark (from<br />
the written paper) is used in this process. After the examination scores<br />
have been standardised, the mean of the examination score of a<br />
particular subject at a particular centre is compared to the mean of the<br />
CASS score. If the mean of the CASS score is within a certain range of<br />
the examination mean, then the CASS mean is accepted as is. If the mean<br />
of the CASS score is either too low or too high, it is brought within a<br />
certain range of the examination mean.<br />
marking that may occur from year to year and across examining bodies.<br />
Statistical moderation of examination marks consists of comparisons<br />
between the current mark distributions and the corresponding average<br />
distributions over the last three years. Standardisation meetings take<br />
place between the completion of marking and publication of results.<br />
Statistical moderation has been an integral part of the Senior Certificate<br />
Examination under both the JMB and SAFCERT. While some of the details<br />
have changed over the years, the basic process has remained the same.<br />
This comprises the following:<br />
These meetings are attended by a team from <strong>Umalusi</strong>’s Statistics<br />
Working Group and, in the case of the provincial examinations, by a<br />
contingent from the examinations section of the relevant province’s<br />
education department. The meeting for the national examinations is<br />
attended by representatives from the National Department and from all<br />
the provincial departments.<br />
• Norms are established for each examination subject (Higher Grade<br />
and Standard Grade) conducted by an examination authority,<br />
separately for each of these authorities.<br />
• An authority’s examination results (in the form of mark distributions and<br />
plotted ogives) are sent to the SAFCERT/<strong>Umalusi</strong> statistics team<br />
(usually the evening before the statistical moderation meeting). A<br />
9
preliminary discussion is held regarding the adjustments to be<br />
recommended at the meeting. (A similar meeting is held by the<br />
Examinations Committee of the authority.)<br />
• A statistical moderation meeting takes place, at which the examination<br />
results for each subject (Higher Grade and Standard Grade) are<br />
discussed and adjustments (including no adjustment) are agreed upon.<br />
The statistics team meets in the new year to review the results and the<br />
adjustments.<br />
10
Chapter 2<br />
Moderation of question papers<br />
1. Introduction<br />
The research into the standard of the Senior Certificate Examination<br />
conducted by <strong>Umalusi</strong> in 2004 as well as the report to the Minister of<br />
<strong>Umalusi</strong> moderates question papers to ensure that the standard is<br />
comparable across all assessment bodies, and that the question papers<br />
are sufficiently and relatively fair, valid, reliable and appropriate.<br />
Education in the same year observed that the standard of the Senior<br />
Certificate Examinations papers may have been compromised by the fact that<br />
question papers set did not cater for all the cognitive levels. Thus for 2005<br />
moderation criteria were improved to ensure that question papers set would<br />
In order to maintain public confidence in the examination system, the<br />
question papers must be seen to be relatively:<br />
• fair;<br />
• reliable;<br />
• representative of an adequate sample of the curriculum;<br />
• representative of relevant conceptual domains;<br />
test all the cognitive levels from the lower order to the higher order skills.<br />
2. Purpose of the chapter<br />
The purpose of this chapter is to extract salient findings from the external<br />
moderators’ reports, synthesise and analyse these and make judgements<br />
on the standard of the Senior Certificate Examinations.<br />
• representative of relevant levels of cognitive challenge.<br />
Furthermore, the chapter highlights problems that potentially compromise the<br />
For this reason external moderators are required to carefully moderate the<br />
quality of the question papers set for the Senior Certificate Examinations.<br />
question papers on behalf of <strong>Umalusi</strong>, recommend improvements and<br />
finally approve the question papers. External moderators then report<br />
comprehensively on their findings, so that <strong>Umalusi</strong> can evaluate the quality of<br />
The chapter finally makes recommendations for improvement of the<br />
standard of question papers.<br />
question papers set for the Senior Certificate Examinations.<br />
11
3. Scope of the moderation practice<br />
• Mathematics Paper 1 and Paper 2 HG and SG<br />
• Physical Science Paper 1 and Paper 2 HG and SG<br />
<strong>Umalusi</strong> moderated 66 of all the subjects offered at Grade 12, six of<br />
which are set at national level and the rest by different assessment<br />
bodies. Composite reports were received from external moderators on<br />
Popular subjects (currently assessment body question papers but to<br />
become national subjects in 2006):<br />
each of the question papers moderated. These reports commented on<br />
question papers in their original state when they were initially submitted<br />
to <strong>Umalusi</strong>’s external moderators. They also comment on their status after<br />
intervention by external moderators. This report does not focus on all the<br />
66 subjects moderated, but draws out issues that were observed that are,<br />
in most cases, common to the 66 subjects moderated.<br />
• Afrikaans Primary and Additional Language<br />
• Agricultural Science<br />
• Business Economics<br />
• Economics<br />
• Geography<br />
For the purpose of this report therefore, focus will be on the following 20<br />
Assessment Body Question Papers:<br />
subjects:<br />
• Art<br />
National Question Papers<br />
• Computyping<br />
• English Primary Language<br />
• Accounting HG and SG<br />
• Biology Paper 1 and Paper 2 HG and SG<br />
• English Additional Language Papers 1, 2 and Paper 3 HG and SG<br />
• History Paper 1 and Paper 2 HG and SG<br />
• Home Economics<br />
• Hotelkeeping and Catering<br />
• IsiZulu Primary and Additional Language<br />
• IsiXhosa Primary and Additional Language<br />
12
• Technical Drawing<br />
• Travel and Tourism<br />
4. Approach to moderation of question papers<br />
<strong>Umalusi</strong> appoints external moderators who are qualified subject<br />
specialists and experts in the field of assessment to moderate question<br />
papers. For the 2005 SCE question papers <strong>Umalusi</strong> appointed 64<br />
external moderators who were responsible for 66 subjects. Each of these<br />
external moderators moderated question papers at a central venue,<br />
where the panel of examiners was available to address issues raised by<br />
the external moderator.<br />
For both these approaches, external moderators used very detailed and<br />
specific criteria set by <strong>Umalusi</strong> when moderating question papers. Criteria<br />
are reviewed and improved yearly to ensure that they accomodate all the<br />
aspects of the question paper pertaining to quality and standard of the<br />
question papers.<br />
moderators was responsible for one subject, except for two moderators<br />
who were moderating two subjects each. The moderators moderated all<br />
the grades (standard and higher). In cases where there are more papers<br />
Criteria used to moderate the question papers, as used by external<br />
moderators, covered the following aspects:<br />
(Paper 1, 2 or 3) in a subject, we had one external moderator per paper,<br />
except in English Second Language where we had one moderator<br />
moderating all the three papers.<br />
• Standard of the question papers<br />
• Coverage of core syllabus<br />
• Cognitive levels assessed<br />
<strong>Umalusi</strong> used different approaches for moderating national and<br />
assessment body question papers. For assessment body question papers<br />
(including back-up papers), assessment bodies sent question papers to<br />
moderators, who would then moderate and write reports to both <strong>Umalusi</strong><br />
and the assessment bodies. With regard to national question papers,<br />
• Presentation of question papers<br />
• Internal moderation of question papers<br />
• Strengths observed from question papers<br />
• Weaknesses observed from question papers<br />
• Overall impression on the question papers<br />
13
5. Findings<br />
The key issue of concern to <strong>Umalusi</strong> when question papers are<br />
moderated is the standard of the question papers. The word standard is a<br />
very relative term and <strong>Umalusi</strong> moderates question papers on the premise<br />
that standard is dependent broadly on the following aspects:<br />
both Higher Grade and Standard Grade. It also seeks to discover whether<br />
key content areas in each subject were covered in the examination<br />
question paper. Furthermore, the criterion examines whether the question<br />
paper consists of constructs that cover crucial subject-specific knowledge.<br />
In the final analysis, the weighting of questions in relation to areas of<br />
subject content is another focus of the criterion.<br />
• Adherence to policy<br />
• Levels of challenge (cognitive demand)<br />
• Quality of the representation of question papers<br />
• Quality of internal moderation<br />
Moderators for the six national subjects – Mathematics, Biology, Physical<br />
Science, History and Accounting – observed that content coverage with<br />
regard to all the aspects mentioned above was quite adequate for<br />
examination question papers in 2005. History question papers in particular,<br />
are reported to be by far of a better quality than the previous years.<br />
The report discusses the standard of the question papers in general,<br />
concentrating on the quality of the question papers particularly as they<br />
were prior to external moderation. The report then unpacks the four<br />
aspects as bulleted under five above.<br />
The English Second/Additional Language moderators reported that the<br />
content that was covered in 2005 was in line with the relevant syllabus<br />
requirements for this year.<br />
5.1 Content coverage (Adherence to Policy)<br />
National question papers for 2005 reflect the syllabus requirements<br />
relatively accurately.<br />
This criterion requires the moderators to establish whether the examination<br />
question paper accurately represents the content specified in the syllabi for<br />
14
5. 2 Cognitive skills<br />
In History, moderators found that differentiation into Higher Grade and<br />
Standard Grade was clear and in line with the guidelines. The same was<br />
the case with Biology. Physical Science moderators, on the other hand,<br />
observed that there was a very clear distinction between the Higher<br />
Grade and the Standard Grade examination question papers. This, they<br />
stated, could be determined through content selection and the weighting<br />
of items demanding high-level thinking skills.<br />
The purpose of this criterion is to establish what conceptual constructs are<br />
being tested by the examination question paper. It also seeks to find out the<br />
challenge or difficulty level of the examination question paper. In addition,<br />
the criterion investigates the extent to which the examination question paper<br />
is in line with the best and latest developments in the teaching of that particular<br />
subject. Lastly, it examines the extent to which the examination question<br />
papers differentiate between Standard Grade and Higher Grade.<br />
The testing of cognitive skills refers to the challenge or difficulty levels at<br />
which examination questions are pitched. This is central to the standards<br />
The moderators for national subjects – History, Biology, English Second/<br />
Additional Language, Physical Science, Accounting and Mathematics –<br />
found that the 2005 Higher Grade examination papers were set in line<br />
with the latest developments in the teaching of these subjects, at least in<br />
the Higher Grade.<br />
of the examination. The overwhelming finding by the moderators in the<br />
majority of subjects – Mathematics, Biology, History, Physical Science,<br />
Accounting and English Second/Additional Language – was that the<br />
examinations contained a satisfactory weighting of questions that required<br />
the deployment of higher-order thinking skills.<br />
The History moderators applauded the 2005 Higher Grade examination<br />
question paper in the following observation:<br />
The foregoing discussion points to a general trend towards setting more<br />
questions that are regarded as cognitively more demanding.<br />
“Three years ago we set ourselves a goal to achieve – we are there now.<br />
I’m proud of the standard of this paper.”<br />
15
5.3 Technical criteria<br />
Language, Biology and Physical Science question papers had to be<br />
moderated a number of times before they were signed off by moderators.<br />
In judging the technical aspect of the examination question papers, the<br />
moderators looked at:<br />
5.5 Question papers set by assessment bodies<br />
• the organisation of the paper;<br />
• technical details like the cover page, layout, numbering and mark<br />
allocation; and<br />
The standard and quality of those question papers that are set at<br />
assessment body level seems to be on par with that of the national<br />
papers. Moderators noted improvements in the following subjects:<br />
• the quality of illustrations, graphs, tables, and other graphics.<br />
• Computer Studies<br />
The majority of the subject moderators – History, Biology, Mathematics<br />
Physical Science, Accounting and English Second/Additional Language,<br />
observed a significant improvement in the technical aspects of the<br />
examination question papers. They noted vastly improved layouts<br />
compared to previous question papers. On the whole, the examination<br />
• Afrikaans Primary, First and Additional Language<br />
• Sepedi Additional and Third Language<br />
• IsiXhosa Primary Language HG<br />
• Xitsonga Primary and Additional Language<br />
• Art<br />
papers were more user-friendly.<br />
5.4 Internal moderation<br />
The Biology Paper 1 by the IEB was a cause for concern because it did<br />
not comply with national policy.<br />
The standard and quality of internal moderation has been very low for<br />
many years and has not improved at all. English Second/Additional<br />
16
5.6 Back-up question papers<br />
• effective contextualisation of questions;<br />
• “fresh” questions; and<br />
Assessment bodies are required to set back-up question papers to be used<br />
• all round accuracy and balance of the papers.<br />
in case of emergency. The standard and quality of back-up question<br />
papers continues to be poor. Some papers had to be returned to<br />
assessment bodies because of the poor standard of the papers.<br />
6. Strengths<br />
6.4 The Physical Science Paper 2 from Free State, IEB, National and<br />
Gauteng as well as the national paper were of a good standard.<br />
The Free State HG paper is the best example of OBE infusion ever.<br />
It was very innovative and original.<br />
6.1 It is worth noting that on the whole the quality of question papers set has<br />
improved greatly over the years. Of remarkable noting is the quality of<br />
questions set in terms of cognitive constructs of the subjects, and also<br />
ensuring that there is proportionate assessing of all the cognitive levels.<br />
6.5 In Mathematics Paper 1, the IEB, North West province, Western<br />
Cape and Gauteng overall set good papers. The North West HG<br />
paper, and the Northern Cape SG paper can serve as exemplars of<br />
good papers, in the sense that they contain, in general, the essential<br />
components that are looked for in a paper.<br />
6.2 There continues to be assessment bodies that strive to produce good<br />
standard question papers. The following examples are worth<br />
highlighting:<br />
7. Areas of concern<br />
7.1 Late submission of question papers for first moderation. Assessment<br />
6.3 The Physical Science Paper 1 from Western Cape and Northern<br />
Cape were singled out by moderators mainly due to:<br />
bodies still continue to send papers as late as September. The<br />
Biology HG papers from Limpopo, and the Biology SG and HG<br />
papers from Western Cape were sent in September. This puts the<br />
17
external moderators under pressure to approve papers they would<br />
have ordinarily rejected, due to time pressures.<br />
negatively on the quality and standard of question papers submitted<br />
for external moderation.<br />
7.2 Not all assessment bodies submit question papers for re-moderation.<br />
When a question paper is required for second or subsequent<br />
moderation, the external moderator indicates on the report, and such<br />
question papers should be re-submitted to the external moderator.<br />
7.6 The majority of question papers had to be moderated more than<br />
once, and this again is due to the poor quality and presentation of<br />
question papers. The following are examples of question papers that<br />
had to be moderated more than once:<br />
7.3 Assessment bodies submit incomplete question papers for external<br />
moderation.<br />
7.7 North West and Mpumalanga SG and HG Physical Science<br />
papers.<br />
7.4 The assessment body back-up papers do not match the quality and<br />
standard of the national papers. As indicated in the report earlier,<br />
7.8 The Mpumalanga English Primary Language SG and HG papers<br />
were moderated twice.<br />
external moderators attribute this to the competency of examiners.<br />
This is because the best moderators are attracted to set national<br />
papers. Assessment bodies have a tendency to use questions over<br />
and over again, making the question paper too straightforward<br />
and predictable.<br />
7.9 The Art paper from the following assessment bodies had to be moderated<br />
twice: North West HG paper 1, Gauteng SG and HG,<br />
Mpumalanga SG and HG, Eastern Cape SG and HG, Limpopo SG<br />
and HG and Western Cape SG and HG.<br />
7.5 Poor quality or even absence of internal moderation of question<br />
papers. This matter cannot be overemphasised as it impacts so<br />
18
8. Recommendations<br />
Based on the analysis of the twenty reports from external moderators, the<br />
following are some of the recommendations:<br />
8.5 Examiners and internal moderators should keep abreast of<br />
international developments in their subjects, as this would ensure that<br />
new developments are infused into the paper. Trends are constantly<br />
changing in subjects and it is important that these developments are<br />
evident in the question papers set. These however, have to be<br />
8.1 Assessment bodies must submit question papers to external<br />
moderators well before the set due date of 30 April.<br />
communicated to teachers so that they teach in accordance with<br />
latest developments in the subjects.<br />
8.2 The interactive meetings between the external moderators, examiners<br />
and internal moderators that used to be hosted by <strong>Umalusi</strong> but have<br />
now been taken over by DOE, needs to be reinstated as this is the<br />
only forum where crucial issues relating to the setting, quality and<br />
8.6 Examiners and internal moderators should ensure that question papers<br />
are edited and in their print-ready form prior to forwarding them for<br />
external moderation. Question papers submitted for external moderation<br />
should, in all circumstances, be typed and NOT handwritten.<br />
standard of papers can be addressed.<br />
8.7 Question papers should be accompanied by typed internal<br />
8.3 In cases where new appointments are made, all the necessary<br />
examination information should be made available to the new<br />
appointees to allow for continuity.<br />
moderators’ reports that indicate clearly that internal moderation has<br />
been done. These reports must be presented on the report format<br />
prescribed by <strong>Umalusi</strong>, with all the necessary information.<br />
8.4 Examiners should revise question papers taking in to consideration<br />
the comments from external moderators and resubmit the revised<br />
copies on time.<br />
8.8 All assessment bodies should align question papers according to the<br />
national guidelines with regard to syllabus coverage, time allocation,<br />
topic allocation, language and formatting.<br />
19
8.9 The standard of the assessment body back-up papers needs to be<br />
looked into and improved seriously. Examiners should avoid<br />
repetition of questions and lifting of questions from past papers.<br />
steadily improving. A great deal of thought goes into their preparation,<br />
and because the stakes are so high, the panels strive for zero-defect. The<br />
standard of the assessment body back-up question papers is low and<br />
continues to deteriorate. The quality of questions set is a clear indication<br />
8.10 In addition to the above recommendations, an ideal situation<br />
would be to have all question papers set at national level as this<br />
that no creativeness is applied, and also the quality of internal<br />
moderation is a serious concern.<br />
wouldensure same standard and quality of question papers set.<br />
External moderators have, however, worked very hard to ensure that all<br />
8.11 The IEB should be required to set two papers of 2 hours duration<br />
(instead of one 3 hour paper) to bring it in line with the national<br />
papers with immediate effect.<br />
the moderated and finally approved question papers were of an<br />
acceptable standard. As a result, <strong>Umalusi</strong> is, therefore, confident that the<br />
standard and integrity of the Senior Certificate Examinations<br />
was not compromised by the quality of finally approved questions papers.<br />
8.12 Examiners and internal moderators should use a checklist, or<br />
analysis grid to ensure that the syllabus content is correctly<br />
covered, the spread and variety of questions is satisfactory, and<br />
that the level of cognitive demands are assessed proportionally.<br />
9. Conclusion<br />
It is very clear from the report that the standard of the national papers has<br />
improved significantly. Likewise that being set by assessment bodies is<br />
20
Chapter 3<br />
Moderation of Continuous Assessment (CASS)<br />
1. Introduction<br />
The aim of internal assessment is twofold: to offer learners an alternative<br />
chance to demonstrate their competence and to assess those skills that<br />
cannot be assessed through traditional examinations. However, problems<br />
regarding its reliability continue to dog continuous assessment. As a result,<br />
<strong>Umalusi</strong> has built into the statistical moderation of continuous assessment<br />
a tolerance range that would enhance confidence in the scores that<br />
accrue from this.<br />
The Senior Certificate Examination consists of two components: one<br />
external and the other internal. Internal Assessment, or Continuous Assessment<br />
(CASS), constitutes 25% of the final mark for the Senior Certificate<br />
Examination. The written external examination, on the other hand, makes up<br />
75% of the final mark. Marks for these components are presented separately,<br />
but are combined to form the final mark for certification purposes. <strong>Umalusi</strong><br />
has the responsibility to ensure that scores presented for the purposes of<br />
continuous assessment are relatively reliable. For this reason, <strong>Umalusi</strong> subjects<br />
this component to its quality assurance regime.<br />
This chapter will outline the scope of <strong>Umalusi</strong>’s moderation of continuous<br />
assessment, proceed to discuss the approach to moderation of continuous<br />
assessment and then table the salient findings from this exercise.<br />
Furthermore, the chapter will highlight the strengths within assessment<br />
Internal assessment is set, marked and graded at site level. This makes it<br />
absolutely necessary for <strong>Umalusi</strong> to put in place measures to<br />
standardize internal assessment. In order to standardize internal assessment,<br />
<strong>Umalusi</strong> sets down directives. These directives include defining the<br />
composition of internal assessment, the respective responsibilities of key role<br />
players, presentation of internal assessment as well as moderation<br />
bodies with regard to their implementation of continuous assessment. It<br />
will also identify problem areas and suggest solutions.<br />
2. Purpose of this chapter<br />
The purpose of this chapter:<br />
procedures. The moderation procedures in place at <strong>Umalusi</strong> and how they<br />
are deployed to the quality assurance of continuous assessment are discussed<br />
in detail later in this chapter.<br />
• To report on the quality and standard of continuous assessment within<br />
assessment bodies.<br />
21
• To identify problem areas in the implementation of continuous<br />
assessment.<br />
• recommend solutions to the problems identified.<br />
3. Scope of the moderation of Continuous<br />
Assessment<br />
<strong>Umalusi</strong> moderates continuous assessment across all the nine provincial<br />
departments of education and the two independent assessment bodies. For<br />
the 2005 moderation exercise, 21 CASS portfolios per subject per assessment<br />
body were selected from the eleven assessment bodies.<br />
A team of 14 moderators was deployed to various assessment bodies to<br />
undertake this task. Table 1 below shows the subjects, the number of schools<br />
and the portfolios that were moderated across the assessment bodies.<br />
Table 1: Samples for moderation<br />
ASSESSMENT BODY SUBJECT NO. OF SCHOOLS NO. OF PORTFOLIOS<br />
BCVO Accounting 7 21<br />
Eastern Cape History 7 21<br />
Biology 7 21<br />
Free State English Second Language 7 21<br />
History 7 21<br />
Gauteng Physics 7 21<br />
IsiZulu 7 21<br />
IEB Accounting 7 21<br />
Kwa – Zulu Natal IsiZulu 7 21<br />
Afrikaans Second Language 7 21<br />
22
ASSESSMENT BODY SUBJECT NO.OF SCHOOLS NO.OF PORTFOLIOS<br />
Limpopo History 7 21<br />
Afrikaans Second Language 7 21<br />
Mpumalanga History 7 21<br />
English Second Language 7 21<br />
North West Physics 7 21<br />
History 7 21<br />
Northern Cape Physics 7 21<br />
Biology 7 21<br />
Western Cape Mathematics 7 21<br />
The schools were selected on the basis of their overall performance in the<br />
Grade 12 examinations of 2004. The portfolios selected fell into the<br />
following categories:<br />
• 80% and above<br />
• 50 – 60%<br />
• Below 40%<br />
• Average performing school – pass rate of 50%<br />
• Low performing school – pass rate of 20% and below<br />
Each school was required to provide one portfolio from each of the<br />
following ranges:<br />
4. Approach to the moderation of Continuous<br />
Assessment<br />
Moderators were deployed to assessment bodies for a period of three<br />
days to moderate continuous assessment portfolios. <strong>Umalusi</strong>’s<br />
23
approach to the moderation of CASS portfolios comprises three<br />
stages, namely:<br />
Post-moderation sessions were held with the same officials that<br />
attended the pre-moderation session, during which <strong>Umalusi</strong>’s moderator<br />
highlighted both the strengths and areas of concern that were identified<br />
• A pre-moderation session;<br />
• The moderation of portfolios; and<br />
• A post-moderation session.<br />
during the moderation session.<br />
5. Findings<br />
Pre-moderation sessions were held during which assessment body officials<br />
involved in providing support to teachers and ensuring the proper and<br />
In order to judge the standard of continuous assessment, <strong>Umalusi</strong> uses a<br />
set of criteria to assess portfolios. These criteria are as follows:<br />
effective implementation of CASS were interviewed on the following<br />
aspects:<br />
• Adherence to national policy and guidelines.<br />
• The appropriateness and standard of the assessment tasks being<br />
• Compliance with policy<br />
• Teacher training on CASS<br />
• Quality of internal moderation<br />
developed within assessment bodies.<br />
• The degree of standardisation within the assessment body.<br />
• The extent and quality of internal moderation.<br />
• Teacher development with regard to CASS implementation.<br />
Moderators then proceeded with the moderation of portfolios. This<br />
• The reliability of CASS scores.<br />
required a re-mark of the portfolio after which moderators were required<br />
to pronounce judgment on the quality of the portfolio.<br />
This chapter will discuss general findings, proceed to identify<br />
strengths, highlight problem areas and suggest solutions through<br />
recommendations.<br />
24
5.1 Compliance with policy<br />
information regarding the management of portfolios, i.e. timeframes,<br />
process approach to writing, supervised and unsupervised pieces,<br />
On the whole, assessment bodies adhere to policy and other<br />
guidelines when implementing continuous assessment although there<br />
are instances of provincial/assessment body perculiarities.<br />
shorter pieces, presentation of learners, portfolio, safekeeping of<br />
portfolios, recording of assessment, educators, master portfolio, penalties,<br />
completion of forms and worksheets as well as moderation procedures,<br />
are dealt with.<br />
With the exception of a few isolated cases, which will be highlighted<br />
under the section entitled areas of concern, CASS is being<br />
implemented in line with policy requirements. There are some slight<br />
modifications within each assessment body. Assessment bodies are<br />
also taking pains to ensure that they provide teachers with as<br />
much support and guidance on how to interpret policy so that it is correctly<br />
and effectively implemented. The CASS policy is supported by<br />
The North West Department of Education has revised its CASS policy and<br />
guideline documents based on recommendations made by <strong>Umalusi</strong><br />
during 2003/4 as well as put systems in place to make the implementation<br />
of CASS succeed in their province. Monitoring and moderation<br />
instruments have also been designed and officials have been appointed<br />
to manage, monitor and moderate CASS.<br />
subject specific guidelines, which are developed with a view to<br />
providing teachers with the necessary guidance on what is required of<br />
them and how to implement CASS.<br />
In the Northern Cape, the CASS guidelines for higher and standard<br />
grades have been separated. Each booklet has exemplars of a<br />
range of assessment activities and provides teachers with guidance on<br />
Certain assessment bodies go to great lengths to ensure that these guidelines<br />
are comprehensive and in-depth. The Free State Department of<br />
Education, for instance, has developed a detailed and in-depth guideline<br />
that assists teachers in formulating literature questions. Furthermore,<br />
how to design tasks covering topics from every aspect of the syllabus<br />
and the weighting for various cognitive levels that need to be assessed,<br />
which enables teachers to integrate CASS with normal teaching<br />
and learning.<br />
25
5.2 The appropriateness and standard of the<br />
assessment tasks<br />
presentation on a real-life company, whilst the other required learners to<br />
choose a public company and compile a report of the financial statements.<br />
In the main the standard of test items and examination papers are<br />
appropriate in that they cater for learners in Grade 12 and cover<br />
syllabus requirements.<br />
From the Afrikaans portfolios in KwaZulu Natal it was evident that all schools<br />
write provincially standardized tests and examinations and therefore these<br />
tasks are appropriate for learners in Grade 12. Assessment tasks are also<br />
based on syllabus requirements. Levels of difficulty are addressed in the<br />
This can also be attributed to the high reliance on past question papers.<br />
The challenge seems to surface when teachers are required to set their<br />
different topics and forms of writing, e.g. discursive essays are more<br />
challenging than narrative essays.<br />
own assessment tasks such as assignments, projects, etc. It is at this point<br />
that the distinct difference in ability levels becomes apparent.<br />
In KwaZulu Natal and Gauteng, in most instances the standard of the<br />
IsiZulu tasks is suitable for Grade 12 learners. In the language usage<br />
In the Free State, the standard of English Second Language tasks are<br />
appropriate, as subject specialists have developed them at provincial<br />
level. The tests and exam tasks in this section are moving away from<br />
discreet language questions to a more communicative and contextual<br />
approach (language in action which is in keeping with modern trends in<br />
language teaching).<br />
section, comprehension texts are suitable and the tasks cover a range of<br />
cognitive levels in that while some simple questions may require a<br />
one-word answer, others may demand evidence of the learner’s insight<br />
into the implied causes of, say, the character’s behaviour. In the Afrikaans<br />
portfolios there is evidence that levels of difficulty are being addressed in<br />
the different topics and forms of writing.<br />
Projects presented by the IEB were commendable. One contained a task<br />
that required learners to analyse the financial situation and make a<br />
Biology tasks from the Eastern Cape revealed that most tasks concentrate<br />
on tests and examinations and assessed low cognitive ability levels as will<br />
26
e discussed later in the report.<br />
consistent with the requirement for Grade 12.<br />
The standard and appropriateness of assessment tasks set for Afrikaans in<br />
Limpopo, were in certain instances up to standard and based on syllabus<br />
requirements. In History, assessment tasks were to a large extent of a<br />
poor quality as is reflected in the areas of concern.<br />
In Physical Science, Gauteng, in an attempt to ensure a relative degree<br />
of standardisation across the province, has all schools in the province do<br />
five common tasks namely the homework assignment, two practical tasks<br />
and two examinations.<br />
5.3 Standardisation within the assessment body<br />
In the North West, the Physical Science team has taken their first steps<br />
towards standardising CASS across their regions by prescribing the eight<br />
Assessment bodies generally have in place processes and procedures<br />
for standardisation of the implementation of continuous assessment.<br />
experiments for practical work and writing a common September paper.<br />
In addition to this, they have a CASS moderation policy and a policy to<br />
adjust CASS marks.<br />
English portfolios form the Free State Department of education revealed<br />
that there is consistency in the standard of assessment tasks across schools<br />
and the district since many schools rely on examples of tasks from the<br />
assessment body.<br />
The standard setting process and the meetings, which are held in the<br />
Western Cape at the beginning of each year with the curriculum advisors,<br />
5.4 The extent and quality teacher development<br />
There is evidence of teacher development for the implementation<br />
of continuous assessment within assessment bodies but the quality and<br />
rigor varies.<br />
is commendable. All aspects of CASS are dealt with during this session.<br />
Curriculum Advisors then cascade this information to teachers. Of note is<br />
that even though districts or schools set their own tasks, the standard is<br />
The nature and quality of teacher training in which the successful<br />
implementation of CASS lies is an aspect that certain assessment<br />
27
odies appear to have no clear strategy for as will be highlighted<br />
under areas of concern as the report progresses. There are provinces<br />
that are making attempts to ensure that teachers are equipped to<br />
subject advisors. Learning facilitators are required to provide reports to<br />
the CASS co-ordinator, who uses these reports to identify schools where<br />
further support is required.<br />
undertake the task of implementing CASS through standard setting<br />
meetings, cluster sessions that are held at least once per term and<br />
workshops, whilst in other provinces, a two hour session (for the entire<br />
Gauteng provides their teachers with on-going support during school<br />
visits and cluster meetings.<br />
year) is held with teachers. In most instances, training is confined to<br />
communicating the contents of the policy and guideline documents.<br />
Practical, hands-on experience for example on how to develop an<br />
appropriate project/investigation in Mathematics and how to assess<br />
the task is lacking. It is precisely for this reason that many teachers<br />
either shy away from this activity or extract from a past examination<br />
paper a task that they deem appropriate to be used as aproject/<br />
investigation. The consequence of this is that the task is often<br />
inappropriate.<br />
In the Northern Cape, the support and guidance given to teachers<br />
by subject advisors and their role in internal moderation is<br />
noteworthy. There are 107 schools in the province that offer<br />
Biology at Grade 12. Despite the fact that they are widespread,<br />
each school is visited once a year and poor performing schools twice.<br />
There is also evidence that the quality of work in these poor<br />
performing schools has improved as a result of the additional support<br />
provided to them.<br />
In the Free State, not only are the guidelines for English Second Language<br />
in-depth, but of note is the amount of effort put into ensuring that teachers<br />
understand and have a good grasp of the contents of the CASS policy<br />
Educators in the North West underwent an intensive two-day<br />
training session early in the year during which they were trained on the<br />
following:<br />
and guideline documents. This is done through hosting of workshops and<br />
regular site visits by learning facilitators, more commonly known as<br />
• The different CASS components<br />
28
• Weighting of CASS components<br />
• Examples of CASS components (for standard setting)<br />
• Minimum requirements of CASS components<br />
aspects of CASS, e.g. facilitators moderate the choice of topics in the<br />
extended, short and transactional writing and provide feedback to the<br />
educators.<br />
• Contents of educators’ and learners’ portfolios<br />
• Guiding participants on the pace-setters<br />
• The meaning of continuous internal moderation conducted by senior<br />
management teams at school/learning site level<br />
• The quality of CASS activities.<br />
In Mpumalanga, internal moderation occurs periodically at site level<br />
whilst in other cases, moderation seems to occur just before portfolios are<br />
made available to curriculum implementers. The cluster system of<br />
moderation is being implemented in more areas than in previous years<br />
and specially appointed teachers have been designated the position of<br />
From this it is evident that much effort and planning went into the<br />
cluster leader.<br />
organization thereof.<br />
5.5 Internal moderation of internal assessment<br />
There is evidence to show that assessment bodies do moderate<br />
continuous assessment. However, the quality and rigor varies.<br />
There is adequate evidence in both learners’ and educators’ portfolios<br />
from the Free State to suggest a rigorous approach to the moderation of<br />
The Western Cape has three moderation sessions per year, each of three<br />
to four hours in duration. In each case feedback is given to teachers and<br />
the curriculum advisors ensure that learners assess relatively consistently.<br />
There is clear evidence form the comprehensive reports available that<br />
portfolios have been thoroughly moderated at school level and by the<br />
subject advisor. Moderation involves the checking that criteria have been<br />
met and that standards are maintained.<br />
portfolios at regional and provincial level. Educators’ portfolios revealed<br />
that the CASS mark sheets were audited. Facilitators ensure that there is<br />
compliance with policy and provide evaluative inputs on the various<br />
In KwaZulu Natal, moderation of Afrikaans CASS portfolios takes place<br />
quarterly at cluster level and once, at the end of the year at provincial<br />
29
level. Educators moderate each other’s learner and educator portfolios.<br />
All learners’ portfolios are taken to the meeting. 10% of the portfolios<br />
taken from the whole rating scale are moderated and not less than five<br />
per school. Certain prescribed mark sheets and annexures are filled in at<br />
these meetings to show that moderation has taken place. These forms are<br />
kept in the coordinators’ and educators’ files.<br />
5.6 The reliability of the assessment outcome<br />
The continuous assessment scores for 2005 were, to a large extent,<br />
reliable.<br />
6. Areas of concern<br />
6.1 Adherence to policy<br />
a. The Biology exemplar document developed by the Northern Cape is<br />
lacking in the following respects:<br />
• There are no activities that assess experimental design (skills<br />
30-38 listed in the National Guideline Document)<br />
• A four-point rubric is still being used for the assessment of graphs<br />
while the rubric used in the written papers at national level is on<br />
a 10/11-point scale<br />
• Guidance regarding the allocation of marks for the caption for<br />
To ascertain whether marks awarded to learners are reliable, moderators<br />
were required to re-assess tasks with a view to establishing whether marks<br />
awarded to learners were reliable. In general, moderators found that there<br />
was a high degree of correlation in the mark/s awarded by the teacher,<br />
internal moderator (where applicable) and the external moderator.<br />
diagrams and units for quantitative answers is absent<br />
• The questions in the classwork activities are not graded into<br />
ability levels, nor is there any guidance on how teachers can<br />
grade the questions themselves<br />
• One of the activities includes details of the dark phase of<br />
photosynthesis, which fall outside the scope of the syllabus.<br />
Instances where problems were picked up in relation to the reliability of the<br />
assessment outcome will be discussed in detail under areas of concern.<br />
b. In Biology in the Eastern Cape, the following deviations from the<br />
National policy is apparent:<br />
30
• The weighting for classwork set in the National policy is 50%<br />
while the Eastern Cape Education Department has set it as 30%.<br />
While a reduction in weighting of this component is in the right<br />
direction, it needs to be discussed at National level so that<br />
implementation can be standardised for the entire country to<br />
ensure fairness. In this circumstance it probably disadvantages<br />
learners of the Eastern Cape Education Department.<br />
• The policy lacks prescriptions in terms of topic and type for the<br />
various forms of assessment for the purposes of standardisation.<br />
However, it does indicate in general terms, for example, that<br />
practicals should be assessed by way of at least two worksheets<br />
and at least three practical experiments. But it lacks details<br />
with respect to the differences between a worksheet and an<br />
experiment or how these should be employed.<br />
• The weighting of the practical work component is also reduced,<br />
from 20% to 10%. This could have an advantageous effect on<br />
the CASS marks of the learners from the Eastern Cape.<br />
• An additional formal test in the form of the June exams is<br />
prescribed. This could increase the total average of the exam<br />
c. The Beweging vir Christelike Onderwys (BCVO) continues to use<br />
Gauteng’s policy and guideline documents despite <strong>Umalusi</strong> having<br />
brought to their attention the need for them to develop their own documentation<br />
based on National policy.<br />
marks, thus making it unfair when compared with other<br />
assessment bodies.<br />
• The policy attempts to distinguish between the various<br />
components of CASS. However, while it indicates the size and<br />
extent of control of the tasks, it does not indicate the variety and<br />
d. Teachers in the North West are making great effort to satisfy internal<br />
policy requirements. However, it seems as though many schools are<br />
falling short as a result of an overload regarding the number of<br />
portfolio pieces teachers and learners are required to complete.<br />
the outcomes/skills to be assessed. The negative effect of this<br />
omission was revealed in the portfolios, in that, almost every task<br />
in the portfolio was a test-type of activity. This goes against the<br />
rationale for continuous assessment.<br />
31
6.2 Standard of assessment tasks<br />
should not be there, i.e. there is still too much summative assessment.<br />
CASS is therefore dominated by tests in this province.<br />
a. A critique of the assessment tasks developed in the Free State is that in<br />
many aspects of Literature questions are language/textual based and<br />
do not allow for learners to express various points of view.<br />
d. Biology portfolios from Gauteng revealed that there is no<br />
differentiation between tasks given to higher and standard<br />
grade learners.<br />
b. The national policy stipulates that process writing should be<br />
encouraged. However, in the Free State, there was little evidence of<br />
process writing in either of the districts as manifested in the learners’ files.<br />
e. In the Eastern Cape, Biology tasks in general assessed low cognitive<br />
ability levels. Most tasks consisted of items, which contained<br />
multiple-choice questions, biological terms, matching, providing labels<br />
c. The purpose of practical work in Physical Science is to assess learners’<br />
science investigative skills. However, in the North West, Rustenberg and<br />
Bergsig High schools combine content and attitudes with science skills<br />
and this inclination is not appropriate to the purpose of practical work. In<br />
the case of Sewagodimo Secondarty School, theoretical practicals are<br />
used but they are just content tests. Practical work should be assessed with<br />
rubrics but this was not done in spite of the provincial subject policy that<br />
prescribes the use of rubrics. Therefore, in the case of Sewagodimo SS,<br />
and functions to diagrams. Classwork focused primarily on objectivetype<br />
questions, which assessed the recall of knowledge. Most projects<br />
were a very simplistic exercise for Grade 12 learners, such as<br />
collecting and sticking pictures of different foods. Practical work is not<br />
investigative in nature, but merely required learners to access<br />
information. Very little or no attention is paid to the principles of<br />
investigations. Most of the tasks are invalid, as it does not assess the<br />
relevant skills as indicated in the policy document.<br />
assessment tasks are largely summative and dominated by tests. In the<br />
case of Rustenberg and Bergsig there is performance-based tasks in<br />
practical work but there is also content and attitude testing included that<br />
32
f. The following was noted in Accounting portfolios from the Independent<br />
Examination Board (IEB):<br />
• There was very little creativity shown and most class tests are<br />
- Thomas Moore College<br />
- Penryn College<br />
- The King’s Court<br />
exercises out of books or old papers. This means that it is just a<br />
repetition and the learner’s context is not considered. Tests and<br />
examinations are to a large extent a mere regurgitation of old<br />
question papers. This was found in portfolios from Thomas<br />
Moore College, Redhill School, The King’s Court and<br />
St Andrew’s College.<br />
• Only one school, namely Penryn College submitted SG files and<br />
it is noted that many of the tests were the same for HG and SG.<br />
This is not giving the HG learners the necessary extension, as in<br />
this case most of the tests can be regarded as being of a SG<br />
level. As an example, one learner achieved 29% overall yet<br />
most of the class tests were in the 80%. While it is normal to<br />
achieve a higher mark in class tests the range seems to be<br />
excessive. The standards of tests must be very carefully<br />
considered. There must also be progression over the year and<br />
while studying a particular section.<br />
• Assessment tasks set by the following schools lacked higher order<br />
questions:<br />
g. An evaluation of Accounting portfolios from the BCVO revealed that:<br />
• Tests that were presented are based on past papers or revision<br />
booklets. There is no creativity shown and very little variation in the<br />
types of questions set. The same topics are re-examined in the<br />
controlled test, class test, June exam and prelims. Ratio and<br />
analysis seldom feature in the various forms of assessment, an<br />
issue that needs to be addressed. Even projects were confined to<br />
tasks/topics that teachers had “lifted” from past examination papers.<br />
• There is little evidence of the use of higher order questions,<br />
wherein learners are required to solve problems, discuss and<br />
explain concepts, apply knowledge to situations that are<br />
unfamiliar to the learners. A misconception seems to exist as to<br />
what constitutes problem solving. A mathematical calculation is<br />
just that and not necessarily problem solving. Teachers also need<br />
to relate the class work to the real world in which the learners live.<br />
• Approximately a third of the prelim papers (107/300) were<br />
questions that involved a great deal of insight and were<br />
33
inappropriate for SG learners. These learners did not fair well in<br />
these questions. While high standards should always be strived<br />
for it must not be at the expense of the learners, particularly<br />
standard grade learners. The teacher, however, needs to be<br />
congratulated on showing initiative in setting questions and making<br />
the learners engage in thinking exercises. In comparison to other<br />
schools though, these papers were of a much higher standard.<br />
the essays (higher or standard grade) and at the latter questions<br />
posed was limited to cognitive levels 1 and 2.<br />
• Learners need great assistance with respect to the History<br />
‘survival skills’. They battle with the writing of suitable<br />
introductions and conclusions in essay questions. They also need<br />
to be taught how to develop and sustain a line of argument,<br />
especially in HG. These are the aspects they are rewarded for<br />
or get penalised on during the marking of essays.<br />
h. In History in the North West, there are still instances where:<br />
• Source-based questions are based on a single source<br />
(Lot Mashiane Secondary). A single source exercise is good<br />
practice to introduce learners to source-based work. It is<br />
particularly useful in the junior GET band. However, at Senior<br />
Certificate level learners are required to handle, with some<br />
measure of competence, questions based on a group of sources<br />
and to be competent to deal with more complex and skillsorientated<br />
questions.<br />
• There is no differentiation between higher and standard grade<br />
tasks, e.g. at Mmanotshe Moduane High School and Abel<br />
Motshoane High School. At the former school, the moderator<br />
was unable to ascertain which marking instrument was used for<br />
i. In the History portfolios from Limpopo, it was noted that assessment<br />
tasks continue to be largely summative in nature, and do not<br />
differentiate between higher and standard grade, despite <strong>Umalusi</strong><br />
highlighting this to the assessment body over the past few years. No<br />
effort seems to have been made to assist teachers in producing tasks<br />
that are formative in nature through the provision of exemplars that are<br />
developed at provincial level. There is also no planned focus on<br />
historical skills identified in the National Guideline document. The<br />
moderator pointed out the following with regard to the tasks:<br />
• Essay writing<br />
- In general no concentration on the application of skills, rather<br />
on content.<br />
34
- No regard to argumentative type essays.<br />
• Source based exercises<br />
- General reliance on old question papers, without taking the<br />
annual adaptations into account. This reflects a lack of<br />
understanding of the different levels of source-based questions.<br />
• The research work (assignment)<br />
- Lack of provincial guidance.<br />
- Lack of understanding of the skills that need to be assessed.<br />
• Standardised Tests and trial Exams:<br />
- No clear planned structure on the skills that were assessed.<br />
• The trial examination, which is provincially set and moderated<br />
has serious flaws, e.g. almost no differentiation between HG and SG;<br />
vague questions in Paper 1SG Q4.1.3 and 4.2.4 marking guideline<br />
does not answer the question posed e.g., Paper 1SG, Q4.1.3;<br />
4.1.5; and Paper 1 HG Q 6.1.5 and 6.1.6.<br />
• Of serious concern is the assessment tasks developed for research<br />
work. There is clear evidence of a lack of understanding of:<br />
- what historical research is.<br />
j. In Limpopo, the following was found in Afrikaans:<br />
• Not all prescribed writing styles were assessed in Afrikaans. The focus<br />
appeared to be mainly on the narrative style. Descriptive, reflective,<br />
expository, argumentative and discursive modes of writing were only<br />
found in portfolios from Ellisras High School and Harry Oppenheimer<br />
Agricultural School.<br />
• In the shorter pieces of writing (functional writing), the main focus is on<br />
the letter. Very few included diary entries, reports, obituaries, dialogues,<br />
speeches, etc.<br />
• There is still a heavy reliance on past examination papers.<br />
• The standard of texts used for comprehension passages are often<br />
inappropriate. Teachers at Ellisras High School tended to select texts that<br />
were too complex for second language learners, whilst those at Tabudi<br />
Secondary selected texts that were too easy .<br />
• The standard of assessments tasks in most instances were below<br />
expectations. Questions tended to require learners to merely recall<br />
or regurgitate information. This was evident even in the Literature<br />
questions.<br />
- how to use sources and acknowledgment thereof.<br />
- how to compile a rubric for research work.<br />
k. From the Afrikaans portfolios in KwaZulu Natal it was evident that<br />
in Comprehension and Literature, the questions tend to be geared<br />
35
to testing content and are generally not reflective of the different<br />
cognitive levels.<br />
6.3 Standardisation within the assessment body<br />
a. In Limpopo, for Afrikaans, there is no attempt being made at<br />
6.4 Internal moderation of CASS portfolios<br />
a. In contrast to Afrikaans portfolios from KwaZulu Natal, the IsiZulu<br />
portfolios reflect a lack of internal moderation, despite the provincial<br />
policies requirements that portfolios must be moderated at school,<br />
cluster and provincial level.<br />
provincial level to ensure a relative degree of standardisation of<br />
assessment tasks across the assessment body. This can be gauged<br />
from the inconsistency in the standard of assessment tasks across<br />
schools and districts as can be seen in the following:<br />
b. During the pre-moderation discussion held for Afrikaans in<br />
Mpumalanga, officials acknowledged that internal moderation was in<br />
most cases inadequate. This was confirmed in the Afrikaans<br />
Additional Language report for Provincial Cass Moderation Grade<br />
• In Segolola High school, a language test was set out of a total of 10<br />
marks which was then converted to a mark out of 80, whilst learners at<br />
Waterberg High School were required to write a full test out of 80 marks.<br />
• Some teachers require learners to write essays on non-challenging<br />
topics, e.g. at Tabudi Secondary School learners were required to write<br />
on ”My skool is mooi”, whilst others provide extremely challenging<br />
12, 2005, which stated that internal moderation of CASS was in most<br />
cases inadequate. District moderators did not check whether the totals<br />
for each section and the mark allocations were correct. Where<br />
moderation is done, it is also resticted to a mere audit. No information<br />
could be found with regard to the prescribed frequency of internal<br />
moderation and feedback mechanisms to educators.<br />
topics such as “Die boksies is vir skoene – nie vir mense nie”.<br />
c. Evidence in The North West, in the learners’ History portfolios<br />
suggest that moderation is rather cursory. There is no evidence of<br />
remarking of learners’ work. The only evident difference between the<br />
36
educators marking and that of the internal moderators was in the use<br />
of an ink of a different colour. Thus, one would see a mark of a green<br />
pen next to that of a red one. It appeared that the idea was limited<br />
only at giving the impression of moderation without actually getting into<br />
grips with such a menial task. Provincial moderation is also confined to<br />
being more of a verification process, apparently necessitated by the<br />
pressure to submit final CASS marks to Head Office.<br />
respect to the quality, appropriateness and variety of the tasks to<br />
individual educators.<br />
6.5 Reliability of the assessment outcome<br />
a. In the Western Cape, detailed memorandums are available for<br />
Mathematics. In instances where the memo did not provide all alternate<br />
answers, teachers were able to assess candidates appropriately.<br />
d. The History portfolios in Limpopo also showed up inadequacies in<br />
internal moderation processes. In all the portfolios moderated, it was<br />
evident that portfolios had merely been audited.<br />
However, in the rubrics for certain tasks, too much weighting was<br />
given to some aspects such as, submission within the stipulated time,<br />
method, neatness, etc. Mathematical aspects had almost the same<br />
weighting as other aspects.<br />
e. In the Eastern Cape moderation is supposed to take place at site level<br />
once a term and at cluster level at least three times per year. However,<br />
in the Biology portfolios proper moderation of educator tasks and<br />
learner performance is almost non-existent at site level. This is revealed<br />
by the lack of any evidence of moderation in the educator’s portfolio<br />
as well as in the learner portfolio. As far as the cluster moderation is<br />
b. In the Northern Cape, for Biology there is virtually no variance in marks<br />
awarded by teachers, the internal and external moderator, which<br />
indicates that teachers have a sound knowledge of the application of<br />
marking guidelines, whether these are traditional marking memoranda<br />
or the relatively new rubrics, rating scales and check-lists.<br />
concerned, some portfolios have cluster reports in them. These,<br />
however, are checklists, which merely audited the number of pieces.<br />
There is very little in the way of constructive, qualitative feedback with<br />
c. In the North West, the History educator at Mmanotshe Moduane High<br />
School exhibited competence in using the marking instrument for<br />
37
History essays, especially when applied to better performing learners.<br />
However, when it came to the lower end of the performance scale,<br />
moderator which indicates a serious lack of understanding amongst<br />
certain teachers on:<br />
the educator tended to allocate higher marks to under-performing<br />
learners. This appeared to be a deliberate strategy to maintain a<br />
better average mark in the end. This is a phenomenon that has been<br />
identified in a current CASS research project that <strong>Umalusi</strong> is engaged.<br />
Teachers seem to be overly strict on higher performing learners, whilst<br />
learners at the bottom end of the scale are being advantaged.<br />
• The application of the matrix<br />
• The application of a rubric<br />
• Global marking<br />
• Levels Marking<br />
• How to arrive at a mark through the writing of comments<br />
d. In Limpopo, in certain instances there is a marked difference between<br />
Marks are therefore unreliable as is evident in the following instances:<br />
the mark allocated by the History educator and that of the <strong>Umalusi</strong><br />
Name of District Name of Name of HG/SG Mark awarded to Mark awarded by UMALUSI<br />
school /region educator learner by educator moderator/verifier<br />
Mbhangazeki Mopanie M.W. Baloyi S. Chabalala HG<br />
High School C. Mashimbye HG<br />
M.O. Nkuna HG<br />
J. Makhubele HG<br />
Letshele Bohlabela S. Moropane G.M. Ndlovu HG<br />
186 139<br />
400<br />
400<br />
130 88<br />
400<br />
400<br />
186 167<br />
400<br />
400<br />
141 120<br />
400<br />
400<br />
178 146<br />
400 400<br />
High School N. Mogane HG<br />
168<br />
400<br />
132<br />
400<br />
38
e. Still in Limpopo, History marking guidelines are often incomplete in<br />
that they:<br />
Currently all assessment bodies, policies reflect that moderation and<br />
monitoring will take place at site, cluster/regional and provincial level. This<br />
is the ideal, but in reality given the various constraints such as a lack of human<br />
• do not provide alternative answers.<br />
• do not indicate a mark allocation.<br />
• do not identify the skills needed.<br />
• do not provide guidelines on how evidence should be used.<br />
• do not provide evidence of how to apply the rubric.<br />
resources to long distances between schools, this is not being implemented.<br />
Assessment bodies are urged to re-visit these policies on an annual basis so<br />
that what’s in theory is practicable and implementable in practice.<br />
7.2 Appropriateness and standard of the assessment tasks<br />
7. Recommendations<br />
7.1 Policy<br />
Assessment bodies are urged to ensure that their CASS policy and<br />
guideline documents do not in any way deviate from the minimum<br />
requirements set by National policy and guidelines. In instances where<br />
assessment bodies feel the need to increase the number of assessment tasks,<br />
assessment bodies are cautioned to look at the impact thereof on the teacher,<br />
learner and teaching and learning needs to be considered. With the current<br />
national requirements for CASS teachers feel overburdened. Such a decision<br />
should not adversely affect the implementation of CASS.<br />
The over-reliance on past examination papers should be clamped down<br />
on by assessment bodies. As discussed during the CASS feedback<br />
sessions held with assessment bodies in 2004, exemplars that cover a<br />
range of forms of assessment need to be developed centrally by<br />
subject specialists and experienced teachers. These should then be<br />
internally moderated and distributed to schools to assist teachers in the<br />
implementation of CASS.<br />
7.3 Standardisation across assessment bodies<br />
The National Department of Education needs to take the lead in this<br />
regard. Clearly defined standards together with appropriately developed<br />
39
and assessed tasks need to be developed by a team of subject<br />
specialists (representatives from the various assessment bodies) at<br />
national level. <strong>Umalusi</strong>’s moderators should then moderate these. Once<br />
these tasks have been finalised, they should then be distributed to assessment<br />
bodies that can use them during teacher development sessions and<br />
for the development of exemplars at assessment body level.<br />
moderation takes place at the various levels, i.e. site, district/regional<br />
and provincial, on a continuous basis. Moderation should involve the<br />
re-mark of a selection of assessment tasks (both teachers’ and learners’<br />
portfolios) with a view to determining the standard and appropriateness<br />
thereof. Moderation should also at all times be undertaken by subject<br />
specialists, based on criteria that’s been developed at provincial level and<br />
that is aligned to that used by <strong>Umalusi</strong>.<br />
Assessment bodies also need to put measures in place to ensure that<br />
schools within and across districts/regions develop assessment tasks that<br />
are relatively standardized. District/regional officials as well as subject<br />
advisors should be tasked with this so that there is not too much of a<br />
deviation in the standard of tasks developed for CASS across the<br />
assessment body.<br />
7.4 The extent and quality of internal moderation and<br />
teacher development<br />
Qualitative feedback to both teachers and learners needs to take place<br />
immediately after moderation has taken place. A moderation report must<br />
be generated after each moderation session, which highlights strengths<br />
and areas of concern in relation to the criteria used for moderation.<br />
Measures need to be put in place at site/district/regional/provincial<br />
level to ensure that there is indeed implementation of feedback.<br />
Focused teacher development programmes around the following is<br />
required:<br />
As mentioned under “Policy”, in most cases, internal moderation of CASS<br />
is not effective and lacks rigour. Qualitative feedback is lacking and<br />
where it does take place, it’s too late to make an impact on the current<br />
cohort of learners. Assessment bodies need to ensure that internal<br />
• Implementation of CASS and integration into the normal teaching and<br />
learning.<br />
• Difference between forms of assessment and types of assessment.<br />
40
• Assessment in an outcome-based education context (the use and<br />
application of rubrics, scoring, etc.)<br />
7.5 The reliability of the assessment outcome<br />
Once teachers have been trained on the aspects related to CASS and<br />
internal moderation process is stepped up, the marks awarded to learners<br />
for CASS tasks will be a more realistic and reliable.<br />
8. Conclusion<br />
The proper and effective implementation of CASS continues to pose a<br />
challenge to teachers, subject advisors and other officials within<br />
assessment bodies that are responsible for the implementation of CASS.<br />
The standard of assessment tasks across the country also continues to<br />
vary and in certain cases, CASS marks remain unreliable as is evident in<br />
the report.<br />
41
Chapter 4<br />
Monitoring of the Conduct of the Senior Certificate Examination<br />
1. Introduction<br />
approach was adopted for the monitoring of the Senior Certificate<br />
Examination in 2005.<br />
<strong>Umalusi</strong> began the monitoring of the Senior Certificate in 2000 and<br />
started playing a verification role in 2003. Assessment bodies are<br />
required to closely monitor the administration of examinations and report<br />
to <strong>Umalusi</strong> on a daily basis. <strong>Umalusi</strong> deploys monitors to assessment<br />
bodies’ head offices, examination centres and marking centres to verify<br />
the contents of reports submitted to <strong>Umalusi</strong>. The following phases of the<br />
Senior Certificate Examination were monitored:<br />
Monitors, together with the assessment body representatives were trained<br />
at <strong>Umalusi</strong> offices and the training was held on 31 August 2005 where<br />
all assessment bodies were represented. However, it became clear to<br />
<strong>Umalusi</strong> that representatives from assessment bodies that are sent for<br />
training are not the officials responsible for monitoring in some assessment<br />
bodies. Furthermore, they do not provide feedback to the responsible<br />
people when they get back. This became evident when <strong>Umalusi</strong><br />
• The design phase which focuses on the state of readiness of the<br />
assessment bodies to administer the examination;<br />
• The conduct of examinations which includes the writing and marking<br />
process; and<br />
• The capturing, processing and release of results, which includes<br />
capturing of marks, standardization and release of results.<br />
reminded assessment bodies of information and documentation pertaining<br />
to monitoring.<br />
2. Purpose<br />
The purpose of monitoring the administration of the Senior Certificate<br />
Examination within the various assessment bodies is to:<br />
From time to time, <strong>Umalusi</strong> reviews its quality assurance processes.<br />
<strong>Umalusi</strong> reviewed the monitoring instruments and these were<br />
further revamped during the training where the trainees made their<br />
input. <strong>Umalusi</strong> also reviewed its approach to monitoring and a new<br />
• establish the effectiveness of the systems that are in place at<br />
assessment body level, for the registration of candidates, appointment<br />
of examiners, moderators, invigilators and markers;<br />
42
• ensure that there are appropriate security measures in place for the<br />
safekeeping of the examination material;<br />
• establish the state of readiness of the assessment bodies to administer<br />
the October/November 2005 Senior Certificate Examination;<br />
• ensure that processes related to the administration and conduct of the<br />
Senior Certificate Examination are credible;<br />
• ensure that marking is done in a fair manner and is of appropriate and<br />
acceptable standard;<br />
• ensure that the processing and capture of results does not advantage<br />
or disadvantage learners.<br />
4. The approach<br />
<strong>Umalusi</strong> deployed 23 monitors to monitor the conduct of Senior Certificate<br />
Examination. The deployment of monitors took into consideration the<br />
vastness of the province. For example, Limpopo, Eastern Cape,<br />
KwaZulu-Natal and Gauteng were each allocated three monitors<br />
whereas other provinces were allocated two monitors each. Each<br />
monitor was required to monitor two schools in a day over three days;<br />
however, the convenors were allocated two days to monitor the writing<br />
phase as one day was reserved for monitoring the design phase.<br />
3. The scope<br />
The monitoring exercise extends across the eleven assessment bodies,<br />
namely, the nine provincial departments of education, the Independent<br />
Examination Board (IEB) and the Beweging vir Christelike Volkseie<br />
Onderwys (BCVO).<br />
All the phases of examination were monitored during the 2005 Senior<br />
Certificate Examinations. <strong>Umalusi</strong> selected a sample of two subjects per<br />
assessment body across all assessment bodies that were tracked from the<br />
monitoring of the design phase through to the capture and processing of<br />
results. The sampling of subjects was informed by the 2004 Senior<br />
Certificate Examination monitoring reports. Subjects that had problems<br />
were tracked. However, some assessment bodies had no problems in<br />
The monitoring exercise started on 09 September 2005 and<br />
ended on 19 December 2005 after the standardization meetings<br />
2004 but the same route was taken to maintain uniformity and to ensure<br />
that that proper systems are in place.<br />
were finalized.<br />
43
The table below indicates the subjects monitored across assessment<br />
bodies. During the design phase, the monitoring focused on the quality<br />
processes with regard to the editing, translation (where applicable) and<br />
printing of the following question papers:<br />
8. North West • Setswana<br />
• Hotel Keeping and Catering<br />
9. Free State • Sesotho<br />
• Geography<br />
ASSESMENT BODY<br />
SUBJECTS<br />
10. IEB • Mathematics<br />
• Business Economics<br />
11. BCVO • Computer Science<br />
1. Mpumalanga • Geography<br />
• Accounting<br />
• Economics<br />
2. Eastern Cape • Economics<br />
• Technical Drawing<br />
3. Limpopo • Business Economics<br />
• Travel and Tourism<br />
<strong>Umalusi</strong> staff were also deployed to assessment bodies to shadow monitors<br />
as they monitored the conduct of examinations. The purpose thereof was<br />
to get an understanding of what monitors actually do when they<br />
monitor schools and also to provide guidance and support where necessary.<br />
4. Western Cape • Geography<br />
• Computer Studies<br />
5. Gauteng • Geography<br />
• Business Economics<br />
6. KwaZulu-Natal • Business Economics<br />
• IsiZulu First Language<br />
7. Northern Cape • Home Economics<br />
• Technical drawing<br />
Furthermore, <strong>Umalusi</strong> requested monitoring plans from assessment bodies<br />
which was used to verify that monitoring was taking place. The selection of<br />
sample of examination centres to be monitored during the 2005 Senior<br />
Certificate was based on the 2004 Senior Certificate Examination<br />
monitoring reports and the 2005 monitoring plans. Examinations centres that<br />
reported irregularities in the 2004 examinations were also targeted. During<br />
the monitoring of marking, a sample of subjects which reported irregularities<br />
44
during the writing phase were selected and monitors were required to<br />
establish the extent and severity thereof. Assessment bodies and <strong>Umalusi</strong><br />
monitors were required to monitor the following phases of examinations:<br />
4.1 The design phase<br />
During this phase, assessment bodies were required to complete<br />
self-evaluation instruments and write reports on their state of readiness.<br />
These reports were then submitted to <strong>Umalusi</strong> and monitors were<br />
deployed to assessment bodies’ head offices to verify the information in<br />
the reports. During the verification process, monitors were required to<br />
track two subjects with regard to processes relating to the setting,<br />
moderation, typing, editing, proofreading, translating (where applicable)<br />
and approval of question papers.<br />
sample of examination and marking centres were visited by <strong>Umalusi</strong><br />
monitors to verify that monitoring takes place at assessment body level.<br />
4.3 The capturing, processing and release of results<br />
Monitors visited the assessment bodies’ capturing centre for a day to<br />
ensure that marks have been correctly captured. The same subjects that<br />
were marked during the design phase were verified.<br />
5. The findings<br />
5.1 The design phase<br />
5.1.1 General findings<br />
4.2 The conduct of examinations (including marking)<br />
The following are the general findings of the monitoring process:<br />
With regard to monitoring the conduct of examinations, assessment<br />
bodies were responsible for the daily monitoring of the examinations.<br />
Daily reports were submitted to <strong>Umalusi</strong> with the purpose of assisting<br />
<strong>Umalusi</strong> to promptly react to issues that need urgent attention. A<br />
a. Registration of candidates in all assessment bodies was done in<br />
February/March and preliminary schedules were sent to schools by<br />
the end of March. All assessment bodies had completed registrations by<br />
June and as indicated in the reports, no amendments or new registrations<br />
45
were allowed after June. By September all assessment bodies had<br />
finalised registration and timetables were sent to schools.<br />
c. Training of examiners, internal moderators is done at assessment body<br />
level. However, the training is not intensive. Markers are trained on the<br />
first day of the marking session and the training takes a form of<br />
It is interesting to note that assessment bodies are trying to make<br />
registrations as easy and efficient as possible. For example, the<br />
Western Cape Department of Education has introduced an Internet<br />
registration system and the majority of schools in the province have<br />
access to Internet and have therefore registered their learners online.<br />
Schools that do not have access to Internet registered their learners<br />
manually by filling in the registration forms. The Northern Cape<br />
memorandum discussion. Assessment bodies do not have training manuals<br />
for the above-mentioned personnel except for North West. Western<br />
Cape Department of Education does not have a training programme for<br />
the examiners but provided evidence of the Powerpoint presentation that<br />
was used during the training. Limpopo, Independent Examination Board,<br />
Beweging vir Christelike Volkseie Onderwys and Western Cape are in<br />
the process of developing the training manuals.<br />
Department of Education has developed a system where learners are<br />
registered by a computer and schools without computers had to<br />
request assistance from the neighbouring schools. The Limpopo<br />
Department of Education has outsourced data capturing to some<br />
private companies.<br />
d. The processes relating to the setting of question papers went well in all<br />
assessment bodies except for Western Cape where there was a delay<br />
in the setting of History HG Paper. The assessment body made<br />
arrangements with <strong>Umalusi</strong> for late submission for external moderation.<br />
b. Assessment bodies have criteria for the appointment of examiners,<br />
5.1.2 Areas of concern<br />
internal moderators and markers. All assessment bodies train all<br />
personnel involved in the administration of examinations. However, the<br />
The following are areas of concern that emerged from the reports:<br />
training of examiners and internal moderators is not done intensely in<br />
all assessment bodies.<br />
a. Training of examiners and internal moderators is not taken in a serious<br />
46
light. This is shown by the fact that assessment bodies do not have<br />
5.1.3 Recommendations<br />
training manuals for such personnel. No training of examiners is done<br />
in Northern Cape as the assessment body alleges that the people<br />
employed in these positions are experienced and have the necessary<br />
knowledge and skills for setting the question papers.<br />
a. That assessment bodies send people responsible for monitoring to a<br />
monitor training conducted by <strong>Umalusi</strong>, or make sure that people sent<br />
for training provide feedback to those responsible for monitoring.<br />
A major concern is that this aspect was noted and reported on in the<br />
2004 Senior Certificate Examination and assessment bodies have<br />
b. That assessment bodies adhere to timelines set for submission of<br />
reports and prescribed report formats are used.<br />
done nothing to that effect.<br />
c. That all assessment bodies develop generic training manuals for<br />
b. The reports submitted by assessment bodies to <strong>Umalusi</strong> are sketchy<br />
examiners, internal moderators and markers.<br />
and thus do not provide detailed information as required from the<br />
instruments.<br />
d. That training of examiners and internal moderators be viewed in a<br />
serious light.<br />
c. The Limpopo Department of Education failed to submit a detailed<br />
monitoring plan. The reason cited for this was that assessment body<br />
monitors monitored schools that were convenient for them. The monitoring<br />
e. That assessment bodies adhere strictly to <strong>Umalusi</strong>’s requirement to<br />
submit detailed monitoring plans.<br />
plan that was submitted was on the national subjects only.<br />
47
5.2 The conduct of examinations<br />
through courier service on a weekly basis whereas in Western Cape<br />
the courier company collects scripts on scheduled dates. In<br />
5.2.1 General findings<br />
KwaZulu-Natal and Gauteng, question papers and answer scripts are<br />
collected from nodal points and answer scripts are returned to the<br />
The following are the general findings of the monitoring of the conduct of<br />
examinations:<br />
same points at the end of the examination session. Gauteng<br />
Department of Education has a system in place to account for every<br />
answer script that is issued to an examination centre. Chief invigilators<br />
a. Chief invigilators and invigilators are appointed in writing. In most of<br />
assessment bodies, school principals are appointed as chief<br />
are required to return all unused and spoilt answer scripts to the nodal<br />
points and sign for them.<br />
invigilators except for Western Cape where prominent community<br />
members are appointed as chief invigilators and they in turn appoint<br />
invigilators. The invigilators have to be over 23 years of age and they<br />
must not have any of their family members writing Senior Certificate in<br />
the year they are appointed.<br />
c. The chief invigilators open the question papers in front of candidates<br />
before the commencement of examination. After the chief invigilator<br />
opens the question papers, invigilators hand them out to candidates. In<br />
most examination centres, the attendance register is circulated amongst<br />
candidates whilst the examination is in process. The invigilators check<br />
b. In most of the assessment bodies, question papers are distributed by<br />
district or circuit officials every morning and answer scripts collected<br />
by the same officials at the end of the examination session except for<br />
question papers with candidates to confirm that all candidates have the<br />
correct question papers and that all question papers have all pages,<br />
however, not all chief invigilators or invigilators do this.<br />
Western Cape and Northern Cape where the examination material is<br />
delivered to schools through courier service in batches thrice and<br />
twice respectively. In Northern Cape, answer scripts are collected<br />
d. Invigilators at most centres were vigilant during the examination,<br />
however, at Emmanuel Christian School and Dzimauli High School<br />
48
invigilators were seated and the reason provided for that was that very<br />
few candidates were writing the paper.<br />
communication facilities at the marking centres were available and<br />
in working order.<br />
e. Most examination centres have seating plans, however, in Limpopo at<br />
Techni-Ven, there was no seating plan and the reason provided for this<br />
was that not all candidates turned up for the examination. This<br />
raises a concern because in case an irregularity is suspected, there<br />
will be no reference to the seating plan to justify or dispute the<br />
occurrence of the irregularity.<br />
h. The appointment of markers is done on an annual basis. The<br />
markers must have passed the subject for which they have applied<br />
to mark at second year level and must have taught the subject at<br />
Grade 12 in the past three years. The examination administration<br />
assistants are also appointed across the assessment bodies and<br />
their duties are to ensure that all questions in the scripts are marked<br />
and that marks are correctly carried over to the margins and to<br />
f. Examination centres are kept clean except for somewhere it was<br />
reported that the rooms were untidy. The centres that were reported to<br />
the cover. They also have to ensure that computation of marks is<br />
done correctly.<br />
be untidy in Limpopo are Dzimauli High School and Techni-Ven<br />
centres. In Gauteng, Immaculata Combined school was also found to<br />
be untidy. In Limpopo, candidates at Techni-Ven were allowed to bring<br />
cellphones into the examination room but were required to switch them<br />
off. The discrepancy was brought to the attention of the chief<br />
invigilator and he acknowledged that this was an error on his side.<br />
i. The training of markers focuses mainly on the discussion of the<br />
memorandum across all assessment bodies. The chief markers and<br />
senior markers report at the marking centre a day before the markers<br />
to check the number of scripts against the mark sheets and sort the<br />
scripts. Thereafter, a sample of scripts is marked by the team<br />
followed by a discussion of the memorandum during which new<br />
g. Generally, venues used as marking centres were of a good<br />
standard. The furniture at the marking centres was suitable and<br />
answers that are correct but are not included in the memorandum<br />
are highlighted.<br />
49
In Western Cape, Gauteng, Northern Cape and Limpopo, one script<br />
that is to be used for the training of markers is photocopied and<br />
markers are required to mark the copied script. The memorandum and<br />
marking process is then discussed using the script as the focus of the<br />
discussions. In Limpopo and North West, markers are required to<br />
prepare a marking memoranda beforehand so that their memoranda<br />
can be compared with the one developed by the examiners and<br />
the monitoring stage marking was completed and some boxes of<br />
scripts were left in the centre unattended. At the University of<br />
Witswatersrand JCE in Gauteng, there was security at each exit point<br />
and the markers were required to identify themselves with cards if they<br />
went beyond the security point. Everyone leaving the marking centre<br />
was searched to ensure that no scripts were being taken out of the<br />
centre. This was found in Mpumalanga too.<br />
refined by the chief markers and the senior markers.<br />
l. The internal moderators spent about six hours a day at the marking<br />
j. Marking is done per question/s in all assessment bodies. This proves<br />
to be good practice as markers are less strained in trying to get<br />
acclimatised to the whole memorandum. Each group of markers<br />
responsible for marking the same questions have a senior marker<br />
monitoring their marking. Each group is made up of experienced and<br />
centre depending on the distance they have to travel. The internal<br />
moderation is generally done at three levels, by the senior marker, the<br />
chief marker and the internal moderator. A sample of scripts is<br />
moderated by the internal moderator and feedback is provided<br />
immediately to the chief markers.<br />
novice markers and a lot of guidance is given to the novice markers.<br />
5.2.2 Areas of concern<br />
k. The security at marking centres is very tight across assessment bodies,<br />
however, the monitor who visited the Pietersburg High School marking<br />
centre found the security not to be as tight as at other centres.<br />
There were no guards at the storeroom and markers moved in and<br />
out of the Biology HG P2 marking rooms without being searched. At<br />
a. It is disturbing to note that the noise level at some examination<br />
centres is very high during the writing of examinations. At Kheto<br />
Nxumayo, Techni-Ven and Tshikhuthula high schools in Limpopo, the<br />
noise level was high. Also, in Western Cape, at Mitchell’s Plain Islamic<br />
50
High School, Isaiah Christian School, John Ramsay Secondary School,<br />
Isilimela Secondary, Masiphumele Secondary and Kylemore Secondary<br />
School, the noise level was found to be very disruptive.<br />
e. In Mpumalanga at Dumezizwe, Bankfontein and Eastdene<br />
candidates were allowed to write examinations without identity<br />
documents bearing their photos. Invigilators only relied on letters<br />
of admission and/or timetables for identification purposes. This is<br />
b. The collection of scripts at most examination centres leaves much to<br />
be desired. Invigilators require candidates to leave their answer<br />
scripts on the desks so that they can be collected in numerical order.<br />
The practice was observed in Limpopo at Mareka Senior Secondary<br />
School, Kheto Nxumayo, Manyangannna, Tshikuthula and Dayimani<br />
not sufficient for identification of part-time candidates as this allows<br />
for ghostwriters. Also, in Western Cape, because invigilators are<br />
not teachers employed at the schools, it is necessary that all<br />
candidates carry identity documents as invigilators do not know<br />
the candidates.<br />
High Schools. The practice was also observed at WEM School in<br />
Mpumalanga. This is unacceptable because candidates may remove<br />
scripts from the examination room as they leave.<br />
f. At Laudium Secondary School in Gauteng, a full time candidate had<br />
forgotten to take along his identity document to school and was<br />
required to go home to collect it fifteen minutes prior to the<br />
c. Unused and spoilt answer scripts were left lying around in the<br />
examination room at Kheto Nxumayo in Limpopo. Candidates may<br />
use these answer sheets to scribble notes at the back of the answer<br />
sheets and use them when writing other papers.<br />
commencement of the examination. The candidate contacted his parents<br />
telephonically and the identity document got delivered instead. The chief<br />
invigilator cautioned against such practice in future as it unsettles the<br />
candidate before the commencement of examination and could result<br />
into the candidate being late for the paper.<br />
d. At Rosebank House Damelin the invigilators were uncertain about the<br />
procedure to be followed in the oral component of the Italian and<br />
Spanish question papers.<br />
g. In Western Cape, full-time candidates do not bring identity<br />
documents to the examination room and the invigilators do not know<br />
51
the candidates. This may encourage candidates to behave in an<br />
irregular manner.<br />
scripts was guaranteed at Tshwane North College because the<br />
scripts were placed outside the marking room for one of the marking<br />
personnel to take them to the next marking room.<br />
h. The chief invigilator at Forbes Grant Secondary Senior School was<br />
not present at the school and had not delegated the responsibility<br />
to anyone. Invigilators at the mentioned centre were also not<br />
appointed in writing.<br />
6. Recommendations<br />
6.1 That assessment bodies ensure both full-time and part-time<br />
examination centres have seating plans.<br />
i. Training of markers does not address how markers may detect an<br />
irregularity and also on how to handle them.<br />
6.2 That chief invigilators ensure no scripts are left unattended on desks<br />
towards the end of the examination session.<br />
j. No irregularity registers were kept at the marking centres as per<br />
<strong>Umalusi</strong>’s Directives for Reporting of Irregularities.<br />
6.3 That senior management at schools control the noise levels at<br />
examination centres while the examination is in process.<br />
k. At Sarel Cilliers in KwaZulu-Natal, some markers arrived late, after<br />
training was completed and were not trained.<br />
6.4 That an irregularity record book be kept at examination centres,<br />
circuit offices, district offices and head offices. An entry must be<br />
l. The marking rooms used at Tshwane North College and University of<br />
Witwatersrand JCE were small and could not host all markers of the<br />
same subject. This called for a number of rooms to be used and thus<br />
scripts had to be moved from one room to the other. The safety of<br />
made therein by the person to whom the irregularity report is<br />
provided and the date on which the irregularity is reported be<br />
indicated. This will assist in tracking down the process followed and<br />
the levels at which the irregularity is reported.<br />
52
6.5 Candidates in the Western Cape must be required to take<br />
along identity documents to the examination room as invigilators are<br />
members of the community who do not know them.<br />
7. The capturing and processing of results<br />
7.1 General findings<br />
6.6 Chief invigilators should under no circumstances deny full-time<br />
candidates access to the examination room because they do not<br />
have their identity documents. According to the Policy on Conduct of<br />
Senior Certificate Examination, the invigilator may admit full-time<br />
a. Marks of candidates are captured per question and the computer program<br />
then adds all the marks. If the total written on the script<br />
tallies with that provided by the computer, the capturers are assured<br />
that correct computation has been done.<br />
candidates at a school on producing an admission letter only.<br />
b. Assessment bodies do not only rely on the mark sheets for<br />
6.7 That assessment bodies thoroughly train markers on all aspects<br />
related to marking.<br />
capturing of candidates’ marks. This is a good practice because<br />
computation errors can be detected and corrected during the<br />
capturing process. However, the Beweging Vir Christelike Vlokseie<br />
6.8 That assessment bodies keep the irregularity register at the marking<br />
centres and that the register should bear signature of the departmental<br />
Onderwys (BCVO), Free State and North West capture marks<br />
directly from the mark sheets.<br />
official to whom the irregularities were reported.<br />
c. Assessment bodies appoint and train data capturers. The capturing<br />
6.9 That assessment bodies ensure all markers attend training to ensure<br />
consistent marking.<br />
process does not take longer than five days. A double capture<br />
system is used to ensure accuracy.<br />
53
d. Data capturers are required to sign confidentiality forms before they<br />
can commence with the capturing process. The security at capturing<br />
venues is very tight and no one is allowed to pass though the<br />
security without producing a positive identification.<br />
The errors are communicated to SITA and SITA in turn sends these corrections<br />
to assessment bodies.<br />
7.2 Areas of concern<br />
e. The capturers can access the data only via their user identity and<br />
password. This makes it difficult for unauthorised people to access the<br />
data and if data is tampered with, then the system will assist in tracing<br />
the person who might have tampered with the data. To ensure that<br />
capturing is of high quality, the Free State Department of Education had<br />
two sessions of capturing where the first group that captured marks form<br />
08h00 to 14h00 was replaced by another group that captured from<br />
14h00 to 20h00. This assists in ensuring that the capturers are not<br />
overcome by fatigue such that they would commit errors.<br />
a. Some assessment bodies rely solely on the mark sheets for the<br />
capture of marks. This is a cause for concern as some of the scripts<br />
might have computation error that cannot be detected at any stage of<br />
the capturing process.<br />
7.3 Recommendation<br />
a. All assessment bodies must use the double capture system to ensure<br />
accuracy of candidates’ marks.<br />
f. Services Information Technology Authority (SITA) captures the<br />
adjustments for all assessment bodies. Free State uses GIJIMA, which is<br />
an agent of Services Information Technology Authority. These<br />
adjustments are then sent to the assessment bodies. <strong>Umalusi</strong><br />
personnel check the adjustments to ensure they have been implemented<br />
according to the agreements made during the standardisation meeting.<br />
8. Irregularities<br />
<strong>Umalusi</strong>, as part of its mandate in ensuring that the required standards<br />
in the Senior Certificate Examination are met, monitors the conduct of<br />
this examination. Monitoring the extent to which assessment bodies<br />
handle examination irregularities in accordance with national policy is<br />
54
part of <strong>Umalusi</strong>’s broad function of monitoring the conduct of<br />
examinations. With respect to monitoring irregularities, <strong>Umalusi</strong> utilizes<br />
the following strategies:<br />
The nature of irregularities in 2005 followed an established trend of<br />
similar kinds of irregularities that are reported to <strong>Umalusi</strong> on a yearly basis<br />
and they include the following:<br />
• Deployment of national teams to monitor the conduct of examinations.<br />
• Use of a common monitoring instrument across all assessment bodies.<br />
• Daily reports from assessment bodies.<br />
• Teleconferences with assessment body officials.<br />
8.1 Findings<br />
In general, assessment bodies deal effectively, efficiently and quickly with<br />
irregularities that are defined as “technical” in the regulations. Part of<br />
the reason for this is that there are very clear procedures outlined in the<br />
regulations that assessment bodies must follow in handling this type of<br />
irregularity. Irregularities in this category are fairly easy to deal with.<br />
Furthermore, the recent establishment of the National Examinations<br />
Irregularities Committee (NEIC) has helped both to expedite the process<br />
of dealing with irregularities and created a structured manner of dealing<br />
• Lack of delivery of question papers.<br />
• Candidates writing without positive identification as defined in the<br />
regulations.<br />
• Candidates leaving the examination room before the stipulated time.<br />
• Power failure affecting subjects like Typing, Computyping and<br />
Computer Studies.<br />
• Candidates changing grades at the time of writing.<br />
• Bomb scares.<br />
• Disruptions.<br />
• Rains (especially in Kwazulu-Natal and the Eastern Cape).<br />
• Errors in question papers.<br />
• Negligence by invigilators, such as confusing exam starting times.<br />
• Opening the wrong question paper.<br />
• Use of crib notes.<br />
• Ghost candidates.<br />
uniformly with irregularities. It has also put pressure on assessment bodies<br />
to settle irregularities speedily.<br />
55
There were also irregularities of a more serious nature which the<br />
concessions policy to compensate candidates.<br />
assessment bodies could not finalise quickly because they required more<br />
investigation time or they were, for one reason or the other, out of the<br />
hands of the assessment body concerned. Nonetheless, some of them<br />
were successfully resolved by the assessment body before the<br />
standardisation process. These irregularities include the following:<br />
d. A Computer Studies teacher in the Western Cape Education<br />
Department tempered with the discs containing the candidates’<br />
answers in order to adjust their answers. This was picked up during<br />
marking and the added answers were disregarded. Action is in<br />
progress against the said teacher.<br />
a. A question paper was written prior to the date scheduled at the<br />
Kuruman Prison within the Northern Cape Department of Education.<br />
This irregularity was successfully resolved by the assessment body.<br />
The security and integrity of the examination were not compromised<br />
as the paper did not leak.<br />
e. There were alleged leakages of the Mathematics Higher Grade Paper<br />
2 in Gauteng as well as KwaZulu-Natal Education Departments. Both<br />
assessment bodies could not establish any leaks. Both cases are being<br />
investigated by the police.<br />
b. The Animal Husbandry paper in Limpopo was discovered to be a<br />
replica of the March 2005 paper. This irregularity was successfully<br />
handled by the assessment body who rescheduled the exam to<br />
another date and administered another paper.<br />
f. The Beweging Vir Christelike Volkseie Onderwys’ chief marker<br />
arrived at the marking centre after marking had commenced. She<br />
did not provide any reason nor did she report that she would be<br />
late. When she arrived, she marked scripts from a centre where she<br />
offers extra Biology classes. She was dismissed immediately after<br />
c. The IsiXhosa Literature paper in the Kwazulu-Natal Department of<br />
Education contained setworks that were not prescribed for 2005. The<br />
assessment body called the examination off and applied the <strong>Umalusi</strong><br />
this was discovered and the scripts were remarked. It was<br />
discovered that candidates from that centre were awarded marks<br />
they did not deserve.<br />
56
g. The Independent Examinations Board Volkskool Vryburg reported that<br />
there was possible candidate assistance at Volkskool Vryburg in<br />
Afrikaans 1st Language HG Paper, Biology SG Paper 1,<br />
9.4 There is a general compliance among assessment bodies with the<br />
requirement to establish Provincial Examinations Irregularities<br />
Committees.<br />
Computyping SG and Afrikaans 2nd Language HG Paper 2. Other<br />
irregularities reported pertain to School-Based Assessment and these<br />
were dealt with by the assessment body.<br />
9. Strengths<br />
9.1 The inception of the National Examinations Irregularities Committee<br />
has strengthened the ability of assessment bodies to deal effectively<br />
with irregularities of all kinds.<br />
9.5 There is a significant reduction of incidences of leakages of<br />
examination question paper. As a matter of fact none of the alleged<br />
leakages could be established to be true.<br />
10. Areas of concern<br />
10.1 Some assessment bodies do not comply satisfactorily with the<br />
requirement to submit daily reports to <strong>Umalusi</strong>. The consolidated<br />
reports submitted to <strong>Umalusi</strong> were not in the prescribed format. A<br />
9.2 Assessment bodies are dealing with technical irregularities quickly,<br />
case in point is Limpopo Department of Education.<br />
efficiently and in accordance with the regulations.<br />
10.2 Some technical irregularities that are pretty straightforward remain<br />
9.3 There is a general improvement in the level of compliance with the<br />
unresolved for too long.<br />
requirement to submit daily reports to <strong>Umalusi</strong> and to submit reports<br />
on all irregularities that take place at the various stages of the<br />
examination.<br />
10.3 Some assessment bodies have not complied with the requirement<br />
to establish Provincial Examinations Irregularities Committees.<br />
57
10.4 Alleged question paper leakages do not seem to be resolved<br />
convincingly in the shortest possible time. There is a general<br />
tendency to palm these off to the police.<br />
11. Recommendations<br />
11.1 Assessment bodies are urged to adhere strictly to <strong>Umalusi</strong>’s<br />
Directives for Reporting Irregularities by reporting in accordance<br />
with the requirements thereof<br />
11.5 As per regulations for the Conduct, Administration and<br />
Management of Assessment for the Senior Certificate, paragraph<br />
31 (l) and (m), candidates are required to submit their answer<br />
sheets and any other aid issued to them before they leave the<br />
examination room. The invigilator is required to tick off the name<br />
of the candidate on the mark sheet to confirm the presence<br />
or absence of that candidate. Chief invigilators and<br />
invigilators must adhere to the above-mentioned regulation as<br />
there were cases where candidates left the examination room<br />
with their answer scripts.<br />
11.2 Assessment bodies should put in place mechanisms to ensure quick<br />
and efficient resolution of technical irregularities.<br />
12. Conclusion<br />
11.3 Provincial Examinations Irregularities Committees must be<br />
established in all provinces to facilitate quick and efficient handling<br />
of irregularities.<br />
From the monitors and assessment bodies’ reports, it is evident that all<br />
assessment bodies have systems in place to ensure the smooth running of<br />
examinations. The reports allow <strong>Umalusi</strong> to confidently declare the results<br />
of the 2005 Senior Certificate Examinations credible despite irregularities<br />
11.4 Assessment bodies need to put in place strategies to investigate<br />
question paper leakages in as quick a time as possible.<br />
that were reported. However, assessment bodes must view the need for<br />
the training of examiners and internal moderators in a serious light.<br />
58
<strong>Umalusi</strong> has in place the required quality assurance measures to<br />
monitor irregularities. The bulk of the irregularities reported in 2005 were<br />
of a general nature that is consistent with the types of irregularities that<br />
have been reported over the years. There were, however some serious<br />
irregularities that were reported and conclusively dealt with by assessment<br />
bodies. On the whole, therefore, the assessment bodies have handled<br />
irregularities that occurred in 2005 with a fair measure of success. The<br />
establishment of the National Examinations Irregularities Committee and<br />
its provincial clones has strengthened the arm of assessment bodies in<br />
dealing with irregularities.<br />
59
Chapter 5<br />
Moderation of marking<br />
1. Introduction<br />
The moderation of marking is of critical importance as it to a<br />
large extent determines the standard and quality of marking,<br />
and ensures that marking is conducted in accordance with agreed<br />
practices.<br />
maintained across all examining bodies and throughout the<br />
marking process;<br />
• positive marking practices are reinforced;<br />
• all the systems and processes that relate to marking are in place<br />
and effective;<br />
• immediate feedback during the marking process is provided so that<br />
learners are not unfairly advantaged/disadvantaged; and<br />
<strong>Umalusi</strong> moderates the marking of scripts by deploying external<br />
moderators to marking centres during the marking process, and for<br />
nationally examined subjects this process takes place at <strong>Umalusi</strong>’s offices<br />
• <strong>Umalusi</strong> moderators gain an understanding of the standard and<br />
appropriateness of the examination question paper and the<br />
common problems experienced by candidates.<br />
in Pretoria. External moderators are deployed to the marking centres to<br />
ensure that:<br />
Moderators report comprehensively on their findings, so that <strong>Umalusi</strong> can<br />
evaluate the marking process for the Senior Certificate Examination, and<br />
• the memorandum is correctly interpreted;<br />
• consensus is reached on memorandum interpretation;<br />
• relevant additions and alternatives to be included in the<br />
memorandum are accepted and finalised;<br />
• all markers have a clear understanding of the memorandum<br />
in order to ensure accurate and consistent allocation of marks to<br />
candidates;<br />
• the standard of marking and internal moderation of scripts is<br />
take the necessary steps to ensure the quality and validity of this particular<br />
aspect of the examination process.<br />
2. Approach and scope of the moderation<br />
of marking<br />
<strong>Umalusi</strong> engages in the following during the moderation of marking:<br />
• Pre-marking/Memorandum discussion session;<br />
60
• Centralised moderation of marking; and<br />
• On-site moderation of marking.<br />
2.1 Pre-marking/Memorandum discussion session<br />
For papers set by assessment bodies, moderators are deployed to the<br />
various assessment bodies whilst memo discussions are in progress. In the<br />
2.2 Centralised Moderation of Marking<br />
For this year the six nationally examined subjects were selected for this<br />
process. Assessment bodies were required to submit a sample of 20 HG<br />
and 20 SG scripts to <strong>Umalusi</strong> offices, following the commencement of<br />
marking. The following table provides the range of the sample of scripts<br />
selected for moderation:<br />
case of the nationally examined papers, the memo is finalized at a<br />
national memo discussion that takes place at a central venue organized<br />
Symbol<br />
Number of scripts<br />
by DoE.<br />
A 1<br />
B 1<br />
Moderators are required to ensure that the final memo is appropriate,<br />
does not unfairly/fairly advantage candidates, and allows for various<br />
alternatives where possible.<br />
C 3<br />
D 3<br />
E 4<br />
F 2<br />
The final memorandum is then signed off by the external moderator, prior<br />
to it being sent to assessment bodies.<br />
FF 2<br />
G 2<br />
GG 1<br />
H 1<br />
61
The subjects selected for moderation were:<br />
Seven of the nine public assessment bodies participated in this process. The<br />
Subject<br />
1. English Additional Language<br />
No. of moderators<br />
Western Cape and Gauteng Provinces were excused because their marking<br />
dates were outside <strong>Umalusi</strong>’s planned dates for the moderation of marking.<br />
However, Gauteng sent their Physical Science scripts to <strong>Umalusi</strong> as they were<br />
P1 + 3, HG and SG 3<br />
2. Mathematics P1 and P2,<br />
HG and SG 4<br />
3. Physical Science P1 and P2,<br />
HG and SG 4<br />
4. Biology P1 and P2, HG and SG 4<br />
5. History P1 and P2, HG and SG 4<br />
6. Accounting, HG and SG 3<br />
marking at the same time as <strong>Umalusi</strong>’s programme began.<br />
2.3 On-site Moderation of Marking<br />
Moderators were deployed to marking centres across all assessment<br />
bodies to moderate a sample of 20 HG and 20 SG marked scripts.<br />
Recommendations were provided to the Chief Marker and/or the Internal<br />
Moderator, who in turn ensured that recommendations were implemented.<br />
Moderation of the nationally examined subjects took place for 5 days<br />
from the 28 November – 2 December 2005. Moderators remarked,<br />
Eleven moderators were deployed to all eleven assessment bodies for<br />
eleven subjects. The process took place over two days.<br />
in certain cases the whole scripts, and in others remarked a sample of<br />
questions only.<br />
Feedback in the form of reports were sent to the<br />
The table on the following page provides details of the assessment<br />
assessment bodies on a daily basis in which recommendations for<br />
bodies and subjects moderated.<br />
improving marking were made to assessment bodies. Assessment bodies<br />
were required to implement this feedback immediately where necessary.<br />
62
Assessment body<br />
BCVO<br />
IEB<br />
Eastern Cape<br />
Free State<br />
Gauteng<br />
KZN<br />
Limpopo<br />
Mpumalanga<br />
North West<br />
Western Cape<br />
Subject<br />
Woodwork<br />
Geography<br />
IsiXhosa Primary Language<br />
Computer Studies<br />
IsiZulu Additional Language<br />
Business Economics<br />
TshiVenda Primary &<br />
Additional Language<br />
IsiNdebele Primary<br />
& Additional Language<br />
Needlework<br />
Biblical Studies<br />
3. Purpose of the report<br />
The purpose of this report is to provide a synthesis of the seventeen<br />
reports received by <strong>Umalusi</strong> on the moderation of marking for the six<br />
nationally and eleven provincially examined subjects, in order to highlight<br />
strengths and areas of concern related to marking among the various<br />
assessment bodies.<br />
The following points served as the criteria for moderation of marking:<br />
• Standard of marking;<br />
• Adherence to the marking memorandum;<br />
• Consistency and accuracy in the allocation of marks;<br />
2.4 Deployment of staff members<br />
• Evidence of internal moderation and the rigour thereof; and<br />
• Competency of the markers.<br />
In addition to the deployment of moderators, <strong>Umalusi</strong> deployed staff<br />
members to specific marking centres to ensure that recommendations<br />
were being implemented and also to monitor the process followed<br />
by assessment bodies, with a view to identifying good practice and<br />
providing guidance to assessment bodies that are in need thereof.<br />
4. Findings<br />
4.1 Nationally examined subjects<br />
Generally markers adhered to the memorandum and the standard of<br />
63
marking was good. The marking guideline as agreed upon at the<br />
The following was observed:<br />
central memorandum discussion was adhered to. Although there were some<br />
inconsistencies in mark allocation, these could be ascribed to errors made by<br />
markers and moderators. The variation in mark allocation was small and<br />
made no significant differences. Many of these errors were also picked up<br />
during the internal moderation process. Many of the inconsistencies also<br />
occurred at the beginning of the marking session. This points to the<br />
importance of standardisation before marking commences.<br />
• question 2.2, the grey shading in the graph and key did not print<br />
clearly in some question papers in all provinces<br />
• question 5.1.5, the mark allocation was changed from 4 to 2.<br />
• question 5.1.7, the mark allocation was changed from 2 to 4.<br />
• In KwaZulu-Natal, the layout of the paper was changed. As a result<br />
part of question 3.1 appeared at the foot of page 12 and was<br />
repeated at the beginning of page 13, with a new number. This<br />
With the exception of question 5 in Accounting HG, there were no<br />
altered the numbering of subsequent questions.<br />
changes to the memorandum during the marking process. All assessment<br />
bodies were notified of these additions.<br />
These observations necessitated the following changes:<br />
There was a remarkable consistency in the marks allocated. However<br />
there was some inconsistency in the allocation of marks to the extended<br />
piece of writing in English Paper 3 in KwaZulu-Natal. There was a<br />
• All candidates were given full marks (3) for question 2.2.4.<br />
• The mark allocation for questions 5.1.5 and 5.1.7 were adjusted so<br />
that no candidate was disadvantaged.<br />
tendency towards bunching of marks, with markers not recognizing<br />
exceptionally good pieces of writing and overrating mediocre attempts.<br />
In Physical Science Paper 1 and 2, there were changes made to the<br />
original memo and these were accepted. It was, however, noted that the<br />
For Biology P1 HG, it was observed that the final examination question<br />
paper did not quite represent the final version of the moderated paper.<br />
chief examiners and internal moderators had not received ample time to<br />
hold pre-discussion sessions and also to mark a sample of scripts, as the<br />
64
memo discussion was held two days after the paper had been written.<br />
• In Limpopo a number of issues were identified which had the<br />
potential of compromising candidates. Entering of totals was<br />
For paper 2, recommendations made on the question paper were<br />
implemented except for the following cases:<br />
extremely poor. In HG alone, six questions were unmarked and<br />
markers were not entering marks on mark sheets. Several errors in<br />
totals were found. The quality of internal moderation was very poor.<br />
• Question 8.2 – The question was too difficult for SG. The external moderator<br />
had recommended that it be replaced but this did not happen.<br />
• Question 4.1 – It was suggested that the molecule be drawn as a<br />
straight chain and not a bent chain. This also did not happen.<br />
• In the HG paper, the number of one (1) mark questions were not<br />
substantially reduced as recommended. The external moderator had<br />
called for a reduction because these questions tend to make the paper<br />
too long.<br />
A number of cases were observed where the Chief Examiner<br />
awarded candidates zero, yet they deserved full marks. In some<br />
cases candidates were awarded full marks when they had many steps<br />
missing in their responses. Some candidates were awarded marks for<br />
responses that were not marked.<br />
• In Mpumalanga there was strong evidence that the moderator was<br />
not familiar with the marking memo, and also that markers were not<br />
that competent.<br />
• The external moderator had also requested that Question 8.2.2 be<br />
replaced because of lack of clarity.<br />
In Mathematics HG Paper 2, it was noted that the marking memo was<br />
generally adhered to in all provinces. The marking memo submitted by<br />
In Mathematics Paper 1 SG the following was observed:<br />
North West for HG included five changes to the mark allocation of some<br />
questions and varies from the approved national memo. Four of these<br />
• In KwaZulu-Natal, it was observed that in some cases markers did not<br />
append their initials/signatures to marked questions. Markers tended<br />
changes were approved, as they were consistent with the notes included<br />
in the final memo.<br />
to award learners more marks than they deserved.<br />
65
In the Mathematics SG Paper 2, it was observed that the marking memo<br />
marking and controls.<br />
was generally adhered to with the exception of Mpumalanga, where it<br />
was found that the memo was not strictly adhered to. The deviation of<br />
marks ranged from 1 to +13.<br />
In Physical Science Paper 2, amendments were made mostly to the<br />
SG memo. Question 5.1.1 was asked in both HG and SG, therefore<br />
amendments were also made to the HG memo for this question. These<br />
In English Additional Language Paper 1 and 3, HG and SG, there was<br />
a general consensus that the question papers and memoranda had been<br />
well constructed and that there were no ambiguities.<br />
amendments were considered necessary because candidates were either<br />
penalized for correct answers, or else the correct answers, not being in<br />
the memo, were ignored.<br />
In Physical Science Paper 1 it was observed that four of the eight<br />
provinces did not include the marking memorandum used, which meant<br />
that it was a lot more difficult to ascertain whether marking was<br />
conducted in accordance with the final memo.<br />
In general there was consistency in the allocation of marks and accuracy<br />
of totals in both grades and the internal moderator, chief marker, and<br />
senior markers did a good job of detecting inconsistencies and sorting<br />
them out in the moderation process. There were, however, inconsistencies in<br />
Limpopo HG – questions were marked correctly but marks were not<br />
In both the SG and HG, checking and control of marks and totals was<br />
the most significant weakness noted. Only two provinces had visible and<br />
allocated. Marks were incorrectly transferred to the front cover and to the<br />
computerized sheets.<br />
consistent control in place, while some appeared to have none at all. In<br />
one province there were a number of errors in this respect, some of these<br />
being committed by the moderator. As best as could be ascertained from<br />
the limited evidence, moderation ranged from reasonable to excellent.<br />
All provinces showed evidence of internal moderation with all the<br />
questions being moderated by the internal moderator. It was clear that<br />
all provinces had internal moderation procedures in place.<br />
Overall, the Eastern Cape and Western Cape produced the best<br />
66
Judging from the candidates’ responses to questions, the question papers<br />
learner was required to also give the topic of the essay.<br />
were deemed to be fair. The questions were clear and easily understood,<br />
and the candidates had a fair opportunity to exhibit their knowledge. The<br />
papers were of an appropriate standard. In English Second Language<br />
HG P1, the candidates’ responses revealed that levels of proficiency in<br />
English were generally low. In Accounting, it is noted that many HG<br />
candidates could not answer the ratio/theory type of questions. The<br />
formulae were often not even known.<br />
There was evidence of internal moderation in that senior markers<br />
moderated their groups and in turn the deputy chief marker moderated<br />
his/her number of scripts. All internally moderated scripts were written<br />
‘moderated’ and signed by the internal moderator.<br />
6. Strengths<br />
5. Provincially examined subjects<br />
6.1 The memo discussions are constructive with the chief markers<br />
participating freely, and in good spirit.<br />
Generally the marking memorandum agreed upon by the chief marker,<br />
senior markers and markers was followed strictly. However, some new<br />
alternatives from candidates’ responses were also considered and added<br />
6.2 The chief markers are conscientious about sticking to the<br />
memorandum during marking.<br />
to the existing memorandum. For example in IsiZulu Additional Language<br />
P3 HG in Gauteng, question,1 more detail was added under each of the<br />
listed sub-categories for the letter, e.g. word division, and spelling under<br />
6.3 It is clear that all assessment bodies have internal moderation<br />
procedures in place during the marking session.<br />
language, paragraphing under style, relevance to the theme of the letter<br />
under content, etc. In question 2.2, the topic of the letter (for1 mark) was<br />
added. In question 3.3, an example of what an acceptable Curriculum<br />
6.4 The external moderators found no evidence of any irregularities in<br />
the sample drawn from the provinces.<br />
Vitae may possibly look like, was added. In question 1.3 and 1.4, the<br />
67
6.5 The marking process was well controlled.<br />
7.2 The bunching of marks was a problem in some cases, and<br />
this could probably also be traced to insufficient training at the<br />
6.6 The chief marker was supported immediately by the external<br />
moderator on issues of marking.<br />
beginning of the marking session. An uncertain marker tends to take<br />
the safe route and avoids awarding very high or very low marks.<br />
6.7 Candidates’ responses were included as alternative answers to the<br />
memorandum.<br />
7.3 Chief markers and internal moderators need to take the task of the<br />
Examination Assistants very seriously, as several cases were found<br />
where the adjusted mark had not been captured in the total. Every<br />
6.8 Accuracy in the calculation and recording of marks improved as the<br />
chief marker became vigilant.<br />
script should be checked for computational errors. This issue was<br />
prevalent in the Limpopo and Free State province in English,<br />
Physical Science and History.<br />
6.9 Marks for each question were entered in a pencil on the cover of<br />
the answer sheet and re-written with a red pen only after being<br />
7.4 The levels of proficiency in English is generally low.<br />
verified by the Examination assistant and the Chief marker.<br />
7. Areas of concern<br />
7.1 The external moderators were not convinced that every examining<br />
body devoted sufficient time to training before markers began marking<br />
and this issue requires urgent attention.<br />
7.5 In Accounting the following exceptions were noted:<br />
- The theory (HG and SG), ratio and analysis (HG only) markers are<br />
not familiar with this type of questioning and thus candidates are<br />
often penalized.<br />
- Theory questions/opinions also needed to be marked in terms of<br />
the candidate’s calculations. Some conclusions of candidates were<br />
absolutely correct in terms of their working and were not given<br />
68
full credit.<br />
- Totals and sub-totals were also method marked with an inspection<br />
on the part of the markers. This was often not adhered to – either<br />
every total was just marked or in other cases these marks were not<br />
awarded.<br />
- Method marks for the final answers in the ratio calculations were<br />
subject to the fact that at least one of the items reflected in the ratio<br />
were correct. This was not always adhered to and candidates<br />
and preparing the memo.<br />
- There is a lack of consistency in the use of different coloured pens.<br />
For example, in Biology in one script you find red, green and<br />
black pens being used by the marker, senior marker and chief<br />
marker respectively. In another script, from the same province,<br />
green would represent the highest level of moderation. In North<br />
West province, internal moderators used blue pens. This should<br />
be discouraged since most learners write in blue.<br />
were awarded the method marks with no item being accurate.<br />
- The concept of foreign items was not consistently applied.<br />
- Markers did not always mark the calculations/reconciliation in<br />
question 2 HG and question 4 SG fairly.<br />
- In Physical Science Paper 1, it was reported that the memo<br />
discussion meeting was held two days after the paper was written.<br />
This afforded members very little time to carry out pre-discussion<br />
marking and in most cases members had no opportunity to do any<br />
marking as they could not source any scripts at all.<br />
- The memorandum submitted for external moderation is, in many<br />
cases, not the finalized version. In Physical Science Paper 2 SG,<br />
the memo submitted to the external moderator was not complete.<br />
Thus the first day of discussion meeting was devoted to finalizing<br />
7.6 In isiNdebele, in Mpumalanga the external moderator mentioned<br />
the fact that out of the 34 moderated scripts, a number of computational<br />
errors were identified, e.g.<br />
- Script No.: B0000001947771 was given 32 instead of 37<br />
- “ B0000001947750 was given 63 instead of 68<br />
- “ B0000001947711 was given 37 instead of 53<br />
- “ B0000001947775 was given 55 instead of 65<br />
- “ B84041757856083 was given 49 instead of 52<br />
In the HG P2 candidates performed poorly, especially in questions<br />
1, 2 and 3. These questions were based on poetry. Indications are<br />
that candidates were either not properly taught or did not<br />
understand poetry at all. In another instance candidates did not<br />
69
ead or understand instructions and sub-instructions on the question<br />
paper.<br />
7.9 In Gauteng, IsiZulu Additional Language P3 HG, there was a<br />
tendency to award marks somewhat too generously. For example, in<br />
question 2.2 (letter) a learner was allocated 5 out of seven marks for<br />
7.7 In the Free State, Computer Studies SG P2, one centre (Tweeling) had<br />
a problem with stiffy drives and printers. Several of the scripts were<br />
structure despite the fact that his/her letter contained no salutation, no<br />
conclusion and no paragraphing. In another instance in question 2.3<br />
incomplete.<br />
Evidence was found in two centres of possible<br />
a learner was awarded 15 out of 20 marks for dialogue despite the<br />
irregularities. Printouts of several candidates with identical incorrect<br />
answers were found, as well as cases where the core of an incorrect<br />
answer was identical with that of another candidate although some of<br />
the surrounding material (labels, names, exam numbers, etc.) were<br />
different. The similarities in one of these cases were of such a nature<br />
that the possibility of guidance in the large group cannot be excluded.<br />
fact that his/her entire answer was given in essay form.<br />
8. Recommendations<br />
8.1 More time should be spent on training and standardisation so that<br />
markers are well equipped before marking commences. The<br />
photocopying of a sufficient number of specimen papers should be<br />
7.8 In the North West, the external moderator for Needlework and<br />
Clothing, observed that the number of facts that the candidates<br />
budgeted for. The training session should continue until an<br />
acceptable level of standardisation has been achieved.<br />
were to give in their responses were not always specified. For<br />
example, “Why should garments be lined” [5]. The five stands for<br />
the number of marks, and this could confuse learners and lead them<br />
to giving five reasons.<br />
8.2 The variation in mark allocation should be kept to a minimum.<br />
Chief markers and internal moderators should play a crucial role in<br />
this regard.<br />
70
8.3 Examination assistants, under the supervision of the senior markers,<br />
must check every single script for accuracy of total and transferring<br />
8.8 Selection and appointment of markers in both Limpopo and<br />
Mpumalanga needs to be reconsidered.<br />
of marks to the cover page.<br />
8.9 Markers need to ensure that they apply the method of consistent<br />
8.4 In order to assist second language learners, bilingual answer books<br />
accuracy in the marking of all questions.<br />
should be avoided. Answer books should have two versions<br />
separately – one for English and one for Afrikaans. English and<br />
Afrikaans headings should not be combined in one column.<br />
8.10 Internal moderators should remain at the marking center for the<br />
complete duration of the marking so as to give the necessary<br />
support to the chief marker.<br />
8.5 Memo discussion meetings should not be held too close after the<br />
papers have been written.<br />
8.11 Curriculum implementers/Subject advisors should also be part of<br />
the marking process as monitors.<br />
8.6 Questions that require candidates to give a certain number of facts<br />
should always clearly indicate how many facts/items.<br />
8.12 Assessment bodies must ensure that they have sufficient examination<br />
assistants for checking computational issues.<br />
8.7 Sophisticated language should be avoided as language is a<br />
serious problem to the English Second Language learners.<br />
8.8 Examiners should strive to implement changes suggested by the<br />
external moderator.<br />
71
9. Conclusion<br />
Monitoring of marking centres by <strong>Umalusi</strong> staff revealed that administration<br />
and logistical arrangements are commendable. Venues were found to be<br />
conducive to marking in that there was sufficient security in and around<br />
the marking centres. Marking space was adequate. No unauthorized<br />
person was allowed into the marking venue.<br />
On the whole the process of moderation of marking has ensured that the<br />
standard of marking and moderation is appropriate and there was great<br />
improvement at all levels of marking.<br />
72
Chapter 6<br />
Report on the standardisation of the 2005 Senior Certificate results<br />
1. Introduction<br />
<strong>Umalusi</strong> standardises both the examination marks and the CASS scores<br />
presented by the different schools in the country. Standardisation of the<br />
examination marks is necessary to address the variation in the standard<br />
of the question papers and marking that may occur from year to year and<br />
across examining bodies. Through the standardisation and the other<br />
quality assurance processes, <strong>Umalusi</strong> aims to ensure that the Senior<br />
Certificate Examination yields results that are comparable across years<br />
and across examining bodies. The standardisation of examination marks<br />
is based on the principle that when the standard of examinations from one<br />
year to the next are equivalent, and they are taken by a sufficiently large<br />
body of candidates, then their statistical mark distributions should correspond.<br />
The same should hold across different examining bodies and<br />
across different subjects, taking into account any historical differences that<br />
may exist in the schooling of candidates across examining bodies and<br />
any intrinsic differences that may exist between subjects, and between the<br />
higher and standard grade levels.<br />
controlled tests, class work, assignments, practical work, projects, etc. In<br />
order to ensure that every mark submitted by an institution is a reliable and<br />
valid indication of the performance of the learner, <strong>Umalusi</strong> carries out a<br />
statistical moderation of the CASS marks. As the assessment and<br />
moderation capacities of the schools improve, the emphasis on statistical<br />
moderation of CASS marks will hopefully be reduced.<br />
2. Principles of standardisation<br />
Statistical moderation of examination marks consists of comparisons between<br />
the current mark distributions and the norms for the corresponding subjects.<br />
Pairs analysis and quantile graphs are also used in the process. The pairs<br />
analysis compares the mean marks in two subjects taken by the same group<br />
of candidates, whereas the quantile report compares the percentiles of these<br />
two distributions. These analyses are based on the principle that, as a group,<br />
the performances of the same candidates in two related subjects (taken at the<br />
same level) should show close correspondence. On the basis of all these<br />
comparisons, marks are either not adjusted or they are adjusted upwards or<br />
downwards by specific amounts over defined mark ranges.<br />
Continuous assessment (CASS) is the assessment that is conducted<br />
internally by a school or college. This will include class tests, term/<br />
73
The following are some of the rules that were employed in the<br />
written nationally.<br />
For the sixth subject, History, which was written<br />
standardisation of the 2005 examination results:<br />
• No adjustments in excess of 10%, either upwards or downwards,<br />
would be applied, except in exceptional cases.<br />
nationally for the first time in 2003, the raw mark distributions for<br />
2003 and 2004, and the adjusted mark distribution for 2004 were<br />
used as a guideline for the adjustments.<br />
• Generally, no upward adjustments to above the norm, or downward<br />
adjustments to below the norm, would be applied.<br />
• Generally, no marks would be adjusted downwards to below 75 marks<br />
in the case of standard grade subjects, or to below 100 marks in the<br />
case of higher grade subjects. This is to ensure that no candidate fails a<br />
subject because of downward adjustment to the raw marks.<br />
• National norm for first languages<br />
All first Languages were standardised against a pre-determined<br />
national norm. A 3% distinction rate and a 7% failure rate were the<br />
norm for English and Afrikaans, First Language, Higher Grade. For<br />
English and Afrikaans, First Language, Standard Grade. a 0,3%<br />
distinction rate and a 2% failure rate was the norm. A 0,5%<br />
Three categories of norms were used in the standardisation of the 2005<br />
Senior Certificate results:<br />
distinction rate and a 7% failure rate was the norm for all African<br />
languages, First Language. However, the 0,5% was not used as the<br />
upper limit if the distinction rate exceeded this, provided it was not<br />
• Norms for provincially set question papers<br />
These are norms based on the average of the raw mark distributions<br />
of a particular province in a subject over the last five years.<br />
above 3%. Furthermore, in cases where the first language results<br />
have, in recent years, shown significant departures from the national<br />
norm, it has not been strictly applied. The Statistics Working Group<br />
will be recommending to Council that the national norms be replaced<br />
• Norms for nationally set question papers<br />
by historical norms from 2006.<br />
These norms were based on the average of the raw mark distributions<br />
over the years 2001 to 2004 since the first five subjects have been<br />
74
2.1 Standardisation of CASS marks<br />
Finally, the adjusted CASS mark of each candidate is combined with the<br />
adjusted examination mark in the ratio of 25:75, i.e. CASS constitutes<br />
Statistical moderation of CASS marks is undertaken per subject and<br />
per institution. The mean and standard deviation of the standardised<br />
examination marks (written paper) and those of the CASS marks are<br />
used in this process. The mean and standard deviation of the CASS<br />
marks are compared, per subject, per institution, with those of the<br />
standardised examination marks of the same candidates. Generally,<br />
the CASS marks are adjusted (up or down) to a mean of 5% above<br />
that of the standardised examination marks and to a corresponding<br />
standard deviation. If the CASS mean falls within a certain range<br />
above the examination mean plus 5%, then the CASS marks may be<br />
accepted as is, or adjusted downwards towards this mean. If the<br />
standard deviation of the CASS marks is too low (relative to that of the<br />
adjusted examination marks), indicating that the teacher was unable to<br />
distinguish between candidates’ performances and give a good<br />
spread of CASS marks, then the CASS marks are rejected and each<br />
candidate in the class is assigned a new CASS mark of exactly 5%<br />
above his or her examination mark.<br />
25% of the final mark of the candidate.<br />
3. Standardisation meetings<br />
In contrast to previous years, when the standardisation meetings were<br />
convened and chaired by the assessment bodies (the National and<br />
Provincial Departments of Education, the IEB and BCVO), for 2005 they<br />
were convened and chaired by <strong>Umalusi</strong>. There were a total of 12<br />
meetings, the first being the national meeting to consider the six common<br />
subjects (at higher and standard grade), which took place at the Sheraton<br />
Hotel chaired by the Chairperson of <strong>Umalusi</strong>’s Council. All the<br />
provincial meetings and those with the IEB and BCVO were held at<br />
<strong>Umalusi</strong>’s offices. Members of <strong>Umalusi</strong>’s Council chaired the meetings.<br />
<strong>Umalusi</strong>’s Statistics Working Group responsible for the mark adjustment<br />
process attended the national meeting as one team, but split into two<br />
teams of three members each for the other meetings. The latter meetings<br />
were organised in two parallel streams, with one team of statisticians<br />
attending each. In this way all the meetings could be completed in three<br />
days, instead of four days as in previous years.<br />
75
3.1 The national standardisation meeting<br />
were effected in each of the provinces for subjects with enrolments greater<br />
than 500.<br />
The meeting was conducted in a cordial spirit, and after a debate on the<br />
results of each subject, the statistical adjustments were agreed upon by<br />
both the assessment body and by <strong>Umalusi</strong>.<br />
“Computer adjustment” means that the marks were adjusted upwards or<br />
downwards by the computer to be brought in line with the norm (or as far<br />
as possible within the 10% limit). “Raw and computer adjustment”<br />
In contrast to previous years, the 2005 raw marks were all lower than<br />
their four-year averages, and History had its lowest marks to date. As a<br />
generally means that marks were adjusted upwards on to the norm and,<br />
when the norm was below the raw marks, the latter were retained. The<br />
result, upward adjustments were applied to all subjects.<br />
In most cases<br />
last two columns generally mean that marks were either not adjusted as<br />
the upward adjustments were relatively modest and below the allowed<br />
maximum of 10%, except for Physical Science HG, where this maximum<br />
increase was applied to the raw marks in the range 30% to 62%. For<br />
Mathematics HG, fairly high upward adjustments were required to get the<br />
mark distribution on to the norm, whereas for Mathematics SG the<br />
maximum upward adjustment was less than 3%.<br />
3.2 Assessment body meetings<br />
The meetings with all the Assessment bodies were conducted in a<br />
far as the norm or, rarely, to beyond it.<br />
3.3 Standardisation of CASS marks<br />
CASS marks were programmatically adjusted in accordance with the<br />
principles agreed for CASS standardisation. A detailed analysis of the<br />
CASS marks will be carried out early in the new year to establish whether<br />
or not the reliability of these marks is improving. However, one<br />
indicator of the quality of CASS is the percentage of CASS marks that<br />
have low standard deviations and are therefore rejected.<br />
congenial spirit. The table on page 77, summarises the adjustments that<br />
76
5. Recommendations for 2005<br />
6. Conclusion<br />
As mentioned earlier, the national norms for first languages no longer seen<br />
to be relevant, and the Statistics working group will be recommending that<br />
they be replaced by historical norms, as is the case with other subjects.<br />
The meetings took place in a good spirit, and while there were sometimes<br />
differences in opinion between the <strong>Umalusi</strong> team and the team from the<br />
examination authority, satisfactory decisions were reached in all cases.<br />
The Statistics Working Group is satisfied that the final marks (whether raw<br />
or adjusted) represent a fair reflection of the candidates’ performance,<br />
Table of Adjustments for 2005<br />
while maintaining the standard of the Senior Certificate Examination.<br />
Assessment body Raw Computer adjustment Raw and Computer adjustment Downward adjustment Upward adjustment<br />
KwaZulu-Natal 27 13 6 1 5<br />
Mpumalanga 10 6 9 0 1<br />
Gauteng 11 0 12 2 1<br />
North West 17 19 1 6 6<br />
Northern Cape 22 14 1 0 1<br />
Eastern Cape 8 10 8 0 2<br />
Western Cape 63 4 4 5 3<br />
Limpopo 5 9 11 6 0<br />
Free State 40 19 1 0 0<br />
IEB 51 0 0 3 14<br />
BCVO 22 0 0 0 8<br />
Total 276 94 53 23 41<br />
77
Chapter 7<br />
Conclusion<br />
<strong>Umalusi</strong> has, in the last year and a half, been grappling with the issue of<br />
standards in the Senior Certificate Examination. This serious attempt to<br />
The processes used by <strong>Umalusi</strong> to quality assure examinations are<br />
the following:<br />
determine the quality and standard of the Senior Certificate Examination<br />
came about partly as a result of <strong>Umalusi</strong>’s own internal review processes<br />
but also from public concerns about the standards of this examination.<br />
The rising pass rates were interpreted to mean that standards in the<br />
examination were declining.<br />
1. Moderation of question papers<br />
2. Moderation of continuous assessment<br />
3. Monitoring the conduct of examinations<br />
4. Moderation of marking<br />
5. Standardization of assessment outcomes<br />
<strong>Umalusi</strong> conducted an investigation in 2004 to determine the standard<br />
and quality of the Senior Certificate Examination. This study yielded very<br />
valuable lessons for <strong>Umalusi</strong>. Among other things, it pointed out the need<br />
to strengthen processes in place for the quality assurance of assessment.<br />
It also underscored the need to give attention to the cognitive challenge<br />
in question papers. This research culminated in concerted efforts to<br />
sharpen, streamline and strengthen the quality assurance of assessment<br />
processes used by <strong>Umalusi</strong>. In addition to all these efforts, <strong>Umalusi</strong><br />
conducted an intensive evaluation of all assessment bodies’ Examinations<br />
and Assessments Directorate in 2005. This has given <strong>Umalusi</strong> detailed<br />
As a result of all the efforts that <strong>Umalusi</strong> has put into sharpening the tools<br />
it uses to monitor standards, the 2005 examinations were conducted in<br />
line with relevant policies and regulations and the quality and standard<br />
was beyond reproach. The cognitive demand in examination question<br />
papers in general, and the nationally set ones in particular, increased<br />
considerably. Question papers included more questions requiring higher<br />
order thinking skills. However, <strong>Umalusi</strong> still has concerns about the<br />
standard and quality of back-up papers and the quality and rigor of<br />
internal moderation of question papers in general.<br />
knowledge and insight into each of the 11 assessment bodies, systems<br />
and processes related to examinations and assessment.<br />
<strong>Umalusi</strong> also reviewed and strengthened its processes and procedures for<br />
the quality assurance of continuous assessment (CASS). As a result of this,<br />
78
an improvement was observed in the quality and standard of continuous<br />
assessment. <strong>Umalusi</strong>, however, remains concerned about the quality of<br />
CASS tasks and the level of internal moderation.<br />
<strong>Umalusi</strong> also moderated the marking of the Senior Certificate<br />
Examinations. This was done in two ways: centralised moderation and<br />
on-site moderation. With centralised moderation of marking, which was<br />
conducted for the six national subjects, assessment bodies send a sample<br />
The strongest link in the system is the quality of management of the administration<br />
of examinations. The system generally runs smoothly in spite of<br />
the examination system being such a massive operation. The ability of the<br />
assessment bodies to make this huge system work is truly admirable.<br />
of 20 scripts representing a spectrum of candidate attainment in all<br />
grades in all the selected subjects. During the marking period,. external<br />
moderators visit the marking centre and moderate a sample of scripts at<br />
the marking centre. Both processes feed immediately and directly into the<br />
marking process. In 2005 marking was conducted smoothly and was of<br />
The handling of irregularities within assessment bodies has improved. This<br />
the required standard.<br />
is partly due to the fact that in 2005 <strong>Umalusi</strong> issued Directives for<br />
Reporting of Irregularities which, among other things, requires assessment<br />
bodies to report irregularities to <strong>Umalusi</strong> within 48 hours. Furthermore,<br />
<strong>Umalusi</strong> had weekly teleconferences with officials within the assessment<br />
bodies to monitor irregularities and other examination-related issues. In<br />
addition to this, daily reports were received from assessment bodies.<br />
<strong>Umalusi</strong> also watched media reports carefully and followed them up with<br />
The 2005 standardisation process was a welcome “meeting of minds”.<br />
<strong>Umalusi</strong>’s standardisation principles are now well understood and<br />
accepted by assessment bodies. They also apply the same principles in<br />
considering subjects for adjustment. Both the national and provincial<br />
standardisation processes were conducted in a collegial spirit and the<br />
outcomes were educationally justifiable.<br />
assessment bodies. On the whole, assessment bodies handled minor<br />
irregularities that occurred, in line with the regulations.<br />
The 2005 Senior Certificate Examination, therefore, satisfied all <strong>Umalusi</strong><br />
requirements for a reasonably fair, valid and reliable examination.<br />
79
37 General Van Ryneveld Street Persequor<br />
Technopark • Pretoria • South Africa<br />
Tel: +27 (12) 349 1510 • Fax: +27 (12) 349 1511