12.07.2015 Views

Inspecting the Foundations - Umalusi

Inspecting the Foundations - Umalusi

Inspecting the Foundations - Umalusi

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

In June and October 2008, for example, <strong>Umalusi</strong> standardized all 23 learning areas examined by<strong>the</strong> DoE and <strong>the</strong> six learning areas examined by <strong>the</strong> IEB. <strong>Umalusi</strong> only standardizes if more than80% of <strong>the</strong> results have been captured and are thus available for <strong>the</strong> standardization process.In both examination sessions in 2008, all subjects were available for standardization. In <strong>the</strong> Juneexamination, for 16 of <strong>the</strong> learning areas, <strong>the</strong> raw marks were accepted, while <strong>the</strong> raw marks fornine of <strong>the</strong> learning areas were accepted for <strong>the</strong> November exam.In <strong>the</strong> statistical moderation process for <strong>the</strong> 2008 GETC: ABET, comparisons between <strong>the</strong> currentmark distributions and <strong>the</strong> mark distributions of <strong>the</strong> previous years since 2001 are used for <strong>the</strong> basisof standardization decisions. Pairs analyses are also used to compare <strong>the</strong> mean marks in twolearning areas taken by <strong>the</strong> same group of candidates. These analyses are based on <strong>the</strong> principlethat, as a group, <strong>the</strong> performances of <strong>the</strong> same candidates in two related learning areas (takenat <strong>the</strong> same level) should show close correspondence. On <strong>the</strong> basis of all <strong>the</strong>se comparisons,toge<strong>the</strong>r with qualitative reports from chief markers and internal and external moderators, marksare ei<strong>the</strong>r not adjusted, or are adjusted upwards or downwards by specifi c amounts over defi nedmark ranges. The major rules that are employed in <strong>the</strong> standardization of examination results are asfollows:• No adjustments in excess of 10%, ei<strong>the</strong>r upwards or downwards, are applied, except inexceptional cases; and• In <strong>the</strong> case of <strong>the</strong> individual candidate, <strong>the</strong> adjustment effected should not exceed 50% of <strong>the</strong>mark obtained by <strong>the</strong> candidate.<strong>Umalusi</strong> and <strong>the</strong> DoE have agreed each to have pre-standardization meetings separatelybefore meeting toge<strong>the</strong>r for <strong>the</strong> standardization meeting. These meetings are used by <strong>Umalusi</strong>to interrogate <strong>the</strong> statistics supplied by <strong>the</strong> DoE. The DoE uses <strong>the</strong> pre-standardization meetingsto draft its proposals for adjustments, whilst <strong>Umalusi</strong> drafts provisional responses to probablerequests for adjustment. This process of having preparatory meetings to review <strong>the</strong> standardizationproposals of <strong>the</strong> o<strong>the</strong>r party appears to be providing greater stability in <strong>the</strong> last of <strong>the</strong> critical qualityassurance processes prior to <strong>the</strong> confi rmation of results.4.3.3 THE STATE OF ASSESSMENT FOR ADULTS AT NQF LEVEL 1 GETCIt is now eight years since <strong>the</strong> implementation of <strong>the</strong> fi rst ABET Level 4 examinations, quality assuredby <strong>Umalusi</strong>, and <strong>the</strong>re are defi nite indications that <strong>the</strong> assessments in most of <strong>the</strong> learning areasare improving. The written examination still forms <strong>the</strong> core of <strong>the</strong> whole examination because ofits relatively reliable nature, and, though well conducted, still does not always receive <strong>the</strong> rigorousattention it deserves from <strong>the</strong> assessment bodies. The internal moderation of some of <strong>the</strong> questionpapers remains questionable, which impacts negatively on <strong>the</strong> standard.The reliability of <strong>the</strong> internal assessment component of <strong>the</strong> examination remains a matter ofongoing concern, but <strong>the</strong>re are signs of improvement in respect of <strong>the</strong> quality of <strong>the</strong> tasks,structure, and presentation of portfolios. Overall, though, <strong>the</strong> implementation and managementof <strong>the</strong> SBA tasks continues to remain at an unacceptable level. For that reason, building andexpanding <strong>the</strong> capacity of <strong>the</strong> current corps of adult educators must be seen as one of <strong>the</strong> keypriorities to ensure effective growth and stability in <strong>the</strong> sector. The professional development andconditions of service of <strong>the</strong>se ‘foot soldiers’ should be addressed as a matter of urgency.Some areas of concern, raised in <strong>the</strong> 2008 <strong>Umalusi</strong> report on <strong>the</strong> GETC exams, are worth re-iterating.In certain learning areas such as Applied Agriculture and Agricultural Technology; Economic andManagement Sciences; and Ma<strong>the</strong>matical Literacy, Ma<strong>the</strong>matics and Ma<strong>the</strong>matical Sciences,ABET candidates continue to perform singularly poorly. The DoE has agreed that <strong>the</strong> standard ofteaching in <strong>the</strong>se learning areas is often poor (<strong>the</strong> reasons for this situation are raised in Section 4.5below).38

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!