12.07.2015 Views

Inspecting the Foundations - Umalusi

Inspecting the Foundations - Umalusi

Inspecting the Foundations - Umalusi

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Because <strong>the</strong> GETC for adults has an extensive internal assessment component – it constitutes 50%of <strong>the</strong> fi nal mark – <strong>Umalusi</strong> also undertakes to verify that <strong>the</strong> SBA is of a suitable standard. In 2008,<strong>the</strong> moderation of internal assessment was conducted in two selected learning areas – LanguageLiteracy and Communication: English and Ma<strong>the</strong>matical Literacy – in each of <strong>the</strong> nine provincialDoEs as well as <strong>the</strong> IEB. <strong>Umalusi</strong>’s decision to moderate <strong>the</strong> internal assessment of <strong>the</strong>se learningareas was motivated by <strong>the</strong> decline in recent years in <strong>the</strong> results obtained by learners in <strong>the</strong>selearning areas: both <strong>the</strong>se learning areas are fundamental to <strong>the</strong> teaching and learning processand it is, <strong>the</strong>refore, necessary to focus on <strong>the</strong>se two learning areas to improve overall standardsand pass rates. The two learning areas were also selected on account of high enrolments in <strong>the</strong>sesubjects. Budgetary constraints prevented <strong>Umalusi</strong> from moderating <strong>the</strong> internal assessment ofo<strong>the</strong>r learning areas.The purpose of <strong>the</strong> moderation of SBAs is to ensure that <strong>the</strong>y comply with <strong>the</strong> national guidelinesand to establish <strong>the</strong> scope and <strong>the</strong> extent of <strong>the</strong> reliability of <strong>the</strong> SBAs. <strong>Umalusi</strong> also undertakesto verify that <strong>the</strong> assessment bodies’ internal moderation of <strong>the</strong> SBAs has taken place, and wasof a suitable standard. <strong>Umalusi</strong> believes it is important to report back to <strong>the</strong> assessment bodyconcerned – and to <strong>the</strong> Minister – on <strong>the</strong> quality of SBA. The moderation process is also intendedto identify problem areas in <strong>the</strong> implementation of <strong>the</strong> SBAs, and to recommend solutions to <strong>the</strong>problems identifi ed.The three-part process of <strong>the</strong> SBA moderation is described more fully in <strong>the</strong> 2008 report, as are<strong>the</strong> fi ndings. The most important fi nding is still that <strong>the</strong> standard of <strong>the</strong> SBAs varied from provinceto province, from district to district and from centre to centre, suggesting that <strong>the</strong> assessmentbodies are far from sharing a common understanding of <strong>the</strong> pre-requisite level for teaching andlearning at GETC Level. While most of <strong>the</strong> assessment bodies have provincial policy documents oninternal assessment that outline <strong>the</strong> minimum requirements for internal assessment and moderationprocesses, <strong>the</strong>re is still a huge gap between policy and practice. When <strong>the</strong>se policy documents orguidelines documents were requested by <strong>the</strong> external moderators, most centre managers, internalmoderators, and departmental offi cials did not have <strong>the</strong>m at hand. In addition, <strong>the</strong> monitoring andevaluation provided by provincial and district offi cials in most cases is not effective and does notgive appropriate support to new educators.In 2008, all provincial departments of education used <strong>the</strong> nationally-set SBA tasks, which were notexternally moderated by <strong>Umalusi</strong>. The use of <strong>the</strong> nationally-set tasks was a small improvement on<strong>the</strong> quality of some of <strong>the</strong> tasks in some learning areas, but some tasks had to be totally reworkedbefore <strong>the</strong>y were given to educators for implementation. No defi nite directive was given to <strong>the</strong>provinces in terms of <strong>the</strong> implementation of <strong>the</strong>se tasks, and <strong>the</strong> provinces were at liberty toimplement <strong>the</strong> tasks as <strong>the</strong>y were, or to subject <strong>the</strong>m to pre-moderation processes. This left <strong>the</strong>majority of learners at <strong>the</strong> mercy of <strong>the</strong> provinces, districts, and centres. Some provinces checked<strong>the</strong> tasks and made <strong>the</strong> necessary changes, but o<strong>the</strong>rs didn’t. Many learners were <strong>the</strong>reforeexposed to <strong>the</strong>se tasks including <strong>the</strong> original mistakes, which had an adverse effect on <strong>the</strong> learners’abilities to prepare for <strong>the</strong> examination. So, while a wide variety of assessment tasks were used, inmost cases <strong>the</strong> tasks had numerous mistakes. Fur<strong>the</strong>rmore, little or no training is given to educatorsin terms of <strong>the</strong> purpose of <strong>the</strong>se assessment tools, which means <strong>the</strong>y are inconsistently applied.<strong>Umalusi</strong> also annually monitors <strong>the</strong> ABET Level 4/GETC examination to ensure that it conforms to<strong>the</strong> established standards that defi ne quality examinations. To this end, <strong>Umalusi</strong> verifi es all <strong>the</strong>preparatory arrangements for <strong>the</strong> examination. It also uses a variety of approaches to monitor <strong>the</strong>writing of <strong>the</strong> examination. Finally, <strong>Umalusi</strong> ensures that all procedures for aggregating scores and<strong>the</strong> moderating, computing, and capturing of fi nal results are strictly adhered to. Collectively, all<strong>the</strong> monitoring approaches, methods, and procedures ensure a credible examination.In 2008, <strong>the</strong> examination monitoring exercise extended across <strong>the</strong> ten assessment bodies, namely<strong>the</strong> nine provincial bodies and <strong>the</strong> IEB. Even though <strong>the</strong> scope of <strong>the</strong> monitoring exercise was verylimited due to budgetary constraints, <strong>Umalusi</strong>’s approach to monitoring <strong>the</strong> examination entailed36

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!