02.03.2014 Views

Response to Assessment of Generic Skills Discussion Paper Thank ...

Response to Assessment of Generic Skills Discussion Paper Thank ...

Response to Assessment of Generic Skills Discussion Paper Thank ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Response</strong> <strong>to</strong> <strong>Assessment</strong> <strong>of</strong> <strong>Generic</strong> <strong>Skills</strong> <strong>Discussion</strong> <strong>Paper</strong><br />

<strong>Thank</strong> you for the opportunity <strong>to</strong> provide an institutional response <strong>to</strong> the <strong>Assessment</strong> <strong>of</strong> <strong>Generic</strong><br />

<strong>Skills</strong> discussion paper. QUT has chosen <strong>to</strong> selectively respond <strong>to</strong> several questions posed within the<br />

document.<br />

What fac<strong>to</strong>rs should guide the design <strong>of</strong> performance measurement instruments <strong>to</strong> assess generic<br />

skills?<br />

For several years, QUT has argued <strong>to</strong> the Commonwealth Government that the assessment <strong>of</strong><br />

institutional excellence within performance schemes, and any accompanying allocations <strong>of</strong> reward<br />

funding, should not be executed with a level <strong>of</strong> precision that exceeds the ability <strong>of</strong> the metrics <strong>to</strong><br />

differentiate performance given validity, error and other limitations. This principle is paramount<br />

regardless <strong>of</strong> the measures ultimately chosen because performance measurement is inherently a<br />

flawed process. All direct and indirect performance metrics have limitations that prevent conclusive<br />

comparison <strong>of</strong> outcomes and full understanding <strong>of</strong> the multiple causal fac<strong>to</strong>rs <strong>of</strong> excellence. Error,<br />

artefact, reliability, validity, utility, context and social value are constant challenges.<br />

National higher education performance assessment in recent years has largely depended on selfreport<br />

survey feedback and government datasets as sources <strong>of</strong> evidence. QUT feels that a national<br />

performance system will be more robust if it encourages multiple sources <strong>of</strong> performance evidence<br />

and does not leave itself vulnerable <strong>to</strong> a weakness in any one methodology or approach. It is in this<br />

context that the potential exists for the Collegiate Learning <strong>Assessment</strong> (CLA), or a similar direct<br />

assessment <strong>of</strong> generic skills, <strong>to</strong> provide a new and different approach <strong>to</strong> assessment <strong>of</strong> generic skills<br />

provisioning, and <strong>to</strong> pr<strong>of</strong>fer insight in<strong>to</strong> any institutional value-add <strong>to</strong> the generic competencies <strong>of</strong><br />

students at the end <strong>of</strong> their degrees.<br />

The sec<strong>to</strong>r should not rush the implementation, but rather recognise that this is a relatively new<br />

paradigm in performance assessment within Australian higher education that may have limited<br />

discrimina<strong>to</strong>ry value if poorly executed. It is anticipated that it will take a couple <strong>of</strong> years for the<br />

process <strong>to</strong> ‘bed down’ and for any initial model <strong>to</strong> be trialled and refined, even given the opportunity<br />

for Australia <strong>to</strong> draw upon the experiences <strong>of</strong> other countries utilising the CLA and similar<br />

approaches. The Commonwealth Government should plan accordingly and with consideration <strong>of</strong><br />

AHELO experiences.<br />

QUT has provided further feedback on appropriate principles for development <strong>of</strong> performance<br />

measurements in response <strong>to</strong> the Development <strong>of</strong> Performance Measurement Instruments in Higher<br />

Education <strong>Discussion</strong> <strong>Paper</strong>.<br />

What level <strong>of</strong> student participation is desirable and for what purposes? What design features or<br />

incentives would encourage student participation?<br />

A direct assessment instrument needs <strong>to</strong> be authentic and deep <strong>to</strong> be a reasonable test <strong>of</strong> selected<br />

generic skills. The CLA appears <strong>to</strong> take this seriously, but the length and sophistication <strong>of</strong> the<br />

instrument necessitates a significant investment in time by students despite little intrinsic<br />

motivation or relative advantage. Valuing <strong>of</strong> the CLA by employers may, in time, help increase the<br />

value <strong>of</strong> individual CLA outcomes <strong>to</strong> students.


A comment on discipline specific assessments<br />

QUT would like <strong>to</strong> highlight a project that may <strong>of</strong>fer leverage <strong>to</strong> any attempts <strong>to</strong> include disciplinespecific<br />

assessments in the development <strong>of</strong> a broader assessment <strong>of</strong> generic skills. The Sec<strong>to</strong>r-wide<br />

model for assuring final year subject and program achievement standards through inter-university<br />

moderation project led by Kerri-Lee Krause and Ge<strong>of</strong>f Scott involves eight universities and aims <strong>to</strong><br />

produce resources capable <strong>of</strong> supporting inter-institutional moderation for assuring final-year<br />

subject and program achievement standards. As stated in the project application:<br />

This project will yield a validated, robust approach for assuring subject achievement standards<br />

through inter-university moderation in common final year subjects across disciplines. It will also trial<br />

approaches for moderating and assuring program achievement standards, building on ALTC<br />

discipline standards. Project resources will assist universities <strong>to</strong> implement sustainable, selfregula<strong>to</strong>ry<br />

moderation processes for moni<strong>to</strong>ring subject and program standards.<br />

Is ‘value-add’ an appropriate measure <strong>of</strong> generic skills?<br />

The capacity for a national system <strong>to</strong> delimit any institutional value-add <strong>to</strong>wards the generic skills <strong>of</strong><br />

students is challenging at best, arguably spurious if a worst-case scenario occurs whereby the<br />

approach is implemented without adequate modelling and fine-tuning. QUT considers that valueadded<br />

skill assessment has the potential <strong>to</strong> be a ‘reasonable’ measure provided several issues,<br />

including the following five, are considered:<br />

<br />

<br />

<br />

<br />

<br />

Firstly, the benefit <strong>of</strong> using ATAR/entry-rank-type entry scores as a proxy for baseline<br />

competencies is questionable. These scores are a summation <strong>of</strong> prior academic performance<br />

across a diversity <strong>of</strong> disciplines. The relevance <strong>of</strong> generic versus specific skills in achieving the<br />

academic success in this single score is unknown and not readily determined.<br />

Secondly, adult learning does not only occur in educational institutions. Individuals continue<br />

<strong>to</strong> develop generic competencies during their degree – partially through the educational<br />

institution, but also as part <strong>of</strong> general learning from daily experiences. Therefore, challenges<br />

in separating value-add attributable <strong>to</strong> university experience from value-add attributable <strong>to</strong><br />

personal growth are inherent in the model.<br />

Thirdly, the role <strong>of</strong> previous institutions and the learner in establishing lifelong, self-directed<br />

learning capability further muddies the ability <strong>to</strong> attribute value-add <strong>to</strong> a current institution.<br />

How much <strong>of</strong> the value-add is attributable <strong>to</strong> lifelong learning skills developed prior <strong>to</strong><br />

university studies (e.g. in high school) or through self-development?<br />

Fourthly, the value-add model appears founded upon very simplistic interpretations <strong>of</strong><br />

generic skills and competencies development. The concepts <strong>of</strong> adult learning and education<br />

have progressed far beyond simple models involving quanta <strong>of</strong> knowledge transferred via<br />

instruction as held sway in the middle <strong>of</strong> last century. A depth and breadth <strong>of</strong> literature<br />

concerning adult education, lifelong learning and deeper learning is essential context for a<br />

thoughtful implementation. While the CLA and other similar approaches are situated within<br />

literature acknowledging more recent constructions <strong>of</strong> learning, the usage <strong>of</strong> the CLA within<br />

any performance measurement system would arguably result in a misalignment between<br />

these richer underpinnings and usage <strong>to</strong> quantify systemic value-add.<br />

Fifthly, the CLA is designed <strong>to</strong> measure critical thinking, analytic reasoning, problem solving<br />

and written communication skills. If the purpose <strong>of</strong> the value-add assessment is <strong>to</strong><br />

determine the extent <strong>to</strong> which individuals have extended generic skills competencies that<br />

are valued for societal and workplace contribution, then the applicability <strong>of</strong> these skills <strong>to</strong><br />

workplace success needs <strong>to</strong> be demonstrated. Validating the design with industry is<br />

essential.<br />

2


Despite shortcomings, QUT considers that the CLA is a more reasonable and direct test <strong>of</strong> a select<br />

number <strong>of</strong> generic skills than the old Graduate <strong>Skills</strong> Test. The approach <strong>of</strong> the former involving<br />

respondents solving a complex problem drawing from a range <strong>of</strong> artefacts is considered more<br />

informative than the multiple choice approach <strong>of</strong> the latter (that has formed part <strong>of</strong> AHELO). The<br />

CLA is better suited <strong>to</strong> development and improvement purposes. Arguably, prior efforts <strong>to</strong> calibrate<br />

the CLA instrument will help support meaningfully comparison <strong>of</strong> cohorts <strong>to</strong> provide insights for<br />

improvement.<br />

Ultimately, directly measuring generic skills does not mean that the sum <strong>of</strong> outputted skills minus<br />

inputted skills is a direct measure <strong>of</strong> added value. Rather, it is simply a variation in scores from<br />

different points in time that needs <strong>to</strong> be interpreted against a range <strong>of</strong> confounding fac<strong>to</strong>rs. As a<br />

concluding comment, QUT would like <strong>to</strong> include the following quote by Educational Testing Service<br />

President and CEO Kurt Landgraf as a summation <strong>of</strong> the role <strong>of</strong> value-add assessment in an overall<br />

performance model:<br />

Results from value-added models should not serve as the primary basis for making consequential<br />

decisions. Other measures must be included in any fair and valid teacher evaluation system. (Banta<br />

& Pike as cited in Klein et al., 2007)<br />

QUT believes that the system will benefit from ongoing robust discussion within the sec<strong>to</strong>r and a<br />

willingness <strong>to</strong> review the success <strong>of</strong> the measure in future years.<br />

Klein, S. Benjamin, R., Shavelson, R., & Bolus, R. (2007). The Collegiate Learning <strong>Assessment</strong>: Facts<br />

and fantasies. Evaluation Review, 31(5), 415–439.<br />

Dr Sam Nielsen<br />

Direc<strong>to</strong>r, Reporting and Analysis<br />

Queensland University <strong>of</strong> Technology<br />

GPO Box 2434, Brisbane, Qld 4001<br />

Email s.nielsen@qut.edu.au<br />

Tel 07-3138 5085<br />

3

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!