12.07.2015 Views

chapters 1-4 - University of Queensland

chapters 1-4 - University of Queensland

chapters 1-4 - University of Queensland

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Evaluation and the Pedagogy-Space-Technology FrameworkWho are the evaluators? Analysis <strong>of</strong> the literature (reflected in thecolloquium) finds that the most prolific evaluatorsare librarians. Library staff members are activemanagers <strong>of</strong> their spaces and are constantlyengaged with their users or clients. In general,librarians are committed to ongoing evaluationwith <strong>of</strong>ten highly developed and standardisedinstruments. It is worth pointing out, howeverthat part <strong>of</strong> the goal in this case is the evaluation<strong>of</strong> library services, with only a portion relatingto issues <strong>of</strong> space design. However in seekingto understand the changing usage patterns,librarians are <strong>of</strong>ten in the position <strong>of</strong> having longterm data upon which to draw. This provides solidempirical evidence for change.Interesting differences in perspective are evidentin the evaluations <strong>of</strong> space performed by othergroups. Some <strong>of</strong> the papers presented hereare authored by space builders (both architectsand technologists) who have a vested interestin improving their solutions as briefs becomemore standardised. Academics who studytertiary teaching practice are also becoming moreactive along with a smattering <strong>of</strong> independentresearchers.When should space be evaluated? It is not surprising that since the majority <strong>of</strong>evaluative studies presented in this collection wereconducted post occupancy and over a relativelyshort term that we can be beguiled into thinking<strong>of</strong> ‘evaluation’ only in this narrow sense. Howeverother views <strong>of</strong> evaluation were raised duringthe discussion sessions that find support in thepapers.Deborah Terry (UQ Deputy Vice Chancellor Teachingand Learning) argued strongly for proper evaluationtechniques to be applied to projects in the proposalstage in order to facilitate proper competitive fundingdecisions. This kind <strong>of</strong> evaluation <strong>of</strong> space looksto not just answer the question <strong>of</strong> how we shoulddesign and build space but also where should webuild and if we should build at all. This view foundbroad agreement with contributors pointing also tothe role <strong>of</strong> projects such as UQ ACTS (AdvancedConcept Teaching Space) and Stanford’s WallenbergHall as experimental spaces whose ongoingevaluation was designed to inform the institution’sdecision making regarding pedagogy, space andtechnology design across a range <strong>of</strong> projects.Mitchell, Winslett and Howell (<strong>Queensland</strong> <strong>University</strong><strong>of</strong> Technology) present a comprehensive plan forevaluation in an evolving and experimental space(Lab 2.0) that takes this approach. Both Graves(Griffith <strong>University</strong>) and Lee (Swinburne) presentperspectives on pre-build evaluation and Andrewsand Powell (UQ) sought to illuminate the ways inwhich issues uncovered in evaluation were directlyapplied to succeeding projects <strong>of</strong> the same genre.Others argued for longer term evaluation <strong>of</strong> spaceto address ongoing environmental concerns. Thisprompts a broader question as to why matureteaching spaces were not subject to regular andsustained evaluation in the same way as moreclosely managed learning spaces such as libraries.What should be evaluated? Before moving to the question <strong>of</strong> methodology, itis instructive to review the goals <strong>of</strong> the evaluation.There are a variety <strong>of</strong> motivators for undertakingevaluation. Some appraisals appear to have thegoal <strong>of</strong> validating a newly completed project andby extension, arguing for the creation <strong>of</strong> similarspaces. Simpler measures such as head countsand multiple choice user satisfaction questionsare <strong>of</strong>ten the mainstay <strong>of</strong> these surveys. Bycontrast, research projects or design studiesaimed at informing ongoing development typicallystrive to uncover more detail, both by targetingempirical measures and probing with open endedquestioning and focus groups.Usage is <strong>of</strong> course a fundamental measure. Itwould be a brave or foolish project managerwho would argue success in the face <strong>of</strong> meagreoccupancy, so gate numbers will always have aplace. However, understanding patterns <strong>of</strong> useover time is increasingly recognised as useful. Dopatterns change hour-by-hour or shift notablybetween early, mid and later weeks <strong>of</strong> semester?Geographic patterns, such as understandingwhere users have been prior to entering theevaluated space and where they are headed canalso be immensely useful in campus planning.Several <strong>of</strong> the studies presented attempted to gaina more sophisticated insight into the patterns <strong>of</strong> an example) and this should be seen as an axis <strong>of</strong>study which is growing in importance.28 NEXT GENERATION LEARNING SPACES

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!