06.05.2013 Views

pdf 820Kb - INSEAD CALT

pdf 820Kb - INSEAD CALT

pdf 820Kb - INSEAD CALT

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Evaluation report of the use of Onto-Logging<br />

platform in the user site<br />

Deliverable ID: D8b<br />

Page : 8 of 110<br />

Version: 1.0<br />

Date: 27 january 2004<br />

Status: Final<br />

Confid.: Public<br />

effect of the evaluation on the dynamic of exec ution of a project (with the bureaucratic<br />

syndrome).<br />

The approach we have decided to adopt in this document is to concentrate as much as<br />

possible on the dimension of the evaluation that is related to the substantive value<br />

(effectiveness) of the knowledge that is generated (in order work the effectiveness of the<br />

approach), and to support the other dimensions when this first dimension can not be<br />

adequately covered.<br />

2.2 Evaluating What?<br />

2.2.1 Levels of evaluation<br />

Donald Kirkpatrick, who started his work on learning evaluation in 1959, proposes four<br />

levels of evaluation of training programmes (Donald Kirkpatrick, 1996). Level 1, measures if<br />

the Learner liked the training (Did Learners like it?); Level 2 measures if the learning<br />

objectives have been met (Did Learners learn?); Level 3 measures the transfer of skills back<br />

to the job (Are they using it?); Level 4 measures the impact on the business (Did it matter?).<br />

We believe that this model of evaluation, which addresses both the efficiency and the<br />

effectiveness aspects of training programmes, can easily be transposed to the evaluation of<br />

systems in general, and in particular in systems that include a strong technical component<br />

(such as it is the case for the Ontologging system).<br />

We will therefore rely for the evaluation, to the four Kirkpatrick levels, plus two additional<br />

levels (technicalities & economics) that appear to be useful for evaluating technical systems:<br />

• Level 0: Technicalities. Do the approach/system perform technically well?<br />

• Level 1: Users’ acceptance: do they like the approach/system?<br />

• Level 2: Assessment: Do the management of knowledge approach/system function?<br />

• Level 3: Transfer: Is the approach/system used?<br />

• Level 4: Impact: Measures the impact in supporting the organizational processes<br />

(Does it deliver substantive value to the Organization?)<br />

• Level 5: Economics. Do the approach/system performs economically well?<br />

Level 0: Do the system performs technically well? This level reflects the performance of<br />

the system according to a technical perspective. It covers elements such as speed, scalability,<br />

reliability, (technical) flexibility, simplicity (capability to evolve) and openness (ability to<br />

interoperate with other systems).<br />

Level 1: Users’ acceptance (do they like it)? This level reflects the perception of the system<br />

by the user and is sometimes called the “smile sheet”. It is important to note that a good<br />

perception by the user do not guaranty that the system is useful (for instance, very nice<br />

looking graphics or gizmos will please the user, but will not contribute to his performance).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!