06.05.2013 Views

pdf 820Kb - INSEAD CALT

pdf 820Kb - INSEAD CALT

pdf 820Kb - INSEAD CALT

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Evaluation report of the use of Onto-Logging<br />

platform in the user site<br />

Deliverable ID: D8b<br />

Page : 47 of 110<br />

Version: 1.0<br />

Date: 27 january 2004<br />

Status: Final<br />

Confid.: Public<br />

Concerning the quantitative evaluation of aspects that would be more semantic web oriented<br />

(semantic domain definition, semantic knowledge capture, and semantic navigation), much<br />

less previous experience is available (than evaluation accordin g to an Information Retrieval<br />

perspective).<br />

We can first indicate some work in this direction related to semantic disambiguation that was<br />

done by (Missikoff, Navigli and Velardi, 2002), and which measured the performance of<br />

different disambiguating heuristics, and more generally all the work on Ontology building,<br />

and ontology learning and population that has developed over recent years. However, the<br />

perspective of these approaches remains very technical, and is quite far from a more semantic<br />

web and cognitive vision that were adopted by the ontologging users and that we would like<br />

to promote in this project.<br />

A probably better perspective to explore is the domain Organizational Memory research<br />

(Abecker et al. 2003), which is more in line with higher-level knowledge management<br />

concepts, but for which evaluation still would need more elaboration (see for instance<br />

(Weinberger, Teeni and Frank 2003) for some work in this direction, for instance in<br />

evaluating the completeness of a manual knowledge population).<br />

Fina lly, an even more promising approach (but also far fetched) is the approach of knowledge<br />

management emphasizing cognitive and social factors (Thomas, Kellogg, and Erickson,<br />

2001). Along this line is all the work related to social translucence, which rely on tools<br />

monitoring quantitatively the knowledge activity of a whole community. Obviously, the<br />

monitoring of the activity via the usage log represents a source of quantitative data that could<br />

easily be exploited (actually it is exploited by the Ontologging OKE) by quantitative<br />

evaluation, and in particular related to the (not only usage retrieval) usage of the system.<br />

4.4.3 Some prospective operationalization of “SMART” quantitative evaluation<br />

Let’s now indicate some direction of how we could build some quantitative evaluation for the<br />

Ontologging project. In the illustration that will be given later will try to follow the SMART<br />

principle and be Specific, Measurable, Attainable, Relevant and Time-bound.<br />

4.4.3.1 Ontology building<br />

The first quantitative evaluation could relate to Ontology building, and could assess the<br />

capacity of the Ontology approach and tools to support the elaboration of the domain<br />

Ontology.<br />

Different quantitative indicator could be used here:<br />

• The time to design or redesign the Ontology. How many hours, days, or weeks would be<br />

necessary to design the main Ontology, or some sub-ontology?<br />

• The complexity of the designed ontology. What is the level of complexity of the<br />

Ontology being elaborated (number of concepts, number of properties associated to this<br />

concept, level of nestledness / how need is the inheritance hierarchy).<br />

• Quality of the resulting Ontology (redundancy, ambiguity, etc.).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!