28.11.2012 Views

Innovation and Ontologies

Innovation and Ontologies

Innovation and Ontologies

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Ontology Engineering 213<br />

1.1.2.3 Gómez-Pérez’ Work on Evaluation<br />

One of the more interesting research issues in ontology design<br />

is the formal evaluation of the created ontology.<br />

(Gómez-Pérez, Juristo & Pazos, 1995)<br />

With various contributions since 1995, Asunción Gómez-Pérez explores some general questions<br />

on evaluation, e.g. which elements should be evaluated <strong>and</strong> by which set of criteria. She also lists<br />

possible types of errors made when domain knowledge is structured in an ontology (Gómez-<br />

Pérez, 1995; Gómez-Pérez, Juristo & Pazos, 1995; Gómez-Pérez, 2001; Gómez-Pérez, 2004).<br />

The following paragraphs give a structured overview of the mentioned aspects.<br />

The general questions guiding an ontology evaluation, introduced in 1995 as ‘primary ideas’, give an<br />

overview on the reasons <strong>and</strong> contents of ontology evaluation (Gómez-Pérez, 1995). Although far<br />

from being a guideline, Gómez-Pérez’ general questions are essential for any evaluation.<br />

Why?<br />

What?<br />

When?<br />

How?<br />

Where?<br />

Who?<br />

Against what?<br />

Guiding Questions for Ontology Evaluation<br />

To guarantee endusers the correctness <strong>and</strong> completeness of the ontologies’ definitions,<br />

documentation <strong>and</strong> software.<br />

• intermediate or final definitions<br />

• set of definitions<br />

• documentation<br />

• software environment<br />

• iterative process<br />

• performed during <strong>and</strong> between phases of the Lifecycle<br />

• early detection of wrong, incomplete, or missed definitions<br />

st<strong>and</strong>ard techniques<br />

anywhere, e.g. at the lab (technical evaluation) or at the end user location (assessment)<br />

• development team (technical properties of the definitions)<br />

• other development teams (technical properties)<br />

• endusers (utility within a given organization or by other software agents)<br />

• ontology evaluation: against a frame of reference (a set of competency questions,<br />

requirements, or the real world)<br />

• software evaluation: against its requirements<br />

table 56 Questions guiding evaluation (based on Gómez-Pérez, 1995)<br />

Objective of the evaluation process is to determine the degree of correctness of the ontology.<br />

Looking at the scope of definitions <strong>and</strong> axioms, the extent of inference has to be figured out. To<br />

guide evaluation, the following criteria were identified: consistency, completeness, conciseness,<br />

exp<strong>and</strong>ability <strong>and</strong> sensitiveness.<br />

Consistency describes whether valid input data can result in contradictory conclusions:<br />

“A definition is consistent if <strong>and</strong> only if the individual definition is consistent <strong>and</strong> no<br />

contradictory sentences can be inferred from other definitions or axioms.” (Gómez-Pérez, 2004).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!