itec - European Schoolnet
itec - European Schoolnet
itec - European Schoolnet
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
iTEC Project<br />
Title: ITEC-D10_2_V1-1 041102012.Docx<br />
implementation mechanisms according to this analysis. We describe below the results of applying<br />
the above-mentioned methodology to the design of the SDE, identifying the design decisions taken<br />
for each level:<br />
• Level 1. Definition of the Object of Decision. From the four decision targets identified in<br />
MCDM (choice, sorting, ranking and describing the items), the SDE addresses the<br />
relevance calculation and ranking of elements according to their estimated utility. It should<br />
be noted that we have both to identify candidate resources to perform an activity and to<br />
provide a sorted list of resources according to their relevance.<br />
• Level 2. Criteria modelling (definition of families of criteria). To identify the factors to<br />
take into account for relevance calculation, we thoroughly analysed all the properties of<br />
each resource type included in the semantic model to select those that may have an actual<br />
impact in resource selection. Once the collection of relevant properties was selected, we<br />
drew up a document to be presented to the Control Board to be discussed with the aim of<br />
providing an indication of the importance of each factor (cf. Appendix II). We then defined<br />
for each of the selected properties a formal criterion enabling the quantitative evaluation of<br />
the resource according to that property, as stated by the methodology. According to the<br />
formal definition of criterion in Section 1.2.1.1, the SDE will take as the general evaluation<br />
scale the interval . In this way, the criterion will take a value within this interval when<br />
considering users’ ratings or explicit numerical values (e.g. tool, people or event ratings,<br />
tool functionality, people’s expertise). For other criteria, we will adopt as a general rule the<br />
following solution: criterion will take value if the option analysed has the worst<br />
possible rating; if the criterion cannot be assessed because there is not enough<br />
information available; and if the option is totally relevant according to criterion .<br />
For example, the criterion of the cost factor applied to a tool ( ), will be assigned value<br />
<br />
when the tool is completely free of charge in the corresponding Technical<br />
Setting; <br />
when cost information is not available; and <br />
when tool usage<br />
is not free. Besides, when the object being assessed has some relevance according to a<br />
given criterion, this will always take a value in the range , to stress the difference<br />
between a clearly not recommended object ( ) and other objects. There are two<br />
basic strategies to compute these values:<br />
o Based on the number of properties (non-weighted properties): given a non-weighted<br />
property 13 on which we will establish a criterion, we measure how many relevant<br />
values are included in the resource.<br />
<br />
<br />
(4)<br />
This strategy is used, among others, for the criterion associated with the factor<br />
language applied to tools ( ). According to it, if a LARG Context is developed in<br />
English and Spanish, the assigned value will be <br />
for a tool<br />
<br />
13 A non-weighted property is a property defining a specific characteristic of the resource without taking into<br />
account specific numerical values. It supports relations such as resource X is available in language L or is<br />
targeted to audience A.<br />
Page 43/96