23.02.2015 Views

Machine Learning - DISCo

Machine Learning - DISCo

Machine Learning - DISCo

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Given:<br />

rn Instance space X: Each instance describes a pair of objects represented by the predicates Type,<br />

Color, Volume, Owner, Material, Density, and On.<br />

rn Hypothesis space H: Each hypothesis is a set of Horn clause rules. The head of each Horn<br />

clause is a literal containing the target predicate SafeToStack. The body of each Horn clause<br />

is a conjunction of literals based on the same predicates used to describe the instances, as<br />

well as the predicates LessThan, Equal, GreaterThan, and the functions plus, minus, and<br />

times. For example, the following Horn clause is in the hypothesis space:<br />

Sa f eToStack(x, y) t Volume(x, vx) r\ Volurne(y, vy) A LessThan(vx, vy)<br />

rn Target concept: SafeToStack(x,y)<br />

rn Training Examples: A typical positive example, SafeToStack(Obj1, ObjZ), is shown below:<br />

On(Objl.Obj2)<br />

Owner(0bj I, Fred)<br />

Type(0bj I, Box)<br />

Owner(Obj2, Louise)<br />

Type(Obj2, Endtable) Density(0bj 1 ,0.3)<br />

Color(Obj1, Red)<br />

Material(Obj1, Cardboard)<br />

Color(Obj2, Blue)<br />

Material (Obj2, Wood)<br />

Volume(Objl,2)<br />

Domain Theory B:<br />

SafeToStack(x, y) c -Fragile(y)<br />

SafeToStack(x, y) c Lighter(x, y)<br />

Lighter@, y) c Weight(x, wx) A Weight(y, wy) r\ LessThan(wx, wy)<br />

Weight(x, w) c Volume(x, v) A Density(x,d) A Equal(w, times(v, d))<br />

Weight(x, 5) c Type(x, Endtable)<br />

Fragile(x) c Material (x, Glass)<br />

Determine:<br />

rn A hypothesis from H consistent with the training examples and domain theory.<br />

TABLE 11.1<br />

An analytical learning problem: SafeToStack(x,y).<br />

As shown in Table 11.1, we have chosen a hypothesis space H in which<br />

each hypothesis is a set of first-order if-then rules, or Horn clauses (throughout<br />

this chapter we follow the notation and terminology for first-order Horn clauses<br />

summarized in Table 10.3). For instance, the example Horn clause hypothesis<br />

shown in the table asserts that it is SafeToStack any object x on any object y, if<br />

the Volume of x is LessThan the Volume of y (in this Horn clause the variables<br />

vx and vy represent the volumes of x and y, respectively). Note the Horn clause<br />

hypothesis can refer to any of the predicates used to describe the instances, as well<br />

as several additional predicates and functions. A typical positive training example,<br />

SafeToStack(Obj1, Obj2), is also shown in the table.<br />

To formulate this task as an analytical learning problem we must also provide<br />

a domain theory sufficient to explain why observed positive examples satisfy the<br />

target concept. In our earlier chess example, the domain theory corresponded to<br />

knowledge of the legal moves in chess, from which we constructed explanations

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!