23.02.2015 Views

Machine Learning - DISCo

Machine Learning - DISCo

Machine Learning - DISCo

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

C : KnowMaterial v -Study C: KnowMaterial V 7SNdy<br />

C : Passh v ~KnawMaferial C : PIISS~ V 1KnowMafcrial<br />

I<br />

I<br />

FIGURE 10.2<br />

On the left, an application of the (deductive) resolution rule inferring clause C from the given clauses<br />

C1 and C2. On the right, an application of its (inductive) inverse, inferring Cz from C and C1.<br />

that occurs in C1 but not in C must be the literal removed by the resolution rule,<br />

and therefore its negation must occur in C2. In our example, this indicates that C2<br />

must contain the literal -D. Hence, C:! = A v -D. The reader can easily verify<br />

that applying the resolution rule to C1 and C2 does, in fact, produce the desired<br />

resolvent C.<br />

Notice there is a second possible solution for C2 in the above example. In<br />

particular, C2 can also be the more specific clause A v -D v B. The difference<br />

between this and our first solution is that we have now included in C2 a literal<br />

that occurred in C1. The general point here is that inverse resolution is not<br />

deterministic-in general there may be multiple clauses C2 such that C1 and C2<br />

produce the resolvent C. One heuristic for choosing among the alternatives is to<br />

prefer shorter clauses over longer clauses, or equivalently, to assume C2 shares no<br />

literals in common with C1. If we incorporate this bias toward short clauses, the<br />

general statement of this inverse resolution procedure is as shown in Table 10.6.<br />

We can develop rule-learning algorithms based on inverse entailment operators<br />

such as inverse resolution. In particular, the learning algorithm can use<br />

inverse entailment to construct hypotheses that, together with the background<br />

information, entail the training data. One strategy is to use a sequential covering<br />

algorithm to iteratively learn a set of Horn clauses in this way. On each<br />

iteration, the algorithm selects a training example (xi, f (xi)) that is not yet covered<br />

by previously learned clauses. The inverse resolution rule is then applied to<br />

- -- - -<br />

1. Given initial clauses C1 and C, find a literal L that occurs in clause C1, but not in clause C.<br />

2. Form the second clause Cz by including the following literals<br />

TABLE 10.6<br />

Inverse resolution operator (propositional form). Given two clauses C and Cl. this computes a clause<br />

C2 such that C1 A Cz I- C.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!