23.02.2015 Views

Machine Learning - DISCo

Machine Learning - DISCo

Machine Learning - DISCo

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

y first order Horn clauses to learn a set of Horn clauses that approximate<br />

the target function. FOCL employs a sequential covering algorithm, learning<br />

each Horn clause by a general-to-specific search. The domain theory is used<br />

to augment the set of next more specific candidate hypotheses considered<br />

at each step of this search. Candidate hypotheses are then evaluated based<br />

on their performance over the training data. In this way, FOCL combines<br />

the greedy, general-to-specific inductive search strategy of FOIL with the<br />

rule-chaining, analytical reasoning of analytical methods.<br />

0 The question of how to best blend prior knowledge with new observations<br />

remains one of the key open questions in machine learning.<br />

There are many more examples of algorithms that attempt to combine inductive<br />

and analytical learning. For example, methods for learning Bayesian belief<br />

networks discussed in Chapter 6 provide one alternative to the approaches discussed<br />

here. The references at the end of this chapter provide additional examples<br />

and sources for further reading.<br />

EXERCISES<br />

12.1. Consider learning the target concept GoodCreditRisk defined over instances described<br />

by the four attributes HasStudentLoan, HasSavingsAccount, Isstudent,<br />

OwnsCar. Give the initial network created by KBANN for the following domain<br />

theory, including all network connections and weights.<br />

GoodCreditRisk t Employed, LowDebt<br />

Employed t -1sStudent<br />

LowDebt t -HasStudentLoan, HasSavingsAccount<br />

12.2. KBANN converts a set of propositional Horn clauses into an initial neural network.<br />

Consider the class of n-of-m clauses, which are Horn clauses containing m literals<br />

in the preconditions (antecedents), and an associated parameter n where n m.<br />

The preconditions of an n-of-m Horn clause are considered to be satisfied if at least<br />

n of its m preconditions are satisfied. For example, the clause<br />

Student t LiveslnDorm, Young, Studies; n = 2<br />

asserts that one is a Student if at least two of these three preconditions are satisfied.<br />

Give an algorithm similar to that used by KBANN, that accepts a set of<br />

propositional n-of-m clauses and constructs a neural network consistent with the<br />

domain theory.<br />

12.3. Consider extending KBANN to accept a domain theory consisting of first-order<br />

rather than propositional Horn clauses (i.e., Horn clauses containing variables, as in<br />

Chapter 10). Either give an algorithm for constructing a neural network equivalent<br />

to a set of Horn clauses, or discuss the difficulties that prevent this.<br />

12.4. This exercise asks you to derive a gradient descent rule analogous to that used by<br />

TANGENTPROP. Consider the instance space X consisting of the real numbers, and<br />

consider the hypothesis space H consisting of quadratic functions of x. That is,

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!