06.03.2013 Views

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

know f ? These questions can be answered with the help of computational<br />

learning theory.<br />

Mother (sita, lob)<br />

The principle of computational learning theory states that any<br />

hypothesis, which is sufficiently incorrect, will be detected with a high<br />

probability after experimenting with a small number of training instances.<br />

Consequently, a hypothesis that is supported by a large number of problem<br />

instances will be unlikely to be wrong <strong>and</strong> hence should be Probably<br />

Approximately Correct (PAC).<br />

The PAC learning was defined in the last paragraph w.r.t. training<br />

instances. But what about the validity of PAC learning on the test set (not the<br />

training set) of data. The assumption made here is that the training <strong>and</strong> the test<br />

data are selected r<strong>and</strong>omly from the population space with the same<br />

probability distribution. This in PAC learning theory is referred to as<br />

stationary assumption.<br />

[13].<br />

Gr<strong>and</strong>father (X, Z) ∨ ¬ Mother (Y, Z ) ∨ ¬ Father (X, Y)<br />

Gr<strong>and</strong>father ( X, lob) V ¬ Father (X, sita)<br />

Father (janak, sita)<br />

Gr<strong>and</strong>father (janak , lob )<br />

In order to formalize the PAC learning theory, we need a few notations<br />

Let X = exhaustive set of examples,<br />

{Y / sita , Z / lob}<br />

{ X / janak }<br />

Fig. 13.14: A two step inverse resolution to derive: Mother (Y, Z) ∧<br />

Father (X, Y) →Gr<strong>and</strong>father (X,Z).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!