06.03.2013 Views

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

Artificial Intelligence and Soft Computing: Behavioral ... - Arteimi.info

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

of precision of data <strong>and</strong> certainty of knowledge can be modeled with the<br />

notion of probability, let us consider example 9.1.<br />

Example 9.1: Consider the following production rule PRi.<br />

PRi: { IF (the-observed-evidences-is-rash) (x ),<br />

AND (the-observed-evidences-is-fever) (y),<br />

AND (the-observed-evidences-is-high-pain) (z ),<br />

THEN (the-patient-bears-German-Measles)} (CF )<br />

where x, y, z denote the degree of precision / beliefs / conditional probabilities<br />

that the patient bears a symptom, assuming he has German Measles. On the<br />

other h<strong>and</strong>, CF represents the degree of certainty of the rule or certainty factor/<br />

probability that the patient bears German Measles, assuming the prior<br />

occurrence of the antecedent clauses. It is, now, clear that the same problem<br />

can be modeled by many of the existing techniques; of course, for the purpose<br />

of reasoning by a technique we need a particular kind of parameter.<br />

9.2.1 Bayesian Reasoning<br />

Under this context, we are supposed to compute P( Hi / Ej ) or P ( Hi / E1, E2,<br />

... Em) where Hi represent a hypothesis <strong>and</strong> Ej represents observed evidences<br />

where 1 ≤ j ≤ m. With respect to a medical diagnosis problem, let Hi <strong>and</strong> Ej<br />

denote the i-th disease <strong>and</strong> j-th symptoms respectively. It is to be noted that<br />

under Bayesian reasoning we have to compute the inverse probability [P (Hi /<br />

Ej )], rather than the original probability, P(Ej /Hi). Before describing the<br />

Bayesian reasoning, let us revise our knowledge on conditional probabilities.<br />

Definition 9.1: Conditional probability [1] P (H / E) is given by<br />

P(H / E) = P(H ∩ E) / P(E) = P(H & E) / P(E),<br />

where H <strong>and</strong> E are two events <strong>and</strong> P (H ∩ E) denotes the joint occurrence of<br />

the events H & E. Analogously,<br />

P(E / H) = P(E & H) / P(H) .<br />

Now, since P(E & H) = P(H & E), we find<br />

P(E & H) = P(E / H) . P (H) = P(H / E) . P (E)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!