11.12.2012 Views

D2.1 Requirements and Specification - CORBYS

D2.1 Requirements and Specification - CORBYS

D2.1 Requirements and Specification - CORBYS

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>D2.1</strong> <strong>Requirements</strong> <strong>and</strong> <strong>Specification</strong><br />

Problem<br />

4<br />

Retention<br />

Solution<br />

1<br />

Retrieval<br />

Figure 1: CBR Process<br />

Case Based Repository<br />

1. The problem is entered into the system <strong>and</strong> analysed.<br />

2. The case which closest matches the details of the problem is selected.<br />

3. This case is modified to better fit the problem using predefined rules. This becomes the solution<br />

which is presented.<br />

4. The solution is analysed for its effect <strong>and</strong> is stored for future use along with an indication of its<br />

successfulness. This allows the system to learn.<br />

The system should be able to learn not only from its successful solutions, but also from its failed solutions<br />

(success driven <strong>and</strong> failure driven learning). Successful solutions can be stored so that the solution can be<br />

reused without regenerating it. Solutions which failed to solve the problem can be used to generate better<br />

solutions for use at a later stage. CBR can be used to identify situations presented as cases from a repository<br />

of known situations; however, the mapping of a situation as a case <strong>and</strong> subsequent measurement of similarity<br />

between two cases poses a problem. Furthermore, the existence of an associated solution becomes irrelevant<br />

as soon as the situation is matched to a template or a case from the repository.<br />

11.3.4 Bayesian Networks<br />

2<br />

Selection<br />

Bayesian networks are probabilistic graphical models that represent a set of r<strong>and</strong>om variables <strong>and</strong> their<br />

conditional dependencies via directed acyclic graphs. Bayesian networks are commonly used for probabilistic<br />

reasoning in the context of situation assessment (Das et al. 2002; Bladon et al. 2002; Higgins 2005). Bayesian<br />

networks are based on Bayes theorem which computes posterior or inverse probability for a proposition i.e.,<br />

given the prior or unconditional probabilities of A <strong>and</strong> B, <strong>and</strong> knowing the conditional probability of B given<br />

A, what is the conditional probability of A given B? Bayesian nodes in the networks represent propositions,<br />

r<strong>and</strong>om variables, unknown parameters or hypotheses. Nodes are connected by edges, that represent<br />

conditional dependency, <strong>and</strong> unconnected nodes therefore represent variables that are conditionally<br />

independent. Each node has an associated probability function that gives the variable probability represented<br />

by the node, upon receiving a set of values as input. Priori probabilities <strong>and</strong> conditional probabilities have to<br />

be specified for each node in the network. Inverse probabilities for each node can then be computed using<br />

Bayes rule as new input is received into the network.<br />

In the context of situation assessment, Das et al. (2002) lists two important points that must be observed when<br />

117<br />

Application of<br />

rules to modify<br />

selected cases<br />

Figure 27: Case-Based Reasoning process<br />

3<br />

Presentation

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!