07.01.2013 Views

Lecture Notes in Computer Science 3472

Lecture Notes in Computer Science 3472

Lecture Notes in Computer Science 3472

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

19 Model Check<strong>in</strong>g 595<br />

queries. At this po<strong>in</strong>t we can implement the observation pack algorithm <strong>in</strong> two<br />

different ways:<br />

(1) Check for closedness (and rebuild the automaton) from scratch every time.<br />

This means n(k +1)n 2 queries.<br />

(2) Use the fact that access str<strong>in</strong>gs are never removed from the pack. This means<br />

that the set of queries asked <strong>in</strong> one closedness check is a subset of the queries<br />

to be asked <strong>in</strong> the next one. So, the total number of different queries over<br />

all checks is at most (k +1)n 2 . We can avoid repeat<strong>in</strong>g queries by record<strong>in</strong>g<br />

all answers to membership queries, at the expense of us<strong>in</strong>g more memory.<br />

Now consider the queries used to process the counterexample. If we do not<br />

<strong>in</strong>sist on obta<strong>in</strong><strong>in</strong>g the shortest dist<strong>in</strong>guish<strong>in</strong>g experiment, we can use Rivest<br />

and Shapire’s b<strong>in</strong>ary search. This means us<strong>in</strong>g O(log m) queries for each counterexample,<br />

hence O(n log m) for the at most n counterexamples.<br />

In total, the algorithm that records all answers uses at most O(kn 2 +n log m)<br />

membership queries.<br />

This is also the cost of the reduced observation table algorithm, if precisely<br />

the data structure record<strong>in</strong>g all answers to membership queries is employed.<br />

The discrim<strong>in</strong>ation tree algorithm, as described <strong>in</strong> [KV94], rebuilds the automaton<br />

from scratch every time and processes the counterexample sequentially,<br />

so it uses O(kn 3 + nm) membership queries. It is not difficult, however, to make<br />

it record previous queries and use b<strong>in</strong>ary search to process the counterexample.<br />

This modified version will have cost of O(kn 2 + n log m).<br />

In the observation table algorithm, the number of columns <strong>in</strong> the table is at<br />

most n, but the number of rows can be as large as O(knm) because all prefixes<br />

of counterexamples are added as rows. Consequently, the number of queries can<br />

be up to O(kn 2 m).<br />

It can be shown that for any algorithm, mak<strong>in</strong>g only O(n) equivalence<br />

queries, at least Ω(kn log n) membership queries have to be made. Further results<br />

on lower bounds can be found <strong>in</strong> [BDGW97].<br />

Note however, that the results are worst-case estimations. One might might<br />

<strong>in</strong> practice trade membership queries for equivalence queries. Experiences with<br />

learn<strong>in</strong>g algorithms are given <strong>in</strong> Section 19.4.8.<br />

19.4.7 Doma<strong>in</strong>-Specific Optimizations<br />

The number of queries can be expected to be a limit<strong>in</strong>g factor <strong>in</strong> practice. Let<br />

us study optimizations for learn<strong>in</strong>g that are possible when certa<strong>in</strong> further <strong>in</strong>formation<br />

about the system to learn is provided. The rationale of the presented approach<br />

is that <strong>in</strong> practice, one is often concerned with learn<strong>in</strong>g a certa<strong>in</strong> reactive<br />

system that can be understood as a special determ<strong>in</strong>istic f<strong>in</strong>ite state automaton<br />

[HNS03].<br />

The general concept of the optimizations presented here is that <strong>in</strong>stead of<br />

the Teacher, an Assistant is queried that might either answer a query by consult<strong>in</strong>g<br />

the Teacher, or, when possible, deduces the answer to the query us<strong>in</strong>g

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!