07.01.2013 Views

Lecture Notes in Computer Science 3472

Lecture Notes in Computer Science 3472

Lecture Notes in Computer Science 3472

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

256 Verena Wolf<br />

• P ⊑ must<br />

JY<br />

Q iff for all T ∈Tpp :<br />

m<strong>in</strong> P ′ ∈fully {0,1} (P�T ) WP ′ � m<strong>in</strong> Q ′ ∈fully {0,1} (Q�T ) WQ ′.<br />

Q and P ⊑must JY Q.<br />

⊓⊔<br />

Two processes P and Q are related <strong>in</strong> ⊑ may<br />

JY if the ”best” fully probabilistic<br />

resolution of P�T has a total success probability that is at least the total success<br />

We write P ⊑JY Q iff P ⊑ may<br />

JY<br />

probability of the best fully probabilistic resolution of Q�T .Notethat⊑ may<br />

JY boils<br />

down to ord<strong>in</strong>ary simulation [Jon91] (when probabilities are omitted) which is<br />

another implementation relation for non-probabilistic processes [Jon91].<br />

A non-probabilistic process Q simulates a non-probabilistic process P if Q can<br />

”simulate” every step of P. The converse must not necessarily hold.<br />

Jonsson and Yi also showed that ⊑ may<br />

JY co<strong>in</strong>cides with their probabilistic simulation<br />

[JY02].<br />

Note that if P ⊑JY Q the process Q is bounded <strong>in</strong> some sense by P (so aga<strong>in</strong><br />

we assume that Q has the role of the implementation and P theroleofthe<br />

specification). The total success probabilities of Q have to lie <strong>in</strong> the <strong>in</strong>terval<br />

[m<strong>in</strong> P ′ ∈fully {0,1} (P�T ) WP ′, max P ′ ∈fully {0,1} (P�T ) WP ′].<br />

Thus P is a more abstract resolution of Q and the requirements <strong>in</strong> P are fulfilled<br />

<strong>in</strong> this case because the ”worst” resolution of Q�T is at least as good as the<br />

worst resolution of P�T .<br />

Example. Consider P and Q <strong>in</strong> Figure 9.9, page 254. It can be shown that<br />

Q ⊑JY P. The composition with T , for example, yields<br />

{0.5} = {WQ ′ | Q ′ ∈ fully {0,1} (Q�T )}<br />

⊂{WP ′ | P ′ ∈ fully {0,1} (P�T )} = {0, 0.5, 1}.<br />

9.9 Compositional Test<strong>in</strong>g of Markovian Processes<br />

In this section, we briefly sketch a test<strong>in</strong>g theory for action-labeled Markov cha<strong>in</strong>s<br />

as proposed by Bernardo and Cleaveland [BC00]. The idea of def<strong>in</strong><strong>in</strong>g a notion<br />

of test<strong>in</strong>g for aCTMCs is very similar to the approaches for fully probabilistic<br />

processes <strong>in</strong> Section 9.7 but here two quantities are taken <strong>in</strong>to account when<br />

test<strong>in</strong>g M1, M2 ∈ ACTMC with a Markovian test process T ∈T pa<br />

τ :<br />

(1) For i ∈ {1, 2} consider all paths α <strong>in</strong> the fully probabilistic embedd<strong>in</strong>gs<br />

M ′<br />

i = φem(Mi�T ) ∈ FPP with success state lstate(α) and calculate their<br />

probabilities where Pr path path<br />

Mi �T := PrM ′<br />

5 .<br />

i<br />

5 HerethepathdistributionofanaCTMCM is equal to the path distribution of the<br />

embedded fully probabilistic process φem(M )<br />

⊓⊔

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!