- Text
- Exception,
- Analysis,
- Program,
- Expression,
- Handler,
- Versioning,
- Exceptions,
- Programming,
- Locations,
- Continuation,
- Dynamic,
- Restoration,
- Using,
- Purdue,
- University,
- Www.cs.purdue.edu

Dynamic State Restoration Using Versioning ... - Purdue University

24 method preserves sequential consistency, but is arguably inefficient. As part of our future work, we plan to examine extensions of our analysis to handle multi-threaded programs. The semantics for versioning exceptions presented here are interesting in only those states of the program that can be ‘undone’. For states like I/O it is not possible to restore the state in the absence of an agreed upon ‘restore’ definition. Recent work by Harris et al [17] formalize the separation between irrevocable actions such as I/O from revocable transactional behavior by generalizing the Haskell type system. Such an approach could be used to check the validity of raising a versioning exception in the presence of I/O. Proof of safety We show that saving and restoring the elements in Γ is safe. Our notion of safety captures the intuition that no modifications to the store in an exception expression are visible in the continuation of a try-block if a versioning exception is raised in the dynamic context of the block. We use two auxilary functions Live and locs in the safety theorem. Live:Label → P(Var) that associates an expression label to the set of live variables at the beginning of that expression. For each expression present in the input set, locs returns the set of all the possible locations that could be returned by the evaluation of that expression. We state this safety property formally: THEOREM 1. Let [try(x,[e1] l1 lk )] l 1 ∈ P, K(l) = {k1 ,... ,klkn n }, and let eval(P) = 〈ρ,σ,halt 〈v〉〉 Then, 1. locs(Γ(l)) ⊆ Dom(σ) 2. locs(Γ(l)) ⊇ locs(Live(l1))∩locs(U(lki ))∩locs(M(l1)), for 1 ≤ i ≤ n. Proof. The first part of the theorem is trivial because the locations obtained in path are always in the store σ. We prove the second part of the theorem, by induction on the structure of expressions. The only interesting case to consider is assignment. Say the assignment expression is given by x1 := x2. There are two cases: 1. x1 ∈ U(lki ): Due to the constraints added, for the assignment statement (Rule 5 in Fig. 10) and the definition of Γ, it is ensured that the location of x1 is restored. 2. x ′ 1 ∈ U(lki ) where x′ 1 is an alias of x1: There are three subcases depending on the way in which the alias is created: a) The alias is created by expressions related to the rule 8(a) in Fig 10: The Unify procedure ensures in path(x1,x ′ 1 ) = (p,i), p is not null and i is 0. final.tex; 26/05/2005; 13:59; p.24

) The alias is created by the presence of an even number of let expressions corresponding to the rules 6 and 8(b) in Fig 10 (e.g. let x1 =!x2 ... let x ′ 1 =!x2...): The constraints added ensures that in path(x1,x ′ 1 ) = (p,i), p is not null and i is 0(−1 + 1 − 1 + 1 · · · even number of times). c) The alias is created because of interaction among expressions corresponding to rules 5−8 in Fig 10: This type of alias gets created due to expressions 5 and 6, 6 and 7, 5 and 8(b), and 6 and 8(c). As in the previous case, we can see that in path(x1,x ′ 1 ) = (p,i), p is not null and i is 0. In all these cases (individually or a combination locs(Γ(l)) ⊇ locs(Live(l1))∩ locs(U(lki )) ∩ locs(M(l1)). Hence the theorem holds and the analysis is safe. ✷ 6. Complexity We now consider the asymptotic running time of our analysis. The cost of computing Γ is the sum over the cost of computing last, InitF, F, U, M, K, and A. The following table shows the time and space complexity of our analysis for building each of these maps, in terms of the number of program labels n. Set last InitF F U M K A Γ Time O(n 2 ) O(n 2 ) O(n 3 ) O(n 2 ) O(n 2 ) O(n) O(n 2 ) O(n 3 ) Space O(n 2 ) O(n) O(n 2 ) O(n 2 ) O(n 2 ) O(n 2 ) O(n 2 ) O(n 2 ) The constraints for last, InitF, U, M, and A yield a complexity measure of O(n 2 ), if the flow map is already computed. The rule for restore in the definition of the last function is recursive. Because the max size of the flow set is bound by program size n, which gives a bound on the number of the recursive calls, last takes O(n 2 ) time. A similar argument gives InitF, U and M a time complexity of O(n 2 ). Careful observation of the algorithm for building the points-to map (Fig 10) shows that the LetApply rule (8) is recursive; this is similar to the restore rule in the algorithm for building last. Again, the number of such calls is limited by the size of the flow set O(n). Hence we get 25 final.tex; 26/05/2005; 13:59; p.25

- Page 1 and 2: Dynamic State Restoration Using Ver
- Page 3 and 4: in which aborts occur due to the co
- Page 5 and 6: comparison operation fails. For man
- Page 7 and 8: complications to the definition and
- Page 9 and 10: in e1 handle E x => e2 end Unlike o
- Page 11 and 12: 〈x,ρ,k,σ,Σ〉 −→ 〈k,ρ(x
- Page 13 and 14: [let s = vExn(λz.z) in [let x1 = [
- Page 15 and 16: If e ∈ InitF and e is − let x =
- Page 17 and 18: For all e l 0 ∈ InitF, if e0 is
- Page 19 and 20: F(s) = {Excpn〈0〉} F(1) = {Int }
- Page 21 and 22: any location v of the store, that n
- Page 23: let s = vExn(λ z.z) in let f = λ
- Page 27 and 28: non-revocable actions such as I/O r
- Page 29 and 30: examination of the source text. Ver
- Page 31 and 32: 18. Tim Harris and Keir Fraser Lang