# A Proof Calculus with Case Analysis as its Only Rule

A Proof Calculus with Case Analysis as its Only Rule

A Proof Calculuswith Case Analysis as its Only RuleJán Kľuka and Paul J. VodaDept. of Applied Informatics, Faculty of Mathematics, Physics, and Informatics,Comenius University, Mlynská dolina, 842 48 Bratislava, Slovakia{kluka,voda}@fmph.uniba.sk1 IntroductionIn [KV09] we have presented a proof calculus for first-order logic based on asingle rule of case analysis (cut). The calculus was to be used in our intelligentproof assistant (IPA) system called CL. The proofs were binary trees labeledwith sentences. The idea was that D :≡AD 1 D 2proves a sequent Γ ⇒ ∆ iffD 1 proves Γ, A ⇒ ∆ and D 2 proves Γ ⇒ ∆, A. Although the system looked likea Hilbert-style system (one rule, many axioms), it had a quite pleasing propertyof being able to simulate directly derivations in sequent calculi.In the present self-contained paper we have made two substantial changes tothis treatment. First, we have realized that proofs D can be read as if-then-elseformulas used in some declarative programming languages with the meaning(A ∧ D 1 ) ∨ (¬A ∧ D 2 ). Instead of turning formulas to proofs (as it is sometimesdone in logic [Sch77,Ane90]), we have reversed the situation and constructedformulas as if they were case analysis proofs. We have a system with a singlepropositional proof rule:a path in an if-then-else formula leading to its ⊥ leaf is inconsistent withthe formula.This is extended to the full quantification logic in a style somewhat reminiscentof Hilbert’s ɛ-calculus. Instead of Skolem functions (a.k.a. ɛ-terms) we use Henkinwitnessing constants.The second change to [KV09] concerns definition hiding. Program verificationinvolves a series of extensions of a language L by definitions of symbols ⃗ F ,followed by the proof of a property A of ⃗ F . However, one would often prefer toprove A as a specification of ⃗ F , and hide the details of definitions within theproof. Our proof system is explicitly designed to make this possible. At the endof the paper, we will put forth an argument in favor of definition hiding in puremathematical logic.If-then-else constructs are also used in binary decision diagrams (BDD) torepresent boolean functions. BDDs are intensively studied mainly in connectionwith the verification of VLSI circuits ([Bry86]). As the experience with BDDconfirms, if-then-else formulas seem to capture the very essence of propositionallogic, namely, that a formula is a tautology if no consistent path through it leads

Dans un effort louable pour ne pas «pensernégativement», certaines personnes répètent desaffirmations positives comme des sortes de mantrasqui auraient le pouvoir magique de contrer lesmessages du Critique ou du Juge. Cela ne fait,malheureusement, que recouvrir leur voix.Une des manières de se libérer d’eux est de lesentendre clairement et de comprendre la réelle raisond’être de ces jugements et commentaires négatifs.Page 2Le premier point à comprendre est que ce Critiqueest lié à nos besoins. Prenons Marie. Marie, commetout être humain a des besoins. Elle a besoin desécurité financière, d’être fiable, de se sentir utile etappréciée. Pour satisfaire ces besoins, elle adéveloppé une personnalité de femme responsable,travaillant dur, aimable et sympathique.En même temps, Marie a besoin de se maintenir enforme, d’être en bonne santé, elle a besoin de plaisir,de sensualité, de repos. Elle devient aussiconsciente d’un besoin de vie spirituelle et artistique.C’est-à-dire que certaines parties d’elle ont vraimentbesoin de faire la sieste, d’aller marcher sur la plage,de prendre des bains moussants, de méditer et deprendre des cours de peinture.Marie, comme toutes les personnes normales et enbonne santé est constituée, au niveau psychique,d’une mosaïque de personnes différentes. Chacuned’elles a des besoins particuliers.Lorsque Marie est identifiée à la femme qui aimetravailler dur et atteindre ses buts, elle ne peut pasprendre de pause, elle n’en ressent pas le besoin.Elle ne désire jamais aller faire de sieste ou marchersur la plage car «ce n’est pas de cette façon que jepourrai payer mes factures ou meconstituer une retraite». Lorsquecette part d’elle domine, Marie estcomplètement focalisée sur lebesoin de travailler et de gagner savie.Lorsqu’elle est identifiée à lapersonne responsable, aimable etgentille, prendre du temps pour elleest également impossible car«comment les gens pourront-ils mefaire confiance si je ne suis pas làlorsqu’ils ont besoin de moi ?»Tant que Marie est identifiée à cesvoix, elle ne peut trouver du tempspour les parties qui aiment sereposer, se cultiver, ou donner dutemps à la sexualité, à la sensualité,à la partie fragile et vulnérable d’ellemêmeou à celle qui a besoin d’une vie spirituelle.Ces parts d’elles finissent par n’avoir plus aucundroit, aucune place, aucun territoire.Les parties dominantes qui se sont développéespour répondre aux besoin d’amour et de sécuritéfinancière sont extrêmement valables, mais elles nereprésentent pas la totalité de son potentiel.Lorsqu’elles envahissent Marie, elle perd le contactavec les besoins -et les ressources- des autres partsd’elle-même.L’entrée du Critique intérieur.Le Critique intérieurarrive lorsqu’il existeune ou plusieursparties dominantesdans la personnalitéd’une personne quiempêchent leursopposés de vivre. Ilsignale un déséquilibre,une identification, uneabsence de consciencede la valeur des partiesopposées.Viens voir par là,j’ai un truc à te dire...Chez Marie, les besoins de détente, de repos, devacances, de temps libre, sont très concrets et réels.Lorsque les personnalités qui peuvent satisfaire cesbesoins prennent le dessus, le Critique estinstantanément là : Marie ne suit plus les règles decomportement des parties dominantes, toutes lesalarmes sonnent. Le Critique intérieur est celui qui semet en alerte quand les besoins des partiesdominantes ne sont plus satisfaits.Marie doit travailler, gagner sa vie,être disponible aux autres, elle nedoit pas gaspiller son temps, ni êtreégoïste.Mais le Critique est aussi sensible aumanque d’équilibre. Lorsque Marie nefait que travailler, gagner sa vie,répondre aux besoins des autres, iln’est pas satisfait non plus. Il saitqu’elle est en déséquilibre et ildégaine : «Tout le monde a une vieprivée sauf toi, quelque chose ne vapas chez toi !» «Tu vas devenirexactement comme ton père, uneaccro au travail.» Finalement quoi queMarie fasse, son Critique la fustige.Lorsque l’identification à une partie esttotale, le Critique pèse un tonne.Association Voice Dialogue Sud

A Proof Calculus with Case Analysis as its Only Rule 3• The software development is quite error prone and formal vehicles for programverification are sorely needed.• Programs are constructed to accept inputs of arbitrary size, and thus themain formal verification vehicle must of necessity be induction. ATPs cannotgenerally find induction formulas automatically. Specialized SAT-solversused in hardware design typically deal with fixed sizes (i.e., the number oflogical gates).• Quantifiers are necessary in the statement of program properties. Determiningthe witnessing terms is best left to humans. Unification used in someATPs degenerates to enumeration in the worst case.• The construction of programs – which amounts to definition of primitive recursivefunctions (or functionals) of rather low complexity (Grzegorczyk’s E 4is already unfeasible) – is a difficult, mentally straining activity. The additionaleffort of verifying the programs usually more than doubles the levelof difficulty of programming.• For many, so called mission critical programs, the correctness is crucial ashuman lives and/or expensive equipment depend on their correct functioning.• The correctness proofs are formally verified properties of concrete functions,it is essential that the proofs are developed to details which can be machinechecked.• The proofs are constructed by humans in that they formulate lemmas, chooseinduction formulas, and guess witnessing terms.• Although most IPAs work in sequent or tableau calculi with subformulaproperties, the use of cuts (i.e., modus ponens) is crucial. This is becausethe properties must be proved in a natural Hilbert style, just as it is donein mathematics. This is to be contrasted with ATPs where the formulas tobe proved are preprocessed by skolemization and by reduction to clausal(resolution-based ATPs) or if-then-else (BDD-based ATPs) normal form.• IPAs should automatize the decidable (and boring) details of proofs such asthe use of propositional rules, handling of equality/orderings, expansion ofdefinitions, limited automatic use of simple lemmas (universal Horn clauses),and limited guessing of witnesses.IPAs work in various logics. CL and ACL 2 [KMM02] work in first-orderlogic, essentially in primitive recursive arithmetic (PRA). Most of IPAs workwith functionals, and many use constructive logic such as type theories. Weare concerened here with classical calculi whether one-sorted or many-sorted(possibly with sorts for the functionals of finite types).A typical work in an IPA is shown in Fig. 1. A theory is extended by adefinition of the symbol f and a lemma (n) about it is proved. The lemma isthen used in a proof of the theorem (m). The proof of (m) is itself done in thestyle of extensions. The theory is locally extended (definition hiding) with thesymbol g and a local lemma (i) about is is proved. Both lemmas are used tofinish the proof of (m).

4 Ján Kľuka and Paul J. VodaDefinition: ∀x ∀y (f (x) = y ↔ · · · ).Lemma n: ∀x (· · · f · · · ) — proof· · · q.e.d..Theorem m: ∀x ∀y (· · · f · · · ) — proof· · ·Definition: ∀x ∀y (g(x) = y ↔ · · · )Lemma i: ∀x (· · · g · · · f · · · )— proof· · · q.e.d.case analysis on · · · ∨ · · · :1. · · ·use of Lemma n· · ·2. use of Lemma i· · · q.e.d.Fig. 1. A schema of development of proofs by a proof assistant in extensions of theories.3 Formulas3.1 Formulas and sentences. We use the standard notion of countable languagesfor first-order logic (denoted by L) with predicate symbols (R, . . . ) andfunction symbols (f , g, . . . ). We denote symbols of any kind by F , G, and setsof symbols by F. We denote by L[F] the language L extended with the functionand predicate symbols F. We will abbreviate e 1 , . . . , e n to ⃗e if n is known fromthe context of irrelevant.Terms (t, s, . . . ) are built up from variables (x, y, z, . . . ) and applications offunction symbols (including constants). Atomic formulas are applications R(⃗s)or equalities s = t. Formulas are the least class containing atomic formulas,⊤, ⊥, and closed under existential quantification ∃x A and if-then-else constructIf(A, B, C) with A, B, C formulas. Prime formulas (designated by P ) are atomicor existentially quantified formulas.We will usually display If(A, B, C), whose intended interpretation is (A ∧B) ∨ (¬A ∧ C), asAB C .We call A the guard.We use the standard notions of free and bound variables. We use A[x/t]to designate the substitution of a closed term t for all free occurrences of thevariable x in A. A sentence is a closed formula, i.e., a formula with no freevariables.

6 Ján Kľuka and Paul J. Voda3.4 Finite sequences of sentences. We will use a computer science list notationfor finite sequences of sentences. Empty sequence is 〈〉 and the prefixing of asequence Π with a sentence A will be designated by the pairing function 〈A|Π〉.Pairing is such that 〈〉 ≠ 〈A|Π〉. We use abbreviations 〈A, B |Π〉 := 〈A|〈B | Π〉〉and 〈A 1 , . . . , A n 〉 := 〈A 1 , . . . , A n | 〈〉〉. Concatenation ⊕ of sequences is definedas 〈〉 ⊕ Φ := Φ and 〈A | Π〉 ⊕ Φ := 〈A | Π ⊕ Φ〉. All declarative programminglanguages use lists in such a form. The presented notation is PROLOG’s but wehave replaced its square brackets with the angle ones.The sequence Π ′ extends Π if Π ′ = Π ⊕ Φ for some Φ. We defineSet(Π) = {A | Π = Π 1 ⊕ 〈A | Π 2 〉 for some Π 1 , Π 2 }.We adopt a convention that an occurrence of a sequence Π in a position wherea set is expected stands for Set(Π). Thus, for instance A ∈ Π stands for A ∈Set(Π) and T, Π Π ′ for T, Set(Π) Set(Π ′ ).Note that Set(Π ⊕Π ′ ) = Set(Π)∪Set(Π ′ ) and if Π ′ extends Π then Π ⊆ Π ′ .3.5 Paths through sentences. We define a partial function (A) Π selectinga subsentence of A at Π to be defined only if it satisfies:( P)(A) 〈〉 :≡ A (P ) 〈A〉 :≡⊤ ⊥( A)( A):≡ (B) Π :≡ (C) Π .B C 〈A|Π〉B C 〈¬A|Π〉Note that we have (A) Π1⊕Π 2≡ ((A) Π1 ) Π2 if defined.The set of all paths in A is defined as Ps(A) := {Π | (A) Π is defined}. Wesay that Π is a path through A if Π selects ⊤ or ⊥ in A. We say that A has ahole at Π if Π selects ⊥ in A. A sentence A is syntactically valid if it has noholes, i.e., if all paths through A select ⊤.We define a ternary partial replacement function A[Π := D] to yield sentenceslike A but with the subsentence selected by Π (if any) replaced by D.The function is defined only when it satisfies:3.6 Lemma.( PA[〈〉 := D] :≡ D P [〈A〉 := D] :≡⊤ ⊥( A)A[〈A | Π〉 := D] :≡B CB[Π := D]( A)[〈¬A | Π〉 := D] :≡BC)[〈A〉 := D]CAB C[Π := D] .(i) For every sentence A ∈ L and a structure M for L, there is a uniquepath Π through A such that M Π.(ii) Π A ↔ (A) Π for every Π ∈ Ps(A).〈A〉

A Proof Calculus with Case Analysis as its Only Rule 7(iii) If A is syntactically valid, then A.(iv) If T, Π ⊥ for every path Π through D, then T, D ⊥.Proof. (i) By induction on A. If A is ⊤ or ⊥, the only path 〈〉 is trivially satisfiedby M. If A is prime then if M A, the path 〈A〉 is the only one satisfied by M.If M A, the path 〈¬A〉 is the only one satisfied by M. The case when A is anif-then-else sentence follows from IH.(ii) Observe first that if Π = 〈〉, then there is nothing to prove because(A) 〈〉 ≡ A. For Π ≠ 〈〉, A cannot be ⊤ or ⊥, and we continue by induction on A.If A is prime, then the conclusion follows if Π = 〈A〉, since (A) Π ≡ ⊤. The caseΠ = 〈¬A〉 is similar because then (A) Π ≡ ⊥. If A ≡B , then we must haveC DΠ = 〈B |Π ′ 〉 with Π ′ ∈ Ps(C) or Π = 〈¬B |Π ′ 〉 with Π ′ ∈ Ps(D). If the former,then for any M Π we have M C ↔ (C) Π ′ from IH. Since M B, we haveM A ↔ C and the conclusion follows because (A) Π ≡ (C) Π ′. The other caseis similar.(iii) Take any M. By (i) there is a path Π through A such that M Π.Since (A) Π ≡ ⊤, we get M A by (ii).(iv) Suppose M T, D for some M. By (i), M Π for some path Πthrough D. This falsifies the assumption.⊓⊔4 ProofsIf a sentence A is valid, then all paths to a hole of A must be inconsistent.A proof of A will be a sentence D “plugging all holes in A”. D will be obtainedfrom A by replacing its holes with sentences which decidably cannot be satisfied.4.1 Regular sets of extension axioms. Sentences of the form ∃x A→A[x/c]defining the constant c which does not occur in A are Henkin axioms. Sentencesof the form ∀⃗x (R(⃗x)↔A) defining the predicate symbol R which does not occurin A are defining axioms for R. Sentences of the form ∀⃗x ∃!y A → ∀⃗x ∀y (f (⃗x) =y ↔ A) defining the function symbol f which does not occur in A are definingaxioms for f . All three kinds of axioms are called extension axioms.Let L be a language, and F a countable set of symbols not in L. A set Sof extension axioms in L[F] is regular for F if S can be enumerated withoutrepetitions as {S i } |S|i=1 where each extension axiom S i defines a symbol F i ∈ Fnot occurring in {S 1 , . . . , S i−1 }.4.2 Syntactic inconsistency. Let T be a theory in L and S a set of extensionaxioms in L[F] regular for F. We designate by X(T, S) the theory in L[F]consisting of the following sentences:(1)A[x/t]∃x A ⊤, (2)⊤ ⊥s = tt = t⊤ ⊥ , (3) P [x/s] ⊤ for atomic P ,P [x/t] ⊤⊤ ⊥

10 Ján Kľuka and Paul J. Vodaof Henkin axioms is regular for C.Construction of the witnessing expansion is given, e.g., in [Bar77].5.2 Hintikka sets. A theory H in L simply consistent if for each sentenceA ∈ H we have (¬A) /∈ H. A simply consistent theory H is maximal if it has noproper simply consistent extension in L, i.e., if for all A ∈ L either A or ¬A isin H.Let L[C] be the witnessing expansion of L. A theory H in L[C] is called aHintikka set if the following conditions are satisfied:(i) ⊤ ∈ H, and (t = t) ∈ H for all terms t in L[C].(ii) If (s = t) ∈ H and P [x/s] ∈ H for atomic P , then P [x/t] ∈ H.(iii) IfA∈ H, then A ∈ H and B ∈ H, or (¬A) ∈ H and C ∈ H.B C(iv) If ∃x A ∈ H, then A[x/c ∃xA ] ∈ H.(v) If ∀x A ∈ H, then A[x/t] ∈ H for all terms t in L[C].5.3 Lemma. Every maximal simply consistent Hintikka set has a model.Proof. For a similar construction see, e.g., [Bar77,Sho67].⊓⊔5.4 Brute-force proof construction. Let A be a sentence, T a theory, bothin L, and He be as in Par. 5.1. To any pair of sentences D, B in L[C] we define asentence BF(D, B) which extends D with the sentenceBat its syntactically⊥ ⊥consistent holes:BF(D, B) :≡ D[Π 1 := E 1 ] · · · [Π n := E n ]where Π 1 , . . . , Π n (n ≥ 0) are paths to all holes in D, and for j = 1, . . . , n{⊤ if Π i is syntactically inconsistent with T and He,E j :≡Botherwise.⊥ ⊥For an enumeration {B i } ∞ i=0 of all sentences in L[C] we define a sequence ofsentences {D i } ∞ i=0 by D 0 :≡ A and D i+1 :≡ BF(D i , B i ).5.5 Theorem (Completeness). If T L A, then T ⊢ L A.Proof. Assume T L A, construct the sequence {D i } ∞ i=0 as in Par. 5.4, and letΠ 1 , . . . , Π n be paths leading to all holes in A. A straightforward induction on iproves:For each i > 0 and each path Π through D i extending some Π j , we have(D i ) Π ≡ ⊤ iff Π is syntactically inconsistent with T and He.We consider two cases. If D i ≡ D i+1 for some i, then D i has no holes, and sothe sentences (D i ) Π1 , . . . , (D i ) Πn are syntactically valid. Since by the aboveproperty, each (D i ) Πj plugs the hole in A at Π j w.r.t. T and He, we haveD i ⊢⃗ F T L A.

A Proof Calculus with Case Analysis as its Only Rule 11If all D i are different, we define the sequence {T i } ∞ i=0 byT 0 := {Π | A has a hole at Π ′ and Π ′ extends Π}T i+1 := {Π | D i has a hole at Π} ∪ T i ,and let T := ⋃ ∞i=0 T i. The set T is infinite and such that if Π ′ ∈ T and Π ′ extendsΠ, then Π ∈ T . This makes T a (set-theoretical) tree. It is finitely branchingbecause every Π ∈ T has at most two immediate successors (extensions)in T . By König’s lemma, T has an infinite branch Φ 0 ⊆ Φ 1 ⊆ Φ 2 ⊆ · · · ⊆ T .From the construction of T , all Φ i are syntactically consistent (with T and He).We set H := ⋃ ∞i=0 Φ i with the intention of showing that H is a maximal simplyconsistent Hintikka set s.t. T ⊆ H. By Lemma 5.3, there will be then a structureM H. Since Φ j = Π k for some j and k, we will get M A from Lemma 3.6(ii).The retraction M ′ to L of M will contradict the assumption T L A.We first show that H is maximal in L[C] by taking any B ∈ L[C]. It is some B iin the enumeration of L[C] and thus Φ j+1 = Φ j ⊕ 〈B〉 or Φ j+1 = Φ j ⊕ 〈¬B〉 forsome j ≥ i. Hence one of B, ¬B is in H.We will need two auxiliary properties of H:No Π ⊆ H is a path to a hole in a sent. (1)–(5) of Par. 4.2 or in an A ′ ∈ H. (†)For every sentence B ∈ L[C] there is a path Π through B s.t. Π ⊆ H.(†) Assume Π ⊆ H. If Π is a path to a hole in a sentence (1)–(5), thenΠ ⊆ Φ j for some j, and Φ j is syntactically inconsistent. If Π is a path to a holein A ′ , then A ′ , Π ⊆ Φ j for some j, and Φ j is syntactically inconsistent.(∗) We proceed by induction on B. If B is ⊤ or ⊥, then set Π := 〈〉 ⊆ H.If B is prime, then by maximality B or ¬B is in H, and so set Π := 〈B〉 ⊆ Hin the first case, and Π := 〈¬B〉 ⊆ H in the second. If B ≡C , then C orD E¬C is in H by maximality. In the first case, set Π := 〈C | Π ′ 〉 where Π ′ ⊆ H isobtained from IH as a path through D. The second case is similar.We will now show that H is simply consistent. Assume that both B and ¬Bare in H. By (∗), there is a path Π through B s.t. Π ⊆ H. Since the same Πis also a path through ¬B, Π leads to a hole of exactly one of B or ¬B. Thiscontradicts (†) where A ′ is B or ¬B.We will now show that T is a subset of H. Take any B ∈ T . 〈¬B〉 ⊆ Hviolates (†)(5). Hence B ∈ H by the maximality of H.It remains to show that H is a Hintikka set:(i) Since 〈〉 ⊆ H, and it is a path to a hole of ⊥, ⊥ ∈ H would violate (†) withA ′ ≡ ⊥. Hence ⊤ ∈ H by the maximality of H. We cannot have 〈t ≠ t〉 ⊆ H by(†)(2). Hence (t = t) ∈ H by the maximality of H.(ii) If (s = t), P [x/s] ∈ H then, since 〈s = t, P [x/s], ¬P [x/t]〉 ⊆ H wouldviolate (†)(3), we must have P [x/t] ∈ H by the maximality of H.(iii) If A ′ :≡B ∈ H, then there is a path Π ⊆ H through C D A′ by (∗).By (†), Π cannot lead to a hole of A ′ . Exactly one of B, ¬B is in H. If theformer, then Π = 〈B | Π ′ 〉 with Π ′ a path through C. Π ′ cannot be a path to ahole of C because then A ′ would have a hole at Π. Hence Π ′ is a path to a hole(∗)

12 Ján Kľuka and Paul J. Vodaof ¬C. By (†), we must have (¬C) /∈ H. Hence C ∈ H by the maximality of H.The case when (¬B) ∈ H, which implies D ∈ H, is similar.(iv) If ∃x B ∈ H then, since 〈∃x B, ¬B[x/c ∃xB ]〉 ⊆ H would violate (†)(4),we must have B[x/c ∃xB ] ∈ H by the maximality of H.(v) For any t, if ∀x B ≡ ¬∃x ¬B ∈ H, then, since 〈¬B[x/t], ¬∃x ¬B〉 ⊆ Hwould violate (†)(1), we must have B[x/t] ∈ H by the maximality of H. ⊓⊔6 Methods of Proof ConstructionWe have introduced a new proof calculus and demonstrated that its conceptof provability coincides, as it should, with logical consequence. It remains toconvince the reader that the calculus offers advantages over the standard proofsystems. In this section we will show that• we can construct proofs incrementally by applying local proof rules,• the rules are flexible enough to emulate Hilbert, as well as Gentzen style ofproofs,• we can, because of definition hiding, transform proofs with flexibility notafforded by the last two types of calculi.6.1 Derivation rules. For the duration of this section we fix a (possibly empty)theory T in some L. A derivation rule (R) is of the formΓ(R) Dwhere Γ is a finite set of sentences (possibly empty) in L, called premises, andthe sentence D in L[ ⃗ F ] is the conclusion. The rule is sound if the following isconstructively proved:for all paths Π such that Γ ⊆ Π, and every A in L[ ⃗ F ] with a hole at Πand such that T ⊢ L[ ⃗ F ] A[Π := D], we have T ⊢ L A.For the soundness of the rule (R) it suffices that D is such that every path Π ′selecting ⊤ in D (if any) leads to inconsistency: T, S, Γ, Π ′ ⊢ L[ ⃗ F ] ⊥ for some S.This is an easy consequence of the definition of proofs. The simplest proof of thelast condition is ⊤ when Π ⊕ Π ′ is syntactically inconsistent with T and S.6.2 The case analysis (cut) method. The case analysis (cut) method ischaracterized by two rules:(Close)(Cut)Γ⊤A⊥ ⊥if Γ is synt. inconsistent with T and some SSoundness of the two rules follows from the sufficient condition in Par. 6.1.

A Proof Calculus with Case Analysis as its Only Rule 136.3 The unboxing method. The unboxing method takes sentences out ofthe guards of if-then-else connectives, possibly also by dropping quantifiers, andplugs holes with them. Its rules are (Close) (see Par. 6.2), the rule “deeply”unboxing a guard (Unbox), and five “shallowly” unboxing rules: (Th) for axiomsin T , rules (L∃), (L∀) for quantifiers, and the two rules for equality (Id), (Subst).(Unbox)(L∀)(Id)A¬A∀x AA[x/t]⊥⊤t = t⊥ ⊤(Th)(L∃)(Subst)A⊥ ⊤∃x AA[x/c]⊥⊤s = t, P [x/s]P [x/t]⊥⊤if A ∈ T .if c /∈ L.if P is atomic.The rules are sound because they are reducible to those of the cut method.We can, namely, construct all sentences in conclusions by one or more cuts.What must be justified, is the correctness of applications of (Close) rules in theconclusions of the rules.So, let Π be the path to the hole we are trying to plug with one of thelisted rules. For (Unbox), let Π ′ be a path selecting any ⊤ in its conclusion ¬A.The path Π ⊕ Π ′ is syntactically inconsistent on account of A ∈ Π and Π ′being a path to a hole of A. We show only the soundness of (L∀), the remainingrules are dealt with similarly. The path to the ⊤ in the conclusion of (L∀) isΠ ⊕ 〈¬A[x/t]〉. It is inconsistent by containing a path to the hole of (1) ofPar. 4.2 because ∀x A ≡ ¬∃x ¬A ∈ Π.Henkin axioms in the set S of extension axioms can be constructed “on thefly” with every application of (L∃). The same is achieved in the usual calculi byeigen-variable conditions.6.4 The sequent method. We will now show that our calculus can emulateproofs in Gentzen’s sequent calculi. For that we need two rules:(Ax)P, ¬P⊤for atomic P(If)B⊥AB CA⊤C⊥⊤.The closure in (Ax) is justified by 〈¬P 〉 being the path to the hole of P . With thesame premise as in (If), we would get by (Unbox) the conclusionAwhere¬B ¬Cthe sentences ¬B and ¬C are unboxed. The rule (If) gives equivalent, but boxedsentences. The two occurrences of ⊤ in its conclusion are not directly reducible

14 Ján Kľuka and Paul J. Vodato (Close). However, one can easily see thatB ⊤` A ´B C¬B ⊤ ⊢A ⊤⊤A , A, ¬B B C L ⊥where B ⊤ is just like B with all holes replaced by ⊤. There is an similar witnessofA, ¬A, ¬C ⊢ B C L ⊥. This is by Par. 6.1 sufficient for soundness of the rule.The sequent rules are (Ax), (If), (Th), (L∀), (L∃), (Id), (Subst).We now present a translation of a proof D of T ⊢ A with all inferencesin D by sequent rules to a one-sided sequent calculus with sequents Γ ⇒ . Suchsequents are dual to the well-known Schütte-Tait one-sided sequents ⇒ ∆ in thecalculus called GS in [TS00]. Assume that a sequent rule Γ has been appliedCat a path Π ∈ Ps(D) leading through a hole of A. We translate it to the sequentcalculus inferenceΠ 1 , Γ, Π ⇒ . . . Π n , Γ, Π ⇒Γ, Π ⇒where Π 1 , . . . , Π n are all paths to holes in C. For instance, sequent calculusinferences corresponding to (Ax), (If), and (L∀) are respectively:A, B, Π ⇒ ¬A, C, Π ⇒ A[x/t], Π ⇒(Ax) (If)P, ¬P, Π ⇒ A, Π ⇒ (L∀)∀x A, Π ⇒ .B C6.5 Considerations of completeness. In this section we have concentratedon the soundness of derivation rules. We know that the cut method is alsocomplete, because we have proved the completeness theorem 5.5 by its two rules.Incidentally, we have proved more than completeness, we have also semanticallydemonstrated the conservativity of extensions by definitions because we neverhad to use other extension axioms than Henkin’s. It can be shown that also theunboxing and sequent methods are complete in this sense.When it comes to proving sentencess A in L[ ⃗ F ] by definition hiding, wherethe extension axioms defining the symbols ⃗ F are local to the proofs, then wecannot hope for the completeness. This is because A can express a property of ⃗ Fnot satisfiable by any extensions by definitions.6.6 Inversion. The rule (L∀) in the one-sided sequent calculus is not invertible.This means that given a proof D of the conclusion we cannot in generaleliminate all its uses with terms t 1 , . . . , t n in D and obtain a proof ofA[x/t 1 ], . . . , A[x/t n ], Γ ⇒. This is because the terms t i may contain eigenvariablesintroduced by (L∃) in D.In our calculus we have weakened the eigen-variable conditions to the requirementof regularity of extension axioms. This allows arbitrary permutationswhere we can, for instance, replaceAD 1 BBA A .D 1 D 2 D 1 D 3anywhere in D byD 2 D 3Thus, given a proof of ∃x A, we can permute all applications of (L∀) to ∀x ¬Ain the proof to obtain a proof of ∨ i A[x/t i]. Note that this is an example of

A Proof Calculus with Case Analysis as its Only Rule 15definition hiding where we assert a property of Henkin constants occurring inthe terms t i without exposing their defining axioms.6.7 Future work. We are preparing a paper in which we will demonstratethe usefulness to pure logic of our calculus, with its emphasis on paths throughformulas and definition hiding. The paper will be on the still difficult, and notyet completely satisfactorily solved, Π 0 2 -analysis of some theories T . The analysistries to characterize the witnessing functions ⃗ f from a given proof of a sentence∀⃗x ∃⃗y R(⃗x, ⃗y) with R quantifier-free. The functions satisfy R(⃗x, ⃗ f(⃗y)). This isequivalent to determining the ordinal bounds on such proofs.The prepared paper will deal with predicate calculus (T = ∅) and with PeanoArithmetic (PA). In both cases, the Henkin witnessing constants introducedfrom the descendants of cut formulas will be replaced by explicit quantifier-freedefinitions of witnessing functions. In case of PA, also the induction axioms withquantified formulas will be replaced by quantifier-free definitions of µ-functionsyielding the least witnesses.References[Ane90] Anellis, I.H.: From Semantic Tableaux to Smullyan Trees: A History of theDevelopment of the Falsifiability Tree Method. Modern Logic 1(1):36–69,1990.[Bar77] Barwise, J.: First-order logic. In Barwise, J., ed.: Handbook of MathematicalLogic. North Holland, 1977.[BC04] Bertot, Y., Castéran, P.: Interactive Theorem Proving and Program Development.Coq’Art: The Calculus of Inductive Constructions. Texts in TheoreticalComputer Science. An EATCS series. Springer, 2004.[Bry86] Bryant, R.E.: Graph-Based Algorithms for Boolean Function Manipulation.IEEE Transactions on Computers, C-35(8):677–691, 1986.[CL97] Clausal Language, http://ii.fmph.uniba.sk/cl/[Gou94] Goubault, J.: Proving with BDDs and Control of Information. In Proc.12 th Intl. Conf. on Automated Deduction (CADE’94), pp. 499–513, Springer,1994.[GT03]Groote, J.F., Tveretina, O.: Binary decision diagrams for first-order predicatelogic. J. Logic and Algebraic Programming 57(1–2):1–22, 2003.[KMM02] Kaufmann, M., Manolios, P., Moore, J.S.: Computer-Aided Reasoning: AnApproach. Kluwer Academic Publishers, 2000.[KV09] Kľuka, J., Voda, P.J.: A Simple and Practical Valuation Tree Calculus.Computation in Europe (CiE) 2009. http://dai.fmph.uniba.sk/~voda/ext-proof.pdf[NPW02] Nipkow, T., Paulson, L.C., Wenzel, M.: Isabelle/HOL — A Proof Assistantfor Higher-Order Logic. LNCS 2283, Springer, 2002.[PS95] Posegga, J., Schmitt, P.H.: Automated Deduction with Shannon Graphs.J. Logic and Computation, 5(6):697–729, 1995.[Sch77] Schütte, K.: Proof Theory. Springer, 1977.[Sho67] Shoenfield, J.R.: Mathematical Logic. Addison-Wesley, 1967.[TS00]Troelstra, A.S., Schwichtenberg, H.: Basic Proof Theory. Second edition.Cambridge University Press, 2000.

More magazines by this user
Similar magazines