13.07.2015 Views

Download Case Study - Serco

Download Case Study - Serco

Download Case Study - Serco

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012Formal Verification for Mission Assurance inCyberspace: Education, Tools, and ResultsShiu-Kai Chin ♦♥ , Erich Devendorf ♥ , Sarah Muccio ♣ , Susan Older ♦♥ , and James Royer ♦Abstract— Mission assurance is the assurance of the correctness,integrity, security, and availability of critical capabilities necessaryto complete a mission successfully. National security dependson the integrity of command and control for military systems,the power grid, and financial systems. Thus, the alarming lack ofpersonnel capable of doing mathematically rigorous specification,design, verification, testing, and procurement of trustworthysystems is a national weakness with profound implications for nationalsecurity. This paper reports the results of a pilot programat the undergraduate level whose objectives include equippingundergraduate computer engineers and computer scientists withthe theory, methods, and tools necessary for formal specificationand verification of mission-essential functions in cyberspace.Index Terms— Cyber security, formal verification, access control,theorem proving, undergraduate educationI. INTRODUCTION“No operator should ever have to ask . . . ‘Will myweapon work?’ . . . Cyberspace warfare creates justthis possibility.” – Gen John Shaud, USAFThe United States’ power, electrical grid, financial services,and other critical infrastructure are inextricably intertwinedwith cyber operations that depend on computer-enabled commandand control systems. The threat to national security bycompromise of these systems’ integrity is well publicized.Furthermore, the desirability and strategic importance of rigorous,certifiable, reusable, and replicable assurance using formalverification is widely acknowledged.Despite these facts, new systems are often designed bythe newest engineers, for understandable economic reasons.Without a solid educational foundation that supports skillsin rigorous specification, design, and verification, there islittle hope for improved assurance of system integrity andsecurity. This capability is generally believed to be beyond thegrasp of practitioners and computer engineering and scienceundergraduates.Our results disprove this belief. Our conclusions are basedon a ten-year partnership and experimentation to devise thetheory, practice, techniques, tools, and means for technologytransfer (i.e., education and courses at the undergraduate♦ Electrical Engineering & Computer Science, Syracuse University♣ Air Force Research Laboratory Information Directorate, Rome, NY♥ <strong>Serco</strong> Inc, Rome, NYDistribution Statement A - Approved for Public Release - DistributionUnlimited Document #88ABW-2012-1812, dated 29 MAR 2012level) necessary for assuring the integrity and security ofcritical systems. Our efforts are focused on the next generationof leaders in cyberspace: undergraduate computer engineers,computer scientists, and reserve officer training corps (ROTC)cadets. Our conclusions are based on the achievements of morethan 265 students from more than 50 US universities who havegone through our programs. This paper gives an overview ofthe theory, application, tools, and courses we use to equipfuture leaders to think rigorously about mission assurance incyberspace.The rest of this paper is organized as follows. Section IIpresents the context of our work in terms of our educational,research, and internship programs. Section III describes ourmethods in terms of conceptual roots, objectives, and tools.We present our results in Section IV and our lessons learnedin Section V. Our conclusions are in Section VI.II. CONTEXT: ACE-CS, INTERNSHIPS, AND THE CYBERENGINEERING SEMESTEROur results are based on a ten-year educational partnershipamong academia, industry, and government. Government, industrial,and academic researchers work together to educateand mentor students to solve real-world problems. Teammembers do not have fixed roles: government and industrialstaff at times teach theoretical courses, while academic facultyoften teach tools applied to specific problems. In all cases, adeliberate attempt is made to integrate theory with practice tosolve real problems.A. ACE-CS: Cyber Security Boot CampThis partnership began with the 2003–2010 Air Force ResearchLaboratory (AFRL) Advanced Course in Engineering(ACE-CS) Cyber Security Boot Camp [1], [2], a ten-weeksummer program based on the tenets of the General ElectricAdvanced Course in Engineering (GE-ACE) [3].The GE-ACE and ACE-CS programs emphasize mathematicallyrigorous solutions to real engineering problems. Bothbuild a culture that values teamwork, rigorous analysis, andon-time performance. Programs like GE-ACE and ACE-CSnaturally lend themselves to building a cadre of professionalswith shared values. People who successfully complete ACE-CS have immediate credibility with one another and withthe senior officers who actively seek out ACE-CS graduates.Over the last ten years, the AFRL ACE-CS program hasISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012graduated 226 ROTC cadets and civilians from over 40 USuniversities. Many ACE-CS graduates are currently on activeduty and distinguishing themselves by their capabilities andachievements.B. Information Assurance InternshipsIn 2011, ACE-CS successfully transitioned to the Air ForceInstitute of Technology (AFIT-ACE). This transfer enabled theAFRL ACE-CS team to launch the AFRL Information Assurance(IA) Internship Program in the summer of 2011. Whencompared to ACE-CS, the IA Internship Program has a greateremphasis on research and problem solving throughout thesummer. Whereas ACE-CS followed the GE-ACE model of anew problem each week, the IA Internship Program elected tofocus on an overall research problem and develop the rigoroustheoretical and practical knowledge needed to solve it. TheSummer 2011 research problem focused on the science ofmission assurance in a cloud-computing environment, withan emphasis on Air Force mission-essential functions in acontested environment.The 2011 IA Internship had 13 students. The 2012 IA Internshipwill have a total of 25 students from universitiesacross the US, including Massachusetts Institute of Technology,University of Illinois at Urbana-Champaign, RochesterInstitute of Technology, Michigan Technological University,University of Texas at El Paso, Southern University, MissouriUniversity of Science and Technology, Bethel College, RutgersUniversity, University of Maryland, University of Virginia,Kennesaw State University, Iowa State University, SyracuseUniversity, Rose Hulman University, University of Delaware,Clarkson University, Texas A&M University, Ithaca College,University of Dayton, Rennsalaer Polytechnic Institute, Universityof Montana, University of Michigan, and University ofPittsburgh.C. The Cyber Engineering SemesterThe same core team that planned and executed ACE-CS andthe IA Internship Program also planned and executed theCyber Engineering Semester (CES). By cyber engineering,we mean the computer engineering and computer sciencenecessary for engineering trustworthy systems operating incontested environments. The pilot program was executed successfullyin the Fall 2011 semester with six undergraduates(five juniors and one senior) from Michigan Technical University,Syracuse University, and Texas A&M University. Threestudents were Air Force ROTC cadets, and the rest werecivilian students.Figure 1 shows the Fall 2011 Cyber Engineering schedule. The18-credit pilot program consisted of five courses and a paidinternship at the Air Force Research Lab (AFRL) InformationDirectorate. Each course is briefly described in Figure 2.As in the ACE-CS and IA Internship programs, the CES bydesign keeps the students together as a cohort. The facultyMonday Tuesday Wednesday Thursday Friday0800 SecureSecureSecure OSSecure OSHardwareHardwareLabLab0900100011001200 Secure1300 Architecture1400 Engineering1500 Assurance1600 LabCyber EngineeringSeminarSecureArchitectureEngineeringAssuranceLabCyber EngineeringSeminarPaidInternshipat AirForceResearchLabFig. 1: Cyber Engineering schedule for Fall 2011 semester1) Cyber Engineering Seminar (3 credit hours): Exploration of the operationalart of cyber operations at the strategic, operational, and tactical levels throughcritical thinking exercises, problem solving, writing, and presentation2) Engineering Assurance Lab (4 credit hours): Introduction to the theory,practice, and tools for building and verifying systems3) Design of Secure Operating Systems (4 credit hours): Design and implementationof modern secure operating systems. Authentication, authorization,and access control4) Secure Computer Architecture (4 credit hours): Introduction to fundamentalsof computer architecture with emphasis on security measures in hardwareimplementation of processor units5) Secure Hardware Engineering Lab (3 credit hours): Design and implementationof hardware to preserve both the confidentiality and integrity of data,authenticate keys, people, and privileges, and protect physical memoryFig. 2: Cyber Engineering Semester course descriptionsand staff meet regularly to “connect the dots” for the students(i.e., enable the students to see how theory, concepts, and toolscould be integrated to solve real-world problems seen in classprojects and the AFRL internship). The same ACE-CS culturalvalues are inculcated: teamwork, rigorous analysis, and ontimeperformance. Alumni from ACE-CS, the IA InternshipProgram, and the CES are part of a growing cohort of leadersin cyberspace.All involved describe the CES as “intense.” Because of spacelimitations, in this paper we focus on the formal reasoning andverification elements of the CES, which are perhaps the mostnovel aspects of the program. Teaching security and formalverification to undergraduates using theorem provers is rare,if not considered virtually impossible by most academics.III. METHODS: OBJECTIVES, CENTRAL CONCEPT, ANDPOWER TOOLSThe three programs—ACE-CS, IA Internship Program, andCES—have their programmatic and philosophical roots firmlyplanted in the GE-ACE. Founded in 1923, the GE-ACE wascreated to address a particular need as described by FrancisPratt, then GE vice president of engineering [3]: “[A] noticeablenumber of our most accomplished theoretical engineersand laboratorians have either pursued post-graduate studiesat European universities or else have had all of their scholastictraining abroad, and the time is approaching when theirsuccessors will have to be found”. Simply put, GE in 1923was facing a critical shortage of engineering leaders necessaryfor its future survival.Robert E. Doherty, who later went on to be president ofCarnegie Institute of Technology, founded the (still extant) GE-ACE to solve Pratt’s problem. His solution was the three-yearISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


ComprehensionApplicationProceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012Level Cyber Engineering Seminar Engineering Assurance LabAnalysisSynthesisEvaluation- Realize the operational context of cyber engineering- Describe the steps to access local and remote systems- Explain security as part of software development- Describe the information life cycle- Circumvent OS authentication mechanisms to gain access- Harden BIOS, boot loaders, OS, and applications- Eliminate code-level I/O-derived vulnerabilities- When given a mission, determine the requirements for mission assurance- Apply knowledge of past revolutions in military affairs to currentsituations- Derive actionable intelligence on a given target from open sourceresources- Alter OS and application software to maintain access and obscureactivities- Create operational deceive, disrupt, degrade, deny, and destroy effects- Assess software applications for input-derived vulnerabilities- Explain the meaning and discuss the interpretation of given mathematicalor logical expressions- Restate in the Haskell functional language or Higher Order Logic(HOL) theorem prover a given mathematical or logical definition orproperty- Use Haskell to create an executable specification- Use HOL to restate a specification in higher-order logic- Restate the syntax and operational semantics of a language in Haskelland HOL- Use HOL to define concepts of operations (CONOPS), access-controldescriptions, or policies- Use the rules of structural operational semantics (SOS) or accesscontrollogic (ACL) to prove properties in SOS or ACL- Use QuickCheck or HOL to check or verify properties- Construct in Haskell or HOL, appropriate data types, predicates,functions, semantics rules, and proofs for specifications, definitions,or properties- Devise specialized inference in HOL for reasoning about accesscontrol, specific data structures, and concepts of operation- Assess the correctness of proofs using the rules of SOS, access-controllogic, or HOLFig. 3: Educational outcomes listed in increasing level of sophisticationin-house GE-ACE program, designed to develop the followingattributes [3]:1) the ability to identify and solve real engineeringproblems,2) concern for the readers of engineering reports,3) generalists with competence is a wide variety ofengineering fields,4) an understanding of the use and misuse of mathematicalanalysis, and other ways of solving engineeringproblems, and5) the realization that the engineer’s primary purposeis not mathematical virtuosity, but the improvementof methods and products.Eighty-nine years later, these attributes remain central to theIA Internship and CES programs. Although the IA Internshipand CES Programs are less than three years in duration, whattruly distinguishes them from the GE-ACE is the fact that theyare operated jointly by government, industry, and academia.This partnership affords a degree of mutual support and agilitythat programs operated solely by government, industry, oracademia cannot match.A. ObjectivesFigure 3 lists the educational outcomes for the Cyber EngineeringSeminar and the Engineering Assurance Laboratory.The Cyber Engineering Seminar course was taught by AFRLstaff and industry staff. The overall objectives of the seminarcourse are similar to the GE-ACE: rigorous solution of realengineering problems in a professional environment. Studentsare required to do a formal mathematical analysis, write afull engineering report, and make professional presentationsdefending their solutions.The Engineering Assurance Laboratory (EAL) was taught byuniversity faculty and focused on formal-verification tools.This course is unusual at the undergraduate level due toits use of functional programming (Haskell) and theoremprovers (HOL). Functional programming (which emphasizesthe mathematical relationships between functions’ input andoutput) and theorem provers (which check the correctnessof proofs) both serve the same end: rigorous mathematicalassurance of correctness. The importance of computer-assistedreasoning tools in the context of mission assurance and securitycannot be overstated: the computer assistance is necessaryboth for verification and for the credible dissemination ofresults. The situation is analogous to hardware design, whichtoday depends on computer-aided design (CAD) tools suchas design-rule checkers, logic-to-layout verifiers, and modelcheckers.The EAL exists to support the problem solving required in theCyber Engineering Seminar and the AFRL internship. The firsthalf of the EAL introduces the basics of Haskell programmingand HOL theorem proving, along with structural operationalsemantics. The combination of functional programming andHOL theorem proving is of great practical value, becauseHOL (like many other theorem provers) is a collection offunctional programs whose inference rules are functions thatreturn objects of type theorem. In the second half of the EAL,EAL homework problems are directly related to the types ofproblems seen in the Seminar and Internship.B. A Command-and-Control CalculusWe view the integrity of command and control to be ofparamount importance for mission assurance. In the sameway that propositional logic is the basis of digital designfor computer hardware engineers, a calculus for commandand control is central to designing and verifying the logicalconsistency of mission command and control, protocols, andpolicies for security and integrity.The command-and-control calculus we use is the accesscontrollogic described in the textbook Access Control, Security,and Trust: A Logical Approach [4]. This calculus isISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012ControlsϕSaysP says ϕP controls ϕdef = (P says ϕ) ⊃ ϕP reps Q on ϕ def = P | Q says ϕ ⊃ Q says ϕP controls ϕRepsϕP says ϕDerived Speaks ForP ⇒ QQ controls ϕ P reps Q on ϕ P | Q says ϕϕP says ϕQ says ϕRelationshipFormulaKey associated with AliceK a ⇒ AliceBob has jurisdiction (controls or is believed)over statement ϕBob controls ϕAlice and Bob together say ϕ(Alice & Bob) says ϕAlice quotes Bob as saying ϕ(Alice | Bob) says ϕBob is Alice’s delegate on statement ϕ Bob reps Alice on ϕCarol is authorized in Role on statement ϕ Carol reps Role on ϕCarol acting in Role makes statement ϕ (Carol | Role) says ϕFig. 6: Relationships and their description in the calculusP & Q says ϕ& Says (1)P says ϕ ∧ Q says ϕP | Q says ϕQuoting (1)P says Q says ϕ& Says (2)Quoting (2)P says ϕ ∧ Q says ϕP & Q says ϕP says Q says ϕP | Q says ϕ1. CA controls (K a ⇒ Alice) Trust in CA’s jurisdiction over keys2. K ca ⇒ CA Trust assumption associating K ca with CA3. K ca says (K a ⇒ Alice) Public key certificate4. CA says (K a ⇒ Alice) 2, 3 Derived Speaks For5. K a ⇒ Alice 1, 4 ControlsFig. 7: Proof for associating a public key with a principalIdempotency of ⇒ P ⇒ PMonotonicity of |P ′ ⇒ PQ ′ ⇒ QP ′ | Q ′ ⇒ P | QFig. 4: Sample inference rules of the access-control logictrust assumptionscredentialsjurisdictionauthoritycommandsPoliciesguardyes or no?protectedresource orcapabilityFig. 5: Basis for determinning whether to grant access requestbased on an access-control and authentication logic originallydefined by Lampson, Abadi, Burrows, Wobber, and Plotkin[5], [6]. The inference rules of the command-and-controlcalculus (some of which appear in Figure 4) are guaranteedto be logically sound with respect to their (Kripke) semantics.The central question addressed by the calculus is illustrated inFigure 5. Given a request (or command) to access a protectedresource or capability, how do we logically derive whether ornot to grant that request based upon specific policies, trustassumptions, credentials presented, and assumptions aboutauthorities and their jurisdiction? This question is analogousto what computer hardware engineers ask when they use logicto derive the output values of computer hardware.The command-and-control calculus can be applied at all levelsof abstraction, from the control of physical memory and networkprotocols, up to and including the security and integritypolicies of organizations. Using the same command-andcontrolcalculus across multiple levels of abstraction enablesconsistent interpretations and implementations of policies.A calculus is useful to the extent that it describes useful conceptsand relationships. Figure 6 shows how some importantrelationships are represented in the logic. To illustrate how thecalculus works, we show how to derive conclusions using therelationships in Figure 6 and the inference rules in Figure 4.As a simple example, suppose that Alice’s public key is K aand that this association is certified by certificate authorityCA whose key is K ca . This certification is a public-keycertificate signed by K ca . Further suppose that Bob trustsor recognizes CA’s authority over public-key certificates andthat Bob believes K ca is CA’s public key. The proof inFigure 7 provides a formal justification for Bob to concludethat K a is Alice’s key (lines 1 through 3 indicate Bob’s startingassumptions, and everything afterward follows as the result ofinference rules from Figure 4.)As another example, suppose that Alice is Blue Force Commander(BFC) and in that role has the authority to issue thego command to start a mission. Suppose further that Bobis in Alice’s chain of command and that he receives Alice’sgo command encrypted with key K a . The proof in Figure 8provides a formal justification of Bob’s conclusion that theencrypted order he receives is legitimate and actionable.The proofs in Figures 7 and 8 give rise to the two new derivedinference rules shown in Figure 9, which are guaranteed to besound by construction. The utility of the derived inference1. BFC controls 〈go〉 Trust in BFC’s authority2. Alice reps BFC on 〈go〉 Trust assumption that Alice is BFC3. CA controls (K a ⇒ Alice) Trust in CA’s jurisdiction over keys4. K ca ⇒ CA Trust assumption associating K ca with CA5. K ca says (K a ⇒ Alice) Public key certificate6. (K a | BFC) says 〈go〉 Alice’s order as BFC encrypted using K a7. K a ⇒ Alice 3,4,5 DR18. BFC ⇒ BFC Idempotency of ⇒9. K a | BFC ⇒ Alice | BFC 7, 8 Monotonicity of |10. Alice | BFC says 〈go〉 9, 6 Derived Speaks For11. 〈go〉 1, 2, 10 RepsFig. 8: Proof of legitimacy of a mission orderDR2DR1CA controls (K a ⇒ Alice) K ca ⇒ CAK ca says (K a ⇒ Alice)K a ⇒ AliceBFC controls 〈go〉 Alice reps BFC on 〈go〉CA controls (K a ⇒ Alice) K ca ⇒ CAK ca says (K a ⇒ Alice) (K a | BFC) says 〈go〉〈go〉Fig. 9: Two derived inference rulesISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012- val a1 = ACL_ASSUM ‘‘(Name CA) controls ((Name Ka) speaks_for (Name Alice))‘‘;> val a1 = [.] |- (M,Oi,Os) sat Name CA controls Name Ka speaks_for Name Alice: thm- val a2 = ACL_ASSUM ‘‘(Name Kca) speaks_for (Name CA)‘‘;> val a2 = [.] |- (M,Oi,Os) sat Name Kca speaks_for Name CA : thm- val a3 = ACL_ASSUM ‘‘(Name Kca) says ((Name Ka) speaks_for (Name Alice))‘‘;> val a3 = [.] |- (M,Oi,Os) sat Name Kca says Name Ka speaks_for Name Alice :thm- val th1 = SPEAKS_FOR a2 a3;> val th1 = [..] |- (M,Oi,Os) sat Name CA says Name Ka speaks_for Name Alice :thm- val th2 = CONTROLS a1 th1;> val th2 = [...] |- (M,Oi,Os) sat Name Ka speaks_for Name Alice : thmFig. 10: Sample proof in HOL of the derived inference rule DR1Joint Forces Certificate AuthorityJFCABlue Forces Commandergo/nogoGold Forces Commandergo/nogoBlue ForcesCertificate AuthorityBFCAGold ForcesCertificate AuthorityGFCA(a) Certificate Authority HierarchyBlue Forces Operatorlaunch/abortWeaponGold Forces Operatorlaunch/abort(b) Flow of Command and ControlFig. 12: Certificate-authority hierarchy and flow of command and controlBFC says goBlue Forces OperatorBFC says goJurisdiction StatementsPolicy StatementsTrust Assumptions------------------------------BFO says launchBFO says launchBFC says nogoBlue Forces OperatorBFC says nogoJurisdiction StatementsPolicy StatementsTrust Assumptions------------------------------BFO says abortBFO says abortWeaponBFO says abortPolicy StatementsTrust Assumptions------------------------abortGFC says goGold Forces OperatorGFC says goJurisdiction StatementsPolicy Statements GFO says launchTrust Assumptions------------------------------GFO says launch(a) Launch CONOPSWeaponBFO says launchGFO says launchPolicy StatementsTrust Assumptions------------------------launchGFC says nogoGold Forces OperatorGFC says goJurisdiction StatementsPolicy StatementsTrust Assumptions------------------------------GFO says abortFig. 13: High-level view of launch and abort CONOPSGFO says abort(b) Abort CONOPSWeaponGFO says abortPolicy StatementsTrust Assumptions------------------------abort3) Weapons launch and abort policy (see Figure 16)4) Role assignments and authorizations (see Figure17)Students defined and verified weapons launch and abortCONOPS using the access-control logic in HOL. They provedthe validity of their authentication CONOPS based on roleauthorization and key assignments in HOL. Specifically, theyhad to devise the jurisdiction statements, policy statements,and trust assumptions; express them using the HOL implementation;and prove the validity of inference rules correspondingto weapons launch or abort in HOL. For this problem, studentsrefined, described, and verified their CONOPS at three levelsof abstraction: (1) the highest level with only roles, (2) amiddle level with staff assigned to roles, and (3) the lowestlevel with cryptographic keys assigned to mission staff.The final problem assigned to EAL students required themto do one more refinement: define the semantics of specificmessage and certificate formats using the access-control logicin HOL. This enabled them to describe and verify the behaviorof their designs using concrete message and certificate formats.Certificate Authority Associated Key How AuthenticatedPre-distributed to all mission principalsJFCA K JFCABlue Forces CA K BFCA Authenticated by JFCAGold Forces CA K GFCA Authenticated by JFCAFig. 14: Certificate authorities, keys, and authentication meansRole Controls How AuthenticatedBlue Forces Commander go/nogoPre-distributed to all BlueForces mission principalsGold Forces Commander go/nogoPre-distributed to all GoldForces mission principalsBlue Forces Operator launch/abort Blue Forces CommanderGold Forces Operator launch/abort Gold Forces CommanderB. ProofsFig. 15: Mission roles and jurisdictionThe students’ HOL proofs used the specialized inference ruleswe created for them as part of our HOL implementation of theaccess-control logic. We supplied 36 HOL inference rules thatcorresponded to the inference rules in the access-control logictextbook [4]. All of the proofs done for the more complexISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012ActionWeapons LaunchWeapons AbortPersonAlice asBFCBob asGFCCarol asBFODan asGFOCommand and Control RequirementsLaunch ordered by both Blue and Gold OperatorsAbort ordered by either Blue or Gold OperatorFig. 16: Weapons launch and abort policyHowAuthenticatedpre-distributedprior to missionpre-distributedprior to missionBy Alice as BFCBy Bob as GFCFormal Description of Delegation of AuthorityAlice reps BFC on ϕAlice reps BFC on (Carol reps BFO on ϕ)Bob reps GFC on ϕBob reps GFC on (Dan reps GFO on ϕ)Carol reps BFO on ϕDan reps GFO on ϕFig. 17: Role assignments and authorizationsproblems resembled the example shown in Figure 10. Theproof support we supplied was intended to mimic what theywould do in their pencil-and-paper proofs.One group did a complete analysis of the secure cloudprotocols for their internship using the HOL implementation ofthe access-control logic. They did this on their own initiativewithout any additional help from the faculty teaching theEngineering Assurance Laboratory.C. ReportsReports were an important deliverable for all projects: no onebelieves an analysis unless it is presented in a compellingfashion.We required students to use L A TEX and Beamer for all reportsand presentations in both the Cyber Engineering Seminar andEngineering Assurance Laboratory. One primary motivationfor this requirement was to make use of HOL’s automaticgeneration of L A TEX macros that typeset all formulas of atheory: datatypes, definitions, and theorems. These macrosmean that HOL users do not have to manually enter theformulas in their documentation, thereby eliminating a majorsource of error. The benefit to readers and users of engineeringreports is the high degree of assurance and confidence thatwhat is documented is accurate and free from error.Given the large number of formulas to be documented andtypeset, we found that the benefits of using L A TEX and Beamerfar outweighed the costs associated with teaching the studentsthese tools. We provided students with style files and samplereport/presentation templates. The reports and presentationswere professional in appearance. The accuracy of theHOL documentation was assured due to the use of HOL’sL A TEX macros for pretty printing HOL theories.V. PLANS: LESSONS LEARNED AND NEXT STEPSAs an educational experiment, the goal of the 2011 CyberEngineering Semester pilot program was to see if we couldget undergraduates to rigorously assure the security andintegrity of computer systems they devised at the level ofhardware, architecture, operating systems, networks, protocols,and applications. Simply put, would they be able to rigorouslycomprehend, analyze, and synthesize the concepts put forthin Saltzer and Schroeder’s classic paper, The Protection ofInformation in Computer Systems [8]?As instructors, we had three major questions: (1) Were weasking too much? (2) Would the design of our five courses infact reinforce security and integrity concepts so that studentswould be able to “connect the dots”? (3) Would studentsactually be able to do formal proofs of correctness using tools(e.g., theorem provers) generally thought to be beyond thegrasp of undergraduates? The answers in short are: (1) almost,(2) mostly, and (3) yes.A. Lessons LearnedWe learned multiple lessons that will inform our next steps.1) The CES courses and internship were very intense, duein large part to the time demands of class meetings,internship responsibilities, and homework. Because ourstudents had excellent skills (all had GPAs of 3.3 andabove), they coped. However, the small size of the pilotprogram also allowed us to make some minor midsemesteradjustments to help alleviate the time pressure.Our challenge and goal is to convey the same capabilitiesnext time with less intensity.2) All five courses utilized the access-control logic or itsconcepts. The benefit was a common way of thinkingabout security and integrity across the five courses. Mutuallyreinforcing assignments, when we were able tosynchronize them, worked.3) Introducing HOL theorem proving worked because studentshad enough functional-programming expertise tocomprehend, analyze, and synthesize HOL inferencerules as functional programs. Restricting students’ attentionprimarily to the 36 access-control logic inferencerules enabled the students to do meaningful HOL proofs.The value to the students was the assurance they achievedwhen they proved their CONOPS were valid.B. Next StepsBased on our experience, we have a much better idea of whatundergraduates are able to accomplish and how we mightsimplify our program for 2012. Specifically, we can accuratelyplan the timing of our assignments to smooth out the workloadof students. We plan on having hardware, OS, and architectureprojects more closely synchronized to avoid duplication andpromote greater reinforcement of common ideas. In termsof formal verification and functional programming, we nowknow how much detail is needed in our explanations andexercises. This will enable us to move more quickly withgreater coverage of concepts.ISBN 1-933510-95-1/$15.00 c○ 2012 CISSE


Proceedings of the 16 th Colloquium for Information Systems Security EducationOrlando, FL June 11-13, 2012All the partners were pleased with the outcome of the pilot.We are planning on testing our latest ideas in the Summer2012 Information Assurance Internship in preparation forthe Fall 2012 CES. Finally, we are planning a four-yearcyber engineering program with an eye toward making theassurance of security and integrity in cyber systems a routinepart of computer engineering and computer science at theundergraduate level.VI. CONCLUSIONS“. . . officers can never act with confidence until theyare masters in their profession . . .” – Col. HarryKnox, 27 September 1776, on establishing the needfor artillery schools in the USColonel Knox’s observations in 1776 apply today: rigorous educationis essential, because there is no substitute for knowingwhat you are doing. Just as artillery officers need to know themathematics of ballistic trajectories, doctrine, and solutions tooperational and tactical problems, cyberspace leaders need toknow the underlying mathematics of cyberspace, be able tosolve operational and tactical problems, and develop effectivedoctrine.Our goal—from the start of the 2003 Advanced Course inEngineering Cyber Security Boot Camp, through the CyberEngineering Semester, and onwards to a four-year Cyber EngineeringProgram—is to meld systems engineering thinkingwith leadership thinking. Cyberspace leaders who can “do themath” as systems engineers and military leaders are capableof reshaping cyberspace to their advantage.Our thinking on this subject has evolved and expanded beyondwhat we initially thought possible. We have needed the tenyears to get as far as we have, and our efforts have benefittedgreatly from the ability to try out our ideas on studentsfrom a wide variety of institutions. When we began in 2003,we thought that the notion of having rising juniors use acommand-and-control calculus with a Kripke semantics washighly questionable. In 2005, we discovered that eight hours ofinstruction could equip students to use the calculus correctly;however, such limited instruction time was insufficient for studentsto carry on by themselves without structured supervision.By the 2011 IA Internships, the instruction had grown to tenweeks and included functional programming and just a hintof theorem proving. The Cyber Engineer Semester providedsufficient evidence that mathematically rigorous assurance isfeasible in a way that is relevant beyond one course.Two professors who can do a proof is not a critical capability.In contrast, two thousand engineers and officers capable ofmathematical analysis and leadership in support of designing,verifying, procuring, and operating systems is a criticalcapability. This vision is shared by an academic, industry,and governmental partnership with a ten-year history of jointoperations. By working together to realize this vision, wehave gained a level of trust and conceptual unity that enablesus to rapidly design and deploy research and educationalprograms, integrated with internships and focused on realworldproblems.We are working hard to meet the goal of two thousandengineers and officers with the capabilities described here.We hope to expand this partnership to other universities,corporations, and government agencies so that we togethercan meet the critical national need for capable leadership incyberspace.REFERENCES[1] Dan Carnevale, “Basic training for anti-hackers: Anintensive summer program drills students on cybersecurityskills,” The Chronicle of Higher Education, September 232005.[2] Kamal Jabbour and Susan Older, “The advanced coursein engineering on cyber security: A learning communityfor developing cyber-security leaders,” in Proceedings ofthe Sixth Workshop on Education in Computer Security,July 2004.[3] American Society for Engineering Education, The GeneralElectric Advanced Course in Engineering, Colorado StateUniversity, Ft. Collins, CO 80521, June 16–19 1975.[4] Shiu-Kai Chin and Susan Older, Access Control, Security,and Trust: A Logical Approach, CRC Press, 2011.[5] Butler Lampson, Martin Abadi, Michael Burrows, andEdward Wobber, “Authentication in distributed systems:Theory and practice,” ACM Transactions on ComputerSystems, vol. 10, no. 4, pp. 265–310, November 1992.[6] Martin Abadi, Michael Burrows, Butler Lampson, andGordon Plotkin, “A calculus for access control in distributedsystems,” ACM Transactions on ProgrammingLanguages and Systems, vol. 15, no. 4, pp. 706–734,September 1993.[7] M.J.C. Gordon and T.F. Melham, Introduction to HOL:A Theorem Proving Environment for Higher Order Logic,Cambridge University Press, New York, 1993.[8] Jerome Saltzer and Michael Schroeder, “The protection ofinformation in computer systems,” Proceedings of IEEE,1975.ISBN 1-933510-95-1/$15.00 c○ 2012 CISSE

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!