Views
1 year ago

CSLATEST

software verification

software verification Teesside University: has plans to launch a cybersecurity clinic that will provide free cybersecurity services to the community. NO SOFT TOUCH HOW EXACTLY CAN YOU BEST MANAGE SOFTWARE VERIFICATION, AND WHAT ARE THE FIRST STEPS TOWARDS SAFE AND RESILIENT SYSTEMS? DR JOÃO FERREIRA, A COMPUTER SCIENTIST WORKING IN THE SCHOOL OF COMPUTING AT TEESSIDE UNIVERSITY, OFFERS HIS INSIGHTS It is undeniable that modern society is very dependent on software. Governments and employers rely on databases that keep sensitive and personal information. Hospitals rely on machines and software that provide life support and complex semi-automated treatments, such as radiotherapy. Banks rely on software that manages money and on databases that store financial data. On a more individual level, most people cannot even imagine life without their favourite mobile apps, email client or social media presence on sites like Facebook, Twitter, Instagram, Pinterest or LinkedIn. Some of the statistics are impressive. In the UK alone, there are at least 38 million active social media users - approximately 58% of the entire population. Software technology is so popular and widespread, because it improves our lives. But it has been recognised for many decades that software failures can cause serious problems. A well-known example of software failure is Therac-25, a radiation therapy machine that was involved in at least six accidents between 1985 and 1987, in which patients were given overdoses of radiation. Due to software errors, the machine gave its patients radiation doses that were hundreds of times greater than normal, resulting in death or serious injury. Given the massive technological developments of the last few decades, one would expect that software failures like the ones that affected Therac-25 would not occur anymore. The reality is that, even though software engineers learned substantially from past mistakes and software engineering practice is now more effective and reliable, serious problems caused by software failures still occur on a regular basis. Two illustrative examples reported by the media in the last year are the software problems that caused HSBC substantial financial losses and the software glitch that resulted in 3,200 US prisoners being released earlier than expected. These examples demonstrate clearly that the current status of software reliability can improve - and must do so, in line with recent developments. TACKLING CYBER-ATTACKS Today, software security - and, more generally, cybersecurity - is on everyone's lips. At the start of November, the UK government announced plans to spend £1.9 billion to retaliate against cyber-attacks (see separate feature in this issue), and address ways to tackle cyber-scammers and defend businesses. From my perspective, many organisations miss a crucial point about 'cybersecurity': the 22 computing security July/August 2017 @CSMagAndAwards www.computingsecurity.co.uk

software verification first step to a safe and resilient system is software analysis and verification. My view is that, whilst it is impossible to create a system that is absolutely secure, discussions around information security and risk management are often being approached incorrectly, as they do not effectively address the 'science'. For the past 10 years, I've researched mathematical approaches that can improve the current scientific standards in algorithm and software design. At Teesside University, we develop tools and methods that enable and support the construction of reliable, safe and secure software systems. We apply formal methods to the specification, development and verification of software. As we use mathematically-based techniques, we are able to obtain strong, provable guarantees about systems. PRECISION REASONING One of the key ideas in our research is that, by treating computer programs as mathematical entities, we can reason about them in a very precise way. In some of our recent work, for example, we used a mathematical theory called separation logic to verify properties of FreeRTOS, one of the most popular operating systems for embedded devices. Verification tools based on separation logic enable reasoning about small, independent parts of the program memory, rather than having to consider its entire status. This makes the verification process manageable and scalable, so it can be applied to large software systems. THE ABSENCE FACTOR Despite the increasing popularity of verification tools that prove the absence of errors, most software companies still rely solely on software testing to determine whether their products are reliable. Testing is an important step in the development of software, but, as the famous Dutch computer scientist Edsger Dijkstra once put it: "program testing can be used to show the presence of bugs, but never to show their absence". Indeed, software testing produces weaker guarantees than verification tools that prove mathematically the absence of errors. However, testing can be a cheaper process, is required by many industry standards and, when performed well, can greatly increase the quality of software. An example is the test-generation software that we are developing at Teesside University, as part of a two-year Knowledge Transfer Partnership (KTP) funded by Innovate UK, with Applied Integration, which is one of the UK's leading independent systems integrators specialising in complex safety-critical control systems. Applied Integration designs these systems for a wide range of customers across many industrial sectors, including automated systems for nuclear attack submarines, petrochemicals, and oil and gas sectors. Typically, for each project, engineers interpret briefs and capture requirements in individual ways, often producing very large documents manually. If a client's requirements have been misunderstood, all those documents then need to be retrospectively amended. As part of this KTP project, a new standard methodology to capture client requirements for new control systems is being developed by myself and colleagues. This will allow more consistency across projects. Moreover, testing the systems developed is a very expensive process, often taking 60% of the total cost of a project. The project will develop and implement mathematicallybased techniques that allow automated generation of test sets, directly from the user requirements captured, and automated testing of these sets. ADVANCING THE CAUSE This will give Applied Integration a unique advantage over global competitors. The company estimates that the positive commercial impact of the Knowledge Transfer Partnership will be worth more than £5 million over the next three years. This encompasses increases in turnover, new Dr João Ferreira: the current status of software reliability must improve. efficiencies in capturing user requirements and the savings associated with automated testing - an exciting innovation. As a university, we are committed to improving the current status of software reliability by developing innovative research, transferring knowledge into industry and adequately training the next generation of software engineers. We have extensive experience working with companies, helping them with expert advice and to attract external funding to solve their software challenges. We also plan to launch a cybersecurity clinic that will provide free cybersecurity services to the community. Services will include advice and security assessments on software safety and other security topics. The issue of effective software or cybersecurity strategy for businesses may seem simple on paper, but getting the right software verification systems in place in the first instance can save a lot of future headaches. More information about Dr João Ferreira's work can be found at http://joaoff.com www.computingsecurity.co.uk @CSMagAndAwards July/August 2017 computing security 23