13.07.2015 Views

Application Layer Covert Channel Analysis and ... - Bill Buchanan

Application Layer Covert Channel Analysis and ... - Bill Buchanan

Application Layer Covert Channel Analysis and ... - Bill Buchanan

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Z. Kwecka, BSc (Hons) Network Computing, 2006 314.4 Experiment DesignExperiments are the crucial part of this project. As it was mentioned previously theHTTP protocol specification, which we are going to investigate, allows for largevariation between the actual implementations. Thus, we will try to identify thedifferences of those implementations. Hopefully the findings will allow us to createbase of Web browsers’ signatures, i.e. kind of fingerprint that could be used toprecisely identify the User-Agent originating the request. Later we shell experimentwith reducing a number of information sent during the requests, so to provide data tothe analysis of which fields seems to be disused now. After the analysis of the first setof experimental results an implementation of filtering Proxy <strong>and</strong> covert channelscanner, will be proposed. Eventually a prototype of covert channel detection <strong>and</strong>filtering software should be implemented. Thus, in order to test this software somebasic covert channel scenarios will be designed <strong>and</strong> later implemented.4.4.1 Experiment 1 – Implementation Specific Data GatheringIn this experiment our objective is to collect HTTP requests generated by variousWeb browsers. We have identified four different Web browsers as the most popular atthe present time:- Internet Explorer- Firefox- Opera- NetscapeThese are the browsers web design companies consider when developing websites 5 ,since they represent “99.9% of the Web browsers” currently in use. Thus we will needto install all these browsers on a single machine connected to the Internet <strong>and</strong> usethem to access the same set of websites, while creating HTTP traffic dumps on thismachine. Ideally the test procedure should be automated <strong>and</strong> exclude the humanfactor, to ensure the test conditions for each browser are the same. To do that a pieceof software which would allow for timed process execution <strong>and</strong> termination should bedeveloped. Also another piece of software should be available to collect trafficdumps. After the data gathering phase the packet dumps, which are in binary form,will need to be analyzed, thus third piece of software will be required for thisexperiment. This software will iterate trough packet dumps <strong>and</strong> extract necessaryinformation.The software required for this experiment:- Browser Caller. An application triggering Web browsers to request websitesfrom predefined list.- HTTP Dumper. Piece of software employing WinPCap to collect binarydumps of packets from HTTP conversations. Ideally only the packetscontaining the HTTP protocol envelope should be saved.- OffLine HTTP Analyser. The purpose of this application would be datamining from the binary packet dumps in order to collect experiment results.4.4.2 Experiment 2 – Request Information FilteringAt this stage we assume that Experiment 1 will result in defining a typical set ofHTTP headers used by the most common implementations of HTTP client software,i.e. popular Web browsers. The objective of this experiment is to analyse the World5 information sourced from the directors of Edinburgh based Efero company (www.efero.com)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!