KEYNOTE TALK AVerify: AN OPEN-SOURCE ANTI-VIRUS - Eicar
KEYNOTE TALK AVerify: AN OPEN-SOURCE ANTI-VIRUS - Eicar
KEYNOTE TALK AVerify: AN OPEN-SOURCE ANTI-VIRUS - Eicar
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
<strong>AVerify</strong><br />
Towards verifiable anti-virus testing<br />
2010-05-10 EICAR 2010
What’s this presentation about?<br />
� Mostly AV testing, of course<br />
� Just my personal point of view (not my<br />
employer’s!)<br />
� The reasons behind <strong>AVerify</strong>, and the project<br />
goals<br />
2010-05-10 EICAR 2010
A philosophical note<br />
� The bad guys are helping each other<br />
– sharing/selling techniques, codes, …<br />
� Yet the AV ecosystem is fragmented, highly<br />
competitive<br />
– Non disclosure of information/samples/… gives an<br />
edge (temporary)<br />
– Cooperation exists on a small scale<br />
2010-05-10 EICAR 2010
A philosophical note<br />
� Information sharing is a good thing, in general<br />
– Security through obscurity never works<br />
� See: Mifare (RFID), A5/1 (GSM), …<br />
– I started with ProView, thanks to McAfee :)<br />
� Case in point: WEP, 2004<br />
– Wireless security started improving after tools<br />
became readily available<br />
– Studying attacks techniques should be encouraged<br />
2010-05-10 EICAR 2010
(not my fault)<br />
2010-05-10 EICAR 2010
� AV comparisons can be found everywhere<br />
� A few problems:<br />
– Does it still make sense to test with 16-bit DOS<br />
infector samples?<br />
� Nope. But look at that shiny 99.8% detection rate!<br />
– Results cannot be reproduced, no clear methodology<br />
given. “Independent” ?<br />
– Scanning 10000+ files a realistic usage scenario?<br />
– Ad-hoc, black-box scoring<br />
� “My My AV must be better than yours, it’s got a 4-star GOLD<br />
award!”<br />
2010-05-10 EICAR 2010
Let’s search “antivirus reviews” on<br />
Google. The winner is…<br />
2010-05-10 EICAR 2010
A sound methodology!<br />
� Apparently based on checking boxes:<br />
2010-05-10 EICAR 2010
“Race Race To Zero” (2008) Challenge<br />
� Defcon 16, Las Vegas, August 2008<br />
– Aim: modify known samples to bypass signaturesignaturebased detection<br />
– Best time: nine samples obfuscated in 3 hours<br />
� Lots of media coverage<br />
� Precise technical details not shared with the<br />
public<br />
2010-05-10 EICAR 2010
“Race Race To Zero” (2008) Challenge<br />
� Signature-based detection will fail eventually<br />
– Cf. Fred Cohen<br />
� What good is this challenge?<br />
– Precise results and samples not available<br />
– Not a set of single tests, more of an ad-hoc method<br />
(let’s nop/pack/obfuscate that sample until it evades<br />
the AV)<br />
– Cannot be reproduce exactly, obfuscation technique<br />
not shared<br />
� Not worthless, but not earth-shattering either<br />
2010-05-10 EICAR 2010
The firewall “leak tests”<br />
� Guillaume Kaddouch, 2007<br />
� Single tests, mostly network evasion &<br />
keyloggers<br />
– Disclosed exploit techniques & executables<br />
– Published the results for each test<br />
� Initially, most vendors didn’t pass<br />
– But they improved their products<br />
� Sadly, the test suite is no longer maintained<br />
– And source code is not accessible<br />
2010-05-10 EICAR 2010
“A A study of anti-virus’ response to<br />
unknown threats (EICAR, 2009)”<br />
� N. Richaud and myself<br />
� Twelve anti-virus products tested<br />
� 21 single tests, oriented towards “proactive”<br />
(HIPS-like) detection<br />
� Tests run by hand in Dec. 2008<br />
– Some tests did require admin rights, but not all<br />
� Results published at EICAR 2009<br />
2010-05-10 EICAR 2010
� Low-level access<br />
EICAR 2009<br />
2008 versions MBR (modified<br />
bootroot)<br />
avast!<br />
AVG<br />
Avira Detected<br />
BitDefender<br />
ESET<br />
F-Secure<br />
Kaspersky Detected<br />
McAfee<br />
Norton<br />
Panda<br />
Sophos<br />
TrendMicro<br />
2010-05-10 EICAR 2010<br />
Device\PhysicalMemory
� Keyloggers:<br />
EICAR 2009<br />
2008 versions WH_KEYBOARD_LL GetRawInputData<br />
avast!<br />
AVG<br />
Avira<br />
BitDefender<br />
ESET<br />
F-Secure<br />
Kaspersky Detected<br />
McAfee<br />
Norton<br />
Panda<br />
Sophos<br />
TrendMicro Detected<br />
2010-05-10 EICAR 2010
� Code injection:<br />
2008 versions CreateRemoteThrea<br />
d<br />
avast!<br />
AVG<br />
Avira Detected<br />
BitDefender<br />
ESET<br />
F-Secure<br />
EICAR 2009<br />
SetWindowsHookE<br />
x<br />
Kaspersky Detected<br />
McAfee<br />
Norton<br />
Panda<br />
Sophos<br />
TrendMicro Detected Detected<br />
2010-05-10 EICAR 2010<br />
QueueUserAPC
� Lessons learned:<br />
EICAR 2009<br />
– 12 AV x 21 tests = a full week of work (non-stop!<br />
except the coffee breaks ;)<br />
– Coding is fun, testing is very repetitive & boring<br />
– A partial view, limited to HIPS-based detection<br />
– Tests were run on a specific configuration; Windows<br />
Vista was ignored<br />
� No real winner(s), even basic techniques were<br />
barely detected<br />
2010-05-10 EICAR 2010
The iAWACS 2009 Challenge<br />
� Goal: to disable several anti-virus programs<br />
without the user noticing<br />
� Windows XP SP3 with admin account<br />
� Results:<br />
– Almost all AV could be disabled thanks to ring0<br />
access<br />
– Took a few minutes to a few hours (tops)<br />
2010-05-10 EICAR 2010
The iAWACS 2009 Challenge<br />
� Lessons learned:<br />
– Disabling is easy with admin rights ;)<br />
� Wait, we knew that already.<br />
� But should it be?<br />
� HIPS-like detection leads to the problem of false positives…<br />
– Many classic techniques still not blocked<br />
� Including access to the PhysicalMemory device<br />
� First published in Phrack, 2002<br />
– Once again, there was no clear winner or loser<br />
2010-05-10 EICAR 2010
The iAWACS 2010 Challenge<br />
� Different rules:<br />
– Windows 7, user account (no admin rights at all)<br />
– Every possible attack was considered fair game<br />
– Time limit: four hours<br />
� Seven attacks proposed:<br />
– Many denial-of-service ones<br />
� One was .bat-powered ;)<br />
– Ransomware<br />
2010-05-10 EICAR 2010
The iAWACS 2010 Challenge<br />
� Judging the attacks, a challenge in itself<br />
– Some were quite sophisticated<br />
– Some very basic but still effective<br />
� Reiterates that signature-based detection not a<br />
silver bullet<br />
� Although a few good surprises:<br />
– Example: KAV’s warning of an non-signed program<br />
launched<br />
– Creation of a key in CurrentVersion\Run detected by<br />
two other anti-viruses<br />
2010-05-10 EICAR 2010
The Guillermito case<br />
� 2001, Tegam vs. Guillermito (a French hacker)<br />
– Tegam claimed to protect against all known and<br />
unknown viruses<br />
– Guillermito then disclosed several flaws in Tegam’s<br />
product, ViGuard<br />
� Condemned (2006) to pay 15000 euros<br />
– Reverse-engineering cited as a reason<br />
� Tegam then went bankrupt…<br />
� Preventing reverse-engineering only helps the<br />
bad guys!<br />
2010-05-10 EICAR 2010
Some difficulties of testing<br />
� Reverse-engineering: to be avoided?<br />
– Or just claim to have been very lucky ;)<br />
– Reading the fine print, tedious but required<br />
� Evaluation versions can be hard to find<br />
– Some AV companies make it easy, others not<br />
– Could be considered a single test in itself<br />
� Finding new threats is a challenge in itself<br />
Finding new threats is a challenge in itself<br />
– Even AV companies have trouble obtaining samples<br />
2010-05-10 EICAR 2010
More difficulties of testing<br />
� Can the test environment be considered<br />
realistic?<br />
– Many malware detect VMware and other vm<br />
– Single tests may run for a few minutes, real users use<br />
their computer for hours<br />
� A better setup would require:<br />
– One physical machine per AV in parallel with identical<br />
hardware<br />
– “dummy” dummy” robots mimicking users<br />
– Knowing which malware are the most common<br />
– Gaining access to new samples near release time<br />
2010-05-10 EICAR 2010
Testing, but what?<br />
� Anti-viruses are complex products with heaps<br />
of features<br />
1. What exactly are they supposed to be<br />
protecting against?<br />
– Malware, rootkits, spyware,<br />
, …<br />
1. What are security mechanisms are<br />
implemented? What should be tested exactly?<br />
� More features not always a good thing<br />
– Software less resilient, greater attack surface<br />
2010-05-10 EICAR 2010
Does CC Certification make sense?<br />
� (CC = Common Criteria, an evaluation process)<br />
� Pros:<br />
– Full review of documentation and source code<br />
– In theory, best known attacks are applied<br />
– Higher EAL levels offer (semi) formal proof<br />
� Cons:<br />
– TOE/ST is often reduced to save time<br />
– Applies to a single version, but AV evolve fast<br />
– No common reference for the attacks<br />
�� depends on the evaluator’s competence<br />
– Just another marketing gimmick?<br />
2010-05-10 EICAR 2010
A note on CSPN Certification<br />
� (CSPN = “Certification Sécurité Premier Niveau”)<br />
� Similar to the Common Criteria, however:<br />
– “Single Single shot” evaluation, much shorter (1 month)<br />
– Mostly focused on the attacks themselves<br />
� Cons:<br />
– Like CC, nothing is made public<br />
(except the eventual certificate)<br />
– Perimeter is restricted as well<br />
– Cert. applies to a single version<br />
– Testing depends on the evaluator<br />
2010-05-10 EICAR 2010
<strong>AVerify</strong><br />
� Inspired by the EICAR anti-virus test file<br />
� But this project is independent from EICAR itself!<br />
� Follows the EICAR code of conduct<br />
� Provide a set of simple tests to other researchers<br />
– with ith the full description & source code<br />
– w<br />
� This ensures:<br />
– Independent reproducibility<br />
– Based on the original experimental desc.<br />
– Reliably repetition of these experiments<br />
� Allows fact-based reviewing of AV programs<br />
Allows fact-based reviewing of AV programs<br />
– Instead of the hand-waving one often sees<br />
2010-05-10 EICAR 2010
The basics<br />
� Define a common platform<br />
– Windows XP 32-bit still widely used<br />
– Windows 7 64-bit gaining momentum<br />
– Which software to install? Firefox/IE8/Chrome…?<br />
� Define a base privilege level<br />
– Admin or not admin? Or better, both.<br />
� Define common usage scenarios and attack<br />
vectors<br />
– Opening malicious links from email / IM<br />
– Running a malware run from an external drive:<br />
USB key / CD-ROM / network drive<br />
– Machine already infected, try do disinfect<br />
� Evaluate the success of disinfection with a LiveCD<br />
2010-05-10 EICAR 2010
Planned tests #1: HIPS features<br />
� Very similar to our 2008 tests<br />
� Every test should only exercise one aspect<br />
– Old-school persistence through CurrentVersion\Run<br />
– WH_KEYBOARD Windows hook<br />
– StartService driver loading<br />
– This includes firewall bypassing techniques<br />
� Run each test identically on all AV<br />
Run each test identically on all AV<br />
– Define a passed/not passed check<br />
� Combination of tests to simulate real<br />
threats, evaluate AV’s threshold level<br />
2010-05-10 EICAR 2010
Planned tests #2: resilience<br />
� Try to cause faults in the AV itself<br />
– Fuzzing: file formats, IOCTLs<br />
� Instrumenting AV scanners may be tricky<br />
– Example: avast! looks simple, just call ashQuick.exe on each file<br />
– But exception catching requires removing SSDT hooks<br />
– Checking and modifying ACLs<br />
� Files, pipes, handles, …<br />
– Attempts to disable/kill the anti-virus<br />
� hosts file, connection blocking<br />
� On-the-fly code patching<br />
� On-disk signature database corruption<br />
� Network updates corruption<br />
2010-05-10 EICAR 2010
Planned tests #3: real-world<br />
� This is slightly harder:<br />
malware<br />
– Find a new threat after it is released<br />
– Repeatedly scan with each anti-virus to determine the<br />
reaction time of AV companies<br />
� Or use any older but common threat<br />
� Install the threat, create a snapshot<br />
– Determine how it installs and stays persistent<br />
– Attempt to disinfect with each AV<br />
– Use a LiveCD to check for successful disinfection on<br />
disk<br />
2010-05-10 EICAR 2010
Planned tests #4: other stuff<br />
� Usage monitoring:<br />
– Size on disk after installation<br />
– Size on disk after x months of updating<br />
– Amount of real memory usable in idle<br />
– CPU consumed when scanning a threat<br />
– Bandwidth consumed<br />
� UI features<br />
– Usefulness of error messages<br />
– Logging and access to log files<br />
� Support<br />
2010-05-10 EICAR 2010
An interesting attack<br />
� “KHOBE KHOBE – 8.0 earthquake for Windows desktop<br />
security software”<br />
– They might be over-hyping it a little ;)<br />
– Researchers provided full details<br />
– Could be re-implemented in <strong>AVerify</strong><br />
2010-05-10 EICAR 2010
Automation ftw<br />
� Mostly based on VMware scripting (VIX)<br />
– C/VB API bundled with VMware Workstation<br />
– Also nicely documented<br />
– Avoids errors due to manual testing<br />
– Lots of useful APIs:<br />
� VixVM_RevertToSnapshot<br />
� VixVM_RunProgramInGuest<br />
� Cannot send raw keyboard/mouse input<br />
– However, VMware offers a VNC server<br />
– VNC protocol open and easily scriptable<br />
2010-05-10 EICAR 2010
Scripting a virtual machine<br />
� Checking for success or failure:<br />
– Depends on the test (eg. capturing keys, …)<br />
– VixVM_CopyFileFromGuestToHost<br />
– VixVM_CaptureScreenImage<br />
� VirtualBox<br />
– VBoxManage is kinda like VIX<br />
� Start, stop, revert to snapshot…<br />
– No remote code execution & VNC server though<br />
� Installing one might interfere with the tests<br />
– One possible alternative: enable remote desktop<br />
� Scripting by modification of an OSS rdp client<br />
2010-05-10 EICAR 2010
On providing source code<br />
� Is providing attack code illegal in France?<br />
– A somewhat muddy subject<br />
– See French ministry of interior vs. VUPEN<br />
� 2009: Author of exploit condemned to pay 1000 E<br />
� I didn’t get jailed for publishing aircrack<br />
– But that was in 2004/2005<br />
– It did include win32 binaries<br />
� <strong>AVerify</strong> will provide source code<br />
– But on a per-request basis<br />
– No ready-to use binaries!<br />
2010-05-10 EICAR 2010
On providing samples<br />
� Distributing samples == distributing malware?<br />
– Yes but…<br />
� Very useful for other security researchers<br />
– vx.netlux.org<br />
– offensivecomputing<br />
� Real-world malware samples from <strong>AVerify</strong> to be<br />
made available on offensivecomputing<br />
– Or any other site willing to host them<br />
2010-05-10 EICAR 2010
A simple example<br />
� Automation of the EICAR anti-virus test file<br />
through VMware VIX<br />
� Current problem: detecting that it worked<br />
– Screenshots not reliable<br />
– AV logs?<br />
2010-05-10 EICAR 2010
The future of <strong>AVerify</strong>?<br />
� Lots of ideas, but not much code yet!<br />
– No demo quite yet, sorry<br />
– As always, time is not on our side<br />
– Right now, it’s a one-person project<br />
� First initial target: November 2010<br />
– Source code and initial results made public<br />
– Hopefully, together with a submission for the next<br />
EICAR conference :)<br />
� Anyone can contribute! You can too.<br />
� Please check averify.org in a few months<br />
2010-05-10 EICAR 2010
A final note<br />
During the installation of a major AV (2010):<br />
Scaring your users is not the solution. Educating them is.<br />
2010-05-10 EICAR 2010
Q&A<br />
Thank you for your attention!<br />
2010-05-10 EICAR 2010