27.02.2017 Views

Malware Triage

Malware-Triage-Using-Open-Data-to-Help-Develop-Robust-Indicators-Sergei-Frankoff-Sean%20Wilson

Malware-Triage-Using-Open-Data-to-Help-Develop-Robust-Indicators-Sergei-Frankoff-Sean%20Wilson

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Malware</strong> <strong>Triage</strong><br />

Using Open Data to Help Develop Robust Indicators


Hello, My name is:<br />

Sergei Frankoff<br />

sergei@openanalysis.net<br />

Sean Wilson<br />

sean@openanalysis.net


OPENANALYSIS.NET


What is an IOC<br />

Indicators of Compromise (IOCs) are<br />

forensic artifacts of an intrusion that can<br />

be identified on a host or network.<br />

openioc.org<br />

(http://openioc.org/resources/An_Introduction_to_OpenIOC.pdf)


IOC Formats<br />

We Aren’t Talking<br />

About Formats!


Traditional View of IOCs


In Practice…<br />

Is APTx attacking us? I saw<br />

this frightening article in CISO<br />

monthly magazine…<br />

WTF?? Ransomeware<br />

just infected half of<br />

accounting??<br />

Wait didn’t I just<br />

remove this same<br />

trojan from Dave’s<br />

workstation last week?<br />

Hey look I just<br />

hooked that IOC<br />

feed to our IDS…<br />

OMG! Our IDS just<br />

blocked traffic to all<br />

of our developers!


A Possible Solution: <strong>Triage</strong><br />

Suspicious<br />

URL<br />

Suspicious<br />

E-mail<br />

Is it<br />

malicious?<br />

What is it<br />

exploiting?<br />

Do we have<br />

exposure?<br />

Incident!<br />

Security<br />

Event<br />

Intel feed


A Better Solution:<br />

<strong>Triage</strong> + IOCs = Automation!<br />

Suspicious<br />

URL<br />

Suspicious<br />

E-mail<br />

Filter<br />

knowns<br />

Is it<br />

malicious?<br />

What is it<br />

exploiting?<br />

Do we have<br />

exposure?<br />

Incident!<br />

Security<br />

Event<br />

Intel feed<br />

IOCs<br />

Root cause<br />

analysis


IOCs Remove The Knowns<br />

And Reveal The Unknown<br />

Unknown<br />

Known Good<br />

Known Bad


The Problem With AV<br />

Artemis!1A5E05B1B9E1<br />

Artemis!262BC0AE2FB0<br />

Artemis!04296F13925B<br />

Artemis!110C43F8A337<br />

Artemis!9BFC61456261<br />

Artemis!9BE792AC4667


<strong>Malware</strong> Specific IOCs<br />

IOCs:<br />

Indicators of Compromise.<br />

<strong>Malware</strong> Specific IOCs:<br />

forensic artifacts resulting from the<br />

presence or execution of malware.<br />

AV Signatures


Robust IOCs<br />

Effectiveness of IOC<br />

Brittle IOC<br />

Robust IOC<br />

Diversity of<br />

<strong>Malware</strong><br />

Lifetime of <strong>Malware</strong> Family


Robust is not …<br />

One is the<br />

loneliest<br />

number : (


More Robust Is…<br />

Multiple Samples<br />

+<br />

Comparative<br />

Analysis!<br />

Reverse engineering<br />

with a Sandbox!


Most Robust Is…<br />

Multiple Samples<br />

+<br />

Code Review<br />

+<br />

Comparative<br />

Analysis!!<br />

Reverse engineering<br />

with IDA and a Debugger!!


The Key is Comparative Analysis<br />

Pivot<br />

(Attribute)<br />

Sample #2<br />

Primary<br />

Sample<br />

Sample #1<br />

This is one of the<br />

most important slides<br />

in the presentation.


Building Robust Indicators<br />

Analysis<br />

(<strong>Triage</strong>)<br />

Identify<br />

Pivots<br />

Discovery<br />

Mining Open<br />

Data<br />

Comparative<br />

Analysis<br />

Develop<br />

IOC<br />

Test<br />

(Validate)


Analysis <strong>Triage</strong><br />

Is it malicious?<br />

Can we identify the malware family?<br />

Collect static attributes<br />

Collect dynamic attributes


Is it Malicious / What is it?<br />

VS.<br />

Binarly


Static Attributes<br />

Hmm… Something isn’t right,<br />

there are no file properties for<br />

this executable?<br />

I’m totally legit!


Static Attributes: Metadata<br />

Compiler<br />

Artifacts<br />

Easily Modified<br />

Can make poor indicators<br />

EXIF Data<br />

Sample Discovery<br />

Can work as primary indicators


Static Attributes: Imports<br />

Compilation<br />

Library and<br />

API Imports<br />

Use Imphash!


Static Attributes: Strings<br />

Analysis with<br />

Context<br />

Analysis without<br />

Context


Try our free PE analysis<br />

tool PF<strong>Triage</strong>!


This file is packed! We aren’t<br />

going to get any useful static<br />

attributes.


Static Attributes Identified<br />

• Initial Sample was packed with UPX<br />

• Contains no file or version metadata


Packer / Crypter<br />

vs. Static Attributes<br />

Payload<br />

Packer<br />

Stub<br />

Obfuscated<br />

Payload<br />

Low Quality<br />

Static Attributes


Packer / Crypter Weakness:<br />

Runtime!<br />

A<br />

1<br />

A<br />

1<br />

X<br />

B<br />

C<br />

2<br />

3<br />

4<br />

5<br />

Windows API<br />

vs.<br />

X<br />

B<br />

C<br />

2<br />

3<br />

4<br />

5<br />

Windows API<br />

PE Runtime<br />

PE Runtime


Sandbox Magic<br />

Sandbox<br />

Process Monitor<br />

X<br />

A<br />

B<br />

C<br />

Windows API<br />

Network<br />

Filesystem<br />

Registry<br />

Process<br />

Services<br />

PE Runtime<br />

Synchronization


When Your Sandbox Doesn’t Work…<br />

Ghetto Runtime Analysis<br />

OR +


Dynamic Attributes<br />

In-Memory Strings<br />

Process Handles / Mutex<br />

Access / Created Files<br />

Registry Keys<br />

Network Traffic


Level Up! Your Analysis With<br />

Some Light Debugging<br />

Quickly trace the sample in a<br />

debugger to deobfuscate strings<br />

and gain CONTEXT.<br />

Try Windbg!


Dynamic Attributes Identified<br />

• Creates Mutex: QKitMan2016_1<br />

• Creates Registry Key: <br />

HKEY_CURRENT_USER\SOFTWARE\QKitMan2016<br />

• Requests IP from IPReq using HTTP GET Request<br />

• Post IP as Payload to LiveJournal account qkitman1010


Identify Pivots<br />

Collect your notes<br />

Choose best pivots<br />

Prepare to hunt


Rough Notes Are OK<br />

VS.


Discovery (Hunting)<br />

Searching for related samples<br />

Mining open data (the easy way)<br />

Acquiring samples


Mining Open Data<br />

virusshare.com


Tricks For Searching<br />

Online Sandboxes<br />

• Shared Virtualization<br />

Infrastructure<br />

• Shared Templates<br />

• Hunting using<br />

computed Values


Mining Open Data With OAPivot


Sample Acquisition<br />

Sandbox Shared Samples<br />

Sharing Services<br />

DFIR Lists and Trust groups


Sandbox Shared Samples<br />

Share your samples!


Sharing Services<br />

VirusShare


Comparative Analysis<br />

Identify common characteristics<br />

Common Properties<br />

Common Behaviour<br />

This is a key section<br />

even though there<br />

aren’t a lot of slides.


Comparison Checklist<br />

Initial Sample<br />

Pivot Sample A Pivot Sample B<br />

Strings X X<br />

Exif Data<br />

Imphash<br />

Memory<br />

Strings<br />

X X X<br />

Mutex X X<br />

File Names<br />

X<br />

Registry Keys X X X<br />

Network Traffic X X


Some network strings remain constant<br />

between all samples while others<br />

differ!


The mutex changes between samples<br />

but only slightly… maybe we can work<br />

with that.


Look at that! The same registry key for<br />

all samples… I remember that key from<br />

the STRINGS too!


Level Up! Your Comparative Analysis<br />

With Some Light Disassembly<br />

Comparative analysis works at the<br />

byte code level as well!


The opcodes of the string building<br />

algorithm are identical. Don’t forget to<br />

use wildcards for variable bytes (0D)!!


Develop IOC<br />

Choose IOC Format(s)<br />

Develop IOC


IOC Formats<br />

We Really Really<br />

There are tons of links to great free<br />

IOC training on our site : )<br />

Aren’t Talking About<br />

Formats!


Testing<br />

Test IOCs Against Known Bad<br />

Test IOCs Against Known Good<br />

Automate Discovery


Known Bad<br />

Run indicators<br />

against repository<br />

of known bad<br />

Update when<br />

required<br />

Validate!<br />

Indicator types dictate<br />

how they are tested.


Known Good<br />

Test indicator against<br />

a repository of known<br />

good samples<br />

Resolve<br />

Issues<br />

Validate!<br />

Try testing IOCs against your<br />

corporate “Golden image(s)”.


Monitor<br />

When indicators stop matching new<br />

samples something has changed!


Test Automation<br />

vs.<br />

Waiting for samples to hit<br />

you organization to test<br />

indicators<br />

Automating discovery<br />

before your organization<br />

is affected


Key Takeaways<br />

<strong>Triage</strong> + IOCs = Automation!<br />

Robust IOCs can be built without the need for a<br />

debugger or disassembly.<br />

Comparative analysis is key!<br />

Open data can be leveraged to collect related<br />

samples. Try OAPivot…<br />

Remember to continuously test your IOCs.


Try It Yourself<br />

http://bit.ly/2frHKg3<br />

763c7763a55b898b9618a23f85edfab6


Thank you : )<br />

And don’t forget… openanalysis.net


Image Attribution<br />

• Noun Project - Molecules by Zoë Austin<br />

• Noun Project - Funnel by Vaibhav Radhakrishnan<br />

• Noun Project - Shield by AFY Studio<br />

• Noun Project - Checklist by Arthur Shlain<br />

• Noun Project - Rocket Man by LuisPrado<br />

• Noun Project - Head-Desk by Karthik Srinivas<br />

• Noun Project - Checked Database by Arthur Slain<br />

• Noun Project - Kevin Augustine LO<br />

• Noun Project - Compilation by Richard Slater<br />

• Noun Project - Database Warning by ProSymbols<br />

• Noun Project - Flow Chart by Richard Schumann<br />

• Noun Project - Debug by Lemon Liu<br />

• Noun Project - Network by Creative Stall<br />

• Noun Project - File Settings by ProSymbols

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!