skip to main |
skip to sidebar
Tuesday, October 16, 2012
Posted by
Corey Harrell
It was a little over two years ago when I started Journey Into Incident Response (aka jIIr). In these two years I learned a lot about blogging on technical topics so I wanted to share some tips and advice for those considering taking the plunge into the blogosphere. The following are some things to consider from planning your blog to setting it up to maintaining it:
- Why
- Motivation
- Post Frequency
- Content
- Publishing Method
- Blog Template
- Blog Configuration
- Blog Post Tips
- Advertising
- Gauging Interest
- Motivation
Why
My first question to any would be blogger is why. Why do you want to put yourself through the agony of sacrificing your personal time and resources so you can share your experiences and research. There are other avenues to take; write an article and submit it to a site offering free content (such as DFI News or Forensic Focus). Write a post for a blog that publishes guest posts (such as SANs Forensic blog or Girl, Unallocated). I’m not trying to talk anyone out of blogging since it has been a rewarding experience. On the contrary, I’m just trying to get my point across that blogging is a lot of work and in the end you are doing it for free. When I decided to start blogging I told my wife “it will be easy”. Sure I had a new born to go along with my two other sons, was pursing my Masters degree, and still had a full time security job but I seriously thought putting together a few posts a month would be a cake walk. My perspective changed pretty quick after I wrote my first few posts.
If you have any hesitation about not wanting to put in the work then consider the other options available. If you made up your mind and blogging is the route you want to take then the remaining tips may be helpful.
Motivation
People who start blogging have a reason why they are doing it. For some it’s to give back to the community, for others it’s for name recognition, for some it’s a platform they control to share their research, and for the rest it’s any number of reasons. Before you even start setting up a blog make sure you know what your true motivation is. There will be times when you have writers block or you just don’t feel like writing anything so whatever your reason is for blogging needs to be enough to motivate you to put the keyboard to word processor. Your motivation needs to come from within since most of the time others can’t do this for you.
Frequency
This was something that helped me so I wanted to pass this nugget along. Before you even start worrying about content, site design, or any other details take some time to consider how often you want to update your blog. If you look at other DFIR blogs you will notice a range in how often they are updated. From multiple updates in a week to monthly updates to yearly postings. As an author, I found it helpful to set myself a monthly goal of how many posts I wanted to put together. It helped to not only plan better on how to reach my monthly goal but it helped get me into the habit of writing regularly. As a blog reader, I tend to check the sites with a more regular update schedule more so I assume others readers do the same. If a blog gets updated at random times then it tends to fall off my radar until I see it in my RSS feeds. Whatever goal you set for yourself it’s not something that needs to be publicized. It’s a personal goal for yourself and only for you to know. I’m only mentioning my goal since I’m giving this advice. I always wanted to write three substantial posts per month so at the end of the year I would have 36 decent write-ups.
Content
I also found it helpful to at least pick a theme for the blog. There are times when I’m not sure what to write about so having a theme gives me something to fall back on. As you might have noticed my theme is in the title: Journey Into Incident Response. I wanted to blog about the different things I learn and experience as I work on improving my skills and knowledge to investigate security incidents. If you do decide to pick a theme it doesn’t mean you are locked into it. Write about whatever you want whether if it’s about a book, tool, article, or anything else you are working on in the moment. The theme is just a crutch for when you start to run out of ideas which brings me to my next point. Make sure you have a blog idea hopper. Keep a list of different ideas for future posts and always add to it when you think of something new. Some ideas may never go beyond the hopper while others may turn into great articles. One of the reasons why I don’t struggle with ideas for posts is because I constantly have between 5 to 10 ideas in my hopper. If I need something to write about then I just look over my hopper and pick the one that interests me the most. Case in point, the idea for this post has been in my hopper for months.
Publishing Method
At this point you know why you want to blog, what your motivation is, how often you will update it, and you have a general idea about what content you want. The biggest decision you will now make is how you want to host your blog. The two most frequent publishing applications I see DFIR bloggers use are Word Press and Blogger. If you aren’t sure about what publishing application to use then reach out to the blog authors you read. I bet the authors are more than willing to let you know why they choose what they did and how they feel about their decision. As for me I went with Blogger for two simple reasons. First was because most of the blogs I followed at the time used the service and second was because I didn’t want to have to worry about maintaining software updates. All I want to do is blog and Blogger enabled me to do that.
Blog Template
The second biggest decision will be what template to use for your blog. The template is how your blog looks and what people will stare at when reading your posts. Your publishing application may have a few templates available that meet your needs. If the built-in templates aren’t what you are looking for then there are free templates available online. What sites should you use to get a free template? Great question and it’s still one I can’t really answer today. The last thing I wanted was to get a free template with an embedded malicious link that would attack any visitor of my blog. So I took a closer look at a few DFIR blogs I followed to see where they got their templates. I went to each blog and viewed the site’s source code to find the template’s author website. The screenshot is from my blog’s current template but it’s also the website I saw in a ton of other DFIR blogs.
Blog Configuration
Remember growing up when people said first impressions matter? I think the statement is true even for blogging. When I was in the process of configuring my blog one setting I loved was the ability to restrict access to my blog. I pretty much prevented anyone from seeing the blog while I tried out different templates, options, and layouts. In Blogger I even used the setting preventing the site getting indexed by Google. I only removed the permissions, thus allowing anyone to see the blog, when I had everything set up the way I wanted. Configuring your blog is a matter of preference so my only advice is to don’t unveil it until it’s set up the way you want it.
Blog Post Tips
ITAuditSecurity has some great posts about blogging tips. As it relates to putting a post together one tip I wanted to echo was in his post Blogging: Choose Great Titles and Intro Sentences. His reasoning is not only do they grab the attention of readers but they also help in having better search engine results. I completely agree with his points and I wanted to build on it with another point. Picking good titles and intro sentences helps to let the reader know exactly what the post will be about. If the point of the post can’t be conveyed in a title or one sentence then make sure it is conveyed in the first paragraph. If the content of the post isn’t clear upfront then some readers will stop reading before they reach the part of the post where you make your point. In all of my posts I try very hard to make sure that the reader knows exactly what I’ll be talking about by the time they finish reading the first paragraph.
Advertising
I remember thinking very clearly when I was getting ready to launch the blog “how do you advertise it to others”. I thought there was some secret so I reached out to Harlan for advice. At the time I was just a name from the Win4n6 group who Harlan helped once before but I figured who else would be better to ask them someone who has been blogging for years. Harlan’s response to my question about the secret to advertising was:
“Add your blog to your email signature, and tell others about it. Pass it around. I, and others, constantly link to blogs and posts that are interesting and informative. There's no real secret”
Here I was thinking there was some secret; some involved advertising process only known to bloggers but in the end advertising has actually been the easy part. Two years later I can honestly say Harlan’s advice was spot on.
Gauging Interest
Don’t get me wrong about my next comment. I truly appreciate all the feedback I have gotten over the last two years. The conversations in person, comments offline, and comments posted to jIIr. When you start blogging treat any feedback you get like gold. Feedback is the best way to get an idea about your blog’s content so cherish it when someone gives it to you. The reason is because the majority of blog readers don’t provide feedback. They don’t leave comments, send emails, or contact you using other means. Thinking about it I fall in the same boat. I follow over 300 blogs and my comment to read ratio is pretty low. For the first year blogging it felt as if I was talking to myself. I would keep posting different content but I didn’t get a lot of feedback. I wasn’t sure what content went over well and which ones as Harlan says “went over like a fart”. In these situations Google Analytics will be your friend. Google Analytics keeps stats about your site such as pageviews for each post and referrals to your blog. For the times when I don’t get feedback I can get a rough idea about the content people like by looking at the page views. However, some of my posts where I got great feedback were the same ones with low pageviews. Leverage Google Analytics as a tool to guage interest on your site but remember it is not fool-proof.
Motivation
As I mentioned before blogging has been one of the most rewarding things I have done. It has required a lot of sacrifice but it has made me into a better DFIR practitioner. There are times when I felt as if I wasn’t adding value; times when I was flying high because my research and posts has helped others. Regardless of what happens when you blog, the most important advice I can give is to stay true to what motivated you to blog in the first place. If you are working towards accomplishing what you set out to do then the rest doesn’t matter. Enjoy the ride and remember to say thanks to those who give shout outs about your blog or provide feedback.
Monday, October 8, 2012
Posted by
Corey Harrell
There is a tendency to focus on what is different when we are faced with newer operating systems. What are the security changes and how does that impact security testing against it? What are the new artifacts and what new information do they provide? The focus is mostly on the changes as it relates to us looking at the operating system. However, there are other changes that may impact us even more and the way we do our jobs. These changes occur when we use these newer operating systems on the workstations we use for analysis and the changes impact how our tools operate. The User Account Control feature in the newer operating systems is one such change impacting how we use our tools.
User Account Control (UAC) was first introduced with Windows Vista and the feature carried over to Windows 7. By default, UAC is turned on in both operating systems. “The primary goal of User Account Control is to reduce the exposure and attack surface of the Windows 7 operating system by requiring that all users run in standard user mode, and by limiting administrator-level access to authorized processes.” This means even if a user account is in the administrators group every application the account runs will only have standard user privileges instead of the all powerful administrative privileges. In essence, we are not administrators when UAC is turned on.
It’s fairly easy to see the impact UAC has on a user account with administrative privileges. With UAC turned on, open a command prompt and type the command “whoami.exe /priv” to see the privileges of the logged on user account (if your system has UnxUtils configured in your path then make sure to run Windows\System32\whoami.exe).
C:\> whoami.exe /priv
As shown above the user account only has five privileges and none of them are the elevated privileges typically associated with administrator accounts. The two ways to get around UAC is to either turn it off or to use the “Run As” feature when starting an application. Continuing to see the impact of UAC, with the same user account re-run the command “whoami.exe/ priv” with either UAC completely turned off or with a command prompt opened with the “Run As” feature. Notice the difference in the privileges the user account has as shown below.
C:\> whoami.exe /priv
UAC Impact on Digital Forensic and Incident Response Tools
UAC will impact any application that requires administrative privileges to function properly. I first encountered the impact UAC has on applications is when I was working on a script to examine volume shadow copies. My script needed elevated privileges to work and without it the script would just fail. Why the sudden interest in UAC now? Last week a new DFIR program was released and the tool requires elevated privileges to run properly. A few people encountered an error when running the program on both Windows Vista and Windows 7. The program in question is Harlan’s Forensic Scanner and the error some people saw when clicking the Init button is below.

The error is a direct result of UAC being enabled on the workstations running the Forensic Scanner. To get around UAC and thus the error, all one has to do is use the “Run As” feature when launching the Forensic Scanner (or in my case by disabling UAC completely). Again, the UAC error is not unique to the Forensic Scanner; it’s any program that requires administrative privileges. With that said let’s take a closer look at what is really occurring with the scanner and UAC.
I monitored the Forensic Scanner as it executed with ProcessMonitor using an administrative user account with UAC turned on. The screenshot below shows the first permission issue encountered due to the restricted privileges imposed by UAC.
The event shows access is being denied when the scanner tries to open the M:\Windows\system32\config\software hive in my mounted forensic image. Now let’s take a look at the same event with UAC turned off (or with the scanner being executed with the “Run As” feature).
The event shows the result is now a success instead of the denied access previously encountered. The end result is the software registry hive was opened. Now the error some people are encountering makes a little more sense: “No filename specified at PERL2EXE_STORAGE/WinSetup.pm line 136”. The Forensic Scanner is unable to open the registry hives because the user account being used has standard privileges since UAC removed the administrative privileges.
When we upgrade our workstations to newer operating systems it may impact the way our tools work. The User Account Control feature introduced with Windows Vista and carried over to Windows 7 is just one example. When the User Account Control feature is turned on any tools needing administrative privileges will no longer function properly.
Thursday, October 4, 2012
Posted by
Corey Harrell
It looks like Santa put his developers to work so they could deliver an early Christmas for those wanting DFIR goodies. Day after day this week there was either a new tool being released or an updated version of an existing tool. In this Linkz edition there isn’t much commentary about the tools because I’m still working my way through testing them all to better understand: what the tool is, how the tool functions, and if the tool can benefit my DFIR process. Without further ado here are the Linkz of the DFIR goodies dropped in the past week.
Big shout out to Glen (twitter handle @hiddenillusion) for his steady stream of tweets from the 2012 Open Source Digital Forensics Conference saying what the new tool releases were.
RegRipper Plugins
The RegRipper project released a new archive containing a bunch of plugins. The plugins extract a wealth of information including: program execution artifacts (appcompatcache, direct, prefetch, and tracing), user account file access artifacts (shellbags), and a slew of plugins to create timeline data (appcompatcache_tln, applets_tln, networklist_tln, and userassist_tln). For the full detail about what was updated check out Wiki History page and to get the new archive go to the download section on the RegRipperPlugins Google code site.
ForensicScanner
While on the topic about a tool authored by Harlan, I might as well talk about his latest creation. Harlan released a new tool named Forensic Scanner followed by a detailed post explaining what the tool is. To get a better understanding about how to use the scanner there’s documentation on the Wiki page for ScannerUsage (there's also a user guide included in the zip file). What I find really cool about this tool is how it will speed up examinations. All one has to do is point the Forensic Scanner at a mounted image and then it extracts all information fairly quick. It reduces the time needed for extracting information so an analysis can start sooner; thus reducing the overall examination time. The tool is hosted on the download section of the ForensicScanner Google code site.
Volatility
Up next is another tool that is plugin based but this time around I’m pretty speechless. All I can say is the project has released a ton of information to accompany its latest version. Leading up to the release the project released a new plugin every day for a month and each plugin was accompanied with a blog post. Jamie Levy did an outstanding job summarizing all the blog posts: Week 1 of the Month of Volatility Plugins posted, Week 2 of the Month of Volatility Plugins posted, and Week 3 of the Month of Volatility Plugins posted. To grab the latest Volatility version go to the Google code site download section and to see what is new check out the Volatility 2.2 release notes.
Log2timeline
Another great tool has been updated but this time it’s a tool for performing timeline analysis. Log2timeline 0.65 was released a few weeks ago; I know this post is discussing tools released in the last week but I can’t do a toolz post and completely ignore L2T. One cool update is the addition of a new input module to parse utmp file which is an artifact on Linux that keeps track of user logins and logouts on the system. To grab Log2timeline 0.65 go to the Google code site download section and to see all the updates check out the Changelog.
L2T_Review
There are different ways to review the Log2timeline output data depending on the output’s format. Typically, people use the csv output and in this case a few different options were available. The csv file could be Grepped, viewed in a text editor, or examined with a spreadsheet program such as Microsoft Excel (refer to jIIr post Reviewing Timelines with Excel) or OpenOffice Calc (refer to jIIr post Reviewing Timelines with Calc). Now there’s another option and it’s a pretty good option at that. David Nides has been working on his L2T_review tool for reviewing log2timeline csv timelines. He posted about it a few times including here, here, and here. Typically, I don’t mention tools still in beta but I wanted to make an exception for this one. I finally got around to testing L2T_review this week and I definitely liked what I saw.
Sleuth Kit and Autopsy
The 2012 Open Source Digital Forensics Conference did occur this week so it shouldn’t be a surprise to see a new version of the Sleuth Kit released. I haven’t had the time to test out Sleuth Kit 4.0 nor have I been able to look into what the new updates are. Sleuthkit 4.0 can be downloaded from the Sleuth Kit website and the History page can be referenced to see what the updates are. The Autopsy Forensic Browser is a graphical interface to The Sleuth Kit and a new Windows beta version was released last month. I quickly tested out the functionality and I’m truly impressed. I’ve been looking for a decent free forensic browser besides FTK Imager to run on Windows and now I can say my search is over. Autopsy is that forensic browser and it can be downloaded from the Autopsy download page.
HexDive
I’ve mentioned the HexDive program on my blog a few times and the latest is when I was analyzing a keylogger. HexDive has been updated so it provides more context and testing out this new functionality is on my weekend to-do list.
ProcDOT
Speaking about malware analysis. I picked up on this next tool from a Lenny Zeltser tweet. ProDOT is “tool processes Sysinternals Process Monitor (Procmon) logfiles and PCAP-logs (Windump, Tcpdump) to generate a graph via the GraphViz suite”. This tool seems really cool by being able to correlate the ProcMon logfile with a packet capture to show how the activity looks. Yup, when I’m running HexDrive against a malware sample the follow up test will be to launch the malware and then see how the dynamic information looks with ProcDOT.
GRR Rapid Response
I first learned about GRR when I attended the SAN Digital Forensic and Incident Response summit last June. GRR Rapid Response is an incident response framework that can be used when responding to incidents. At the top of my to-do list when I have a decent amount of free time will be set up GRR in a lab environment to get a better understanding how the framework can benefit the IR process. The GRR homepage provides some overview information, the Wiki page provides a wealth of information, and the GRR Rapid Response - OSFC 2012.pdf slide deck contains information as well. GRR itself can be found on the download page.
Lightgrep is open source!
LightGrep is a tool to help perform fast searches. I have yet to try this software out but an interesting development is the core Lightgrep engine is now open source. This will be one to keep an eye on to see how it develops.
bulk_extractor
Rounding out this edition of Linkz for Toolz is a new version for the program bulk_extractor. Bulk_extractor scans a disk image, a file, or a directory of files and extracts useful information without parsing the file system or file system structures. Again, this is another tool on my to-do list to learn more about since my free time has been spent on improving my own processes using the tools already in my toolkit.
Tuesday, September 18, 2012
Posted by
Corey Harrell
Malware forensics can answer numerous questions. Is there malware on the system, where is it, how long has it been there, and how did it get there in the first place. Despite all the questions malware forensics can solve there are some that it can’t. What is the malware’s purpose, what is its functionality, and is it capable of stealing data. To answer these questions requires malware analysis. Practical Malware Analysis defines malware analysis as “the art of dissecting malware to understand how it works, how to identify it, and how to defeat or eliminate it”. I’ve been working on improving my malware analysis skills while at the same time thinking about the different ways organizations can benefit from the information gained by analyzing malware. One such benefit is empowering the help desk to combat malware. Everyday help desks are trying to find malware on computers using antivirus products lacking signatures to detect it. Analyzing malware can provide enough information to build a custom antivirus signature to provide the help desk with a capability to find it until the commercial antivirus signatures catch up. In this post I’m going through the process; from analyzing malware to creating a custom antivirus signature to using the signature in the portable apps ClamAV version.
The work presented in this post was originally put together for my hands on lab for a Tr3Secure meet-up. Also, the sample used was obtained from Contagio’s post APT Activity Monitor / Keylogger.
Disclaimer:
I’m not a malware analyst. I’m an information security and digital forensic practitioner who is working on improving my malware analysis skills. As such for anyone looking to be more knowledgeable on the subject then I highly recommend the books Practical Malware Analysis and the Malware Analyst's Cookbook.
Static Analysis
Static Analysis is when the malware is examined without actually running it. There are different static analysis steps to extract information; the two I’m discussing are: reviewing strings and reviewing the import table.
Reviewing Strings
Strings in malware can provide clues about the program and I find them helpful since it makes me more aware about malware’s potential functionality. However, conclusions cannot be drawn by solely looking at the strings. I usually first run HexDive on a sample to filter the strings typically associated with malware followed by running Strings to make sure I see everything.
Below is the Hexdrive command running against AdobeInfo.exe and a snippet from its output.
C:\> Hdive.exe C:\Samples\AdobeInfo.exe
CreateFileA
SetFileAttributesA
CreateDirectoryA
GetCurrentDirectoryA
GetWindowTextA
GetForegroundWindow
GetAsyncKeyState
GetStartupInfoA
[Up]
[Num Lock]
[Down]
[Right]
[UP]
[Left]
[PageDown]
[End]
[Del]
[PageUp]
[Home]
[Insert]
[Scroll Lock]
[Print Screen]
[WIN]
[CTRL]
[TAB]
[F12]
[F11]
There were numerous Windows API function names in the strings and looking the functions up in Practical Malware Analysis’s Appendix A (commonly encountered Windows functions) provides some clues. The following are three function names and why they may be relevant:
- CreateFileA: creates new or opens existing file
- GetForegroundWindow: returns a handle to a window currently in the foreground of the desktop. Function is commonly used by keyloggers to determine what window the user is entering keystrokes in
- GetAsyncKeyState: used to determine whether a particular key is being pressed. Function is sometimes used to implement a keylogger
The other interesting strings were the characters associated with a keyboard such as [Down] and [Del]. The combination of the API names and keyword characters indicate the malware could have some keylogging functionality.
Below is the Strings command running against AdobeInfo.exe and a snippet from its output.
C:\>Strings.exe C:\Samples\AdobeInfo.exe
---- %04d%02d%02d %02d:%02d:%02d ----------------
\UpdaterInfo.dat
\mssvr
The Active Windows Title: %s
In addition to the strings extracted with HexDrive, Strings revealed some other text that didn’t become clear until later in the examination.
Reviewing the Import Table
“Imports are functions used by one program that are actually stored in a different program, such as code libraries that contain functionality common to many programs”. Looking at functions imported by a program provides better information about a malware’s functionality than solely relying on its strings. I used CFF Explorer to review the import table as shown in the screen shot below.
The import table showed three DLLs which were kernel32.dll, user32.dll, and msvcrt.dll. The DLLs’ functions imported matched the Windows API function names I found earlier in the strings. The functions provide the sample with the ability to: create and open files and directories, copies text from a window’s title bar, returns a handle to an active window, returns handle to a loaded module, and monitor for when a key is pressed. All of which strengthens the indication that the sample is in fact a keylogger.
Dynamic Analysis
The opposite of static analysis is dynamic analysis which is examining the malware as it runs on a system. There are different dynamic analysis steps and tools to use but I’m only going to discuss one; monitoring program execution with Capture-Bat. Below is the output from the Capture-Bat log after AdobeInfo.exe executed on the system (the entries related to my monitoring tools were removed).
"2/8/2012 15:03:27.653","process","created","C:\WINDOWS\explorer.exe","C:\Samples\AdobeInfo.exe"
"2/8/2012 15:03:40.403","file","Write","C:\Samples\AdobeInfo.exe","C:\Samples\mssvr\UpdaterInfo.dat"
"2/8/2012 15:03:40.481","file","Write","C:\Samples\AdobeInfo.exe","C:\Samples\mssvr\UpdaterInfo.dat"
"2/8/2012 15:03:40.497","file","Write","C:\Samples\AdobeInfo.exe","C:\Samples\mssvr\UpdaterInfo.dat"
"2/8/2012 15:04:33.419","file","Write","C:\Samples\AdobeInfo.exe","C:\Samples\mssvr\UpdaterInfo.dat"
"2/8/2012 15:04:33.419","file","Write","C:\Samples\AdobeInfo.exe","C:\Samples\mssvr\UpdaterInfo.dat"
Capture-Bat revealed when AdobeInfo.exe runs it creates a folder named mssvr in the same folder where the executable is located as well as creates a file named UpdaterInfo.dat inside the folder. Remember the suspicious strings before? Now the strings \UpdaterInfo.dat and \mssvr make a lot more sense. Looking at the UpdaterInfo.dat file closer showed it was a text file containing the captured data as can be seen in the partial output below.
---- 20120802 15:07:38 ----------------
15:07:40 The Active Windows Title: Process Explorer - Sysinternals: www.sysinternals.com [XP-SP2\Administrator]
15:07:33 The Active Windows Title: C:\WINDOWS\system32\cmd.exe
15:07:39 The Active Windows Title: Process Explorer - Sysinternals: www.sysinternals.com [XP-SP2\Administrator]
c[CTRL]y
15:07:41 The Active Windows Title: Process Monitor - Sysinternals: www.sysinternals.com
15:07:44 The Active Windows Title: Process Monitor
15:07:38 The Active Windows Title: Process Monitor
15:07:38 The Active Windows Title: API Monitor v2 (Alpha-r12) 32-bit (Administrator)
15:07:44 The Active Windows Title: Process Monitor
15:07:45 The Active Windows Title: API Monitor v2 (Alpha-r12) 32-bit (Administrator)
15:07:35 The Active Windows Title: ApateDNS
Everything in the analysis so far has identified the AdobeInfo.exe program as a keylogger that stores its captured data in a log file named UpdaterInfo.dat. All that was left was to confirm the functionality. I used Windows to create a password protected zip file then I unzipped it. A quick look at the UpdaterInfo.dat log file afterwards confirmed the functionality (note: Windows made me enter the password twice).
16:07:30 The Active Windows Title: Program Manager
16:07:38 The Active Windows Title: Add to Archive
[CTRL]
16:07:58 The Active Windows Title: Program Manager
supersecretsupersecret
16:07:58 The Active Windows Title: Compressing
16:07:59 The Active Windows Title: Program Manager
16:07:21 The Active Windows Title: Extraction Wizard
16:07:30 The Active Windows Title: Password needed
16:07:37 The Active Windows Title: Extraction Wizard
supersecret
16:07:40 The Active Windows Title: Program Manager
16:07:42 The Active Windows Title: Windows Explorer
Creating Custom AntiVirus Signature
I’m not going into detail about how to create custom signatures for ClamAV but I will point to the great references I found on the subject. The Malware Analyst’s Cookbook talks about how to leverage ClamAV for malware analysis in the recipes: Recipe 3-1 (examining existing ClamAV signatures), Recipe 3-2 (creating a custom ClamAV database), and Recipe 3-3 (converting ClamAV signatures to Yara). Another resource is Alexander Hanel’s post An Intro to Creating Anti-Virus Signatures. The ClamAV website also has information on the subject as well including the slide deck in PDF format for the webcast Writing CalmAV Signatures.
I spent some time creating different signatures: from a custom hash database to extended signature format to a logical signature. To keep the post shorter I’m only going to cover how I created a logical signature. A ClamAV logical signature is based on hex strings found inside a file and logical operators can be used to combine the hex strings in different ways. The format for the signature is below:
SignatureName;TargetDescriptionBlock;LogicalExpression;Sig0;Sig1;Sig2;
The SignatureName is self explanatory, the TargetDescriptionBlock is the type of file the signature applies to (0 means any file), LogicalExpression is how the signatures are combined using logical operators, and the Sig# are the actual hex strings. The completed signature is placed into a file with an ldb extension.
Reviewing the strings in AdobeInfo.exe provided some good candidates to create a signature for; specially \UpdaterInfo.dat, \mssvr, and [CTRL]. I used portable apps ClamAV’s sigtool to determine the hex of those strings. I ran the following command for each string:
C:\>echo \UpdaterInfo.dat | App\clamwin\bin\sigtool --hex-dump
The end result provided me with the hex for each string.
\UpdaterInfo.dat
5c55706461746572496e666f2e646174
\mssvr
5c6d73737672
[CTRL]
5b4354524c5d
I then combined the strings into a logical signature as shown next.
AdobeInfo.exe;Target:0;0&1&2;5c55706461746572496e666f2e646174;5c6d73737672;5b4354524c5d
Finally, I ran the custom signature against the AdobeInfo.exe file and was successfully able to identify it. The command to run a custom scan from the command-line in portable ClamAV is:
App\clamwin\bin\clamscan -d adobe.ldb C:\Samples\
Empowering the Help Desk
I’m done right? I was able to analyzing the malware to determine its functionality, create a custom antivirus signature to detect it, and found a way to run the custom signature using the portable apps ClamAv version. I wouldn’t be too quick to say my job is done though. Let’s be honest, going to any help desk and telling the staff from now on you have to use the command-line may not have the greatest chances of success. You need to provide options; one option most help desk staff will want is an antivirus program with a GUI that’s similar to their commercial antivirus programs. ClamAV has a pretty nice graphical user interface that can be configured to use custom signatures so we could leverage it.
The article How to create custom signatures for Immunet 3.0, powered by ClamAV explains how to write custom signatures and configure the Windows ClamAV version (Immunet) to use them. This is nice but Immunet has to be installed on a computer. A cool option would be to be able to run scans from a removable drive so when the help desk respond to a system all they need to do is to plug in their thumb drive. This is where the portable apps ClamAV version comes into play since it can run scans from a thumb drive. I couldn’t find any documentation about how to use custom signatures in the portable version but after a little testing I figured it out. All that has to be done is to copy the custom signature to the same folder where ClamAV stores it signature files main.cvd and daily.cdv. This location is ClamWinPortable\Data\db folder. I copied my adobe.ldb custom signature to the Data\db folder and was able to locate the malware sample I disguised as Notepad ++.
Wednesday, August 22, 2012
Posted by
Corey Harrell
Knowing what programs ran on a system can answer numerous questions about what occurred. What was being used to communicate, what browsers are available to surf the web, what programs can create documents, were any data spoliation programs ran, or is the system infected. These are only a few of the questions that can be answered by looking at program execution. There are different artifacts showing program execution; one of which is the application compatibility cache. Mandiant’s whitepaper Leveraging the Application Compatibility Cache in Forensic Investigations (blog post is here and paper is here) explains what the cache is in detail and why it’s important to digital forensics. One important aspect about the cache is it stores information about files such as names, size, and last modified times; all of which may be useful during a digital forensic examination. The application compatibility cache has provided additional information I wouldn’t have known about without it. As such I’m taking some time to write about this important new artifact.
I wanted to highlight the significance of the cache but I didn’t want to just regurgitate what Mandiant has already said. Instead I’m doing the DFIR equivalent of man versus the machine. I’m no John Henry but like him we are witnessing the impact modernization has on the way people do their jobs. One such instance is the way people try to determine if a system is infected with malware. A typical approach is to scan a system with antivirus software to determine if it is infected. There is a dependency on the technology (antivirus software) to do the work and in essence the person is taken out of the process. Seems very similar to what John Henry witnessed with the steam powered hammer replacing the human steel drivers. John Henry decided to demonstrate man’s might by taking the steam powered hammer head on in a race. I opted to do the same, to take on one of my most reliable antivirus scanners (Avast) in a head on match to see who can first locate and confirm the presence of malware on a system. I didn’t swing a hammer either. My tools of choice were RegRipper with the new appcompatcache plugin to parse the application compatibility cache along with the Sleuthkit and Log2timeline to generate a timeline containing filesystem metadata. Maybe, just maybe in some distant future in IT and security shops across the land people will be singing songs about the race of the century. When Man took on the Antivirus Scanner.
The Challenge
The challenge was to find malware that an organization somewhere in the land is currently facing. Before worrying about what malware to use I first configured the test system. The system was a Windows XP fresh install with Service Pack 3. I only installed Adobe Reader version 9.3 and Java version 6 update 27. These applications were chosen to make it easier to infect the system through a drive-by. I wanted to use unknown malware as a way to level the playing field; I didn’t need nor want any advantages over the antivirus scanner. To find the malware I looked at the recently listed URLs on the Malware Domain List to find any capable of doing a drive-by. I found two potential URLs as shown below.
The first URL pointed to a Blackhole exploit pack. I entered the URL into Internet Explorer and after waiting for a little bit the landing page appeared as captured below.
I gave Blackhole some more time to infect the computer before I entered the second URL. That was when I saw the first indication the system was successfully infected with an unknown malware.
The race was now officially on. Whoever finds the malware and any other information about the malware first wins.
On Your Mark, Get Set
I mounted the system to my workstation using FTK Imager in order for tools to run against it. I downloaded and installed the latest Avast version followed by updating to the latest virus signature definitions. I configured Avast to scan the mounted image and all that was left was to click “Scan”. With my challenger all set I made sure I had the latest RegRipper Appcompatcache plugin. Next I fired up the command prompt and entered the following command:
rip.pl –p appcompatcache –r F:\Windows\System32\config\system > C:\appcompt.txt
The command is using RegRipper’s command-line version and says to run the appcompatcache plugin against the system registry hive in the mounted image’s config folder. To make it easier to review the output I redirected it to a text file.
My challenger is all set waiting at the starting line. I’m all set just waiting for one little word.
Go!
The Avast antivirus scan was started as I pressed enter to run the RegRipper’s appcompatcache plugin against the system registry hive.
0 minutes 45 seconds
I opened the text file containing the parsed application compatibility cache. One cool thing about the plugin is that Harlan highlights any executables in a temporary folder. In the past I quickly found malware by looking at any executables present in temp folders so I went immediately to the end of the output. I found the following suspicious files which I inspected closer.
Temp paths found:
C:\Documents and Settings\Administrator\Local Settings\Temp\gtbcheck.exe
C:\Documents and Settings\Administrator\Local Settings\Temp\install_flash_player_ax.exe
C:\Documents and Settings\Administrator\Local Settings\Temp\install_flashplayer11x32ax_gtbd_chrd_dn_aih[1].exe
C:\Documents and Settings\Administrator\Local Settings\Temp\gccheck.exe
C:\Documents and Settings\Administrator\Local Settings\Temporary Internet Files\Content.IE5\4967GLU3\install_flashplayer11x32ax_gtbd_chrd_dn_aih[1].exe
C:\Documents and Settings\Administrator\Local Settings\Temp\install_flashplayer11x32ax_gtbd_chrd_dn_aih[1].bat
3 minutes 4 seconds
My hopes of a quick win came crashing down when I found out the executables in the temporary folders were no longer present on the system. I went back to the beginning of the application compatibility cache’s output and started working my way through each entry one at a time. Avast was scanning the system at a fast pace because the image was so small.
5 minutes 10 seconds
Avast was still scanning the system but it still didn’t find the malware. That was good news for me because I found another suspicious entry in the application compatibility cache.
C:\Documents and Settings\Administrator\Local Settings\Application Data\armfukk.exe
ModTime: Tue Aug 21 20:34:04 2012 Z
UpdTime: Tue Aug 21 20:38:03 2012 Z
Size : 495616 bytes
The file path drew my attention to the program and a check on the system showed it was still there. I quickly uploaded armfukk.exe to VirusTotal as stared at the Avast scan waiting to see if it would flag it before the VirusTotal scan completed.
VirusTotal delivered the verdict: 9 out of 42 antivirus scanners detected the armfukk.exe file as malware. Going head to head against Avast I located a piece of malware in about 5 minutes while Avast was still scanning. As you probably expected Avast still didn’t flag any files as being malicious.
Avast was still running the race as it kept scanning the system. I continued my examination by turning to my next tool of choice; a timeline. A timeline would provide a wealth of information by showing the activity around the time the armfukk.exe file was created on the system. I ran the following Sleuthkit command to create a bodyfile containing the filesystem metadata:
fls.exe -m C: -r \\.\F: > C:\bodyfile
9 minutes 30 seconds
Avast was still chugging along scanning but it still didn’t flag any files. The bodyfile was finally created but I needed to convert it into a more readable format. I wanted the timeline in log2timeline’s csv format so I next ran the command:
log2timeline.pl -z local -f mactime -w timeline.csv C:\bodyfile
11 minutes 22 seconds
I imported the timeline into Excel and sorted the output. Just as I was getting ready to search on the “armfukk.exe” keyword Avast finally completed its scan with zero detections.
Shortly There After
The race was over but I wasn’t basting in the glory of winning. I wanted to know how the malware actually infected the computer since I was so close to getting the answer. I searched on the armfukk.exe filename and found the entry showing when the file was created on the system.
There was activity showing Java was running and five seconds before the armfukk.exe file was created I came across an interesting file in the Java cache. VirusTotal gave me all the confirmation I needed.
Moral of the Story
As I said before, maybe, just maybe in some distant future in IT and security shops across the land people will be singing songs about the race of the century. Remembering the day when man demonstrated they were needed in the process to locate malware on a system. Putting antivirus technology into perspective as a tool; a great tool to have available in the fight against malware. Remembering the day when man stood up and said "antivirus technology is not a replacement for having a process to respond to malware incidents nor is it a replacement for the people who implement that process".
Wednesday, August 15, 2012
Posted by
Corey Harrell
In this Linkz edition I’m mentioning write-ups discussing tools. A range of items are covered from the registry to malware to jump lists to timelines to processes.
RegRipper Updates
Harlan has been pretty busy updating RegRipper. First RegRipper version 2.5 was released then there were some changes to where Regripper is hosted along with some nice new plugins. Check out Harlan’s posts for all the information. I wanted to touch on a few of the updates though. The updates to Regripper included the ability to run directly against volume shadow copies and parse big data. The significance to parsing big data is apparent in his new plugin that parses the shim cache which is an awesome artifact (link up next). Another excellent addition to RegRipper is the shellbags plugin since it parses Windows 7 shell bags. Harlan’s latest post Shellbags Analysis highlights the forensic significance to shell bags and why one may want to look at the information they contain. I think these are awesome updates; now one tool can be used to parse registry data when it used to take three separate tools. Not to be left out the community has been submitting some plugins as well. To only mention a few Hal Pomeranz provided some plugins to extract Putty and WinSCP information and Elizabeth Schweinsberg added plugins to parse different Run keys. The latest RR plugin download has the plugins submitted by the community. Seriously, if you use RegRipper and haven’t checked out any of these updates then what are you waiting for?
Shim Cache
Mandiant’s post Leveraging the Application Compatibility Cache in Forensic Investigations explained the forensic significance of the Windows Application Compatibility Database. Furthermore, Mandiant released the Shim Cache Parser script to parse the appcompatcache registry key in the System hive. The post, script, and information Mandiant released speaks for itself. Plain and simple, it rocks. So far the shim cache has been valuable for me on fraud and malware cases. Case in point, at times when working malware cases programs execute on a system but the usual program execution artifacts (such as prefetch files) doesn’t show it. I see this pretty frequently with downloaders which are programs whose sole purpose is to download and execute additional malware. The usual program execution artifacts may not show the program running but the shim cache was a gold mine. Not only did it reflect the downloaders executing but the information provided more context to the activity I saw in my timelines. What’s even cooler than the shim cache? Well there are now two different programs that can extract the information from the registry.
Searching Virus Total
Continuing on with the malware topic, Didier Stevens released a virustotal-search program. The program will search for VirusTotal reports using a file’s hash (MD5, SHA1, SHA256) and produces a csv file showing the results. One cool thing about the program is it only performs hash searches against Virustotal so a file never gets uploaded. I see numerous uses for this program since it accepts a file containing a list of hashes as input. One way I’m going to start using virustotal-search is for malware detection. One area I tend to look at for malware and exploits are temporary folders in user profiles. It wouldn’t take too much to search those folders looking for any files with an executable, Java archive, or PDF file signatures. Then for each file found perform a search on the file’s hash to determine if VirusTotal detects it as malicious. Best of all, this entire process could be automated and run in the background as you perform your examination.
Malware Strings
Rounding out my linkz about malware related tools comes from the Hexacorn blog. Adam released Hexdrive version 0.3. In Adam’s own words the concept behind Hexdrive is to “extract a subset of all strings from a given file/sample in order to reduce time needed for finding ‘juicy’ stuff – meaning: any string that can be associated with a) malware b) any other category”. Using Hexdrive makes reviewing strings so much easier. You can think of it as applying a filter across the strings to initially see only the relevant ones typically associated with malware. Then afterwards all of the strings can be viewed using something like Bintext or Strings. It’s a nice data reduction technique and is now my first go to tool when looking at strings in a suspected malicious file.
Log2timeline Updates
Log2timeline has been updated a few times since I last spoke about it on the blog. The latest release is version 0.64. There have been quite a few updates ranging from small bug fixes to new input modules to changing the output of some modules. To see all the updates check out the changelog.
Most of the time when I see people reference log2timeline they are creating timelines using either the default module lists (such as winxp) or log2timeline-sift. Everyone does things differently and there is nothing wrong with these approaches. Personally, both approaches doesn’t exactly meet my needs. The majority of the systems I encounter have numerous user profiles stored on them which mean these profiles contain files with timestamps log2timeline extracts. Running a default module list (such as winxp) or log2timeline-sift against the all the user profiles is an issue for me. Why should I include timeline data for all user accounts instead of the one or two user profiles of interest? Why include the internet history for 10 accounts when I only care about one user? Not only does it take additional time for timeline creation but it results in a lot more data then what I need thus slowing down my analysis. I take a different approach; an approach that better meets my needs for all types of cases.
I narrow my focus down to specific user accounts. I either confirm who the person of interest is which tells me what user profiles to examine. Or I check the user profile timestamps to determine which ones to focus on. What exactly does this have to do with log2timeline? The answer lies in the –e switch since it can exclude files or folders. The –e switch can be used to exclude all user profiles I don’t care about. There’s 10 user profiles and I only care about 2 profiles but I only want to run one log2timeline command. No problem if you use the –e switch. To illustrate let’s say I’m looking at the Internet Explorer history on a Windows 7 system with five user profiles: corey, sam, mike, sally b, and alice. I only need to see the browser history for the corey user account but I don’t want to run multiple log2timeline commands. This is where the –e switch comes into play as shown below:
log2timeline.pl -z local -f iehistory -r -e Users\\sam,Users\\mike,"Users\\sally b",Users\\alice,"Users\\All Users" -w timeline.csv C:\
The exclusion switch eliminates anything containing the text used in the switch. I could have used sam instead of Users\\sam but then I might miss some important files such as anything containing the text “sam”. Using a file path limits the amount of data that is skipped but will still eliminate any file or folder that falls within those user profiles (actually anything falling under the C root directory containing the text Users\username). Notice the use of the double back slashes (\\) and the quotes; for the command to work properly this is needed. What’s the command’s end result? The Internet history from every profile stored in the Users folder except for the sam, mike, sally b, alice, and all user profiles is parsed. I know most people don’t run multiple log2timeline commands when generating timelines since they only pick one of the default modules list. Taking the same scenario where I’m only interested in the corey user account on a Windows 7 box check out the command below. This will parse every Windows 7 artifact except for the excluded user profiles (note the command will impact the filesystem metadata for those accounts if the MFT is parsed as well).
log2timeline.pl -z local -f win7 -r -e Users\\sam,Users\\mike,"Users\\sally b",Users\\alice,"Users\\All Users" -w timeline.csv C:\
The end result is a timeline focused only on the user accounts of interest. Personally, I don't use the default module lists in log2timeline but I wanted to show different ways to use the -e switch.
Time and Date Website
Daylight savings time does not occur on the same day each year. One day I was looking around the Internet for a website showing the exact dates when previous daylight savings time changes occurred. I came across the timeanddate.com website. The site has some cool things. There’s a converter to change the date and time from one timezone to another. There’s a timezone map showing where the various timezones are located. A portion of the site even explains what Daylight Savings Time is. The icing on the cake is the world clock where you can select any timezone to get additional information including the historical dates of when Daylight Savings Time occurred. Here is the historical information for the Eastern Timezone for the time period from the year 2000 to 2009. This will be a useful site when you need to make sure that your timestamps are properly taken into consideration Daylight Savings Time.
Jump Lists
The day has finally arrived; over the past few months I’ve been seeing more Windows 7 systems than Windows XP. This means the artifacts available in the Windows 7 operating system are playing a greater role in my cases. One of those artifacts is jump lists and Woanware released a new version of Jumplister which parses them. This new version has the ability to parse out the DestList data and performs a lookup on the AppID.
Process, Process, Process
Despite all the awesome tools people release they won’t be much use if there isn’t a process in place to use them. I could buy the best saws and hammers but they would be worthless to me building a house since I don’t know the process one uses to build a house. I see digital forensics tools in the same light and in hindsight maybe I should have put these links first. Lance is back blogging over at ForensicKB and he posted a draft to the Forensic Process Lifecycle. The lifecycle covers the entire digital forensic process from the Preparation steps to triage to imaging to analysis to report writing. I think this one is a gem and it’s great to see others outlining a digital forensic process to follow. If you live under a rock then this next link may be a surprise but a few months back SANS released their Digital Forensics and Incident Response poster. The poster has two sides; one outlines various Windows artifacts while the other outlines the SANs process to find malware. The artifact side is great and makes a good reference hanging on the wall. However, I really liked seeing and reading about the SANs malware detection process since I’ve never had the opportunity to attend their courses or read their training materials. I highly recommend for anyone to get a copy of the poster (paper and/or electronic versions). I’ve been slacking updating my methodology page but over the weekend I updated a few things. The most obvious is adding links to my relevant blog posts. The other change and maybe less obvious is I moved around some examination steps so they are more efficient for malware cases. The steps reflect the fastest process I’ve found yet to not only find malware on a system but to determine how malware got there. Just an FYI, the methodology is not only limited to malware cases since I use the same process for fraud and acceptable use policy violations.
Sunday, August 12, 2012
Posted by
Corey Harrell
This past week I was vacationing with my family when my blog surpassed another milestone. It has been around for two years and counting. Around my blog’s anniversary I like to reflect back on the previous year and look ahead at the upcoming one. Last year I set out to write about various topics including: investigating security incidents, attack vector artifacts, and my methodology. It shouldn’t be much of a surprise then when you look at the topics in my most read posts from the past year:
1. Dual Purpose Volatile Data Collection Script
2. Finding the Initial Infection Vector
3. Ripping Volume Shadow Copies – Introduction
4. Malware Root Cause Analysis
5. More About Volume Shadow Copies
6. Ripping VSCs – Practitioner Method
Looking at the upcoming year there’s a professional change impacting a topic I’ve been discussing lately. I’m not talking about a job change but an additional responsibility in my current position. My casework will now include a steady dose of malware cases. I’ve been hunting malware for the past few years so now I get to do it on a regular basis as part of my day job. I won’t directly discuss any cases (malware, fraud, or anything else) that I do for my employer. However, I plan to share the techniques, tools, or processes I use. Malware is going to continue to be a topic I frequently discuss from multiple angles in the upcoming year.
Besides malware and any other InfoSec or DFIR topics that have my interest, there are a few research projects on my to-do list. First and foremost is to complete my finding fraudulent documents whitepaper and scripts. The second project is to expand on my current research about the impact virtual desktop infrastructure will have on digital forensics. There are a couple of other projects I’m working on and in time I’ll mention what those are. Just a heads up, at times I’m going to be focusing on these projects so expect some time periods when there isn’t much activity with the blog. As usual, my research will be shared either through my blog or another freely available resource to the DFIR community.
Again, thanks to everyone who links back to my blog and/or publicly discusses any of my write-ups. Each time I come across someone who says that something I wrote helped them in some way makes all the time and work I do for the blog worth the effort. Without people forwarding along my posts then people may not be aware about information that could help them. For this I’m truly grateful. I couldn’t end a reflection post without thanking all the readers who stop by jIIr. Thank you and you won’t be disappointed with what I’m gearing up to release over the next year.
Labels: