Links 4 Everyone

Wednesday, August 10, 2011 Posted by Corey Harrell
In this edition of Links I think there is a little bit of something for everyone regardless if your interest is forensics, malware, InfoSec, security auditing or even good a rant ….

Digital Forensic Search Updates

The Digital Forensic Search index has been slowly growing since it was put together four months ago. Last Sunday’s update brought the sites in the index to: 103 DFIR blogs, 38 DFIR websites, 13 DFIR web pages, and 2 DFIR groups. The initial focus of DFS was to locate information related to specific artifacts as opposed to locating tools to parse those artifacts. My reasoning was because I didn’t want to weed through a lot of irrelevant search hits. Most tools’ websites only provided a high level overview of an artifact the tool parses instead of in-depth information. It made sense to leave out tool specific sites to reduce the amount of noise but things change.

A question I ask myself at times is what tool can parse artifact XFZ. I’m not alone asking the question because I see others asking the same thing. To make things easier in locating tools I’m now adding tool specific sites to the Digital Forensic Search. So far 15 websites and 7 web pages are indexed. I ran a few tests and the search results seem to be a good mixture of hits for information and tools. My testing was limited so if anyone sees too much noise then just shoot me an email telling me who the culprit is.

Let me know of any links missing from DFS

Windows Shortcut File Parser Update

My post Triaging My Way mentions a need I had for a command line tool to parse Windows Shortcut files. In my quest for a tool I modified the perl script to produce the output I wanted. One of the modifications I made to the script was to examine all of the files in a folder and to only parse files with the lnk file extension. I was running (modified script) against some shortcut files when the script would abruptly stop. The parsed information from the last file only contained the file system timestamps. Examination of the file showed that it was empty and this was what caused to die. I made a slight modification to so the script checks each files’ header to confirm it is indeed a Windows shortcut file. I uploaded the new scripts ( and to the Yahoo Win4n6 group and added a version number (v1.1) in the comments.

There are always different ways to accomplish something. When faced with trying to parse all of the Window shortcut files in a folder I opted to modify an existing script to meet my needs. The Linux Sleuthing blog took a different approach in the post Windows Link Files / Using While Loops. The author uses a while loop with an existing script to parse all of the shortcut files in a folder. Their approach is definitely simpler and quicker than what I tried to do. I learned a lot from the approach I took since I had to understand what modifications to make to an existing script in order to get the output I wanted.

How to Mount a Split Image

Speaking of the Linux Sleuthing blog. They provided another useful tip in the post Mounting Split Raw Images. As the name of the post implies it is about how to mount a split image in a Linux environment. I can’t remember the last time I dealt with a split image since I no longer break up images. However, when I used to create split images I remember asking myself how to mount it in Linux. To others the question may be simple but I didn’t have a clue besides concatenating to make a single image. The Mounting Split Raw Images post shows that sharing information – no matter how simple it may appear – will benefit someone at some point in time.

$UsnJrnl Goodness

Bugbear over at Security Braindump put together a great post Dear Diary: AntiMalwareLab.exe File_Created. I recommend anyone who will be encountering a Windows Vista or 7 system to read the post even if malware is not typically encountered during examinations. The $UsnJrnl record is an NTFS file system artifact which is turned on by default in Vista and 7. Bugbear discusses what the $UsnJrnl record is and how to manually examine it before discussing tools to automate the examination.

What I really like about the post is the way he presented the information. He explains an artifact, how to parse the artifact, a tool to automate the parsing and then shares an experience of how the artifact factored into one of his cases. I think the last part is important since sharing his experience provides context to why the artifact is important. His experience involved files created/deleted on the system as a result of a malware infection. Providing context makes it easier to see the impact of $UsnJrnl on other types of investigations. For example, a reoccurring activity I need to determine on cases is what files were deleted from a system around a certain time. Data in the $UsnJrnl record may not only show when the files of interest were deleted but could highlight what other files were deleted around the same time.

Memory Forensic Image for Training

While I’m on the topic of malware I wanted to pass along a gem I found in my RSS feeds and seen others mention. The MNIN Security Blog published the Stuxnet's Footprint in Memory with Volatility 2.0 back in June but I didn’t read it until recently. The post demonstrates Volatility 2.0’s usage by examining a memory image of a system infected with Stuxnet. A cool thing about the write-up is the author makes available the memory image they used. This means the write-up and the memory image can be used as a guide to better understand how to use Volatility. Just download Volatility, download the memory image, read the post, and follow along by running the same commands against the memory image. Not bad for a free way to improve your Volatility skills.

Easier Way to Generate Reports from Vulnerability Scans

Different methods are used to identify known vulnerabilities on systems. Running various vulnerability scanners, web application scanners, and port scanners are all options. One of the more tedious but important steps in the process is to correlate all of the tools’ outputs to identify: what vulnerabilities are present, their severity, and their exposure on the network. Obtaining this kind of information from the scans was a manual process since there wasn’t a way to automate it. James Edge over at Information Systems Auditing is trying to address this issue in something he calls the RF Project (Reporting Framework Project). RF Project is able to take scans from Nessus, Eeye Retina, Nmap, HP WebInpect, AppScan AppDetective, Kismet, and GFI Languard so custom reports can be created. Want to know the potential vulnerabilities detected by Nessus, Retina, and Nmap against server XYZ? Upload the scans to the reporting framework and create a custom report showing the answer instead of manually going through each report to identify the vulnerabilities. I tested an earlier version of the framework when it only supported Nessus and Retina a few years ago. It’s great to see he continued with the project and added support for more scans.

Jame’s site has some useful stuff besides the RF project. He has a few hacking tutorials and some technical assessment plans for external enumeration, Windows operating system enumeration, and Windows passwords.

Good InfoSec Rant

I like a good rant ever once in awhile. Assuming the Breach’s I do it for the Lulz explains the reason the author works in security. It’s not about the money, job security, or prestige; he works in security because it’s a calling. The post was directed at the InfoSec field but I think the same thing applies to Digital Forensics. Take the following quote:

“Technology, and especially information security has always been more than a job to me. More than even a career. It's a calling. Don't tell my boss, but I'd do this even if they didn't pay me. It's what I do. I can't help it.”

I can’t speak for others but digital forensics is the most changing field I’ve ever worked in. Technology (hardware and software) is constantly changing in how it stores data and the tools I use to extract information are also evolving. Digital forensics can’t be treated as a normal 8 to 4 job with any chance of being successful. Five days a week and eight hours each day is not enough time for me to keep my knowledge and skills current about the latest technology, tool update, threat, or analysis technique. It’s not a job; it’s my passion. My passion enables me to immerse myself in DFIR so I can learn constantly and apply my skills in different ways outside of work for my employer.

I wouldn’t last if digital forensics was only a day job. Seriously, how could I put myself through some of the things we do if there is no passion? We read whitepapers dissecting artifacts and spend countless hours researching and testing to improve our skills. Doing either of these things would be brutal to someone who lacks passion for the topic. For example, I couldn’t hack it being a dentist because I lack the passion for dentistry. I wouldn’t have the will power to read a whitepaper explaining some gum disease or spend hours studying different diagnosis. Dentistry would just be an 8 to 4 day job that pays the bills until I could find something else. DFIR on the other hand is another story as I spend my evening blogging about it after spending the day working on a case.
  1. Thanks for the link back. A lot of my posts are as much as a learning experience for me as others. So I try to document everything so others can find it useful too. Keep up the good work on the blog.

Post a Comment