Forensic Blogs

An aggregator for digital forensics blogs

October 21, 2014 by Corey Harrell

Timeline Analysis by Categories

Organizing is what you do before you do something, so that when you do it, it is not all mixed up.

~ A. A. Milne

"Corey, at times our auditors find fraud and when they do sometimes they need help collecting and analyzing the data on the computers and network. Could you look into this digital forensic thing just in case if something comes up?" This simple request - about seven years ago - is what lead me into the digital forensic and incident response field. One of the biggest challenges I faced was getting a handle on the process one should use. At the time there was a wealth of information about tools and artifacts but there was not as much outlining a process one could use. In hindsight, my approach to the problem was simplistic but very effective. I first documented the examination steps and then organized every artifact and piece of information I learned about beneath the appropriate step. Organizing my process in this manner enabled me to carry it out without getting lost in the data. In this post I'm highlighting how this type of organization is applied to timeline analysis leveraging Plaso.

Examinations Leveraging Categories
Organizing the digital forensic process by documenting the examination steps and organizing artifacts beneath them ensured I didn't get "all mixed up" when working cases. To do examination step "X" examine artifacts A, B, C, and D. Approaching cases this way is a more effective approach. Not only did it prevent jumping around but it helped to minimize overlooking or forgetting about artifacts of interest. Contrast this to a popular cheat sheet at the time I came into the field (links to cryptome). The cheat sheet was a great reference but the listed items are not in an order one can use to complete an examination. What ends up happening is that you jump around the document based on the examination step being completed. This is what tends to happen without any organization. Jumping around in various references depending on what step is being completed. Not an effective method.

Contrast this to the approach I took. Organizing every artifact and piece of information I learned about beneath the appropriate step. I have provided glimpses about this approach in my posts: Obtaining Information about the Operating System and It Is All About Program Execution. The "information about the operating system" post I wrote within the first year of starting this blog. In the post I outlined a few different examination steps and then listed some of the items beneath it. I did a similar post for program execution; beneath this step I listed various program execution artifacts. I was able to write the auto_rip program due to this organization where every registry artifact was organized beneath the appropriate examination step.

Examination Steps + Artifacts = Categories
What exactly are categories? I see them as the following: Examination Steps + Artifacts = Categories. I outlined my examination steps on the jIIr methodology webpage and below are some of the steps listed for system examinations.

        * Profile the System
              - General Operating System Information
              - User Account Information
              - Software Information
              - Networking Information
              - Storage Locations Information
        * Examine the Programs Ran on the System
        * Examine the Auto-start Locations
        * Examine Host Based Logs for Activity of Interest
        * Examine Web Browsing
        * Examine User Profiles of Interest
              - User Account Configuration Information
              - User Account General Activity
              - User Account Network Activity
              - User Account File/Folder Access Activity
              - User Account Virtualization Access Activity
        * Examine Communications

In a previous post, I said "taking a closer look at the above examination steps it’s easier to see how artifacts can be organized beneath them. Take for example the step Examine the programs ran on the system. Beneath this step you can organize different artifacts such as: application compatibility cache, userassist, and muicache. The same concept applies to every step and artifact." In essence, each examination step becomes a category containing artifacts. In the same post I continued by saying "when you start looking at all the artifacts within a category you get a more accurate picture and avoid overlooking artifacts when processing a case."

This level of organization is how categories can be leveraged during examinations.

Timeline Analysis Leveraging Categories
Organizing the digital forensic process by documenting the examination steps and organizing artifacts is not limited to completing the examination steps or registry analysis. The same approach works for timeline analysis. If I'm looking to build a timeline of a system I don't want everything (aka the kitchen sink approach.) I only want certain types of artifacts that I layer together into a timeline.

To illustrate let's use a system infected with commodity malware. The last thing I want to do is to create a supertimeline using the kitchen sink approach. First, it takes way too long to generate (I'd rather start analyzing than waiting.) Second, the end result has a ton of data that is really not needed. The better route is to select the categories of artifacts I want such as: file system metadata, program execution, browser history, and windows event logs. Approaching timelines in this manner makes them faster to create and easier to analyze.

The way I created timelines in this manner was not efficient as I wanted it to be. Basically, I looked at the timeline tools I used and what artifacts they supported. Then I organized the supported artifacts beneath the examination steps to make them into categories. When I created a timeline I would use different tools to parse the categorized artifacts I wanted and then combine the output into a single timeline in the same format. It sounds like a lot but it didn't take that long to create it. Hell, it was even a lot faster than doing the kitchen sink approach.

This is where Plaso comes into the picture by making things a lot easier to leverage categories in timeline analysis.

Plasm
Plaso is a collection of timeline analysis tools; one of which is plasm. Plasm is capable of "tagging events inside the plaso storage file." Translation: plasm organizes the artifacts into categories by tagging the event in the plaso storage file. At the time of this post the tagging occurs after the timeline data has already been parsed (as opposed to specifying the categories and only parsing those during timeline generation.) The plasm user guide webpage provides a wealth of information about the tool and how to use it. I won't rehash the basics since I couldn't do justice to what is already written. Instead I'll jump right in to how plasm makes leveraging categories in timeline analysis easier.

The plasm switch " --tagfile" enables a tag file to be used to define the events to tag. Plaso provides an example tag file named tag_windows.txt. This is the feature in Plaso that makes it easier to leverage categories in timeline analysis. The artifacts supported by Plaso are organized beneath the examination steps in the tag file. The image below is a portion of the tag_jiir.txt tag file I created showing the organization:


tag_jiir.txt is still a work in progress. As can be seen in the above image, the Plaso supported artifacts are organize beneath the " Examine the Programs Ran on the System" (program execution) and " Examine the Auto-start Locations" (autostarts info) examination steps. The rest of the tag file is not shown but the same organization was completed for the remaining examination steps. After the tag_jiir.txt tag file is applied to plaso storage file then timelines can be created only containing the artifacts within select categories.

Plasm in Action
It's easier to see plasm in action in order to fully explore how it helps with using categories in timeline analysis. The test system for this demonstration is one found laying around on my portable drive; it might be new material or something I previously blogged about. For demonstration purposes I'm running log2timeline against a forensic image instead of a mounted drive. Log2timeline works against a mounted drive but the timeline will lack creation dates (and at the time of this post there is not a way to bring in $MFT data into log2timeline.)

The first step is taking a quick look at the system for any indications that the system may be compromised.  When triaging for potential malware infected system the activity in the program execution artifacts excel and the prefetch files revealed a suspicious file as shown below.


The file stands out for a few reasons. It's a program executing from a temp folder inside a user profile. The file handle to the zone.identifier file indicates the file came from the Internet. Lastly, the program last ran on 1/16/14 12:32:14 UTC.

Now we move on to creating the timeline with the command below (the -o switch specifies the partition I want to parse.) FYI, the command below creates a kitchen sink timeline with everything being parsed. To avoid the kitchen sink use  plaso filters. Creating my filter is on my to-do list after I complete the tag_jiir.txt file.

log2timeline.exe -o 2048 C:Atadtest-datatest.dump C:Atadtest-datatest.E01

The image below shows the "information that have been collected and stored inside the storage container." The Plaso tool pinfo was used to show this information.


Now it is time to see plasm in action by tagging the events in the storage container. The command below shows how to do this using my tag_jiir.txt file.

plasm.exe tag --tagfile=C:Atadtest-datatag_jiir.txt C:Atadtest-datatest.dump

The storage container information now shows the tagged events as shown below. Notice how the events are actually the categories for my examination steps (keep in mind some categories are not present due to the image not containing the artifacts.)


Now a timeline can be exported from the storage container based on the categories I want. The program execution indicator revealed there may be malware and it came from the internet. The categories I would want for a quick look are the following:

        - Filesystem: to see what files were created and/or modified
        - Web browsing: to correlate web browsing activity with malware
        - Program execution: to see what programs executed (includes files such as prefetch as well as specific event logs

The command below creates my timeline based on the categories I want (-o switch outputs in l2tcsv format, -w switch outputs to a file, and filter uses the tag contains.) FYI, a timeslice was not used since I wanted to focus on the tagging capability.

psort.exe -o L2tcsv -w C:Atadtest-datatest-timeline.csv C:Atadtest-datatest.dump "tag contains 'Program Execution' or tag contains 'Filesystem Info' or tag contains 'Web Browsing Info'"

The image below shows the portion of the timeline around the time of interest, which was 1/16/14 12:32:14 UTC. The timeline shows the lab user account accessing their Yahoo web email then a file named invoice.83842.zip being downloaded from the internet. The user then proceeded to execute the archive's contents by clicking the invoice.83842.exe executable. This infection vector is clear as day since the timeline does not contain a lot of un-needed data.


Conclusion
Plaso's tagging capabilities makes things easier to leverage categories in timeline analysis. By producing a timeline containing only the categories one wants in order to view the timeline data for select artifacts. This type of organization helps minimize getting "all mixed up" when conducting timeline analysis by getting lost in the data.

Plaso is great and the tagging feature rocks but as I mentioned before I used a combination of tools to create timelines. Not every tool has the capabilities I need so combining them provides better results. In past I had excellent results leveraging the Perl log2timeline and Regripper to create timelines. At this point the tools are not compatible. Plaso doesn't have a TLN parser (can't read RegRipper's TLN plug-ins) and RegRipper only outputs to TLN. Based on Harlan's upcoming presentation my statement may not be valid for long.

In the end, leveraging categories in timeline analysis is very powerful. This train of thought is not unique to me. Others (who happen to be tool developers) are looking into this as well. Kristinn is as you can see in Plaso's support for tagging and Harlan wrote about this in his latest Windows forensic book.


Side note: the purpose of this post was to highlight Plaso's tagging capability. However, for the best results the filtering capability should be used to reduce what items get parsed in the first place. The kitchen sink approach just takes way too long; why wait when you can analyze.  

Read the original at: Journey Into Incident ResponseFiled Under: Uncategorized Tagged With: timeline, tools

February 11, 2014 by Corey Harrell

Linkz 4 Mostly Malware Related Tools

It's been awhile but here is another Linkz edition. In this edition I'm sharing information about the various tools I came across over the past few months.

Process Explorer with VirusTotal Integration

By far the most useful tool released this year is the updated  Process Explorer program since it now checks running processes against VirusTotal. This added feature makes it very easy to spot malicious programs and should be a welcome toolset addition to those who are constantly battling malware. To turn on the functionality all you need to do is to select the "Check Virustotal" option from the Options menu.

 After it is selected then the new Virustotal column appears showing the results as shown below:


The hyperlinks in the VirusTotal results can be clicked, which brings you to the VirusTotal report. The new functionality is really cool and I can see it making life easier for those who don't have DFIR skills to find malware such as IT folks. Hence, my comment about this being the most useful tool released. The one thing I should warn others about is to think very hard before enabling is the "Submit Unknown Executables" to VirusTotal since this will result in files being uploaded to Virustotal (and thus available for others to download).

Making Static Analysis Easier

I recently became aware about this tool from people tweeting about. PEStudio "is a tool that can be used to perform the static investigation of any Windows executable binary." It quickly parses an executable file presenting you with indicators, VirusTotal results, imports, exports, strings, and a whole lot more as shown below.




Automating Researching URLs, Domains, and IPs

The next tool up to bat automates researching domains, IPs, hashes, and URLs. It's a pretty slick tool and I can see it being an asset when you need to get information quickly. TekDefense describes Automater as "a URL/Domain, IP Address, and Md5 Hash OSINT tool aimed at making the analysis process easier for intrusion Analysts." If you are tasked with doing this type of analysis then you will definitely want to check out this tool. The screenshot below is part of the report generated for the MD5 hash ae2fc0c593fd98562f0425a06c164920; the hash was easily obtained from PEStudio.


Norben - Portable Dynamic Analysis Tool

The next tool makes the dynamic analysis process a little easier. "Noriben is a Python-based script that works in conjunction with Sysinternals Procmon to automatically collect, analyze, and report on runtime indicators of malware. In a nutshell, it allows you to run your malware, hit a keypress, and get a simple text report of the sample's activities." To see this tool in action you can check out Brian Baskin's post Malware with No Strings Attached - Dynamic Analysis; it's an excellent read. In order to get a screenshot, I ran the previous sample inside a virtual machine with Noriben running.


I'm Cuckoo for Cocoa Puffs

You can probably guess what my kids ate for breakfast but this next tool is not child's play. Version 1 of the Cuckoo Sandbox has been released. The download is available on their download page. For those who don't want to set up their own in-house sandbox then you can use Malwr (the online version).

Pinpointing

The next tool comes courtesy of Kahu Security. The best way to explain the tool is to use the author's own words from his post Pinpoint Tool Released.

"There are many times where I come across a drive-by download, especially malvertisements, and it takes me awhile to figure out which file on the compromised website is infected. I wrote Pinpoint to help me find the malicious objects faster so I can provide that info to webmasters for clean-up purposes."

The post provides some examples on the tool's use as well as their most recent post Pinpointing Malicious Redirects (nice read by the way.) You can grab the tool from the tools page.

What You See May Not Be What You Should Get

I thought I'd share a post highlighting shortcomings in our tools while I'm on the topic about malware. Harlan posted his write-up Using Unicode to hide malware within the file system and it is a well written post. The post discusses an encounter with a system impacted by malware and the anti-forensic techniques used to better hide on the system. One method used was to set the file attributes to hidden and system; to hide a folder from the default view settings. The second method and more interesting of the two use the use of Unicode in the file name path. What the post really highlighted was how multiple tools - tools that are typically in the DFIR toolset - do not show the Unicode in the file path. This would make it possible for anyone looking at a system to overlook the directory and possibly miss an important piece of information. This is something to definitely be aware about for the tools we use to process our cases.

Do You Know Where You Are? You're In The NTFS Jungle Baby

If you haven't visited Joakim Schicht's MFT2CSV website lately then you may have missed the tools he updated on his downloads page. The tools include: LogFileParser that parses the logfile (only open source logfile parser available), mft2csv that parses the $MFT file, and UsnJrnl2Csv that parses the Windows Change Journal. The next time you find yourself in the NTFS jungle you may want to visit the mft2csv site to help you find your way out.

Still Lost in the NTFS Jungle

Rounding out this linkz post are a collection of tools from Willi Ballenthin. Willi had previously released tools such as his INDX parser and python-registry. Over the past few months he has released some more NTFS tools. These include: list-mft to timeline NTFS metadata, get-file-info to inspect $MFT records, and fuse-mft to mount an $MFT. I haven't had the time to test out these tools yet but it's at the top of my list.

Read the original at: Journey Into Incident ResponseFiled Under: Uncategorized Tagged With: Malware, tools

  • « Previous Page
  • 1
  • …
  • 3
  • 4
  • 5

About

This site aggregates posts from various digital forensics blogs. Feel free to take a look around, and make sure to visit the original sites.

  • Contact
  • Aggregated Sites

Suggest a Site

Know of a site we should add? Enter it below

Sending

Jump to Category

All content is copyright the respective author(s)