Missing Manual Plaso Filtering Techno 2018

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 58

DownloadMissing Manual Plaso Filtering Techno 2018
Open PDF In BrowserView PDF
Plaso Filtering
The Missing Manual

September 17-19, 2018 ½ San Antonio, TX USA

Mark Hallman
• Sr. Engineer with
SANS Research Operations Center (SROC)
• 11 Years in DFIR
• Worked with Plaso FKA log2timeline
• Certifications: CHFI, CCE, EnCE, GCFE,
GCFA

Email:

mark.hallman@gmail.com
mhallman@sans.org
Skype: mhallman
Twitter: @mhallman

What are we going to cover today?
• Refresher on the Plaso Components
• Methods to filter in Plaso
• Front End
• image_export
• log2timeline

• Back End
• psort

• Other complementary tools
• Timeline Explorer – Eric Zimmerman
• KAPE – Triage collection tool – Eric Zimmerman

September 17-19, 2018 ½ San Antonio, TX USA

Filtering – Why is it so important?
Data reduction by targeted collections allows:
• Focus on specific areas of interest
• Speed of processing
• Speed of analysis
• Manageable Output Size for Other
Tools

September 17-19, 2018 ½ San Antonio, TX USA

Evidence of categories:
• User Communication
• File Download
• Program Execution
• File Opening/ Creation
• File Knowledge
• Physical Location
• USB Key Usage
• Account Usage
• Browser Usage

Plaso Components & Process Flow
• Can be an iterative process.
• The lines are blurring between Collection & Culling
• image_export – extracts files from images (+VSS)

Collection

Culling

• Full Disk
Image
• Triage

•image_export
•Cylr / KAPE
•FTK Imager

• Or, other tools like KAPE – Also processes VSS

• log2timeline – creates the Plaso storage file (sqlite)
• pinfo – provide information on log2timeline
processing
• psort – processes /updates Plaso storage files
(sort, filter, analysis)
• psteal – wrapper that runs log2timeline and then
psort

psort

log2timeline

•Event Filters
•Dynamic Filters
•Tagging

•File Filters
•Parser filter
•YARA

September 17-19, 2018 ½ San Antonio, TX USA

pinfo
•Processing
Status

image_export
•
•
•
•
•
•

Extracts files by using a filter file
Allows targeted extraction of specific files
The filter file is the same format as used by log2timeline
Significantly faster processing than processing the entire image
Command line so it is scriptable /repeatable
VSS Support
• image_export will can grab matching files from the VSS

• Other tools can do similar collections but…

KAPE

September 17-19, 2018 ½ San Antonio, TX USA

image_export: common options

September 17-19, 2018 ½ San Antonio, TX USA

image export: VSS Capability
image_export -f windows_filter.txt --vss_stores all -w
nromanoff_image_export demo.E01

• --no_vss
• --vss_only
• --vss_stores

image export: Export by File Extension
-x "doc,docx,xls,xlsx,ppt,pptx"
image_export -x "doc,docx,xls,xlsx,ppt,pptx"
--vss_stores all -w nromanoff_image_export_office_docs
nromanoff-c-drive.e01

image export: date
"atime,
"crtime,2013-10-22
2013-10-2200:00:00,
00:00:00,2013-10-22
2013-10-2223:59:59"
23:59:59"

image_export.py --vss_stores all -x
"doc,docx,xls,xlsx,ppt,pptx"
--date-filter "atime, 2013-10-22 00:00:00, 2013-10-22 23:59:59"
--date-filter "crtime,2013-10-22 00:00:00, 2013-10-22 23:59:59"
-w blake_image_export_office_docs
../blake-c-drive/blake-c-drive.e01

log2timeline
• Processes source files into the Plaso database
• Supports many image formats - Raw, VHD, E01 images,
mount points & other Plaso DB files.

• Filtering Options available
• File Filters
• Filter by Parser
• Filter by YARA rules. (artifacts in Release 20180630)
Most basic command format:
log2timeline.py OUTPUT INPUT
log2timeline.py
demo.E01 demo.plaso
September 17-19, 2018 ½ San Antonio, TX USA

log2timeline usage: common options
usage: log2timeline.exe [-h] [-V]
[--parsers PARSER_LIST]
[-f FILE_FILTER]
[--no_vss] [--vss_only] [--vss_stores VSS_STORES]
[--no_dependencies_check]
[STORAGE_FILE] [SOURCE]
log2timeline.exe –z “UTC” --file_filter filter_windows.txt
--no_vss --parsers prefetch,amcache,userassist,srum
demo.plaso demo.E01
September 17-19, 2018 ½ San Antonio, TX USA

log2timeline: Collection Filters Files
• Filter Files are a list of files to collect
• Triage Approach: Collect / Process only what you want
• Saves time during collection and analysis

• Relevant to image_export and log2timeline
• Use the same file / file format
• Some items in Filter File may only be relevant for image_export
• No plugins to process some files that still; should be collected
• Example: pagefile.sys, hiberfile.sys, etc.

September 17-19, 2018 ½ San Antonio, TX USA

log2timeline: Filter File Format
• One entry per line
• Each line defines a single location to collect/process
• Format is: FIELD 1 | SEPARATOR | FIELD 2 | SEPARATOR | FIELD 3 | ...
• Separator = slash “/”
• A field can be one of the following three options:
• A string representing the exact directory name, case insensitive.
• A regular expression denoting the name of the directory or file.
• A name of an attribute collected during the preprocessing stage, denoted by a curly
bracket {attribute_name}.
• Attribute Name Example: {sysregistry}/.+evt
Source: https://github.com/log2timeline/plaso/wiki/Collection-Filters

September 17-19, 2018 ½ San Antonio, TX USA

Filter File Example

Complete File
Available on my
GitHub Page

Complete File: https://github.com/mark-hallman/plaso_filters

Filter Files – Performance Test - Events
Number of Events
Filter & Parsers

161469

Just Parsers

384316

Just File Filter

301594

No File Filter or Parser

1471272

0

200000

400000

600000

800000

1000000

1200000

1400000

1600000

Filter Files – Performance Test - Time
Processing
Time
Filter & Parsers

147.00

Just Parsers

Just File Filter

9694.00

190.00

No File Filter or Parser

9993.00

0.00

2000.00

4000.00

6000.00

Run time in seconds

8000.00

10000.00

12000.00

Parsers
• Available on front & back end
• Limit processing to specific types of artifacts – missing ”not” operator
• Parser Catagories
• Parsers: Processes individual artifacts
• (amcache, lnk, mft, plist, prefetch …)

• Parser Plugins: Processes artifact categories
• (apple_id, bag_mru, cron, google_drive …)

• Parser Presets: Sets of Parser Presets, Parsers Plugins & Parsers
• (winreg, win7, macosx …)
• Easy to create your own. Covered later in presentation.

• Help and list of all parsers

• log2timeline.exe --info
• log2timeline.exe --parsers list
September 17-19, 2018 ½ San Antonio, TX USA

Registry (winreg) Parsers
appcompatcache

shellbags

ccleaner

default

interface

lfu

mountpoints

mrulistex

mrulist

msie zones

officemru

outlook

run

sam_users

services

shutdown

task
scheduler

terminal
server

typedurls

usb

usbstor

userassist

winrar

winver

© 2017 Rob Lee | All Rights Reserved
September 17-19, 2018 ½ San Antonio, TX USA

Windows Parsers (win_gen, winxp, win7)
Chrome

Esedb

EVT / EVTX

Filestat

Firefox

Google drive

IE 6-9

IE 10-11

IIS

Job Files

Jumplists

LNK

McAfee Logs

Olecf

Openxml

Peer to Peer

Prefetch

Recycle Bin

Registry

Skype

Skydrive Logs

Symantec Log

© 2017 Rob Lee | All Rights Reserved

Winfirewall

Web History (webhist) Parsers
Chrome cache

Chrome cookies

Chrome extension
activity

Chrome history

Firefox cache

Firefox cookies

Firefox downloads

Firefox history

Java idx

MS Index.dat

MS webcache.dat

Opera global

Opera typed
history

© 2017 Rob Lee | All Rights Reserved

Safari history

Linux/Andriod/Mac (android, linux, macOS)
Android app
usage

Android calls

Android sms

appusage

Cups ipp

filestat

Google drive

Mackeeper
cache

keychain

Plist
appleaccount
Plist spotlight
volume

Asl log

bencode

Bsm log

Ipod device

Ls quarantine

Firewall log

Doc versions

securityd

macwifi

olecf

openxml

Plist airport

Plist bluetooth

Plist default

Plist install
history

Plist macuser

Plist
softwareupdat
e

Plist spotlight

Plist
timemachine

Pls recall

Popularity
contest

selinux

skype

syslog

utmp

utmpx

webhist

xchatlog

xchatscrollbac
k

© 2017 Rob Lee | All Rights Reserved

zeitgeist

Parsers – Do I Really Want the Defaults?
• Maybe, if you really know what
that means.
• No parser parameters == win7
• win7 ==
•
•
•
•
•
•
•

recycle_bin
amcache
custom_destinations
winevtx
esedb/file_history
olecf/olecf_automatic_destinations
win_gen

Parsers – Create your own Presets
• Presets are grouping of parsers, plugins and other parsers
invoked by a single name.
• In the Linux version of Plaso you can edit the presets.py file
to add your own presets.
/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py

September 17-19, 2018 ½ San Antonio, TX USA

Remove Event Log from win7 Parser

Copy & Rename (win7_custom)

/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py
September 17-19, 2018 ½ San Antonio, TX USA

Remove Chrome Artifacts from win7

/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py
September 17-19, 2018 ½ San Antonio, TX USA

pinfo
• Provides info on the Plaso
database
• Command line used for
log2timeline
• Event counts, Parser Counts
• Tagging & other Analysis
plugins runs
• Can be helpful to tune you
log2timeline command
options.

September 17-19, 2018 ½ San Antonio, TX USA

psort
•
•
•
•

Backend Work horse
Dedupping
Filtering – This is key to using the tool effectively
Analysis – Can be used to update the database and then filter on the
updates
Most basic command format:
psort.py –w OUTPUT INPUT
psort.py –w nromanoff.csv nromanoff.plaso
September 17-19, 2018 ½ San Antonio, TX USA

psort – Options of Interest
• usage: psort.exe [-h] [-V]
• [--analysis PLUGIN_LIST] - A list of analysis plugin names to be loaded or "--analysis list" to see a
list of available plugins
• [--slice DATE] - Create a time slice around a certain date.
• [--slice_size SLICE_SIZE] - Defines the slice size.
• [--slicer] - Create a time slice around every filter match.
• [-z TIMEZONE] - Explicitly define the timezone.
• [-o FORMAT] - The output format. Use "-o list" to see a list
• [-w OUTPUT_FILE] - Output filename.
• [--fields FIELDS] - Which fields should be included in the output
• [--additional_fields ADDITIONAL_FIELDS] - extra output, in addition to the default fields
• [STORAGE_FILE] – Plaso database created by log2timeline.
• [FILTER] – A filter applied to the database before it written to the output file(s)

September 17-19, 2018 ½ San Antonio, TX USA

Common use of psort and filters
Output file format: Several
other formats besides “l2tcsv”

psort.py -z "UTC" -o l2tcsv -w nromanoff_l2tcsv
nromanoff.plaso
"date > '2012-04-03 00:00:00' AND
date < '2012-04-07 00:00:00'"
Filter statement

September 17-19, 2018 ½ San Antonio, TX USA

psort: Output File Formats

September 17-19, 2018 ½ San Antonio, TX USA

L2TCSV Format – What to Fields to Focus on
Initially
Date:
Time:
Timezone:
MACB:
source:
sourcetype:
type:
user:
host:
short:
desc:
version:
filename:
Inode:
notes:
format:
extra:

Date of the event, in the format of MM/DD/YYYY
Time of day, expressed in a 24h format, HH:MM:SS
Time zone that was used to call the tool with.
MACB meaning of the fields, mostly for compatibility with the mactime format.
Short name for the source. All web browser history is, for instance, WEBHIST, registry entries are REG, simple log files are LOG, and so on.
More comprehensive description of the source, “Internet Explorer” instead of WEBHIST, etc.
Type of the timestamp itself, such as “Last Accessed,” “Last Written,” or “Last modified,” and so on.
Username
Hostname
Short description of the entry, usually contains less text than the full description field.
Description field, this is where most of the information is stored, the actual parsed description of the entry.
Version number of the timestamp object.
Filename with the full path of the filename that contained the entry.
Inode number of the file being parsed.
Some input modules insert additional information in the form of a note.
Name of the input module that was used to parse the file.
Additional information parsed is joined together and put here.

September 17-19, 2018 ½ San Antonio, TX USA

psort: L2TCSV Output Format - Sample
If you have not tried this tool – you really should

Timeline Explorer – Eric Zimmer
https://ericzimmerman.github.io/#!index.md

Dynamic Output Fields
These are the default fields for psort
Field Name
Datetime
timestamp_desc
Source
source_long
Message
Parser
display_name
tag

Description
Timestamp in ISO 8601 format
Type of the timestamp itself, such as “Last Accessed,” “Last Written,” or “Last
modified,” and so on.
Short name for the source. All web browser history is, for instance, WEBHIST,
registry entries are REG, simple log files are LOG, and so on.
More comprehensive description of the source, “Internet Explorer” instead of
WEBHIST, etc.
Description field, this is where most of the information is stored, the actual
parsed description of the entry.
Name of the input module that was used to parse the file.
Filename with the full path of the filename that contained the entry.
Tag name populated by the psort analysis module(s)

September 17-19, 2018 ½ San Antonio, TX USA

Maps to L2TCSV
no single field
type
source
sourcetype
desc
format
filename
N/A

psort: Dynamic Output Format - Sample
Default Format

Additional Output Fields
•
•
•
•
•

--additional_fields option
Adds additional fields to the default output list
Option works with “dynamic” output type
Can be context sensitive
Unfortunately, not compatible with “l2tcsv” output
format
• But, additional fields can be used in filters
• Look at the JSON output for additional fields

September 17-19, 2018 ½ San Antonio, TX USA

Default Output Fields
1.
2.
3.
4.
5.
6.
7.
8.

datetime
timestamp_desc
source
source_long
message
parser
display_name
tag

data_type: Additional Filterable Fields
• Can provide more granularity that any other single field
• In some cases, sourcetype, parser and data_type can provide the
same results

September 17-19, 2018 ½ San Antonio, TX USA

Data Types: 130+ Identified to Date
Windows Registry

FS Activity

windows:registry:amcache
fs:mactime:line
windows:registry:amcache:program
fs:stat
s

MAC
mac:appfirewall:line

Browser
firefox:cache:record

windows:registry:appcompatcache fs:stat:ntfs

mac:asl:event
firefox:cookie:entry
mac:document_versions:fil
firefox:downloads:download
e

windows:registry:installation

windows:lnk:link

mac:keychain:application

windows:registry:key_value

windows:shell_item:file_entry mac:keychain:internet

firefox:places:bookmark
firefox:places:bookmark_annotatio
n

windows:registry:list

windows:volume:creation

mac:securityd:line

firefox:places:bookmark_folder

windows:registry:network

mac:utmpx:event

firefox:places:page_visited

windows:registry:office_mru

mac:wifilog:line

chrome:cache:entry

imessage:event:chat
windows:registry:sam_users

mackeeper:cache

chrome:cookie:entry
chrome:extension_activity:activity
_log

windows:registry:service

macos:fseventsd:record

chrome:history:file_downloaded

windows:registry:shutdown

macosx:application_usage chrome:history:page_visited

windows:registry:userassist

macosx:lsquarantine

chrome:preferences:clear_history

Complete List
Available on my
GitHub Page

data_type Field as Filter – dynamic output
More Granularity
Red: Example of
more detail

Green: Example
other fields with
same information

data_type Field as Filter – l2tcsv output
More Granularity
"sourcetype is
'Firefox History'"

"DATA_TYPE is
'firefox:places:page_visited'"

Context Sensitive Fields: LNK Files
Example: LNK File events can be filtered on all these fields
Field

Description

birth_droid_file_identifier

distributed link tracking birth droid file identifier.

birth_droid_volume_identifier

distributed link tracking birth droid  volume identifier.

command_line_arguments

command line arguments.

description

description of the linked item.

drive_serial_number

drive serial number where the linked item resides.

drive_type

drive type where the linked item resided.

droid_file_identifier

distributed link tracking droid file identifier.

droid_volume_identifier

distributed link tracking droid volume identifier.

env_var_location

evironment variables loction.

file_attribute_flags

file attribute flags of the linked item.

file_size

size of the linked item.

icon_location

icon location.

link_target

shell item list of the link target.

local_path

local path of the linked item.

network_path

local path of the linked item.

relative_path

relative path.

volume_label

volume label where the linked item resided.

working_directory

working directory.

Data Type = windows:lnk:link
Output type == dynamic

psort.exe -z "UTC" -o dynamic --additional_fields
"data_type,drive_serial_number,drive_type,droid_file_identifier"
-w add_fields_drive_type.csv file_filter.plaso "data_type is 'windows:lnk:link' and
drive_type == 2"

Data Type = windows:lnk:link
Output type == l2tcsv

psort.exe -z "UTC" -o l2tcsv -w filter_on_add_fields.csv file_filter.plaso "data_type
is 'windows:lnk:link' and drive_type == 2"

Context Sensitive Fields: SAM Registry
Example: SAM Users events can be filtered on all these fields
DATA_TYPE = 'windows:registry:sam_users'
Field
account_rid (int)
comments (str)
fullname (str)
key_path (str)
login_count (int)
username (str)

Description
account relative identifier (RID).
comments
full name
Windows Registry key path
login count
username (str)

Context Sensitive Fields: Prefetch Files
Example: Prefetch events can be filtered on all these fields
DATA_TYPE = 'windows:prefetch:execution '
Field
executable (str)
format_version (int)
mapped_files (list[str])
number_of_volumes (int)
path (str)
prefetch_hash (int)
mapped_files (list[str])
volume_device_paths (list[str])
volume_serial_numbers (list[int])

Description
executable filename
format version
mapped filenames
number of volumes
path to the executable
prefetch hash
mapped filenames
volume device paths
volume serial numbers

Filter Example: Evidence of Execution
psort -z "UTC" -o l2tcsv -w execution_test.csv file_filter.plaso
"message contains 'Prefetch {' or
message contains 'AppCompatCache' or
message contains 'typed the following cmd' or
message contains 'CMD typed' or
message contains 'Last run' or
message contains 'RunMRU' or
message contains 'MUICache' or
message contains 'UserAssist key' or
message contains 'Time of Launch' or
message contains 'Prefetch' or
message contains 'SHIMCACHE' or
message contains 'Scheduled' or
message contains '.pf' or
message contains 'was run' or
message contains 'UEME_' or message contains '[PROCESS]'"

• Sample of Evidence of
Execution logic used by
Timeline Explorer. Developed
by Eric Zimmerman.
• This logic can be
implemented as a psort filter.
• Logic is not compatible with
psort tagging. !

Filter Results: Evidence of Execution

psort -z "UTC" -o l2tcsv -w execution_test.csv file_filter.plaso "message contains 'Prefetch {' or
message contains 'AppCompatCache' or message contains 'typed the following cmd' or message
contains 'CMD typed' or message contains 'Last run' or message contains 'RunMRU' or message
contains 'MUICache' or message contains 'UserAssist key' or message contains 'Time of Launch' or
message contains 'Prefetch' or message contains 'SHIMCACHE' or message contains 'Scheduled' or
message contains '.pf' or message contains 'was run' or message contains 'UEME_' or message
contains '[PROCESS]'"

log2timeline parsers versus psort filters

log2timeline parsers versus psort filters

Time Filtering – Data Range

psort.py -z "UTC" -o l2tcsv -w nromanoff_l2tcsv nromanoff.plaso
"date > '2012-04-03 00:00:00' AND date < '2012-04-07 00:00:00'"

Time Filtering – Slice
• Provides context around an date/time
• Create a time slice around a certain date
• Display all events that happened X minutes before and after
the defined date
• --slice_size defines the size of the slice
• Defaults to 5 minutes.
psort.py -z "UTC" -o l2tcsv --slice '2012-04-05 22:12:00'
-w nromanoff_l2tcsv nromanoff.plaso "data_type
is 'windows:lnk:link’ and drive_type == 2"

Time Filtering – Slicer
•
•
•
•

Creates a Time Slice Around every Filter match
Will save all X events before and after a filter match
X is set with the --slice option
Defaults to 5 events.

psort.py -z "UTC" -o l2tcsv –slicer –slice_size 10
-w nromanoff_l2tcsv nromanoff.plaso "data_type is
'windows:lnk:link’
and drive_type == 2"

Tagging
• Analysis Plugin to update the tag field in the DB file
• There are a few free form fields like “message” and “strings” that are
interesting for filtering but … not available for tagging.
• Reason: These fields are not stored in the DB
• The sample file filter, tag_windows.txt, on Plaso GitHub has some
errors
• Data_type typos
• Use of the strings field

September 17-19, 2018 ½ San Antonio, TX USA

Testing – How did you find these other fields?
• Looking at other formats (json, etc).
• Looking at code
• Looking at tagging files

September 17-19, 2018 ½ San Antonio, TX USA

Plaso Filtering Cheat Sheet

https://digital-forensics.sans.org/media/Plaso-Cheat-Sheet.pdf

Plaso Filter Presentation GitHub Link
• Repository is work in progress – will update as new info is
discovered
• Link is https://github.com/mark-hallman/plaso_filters

September 17-19, 2018 ½ San Antonio, TX USA

A Peek at KAPE

Questions
Thanks for attending – Safe Travels home

https://github.com/mark-hallman/plaso_filters



Source Exif Data:
File Type                       : PDF
File Type Extension             : pdf
MIME Type                       : application/pdf
Linearized                      : No
Page Count                      : 58
PDF Version                     : 1.4
Title                           : Missing_Manual_Plaso_Filtering_Techno_2018
Author                          : Mark Hallman
Subject                         : 
Producer                        : Mac OS X 10.13.6 Quartz PDFContext
Creator                         : PowerPoint
Create Date                     : 2018:09:20 01:36:11Z
Modify Date                     : 2018:09:20 01:36:11Z
Apple Keywords                  : 
EXIF Metadata provided by EXIF.tools

Navigation menu