Missing Manual Plaso Filtering Techno 2018

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 58

Plaso Filtering
The Missing Manual
September 17-19, 2018 ½San Antonio, TX USA
Mark Hallman
Sr. Engineer with
SANS Research Operations Center (SROC)
11 Years in DFIR
Worked with Plaso FKA log2timeline
Certifications: CHFI, CCE, EnCE, GCFE,
GCFA Email: mark.hallman@gmail.com
mhallman@sans.org
Skype: mhallman
Twitter: @mhallman
September 17-19, 2018 ½San Antonio, TX USA
Refresher on the Plaso Components
Methods to filter in Plaso
Front End
image_export
log2timeline
Back End
psort
Other complementary tools
Timeline Explorer Eric Zimmerman
KAPE – Triage collection tool – Eric Zimmerman
What are we going to cover today?
September 17-19, 2018 ½San Antonio, TX USA
Focus on specific areas of interest
Speed of processing
Speed of analysis
Manageable Output Size for Other
Tools
Filtering Why is it so important?
Evidence of categories:
• User Communication
• File Download
• Program Execution
• File Opening/ Creation
• File Knowledge
• Physical Location
• USB Key Usage
• Account Usage
• Browser Usage
Data reduction by targeted collections allows:
September 17-19, 2018 ½San Antonio, TX USA
Can be an iterative process.
The lines are blurring between Collection & Culling
image_export extracts files from images (+VSS)
Or, other tools like KAPE – Also processes VSS
log2timeline creates the Plaso storage file (sqlite)
pinfo provide information on log2timeline
processing
psort processes /updates Plaso storage files
(sort, filter, analysis)
psteal wrapper that runs log2timeline and then
psort
Plaso Components & Process Flow
Culling
image_export
Cylr / KAPE
FTK Imager
log2timeline
File Filters
Parser filter
YARA
pinfo
Processing
Status
psort
Event Filters
Dynamic Filters
Tagging
Collection
Full Disk
Image
Triage
September 17-19, 2018 ½San Antonio, TX USA
Extracts files by using a filter file
Allows targeted extraction of specific files
The filter file is the same format as used by log2timeline
Significantly faster processing than processing the entire image
Command line so it is scriptable /repeatable
VSS Support
image_export will can grab matching files from the VSS
Other tools can do similar collections but…
KAPE
image_export
September 17-19, 2018 ½San Antonio, TX USA
image_export: common options
image export: VSS Capability
image_export -f windows_filter.txt --vss_stores all -w
nromanoff_image_export demo.E01
--no_vss
--vss_only
--vss_stores
image export: Export by File Extension
image_export -x "doc,docx,xls,xlsx,ppt,pptx"
--vss_stores all -w nromanoff_image_export_office_docs
nromanoff-c-drive.e01
-x "doc,docx,xls,xlsx,ppt,pptx"
image export: date
image_export.py --vss_stores all -x
"doc,docx,xls,xlsx,ppt,pptx"
--date-filter "atime, 2013-10-22 00:00:00, 2013-10-22 23:59:59"
--date-filter "crtime,2013-10-22 00:00:00, 2013-10-22 23:59:59"
-w blake_image_export_office_docs
../blake-c-drive/blake-c-drive.e01
"atime, 2013-10-22 00:00:00, 2013-10-22 23:59:59"
"crtime, 2013-10-22 00:00:00, 2013-10-22 23:59:59"
September 17-19, 2018 ½San Antonio, TX USA
Processes source files into the Plaso database
Supports many image formats -Raw, VHD, E01 images,
mount points & other Plaso DB files.
Filtering Options available
File Filters
Filter by Parser
Filter by YARA rules. (artifacts in Release 20180630)
log2timeline
Most basic command format:
log2timeline.py OUTPUT INPUT
log2timeline.py demo.E01 demo.plaso
September 17-19, 2018 ½San Antonio, TX USA
log2timeline usage: common options
log2timeline.exe z “UTC
usage: log2timeline.exe [-h] [-V]
[--parsers PARSER_LIST]
[-f FILE_FILTER]
[--no_vss] [--vss_only] [--vss_stores VSS_STORES]
[--no_dependencies_check]
[STORAGE_FILE] [SOURCE]
--file_filter filter_windows.txt
--no_vss --parsers prefetch,amcache,userassist,srum
demo.plaso demo.E01
September 17-19, 2018 ½San Antonio, TX USA
Filter Files are a list of files to collect
Triage Approach: Collect / Process only what you want
Saves time during collection and analysis
Relevant to image_export and log2timeline
Use the same file / file format
Some items in Filter File may only be relevant for image_export
No plugins to process some files that still; should be collected
Example: pagefile.sys, hiberfile.sys, etc.
log2timeline: Collection Filters Files
September 17-19, 2018 ½San Antonio, TX USA
One entry per line
Each line defines a single location to collect/process
Format is: FIELD 1 | SEPARATOR | FIELD 2 | SEPARATOR | FIELD 3 | ...
Separator = slash “/”
A field can be one of the following three options:
A string representing the exact directory name, case insensitive.
A regular expression denoting the name of the directory or file.
A name of an attribute collected during the preprocessing stage, denoted by a curly
bracket {attribute_name}.
Attribute Name Example: {sysregistry}/.+evt
log2timeline: Filter File Format
Source: https://github.com/log2timeline/plaso/wiki/Collection-Filters
Filter File Example
Complete File: https://github.com/mark-hallman/plaso_filters
Complete File
Available on my
GitHub Page
Filter Files Performance Test -Events
1471272
301594
384316
161469
0200000 400000 600000 800000 1000000 1200000 1400000 1600000
No File Filter or Parser
Jus t Fi l e Fi lt er
Jus t P ar se r s
Fi l te r & P ar s er s
Number of Events
Filter Files Performance Test -Time
9993.00
190.00
9694.00
147.00
0.00 2000.00 4000.00 6000.00 8000.00 10000.00 12000.00
No File Filter or Parser
Just Fi l e Fi lter
Just P ar ser s
Filter & Par sers
Run time in seconds
Processing
Time
September 17-19, 2018 ½San Antonio, TX USA
Available on front & back end
Limit processing to specific types of artifacts missing not” operator
Parser Catagories
Parsers: Processes individual artifacts
(amcache, lnk, mft, plist, prefetch …)
Parser Plugins: Processes artifact categories
(apple_id, bag_mru, cron, google_drive …)
Parser Presets: Sets of Parser Presets, Parsers Plugins & Parsers
(winreg, win7, macosx …)
Easy to create your own. Covered later in presentation.
Help and list of all parsers
log2timeline.exe --info
log2timeline.exe --parsers list
Parsers
September 17-19, 2018 ½San Antonio, TX USA
Registry (winreg) Parsers
appcompatcache shellbags ccleaner default interface lfu
mountpoints mrulistex mrulist msie zones officemru outlook
run sam_users services shutdown task
scheduler
terminal
server
typedurls usb usbstor userassist winrar winver
© 2017 Rob Lee | All Rights Reserved
Windows Parsers (win_gen, winxp, win7)
© 2017 Rob Lee | All Rights Reserved
Chrome Esedb EVT / EVTX Filestat Firefox Google drive
IE 6-9 IE 10-11 IIS Job Files Jumplists LNK
McAfee Logs Olecf Openxml Peer to Peer Prefetch Recycle Bin
Registry Skype Skydrive Logs Symantec Log Winfirewall
Web History (webhist) Parsers
Chrome cache Chrome cookies Chrome extension
activity Chrome history Firefox cache
Firefox cookies Firefox downloads Firefox history Java idx MS Index.dat
MS webcache.dat Opera global Opera typed
history Safari history
© 2017 Rob Lee | All Rights Reserved
Linux/Andriod/Mac (android, linux, macOS)
Android app
usage Android calls Android sms appusage Asl log bencode Bsm log
Cups ipp filestat Google drive Ipod device Ls quarantine Firewall log Doc versions
Mackeeper
cache keychain securityd macwifi olecf openxml Plist airport
Plist
appleaccount Plist bluetooth Plist default Plist install
history Plist macuser
Plist
softwareupdat
e
Plist spotlight
Plist spotlight
volume
Plist
timemachine Pls recall Popularity
contest selinux skype syslog
utmp utmpx webhist xchatlog xchatscrollbac
kzeitgeist
© 2017 Rob Lee | All Rights Reserved
Parsers Do I Really Want the Defaults?
Maybe, if you really know what
that means.
No parser parameters == win7
win7 ==
recycle_bin
amcache
custom_destinations
winevtx
esedb/file_history
olecf/olecf_automatic_destinations
win_gen
September 17-19, 2018 ½San Antonio, TX USA
Presets are grouping of parsers, plugins and other parsers
invoked by a single name.
In the Linux version of Plaso you can edit the presets.py file
to add your own presets.
Parsers Create your own Presets
/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py
September 17-19, 2018 ½San Antonio, TX USA
Remove Event Log from win7 Parser
Copy & Rename (win7_custom)
/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py
September 17-19, 2018 ½San Antonio, TX USA
Remove Chrome Artifacts from win7
/usr/lib/python2.7/dist-packages/plaso/parsers/presets.py
September 17-19, 2018 ½San Antonio, TX USA
Provides info on the Plaso
database
Command line used for
log2timeline
Event counts, Parser Counts
Tagging & other Analysis
plugins runs
Can be helpful to tune you
log2timeline command
options.
pinfo
September 17-19, 2018 ½San Antonio, TX USA
Backend Work horse
Dedupping
Filtering This is key to using the tool effectively
Analysis Can be used to update the database and then filter on the
updates
psort
Most basic command format:
psort.py w OUTPUT INPUT
psort.py w nromanoff.csv nromanoff.plaso
September 17-19, 2018 ½San Antonio, TX USA
usage: psort.exe [-h] [-V]
[--analysis PLUGIN_LIST] - A list of analysis plugin names to be loaded or "--analysis list" to see a
list of available plugins
[--slice DATE] - Create a time slice around a certain date.
[--slice_size SLICE_SIZE] - Defines the slice size.
[--slicer] - Create a time slice around every filter match.
[-z TIMEZONE] - Explicitly define the timezone.
[-o FORMAT] - The output format. Use "-o list" to see a list
[-w OUTPUT_FILE] - Output filename.
[--fields FIELDS] - Which fields should be included in the output
[--additional_fields ADDITIONAL_FIELDS] - extra output, in addition to the default fields
[STORAGE_FILE] – Plaso database created by log2timeline.
[FILTER] A filter applied to the database before it written to the output file(s)
psort Options of Interest
September 17-19, 2018 ½San Antonio, TX USA
Common use of psort and filters
psort.py -z "UTC" -o l2tcsv -w nromanoff_l2tcsv
nromanoff.plaso
"date > '2012-04-03 00:00:00' AND
date < '2012-04-07 00:00:00'"
Output file format: Several
other formats besides “l2tcsv
Filter statement
September 17-19, 2018 ½San Antonio, TX USA
psort: Output File Formats
September 17-19, 2018 ½San Antonio, TX USA
L2TCSV Format – What to Fields to Focus on
Initially
Date: Date of the event, in the format of MM/DD/YYYY
Time: Time of day, expressed in a 24h format, HH:MM:SS
Timezone:
Time zone that was used to call the tool with.
MACB: MACB meaning of the fields, mostly for compatibility with the mactime format.
source: Short name for the source. All web browser history is, for instance, WEBHIST, registry entries are REG, simple log files are LOG, and so on.
sourcetype: More comprehensive description of the source,Internet Explorer instead of WEBHIST, etc.
type: Type of the timestamp itself, such as “Last Accessed,Last Written, orLast modified, and so on.
user: Username
host: Hostname
short: Short description of the entry, usually contains less text than the full description field.
desc:
Description field, this is where most of the information is stored, the actual parsed description of the entry.
version: Version number of the timestamp object.
filename: Filename with the full path of the filename that contained the entry.
Inode: Inode number of the file being parsed.
notes: Some input modules insert additional information in the form of a note.
format: Name of the input module that was used to parse the file.
extra: Additional information parsed is joined together and put here.
psort: L2TCSV Output Format -Sample
Timeline Explorer Eric Zimmer
https://ericzimmerman.github.io/#!index.md
If you have not tried this tool you really should
September 17-19, 2018 ½San Antonio, TX USA
Dynamic Output Fields
Field Name
Description Maps to L2TCSV
Datetime
Timestamp in ISO 8601 format no single field
timestamp_desc
Type of the timestamp itself, such as “Last Accessed,Last Written, orLast
modified,” and so on.
type
Source
Short name for the source. All web browser history is, for instance, WEBHIST,
registry entries are REG, simple log files are LOG, and so on.
source
source_long
More comprehensive description of the source,Internet Explorer instead of
WEBHIST, etc.
sourcetype
Message
Description field, this is where most of the information is stored, the actual
parsed description of the entry.
desc
Parser
Name of the input module that was used to parse the file. format
display_name
Filename with the full path of the filename that contained the entry. filename
tag
Tag name populated by the psort analysis module(s) N/A
These are the default fields for psort
psort: Dynamic Output Format -Sample
Default Format
September 17-19, 2018 ½San Antonio, TX USA
--additional_fields option
Adds additional fields to the default output list
Option works with “dynamic” output type
Can be context sensitive
Unfortunately, not compatible with “l2tcsv” output
format
But, additional fields can be used in filters
Look at the JSON output for additional fields
Additional Output Fields
Default Output Fields
1. datetime
2. timestamp_desc
3. source
4. source_long
5. message
6. parser
7. display_name
8. tag
September 17-19, 2018 ½San Antonio, TX USA
Can provide more granularity that any other single field
In some cases, sourcetype, parser and data_type can provide the
same results
data_type: Additional Filterable Fields
Data Types: 130+ Identified to Date
Windows Registry
FS Activity
MAC
windows:registry:amcache
fs:mactime:line
mac:appfirewall:line
windows:registry:amcache:program
s
fs:stat
mac:asl:event
windows:registry:appcompatcache
fs:stat:ntfs
mac:document_versions:fil
e
windows:registry:installation
windows:lnk:link
mac:keychain:application
windows:registry:key_value
windows:shell_item:file_entry
mac:keychain:internet
windows:registry:list
windows:volume:creation
mac:securityd:line
windows:registry:network
mac:utmpx:event
windows:registry:office_mru
mac:wifilog:line
imessage:event:chat
windows:registry:sam_users
mackeeper:cache
windows:registry:service
macos:fseventsd:record
windows:registry:shutdown
macosx:application_usage
windows:registry:userassist
macosx:lsquarantine
Complete List
Available on my
GitHub Page
data_type Field as Filter – dynamic output
More Granularity
Red: Example of
more detail
Green: Example
other fields with
same information
data_type Field as Filter – l2tcsv output
More Granularity
"DATA_TYPE is
'firefox:places:page_visited'"
"sourcetype is
'Firefox History'"
Context Sensitive Fields: LNK Files
Example: LNK File events can be filtered on all these fields
Field
Description
birth_droid_file_identifier
distributed link tracking birth droidÂ
file identifier.
birth_droid_volume_identifier
distributed link tracking birth droidÂ
Âvolume identifier.
command_line_arguments
command line arguments.
description
description of the linked item.
drive_serial_number
drive serial number where the linked
item resides.
drive_type
drive type where the linked item resided.
droid_file_identifier
distributed link tracking droid fileÂ
identifier.
droid_volume_identifier
distributed link tracking droid volume identifier.
env_var_location
evironment variables loction.
file_attribute_flags
file attribute flags of the linked item.
file_size
size of the linked item.
icon_location
icon location.
link_target
shell item list of the link target.
local_path
local path of the linked item.
network_path
local path of the linked item.
relative_path
relative path.
volume_label
volume label where the linked item resided.
working_directory
working directory.
Data Type = windows:lnk:link
psort.exe -z "UTC" -o dynamic --additional_fields
"data_type,drive_serial_number,drive_type,droid_file_identifier"
-w add_fields_drive_type.csv file_filter.plaso "data_type is 'windows:lnk:link' and
drive_type == 2"
Output type == dynamic
Data Type = windows:lnk:link
psort.exe -z "UTC" -o l2tcsv -w filter_on_add_fields.csv file_filter.plaso "data_type
is 'windows:lnk:link' and drive_type == 2"
Output type == l2tcsv
Context Sensitive Fields: SAM Registry
Example: SAM Users events can be filtered on all these fields
DATA_TYPE = 'windows:registry:sam_users'
Field
Description
account_rid
(int)
account relative identifier (RID).
comments (str)
comments
fullname
(str)
full name
key_path
(str)
Windows Registry key path
login_count
(int)
login count
username (str)
username (str)
Context Sensitive Fields: Prefetch Files
Example: Prefetch events can be filtered on all these fields
DATA_TYPE = 'windows:prefetch:execution '
Field
Description
executable (str)
executable filename
format_version
(int)
format version
mapped_files
(list[str])
mapped filenames
number_of_volumes (int)
number of volumes
path (str)
path to the executable
prefetch_hash (int)
prefetch hash
mapped_files
(list[str])
mapped filenames
volume_device_paths
(list[str])
volume device paths
volume_serial_numbers
(list[int
])
volume serial numbers
Filter Example: Evidence of Execution
psort -z "UTC" -o l2tcsv -w execution_test.csv file_filter.plaso
"message contains 'Prefetch {' or
message contains 'AppCompatCache' or
message contains 'typed the following cmd' or
message contains 'CMD typed' or
message contains 'Last run' or
message contains 'RunMRU' or
message contains 'MUICache' or
message contains 'UserAssist key' or
message contains 'Time of Launch' or
message contains 'Prefetch' or
message contains 'SHIMCACHE' or
message contains 'Scheduled' or
message contains '.pf' or
message contains 'was run' or
message contains 'UEME_' or message contains '[PROCESS]'"
Sample of Evidence of
Execution logic used by
Timeline Explorer. Developed
by Eric Zimmerman.
This logic can be
implemented as a psort filter.
Logic is not compatible with
psort tagging. !
Filter Results: Evidence of Execution
psort -z "UTC" -o l2tcsv -w execution_test.csv file_filter.plaso "message contains 'Prefetch {' or
message contains 'AppCompatCache' or message contains 'typed the following cmd' or message
contains 'CMD typed' or message contains 'Last run' or message contains 'RunMRU' or message
contains 'MUICache' or message contains 'UserAssist key' or message contains 'Time of Launch' or
message contains 'Prefetch' or message contains 'SHIMCACHE' or message contains 'Scheduled' or
message contains '.pf' or message contains 'was run' or message contains 'UEME_' or message
contains '[PROCESS]'"
log2timeline parsers versus psort filters
log2timeline parsers versus psort filters
Time Filtering Data Range
psort.py -z "UTC" -o l2tcsv -w nromanoff_l2tcsv nromanoff.plaso
"date > '2012-04-03 00:00:00' AND date < '2012-04-07 00:00:00'"
Time Filtering Slice
psort.py -z "UTC" -o l2tcsv --slice '2012-04-05 22:12:00'
-w nromanoff_l2tcsv nromanoff.plaso "data_type
is 'windows:lnk:link’ and drive_type == 2"
Provides context around an date/time
Create a time slice around a certain date
Display all events that happened X minutes before and after
the defined date
--slice_size defines the size of the slice
Defaults to 5 minutes.
Time Filtering Slicer
psort.py -z "UTC" -o l2tcsv slicer slice_size 10
-w nromanoff_l2tcsv nromanoff.plaso "data_type is
'windows:lnk:link’
and drive_type == 2"
Creates a Time Slice Around every Filter match
Will save all X events before and after a filter match
X is set with the --slice option
Defaults to 5 events.
September 17-19, 2018 ½San Antonio, TX USA
Analysis Plugin to update the tag field in the DB file
There are a few free form fields like “message” and “strings” that are
interesting for filtering but not available for tagging.
Reason: These fields are not stored in the DB
The sample file filter, tag_windows.txt, on Plaso GitHub has some
errors
Data_type typos
Use of the strings field
Tagging
September 17-19, 2018 ½San Antonio, TX USA
Looking at other formats (json, etc).
Looking at code
Looking at tagging files
Testing – How did you find these other fields?
Plaso Filtering Cheat Sheet
https://digital-forensics.sans.org/media/Plaso-Cheat-Sheet.pdf
September 17-19, 2018 ½San Antonio, TX USA
Repository is work in progress will update as new info is
discovered
Link is https://github.com/mark-hallman/plaso_filters
Plaso Filter Presentation GitHub Link
A Peek at KAPE
Questions
Thanks for attending Safe Travels home
https://github.com/mark-hallman/plaso_filters

Navigation menu