feat(dump) - extract, decode and write MFT streams #210
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
Our current MFT support doesn't look at data streams stored in MFT entries. These data streams are a goldmine of information for Incident Responders.
This PR aims to introduce three key features:
This PR should address issues: #190 #191
Examples
Data Streams in YAML Output
By default, Chainsaw will now extract data streams and include them in the default output. Stream names and values are shown, with values being in a hexstring format:
Data Streams in YAML Output with Decoding
The chainsaw dump module now has the
--decode-data-streams
flag which will ask chainsaw to attempt to decode data streams before outputting them. This is super useful for quickly identifying Zone.Identifiers and the contents of files (if resident)Writing Data Streams to Disk
Chainsaw now has the option to write extracted and decoded data streams to disk via the
--data-streams-directory
optionHunting on the content of DataStreams
Now that chainsaw extracts MFT datastreams we can write powerful detection rules on them:
Leading to: