Skip to content

Analyzing Data in Three Easy Steps

@AlanOrlikoski edited this page Jul 25, 2019 · 5 revisions

The three steps are:

  1. Collect information from host
  2. Process / parse collected data
  3. Start reviewing data from csv reports or from Kibana or TimeSketch Web UI's

Collect information from host

The CyLR tool can assist with collecting data from Windows hosts but any tool can be used to collect the forensic image or collect the artifacts. SFTP can be used to transfer the files securely to the Skadi.

CyLR Collection Example

To execute CyLR on the host to SFTP the data directly to the Skadi use the following command:

CyLR.exe -u skadi -p skadi -s <Skadi IP address> 

Sample output

Collecting File: C:\Windows\System32\config\BBI
Collecting File: C:\Windows\System32\config\BCD-Template
Collecting File: C:\Windows\System32\config\COMPONENTS
Collecting File: C:\Windows\System32\config\DEFAULT
Collecting File: C:\Windows\System32\config\DRIVERS
Collecting File: C:\Windows\System32\config\ELAM
Collecting File: C:\Windows\System32\config\SAM
Collecting File: C:\Windows\System32\config\SECURITY
Collecting File: C:\Windows\System32\config\SOFTWARE
Collecting File: C:\Windows\System32\config\SYSTEM
Collecting File: C:\Windows\System32\config\userdiff
Collecting File: C:\Windows\System32\config\VSMIDK
…
Collecting File: C:\Windows\System32\winevt\logs\OAlerts.evtx
Collecting File: C:\Windows\System32\winevt\logs\PreEmptive.evtx
Collecting File: C:\Windows\System32\winevt\logs\Security.evtx
Collecting File: C:\Windows\System32\winevt\logs\Setup.evtx
Collecting File: C:\Windows\System32\winevt\logs\System.evtx
Collecting File: C:\Windows\System32\winevt\logs\Windows Azure.evtx
Collecting File: C:\Windows\System32\winevt\logs\Windows PowerShell.evtx
Collecting File: C:\Windows\System32\drivers\etc\hosts
Collecting File: C:\$MFT
Extraction complete. 0:00:11.8929015 elapsed

NOTE:

  • File is uploaded to the user skadi home directory of the Skadi (/home/skadi)

Process / parse collected data

This write up makes the assumption that the source is either a .zip file with artifacts, a directory with artifacts in it, or a directory with forensic system image files in it (example collect.E01). Skadi uses the Cold Disk Quick Response (CDQR) tool to process individual artifacts or entire system images.

The best practice is to put all of the artifacts into one folder (or zip file).

Examples of using CDQR to process the data and output to into Elasticsearch

If the data is from a windows host, in a .zip file, “example_hostname.zip”, then use the following command

cdqr in:example_hostname.zip out:Results -p win --max_cpu -z --es example_index

If the data is from a windows host, in a directory, “example_dirname”, then use the following command

cdqr in:example_dirname out:Results -p win --max_cpu --es example_index

If the data is from a mac host, is a forensic image file(s) then use the following command

cdqr in:example_dirname/example_hostname.E01 out:Results -p mac --max_cpu --es example_index

Successful example output from CDQR

Assigning CDQR to the host network
The Docker network can be changed by modifying the "DOCKER_NETWORK" environment variable
Example (default Skadi mode): $env:DOCKER_NETWORK = "host"
Example (use other Docker network): $env:DOCKER_NETWORK = "skadi-backend"
docker run --network host  -v 'E:\GitHub\CDQR\Docker\cdqr.ps1:/data/GitHub/CDQR/Docker/cdqr.ps1' aorlikoski/cdqr:5.0.0 -y '/data/GitHub/CDQR/Docker/cdqr.ps1'
CDQR Version: 5.0
Plaso Version: 20190331
Number of cpu cores to use: 8
Source data: Sample_data
Destination Folder: Results
Database File: Results/Sample_data.db
SuperTimeline CSV File: Results/Sample_data.SuperTimeline.csv


Results/Sample_data.log
Processing started at: 2001-01-01 17:40:58.322694
Parsing image
"log2timeline.py" "-p" "--partition" "all" "--vss_stores" "all" "--parsers" "appcompatcache,bagmru,binary_cookies,ccleaner,chrome_cache,chrome_cookies,chrome_extension_activity,chrome_history,chrome_preferences,explorer_mountpoints2,explorer_programscache,filestat,firefox_cache,firefox_cache2,firefox_cookies,firefox_downloads,firefox_history,google_drive,java_idx,mcafee_protection,mft,mrulist_shell_item_list,mrulist_string,mrulistex_shell_item_list,mrulistex_string,mrulistex_string_and_shell_item,mrulistex_string_and_shell_item_list,msie_zone,msiecf,mstsc_rdp,mstsc_rdp_mru,network_drives,opera_global,opera_typed_history,prefetch,recycle_bin,recycle_bin_info2,rplog,safari_history,symantec_scanlog,userassist,usnjrnl,windows_boot_execute,windows_boot_verify,windows_run,windows_sam_users,windows_services,windows_shutdown,windows_task_cache,windows_timezone,windows_typed_urls,windows_usb_devices,windows_usbstor_devices,windows_version,winevt,winevtx,winfirewall,winjob,winlogon,winrar_mru,winreg,winreg_default" "--hashers" "md5" "--workers" "8" "Results/Sample_data.db" "Sample_data"
Parsing ended at: 2001-01-01 17:44:24.899715
Parsing duration was: 0:03:26.577021

Process to export to ElasticSearch started
Exporting results to the ElasticSearch server
"psort.py" "-o" "elastic" "--raw_fields" "--index_name" "case_cdqr-Sample_data" "Results/Sample_data.db"
All entries have been inserted into database with case: case_cdqr-Sample_data

Process to export to ElasticSearch completed
ElasticSearch export process duration was: 0:03:24.242369

Total  duration was: 0:06:50.819390