ShareSearch tool goes through hosts with SMB, NFS, checking credentials, looking for interesting stuff and greping sensitive data in it. WARNING! Alfa version, a lot of bugs and spaghetti code.
pip3 install -r requirements.txt
sudo apt-get install cifs-utils
python3 sharesearch.py [options] DOMAIN/login:password HOSTS_CIDR
python3 sharesearch.py [options] WORKGROUP/login:LM:NT HOSTS_CIDR
python3 sharesearch.py -p all -w -v -H hosts.lst -C creds.lst
python3 sharesearch.py -s --share-num 2 --grep -i prev_share_results.csv 192.168.1.62
You can configure sharesearch in default.cfg.
hostlist.lst => CIDR ranges or hosts creds.lst => списки учетных записей
--version => show program's version number and exit -h, --help => show this help message and exit
-i FILE, --import=FILE => Import previous csv-results and print them. -H FILE, --hosts=FILE => Get target hosts from input file. -m, --masscan => Use masscan instead of nmap at initial 445, 139 port scan. -e, --exist => Declare all input (-H) hosts containing SMB shares and skip init port scan. Ranges will be removed. -C FILE, --creds=FILE => Get credentials from input file. -p [r/rw/w/no/all], --perms=[r/rw/w/no/all] => Shares with what permissions do we need (default r) [r/rw/w/no/all] -w, --check-write => Check write permissions trying to upload file. -v, --verbose => Be verbose. Print all findings to STDOUT.
Options for spidering files in shares. You can manage files to spider in default.cfg
-s, --spider => Spider interesting files in all shares. -n SHARE_NUM, --share-num=SHARE_NUM => Shares numbers in imported result list to spider ([,] as delimiter, [0/a/all] for "all"). -d MAX_DEPTH, --depth=MAX_DEPTH => Maximum depth level of recursive spidering (default 5). -f, --force => Spider everyting in every share, even if it is already spidered. By default we also skip parsing whole "ADMIN$" and "C$/Windows" shares to speed up (but we look for SAM, SYSTEM, hosts files.). -S FILE, --spider-print=FILE => Print imported Spider results with highlighting. -t THREADS, --threads=THREADS => TODO: Number of threads while spidering (default 1).
Options for grep strings in files found by spider. By default it doesn't grep strings in files after spidering shares. You can manage regular expressions list in default.cfg
-g, --grep => Grep previously spidered interesting files. -k MAX_FILE_SIZE, --kb-grep-size=MAX_FILE_SIZE => Maximum filesize in KB for grep (default 200). -G FILE, --grep-print=FILE => Print imported Grep results with highlighting.
- Download specified file
- Validate imported csv file
- Add custom regexp
- Add flag for grep how many lines to show before and after match
- Multi threading