Wappacvez is a command-line tool that analyzes a web application by using a dockerized Wappalyzer. It then extracts the software for which a version is detected, and finally employs the uCVE tool to search for associated CVEs. The output can be exported in HTML or CSV format, depending on the user's preference.
- Linux or Mac
- Go (version 1.16+)
- Docker
To install Wappacvez, run the following command:
go install -v github.com/shockz-offsec/wappacvez@latest
or via building via repository
git clone https://github.com/shockz-offsec/Wappacvez.git
cd Wappacvez
go build -o wappacvez wappacvez.go
wappacvez -u <url> [-cvss value] [-lg value] [-oHTML value.html] [-oCSV value.csv]
-u
: URL to scan (mandatory)-cvss
: Filter vulnerabilities by CVSS [critical,high,medium,low,none] (default: all)-lg
: Set language of information [en,es] (default: en)-oHTML
: Save CVEs list in HTML file [filename] (default: report.html)-oCSV
: Save CVEs list in CSV file [filename]
| The only mandatory argument is the url
wappacvez -u "https://www.nasa.gov" -oHTML "nasa.html" -cvss critical,high
Output
Wappacvez will proceed to install Docker and build my Wappalyzer image and install uCVE on the system.
| Due to the limitations of using the Wappalyzer core versus the extension, it is possible that some websites may not detect all software versions compared to the extension. | We considered using the official API, but this free API has more limitations in terms of queries and results.
Dockerized version of Wappalyzer developed for this tool.
https://hub.docker.com/r/shockzoffsec/wappalyzer
With the following command the latest available version will be installed and executed.
docker run --rm shockzoffsec/wappalyzer:latest <url> [arguments]
All Wappalyzer options are allowed.
Usage:
wappalyzer <url> [options]
Examples:
wappalyzer https://www.example.com
node cli.js https://www.example.com -r -D 3 -m 50 -H "Cookie: username=admin"
docker wappalyzer/cli https://www.example.com --pretty
Options:
-b, --batch-size=... Process links in batches
-d, --debug Output debug messages
-t, --delay=ms Wait for ms milliseconds between requests
-h, --help This text
-H, --header Extra header to send with requests
--html-max-cols=... Limit the number of HTML characters per line processed
--html-max-rows=... Limit the number of HTML lines processed
-D, --max-depth=... Don't analyse pages more than num levels deep
-m, --max-urls=... Exit when num URLs have been analysed
-w, --max-wait=... Wait no more than ms milliseconds for page resources to load
-p, --probe=[basic|full] Perform a deeper scan by performing additional requests and inspecting DNS records
-P, --pretty Pretty-print JSON output
--proxy=... Proxy URL, e.g. 'http://user:pass@proxy:8080'
-r, --recursive Follow links on pages (crawler)
-a, --user-agent=... Set the user agent string
-n, --no-scripts Disabled JavaScript on web pages
-N, --no-redirect Disable cross-domain redirects
-e, --extended Output additional information
--local-storage=... JSON object to use as local storage
--session-storage=... JSON object to use as session storage
--defer=ms Defer scan for ms milliseconds after page load
This tool is licensed under the GPL-3.0 License.