OSINT Discovery is a set of Python scripts designed to search for users or URLs across different social media platforms and caching services. Currently, it supports searching for users on Nostr and Mastodon networks, searching for cached tweets across various archiving services, and searching for cached versions of any URL. Along with report generator for any domain.
Created by inforensics.ai
This script searches for a Nostr user across multiple relays.
- Search by public key or NIP-05 identifier
- Use default relays or specify custom ones
- Read relay list from a file
- Verbose mode for detailed output
python nostr-user-search.py [-h] [-r RELAYS [RELAYS ...]] [-f FILE] [-v] identifier
This script searches for a Mastodon user across multiple instances.
- Fetch instances from the instances.social API
- Specify custom instances or read from a file
- Control minimum instance size and status
- Verbose mode for detailed output
python mastodon-user-search.py [-h] [-c COUNT] [-m MIN_USERS] [--include-down] [--include-closed] [-v] [-i INSTANCES [INSTANCES ...]] [-f FILE] username
This script searches for cached tweets of a specified Twitter username across multiple archiving and caching services.
- Search across multiple caching services (Wayback Machine, Google Cache, etc.)
- Option to open results in the default web browser
- Command-line interface with optional arguments
python tweet-cache-search.py [-h] [-u USERNAME] [-o]
This script searches for cached versions of any URL across various caching and archiving services.
- Search across multiple services (Wayback Machine, Google Cache, Bing, Yandex, etc.)
- Option to open results in the default web browser
- JSON output option for easy parsing
- Automatic installation of required libraries
python cache-me-outside.py [-h] [-u URL] [-o] [-j]
- Clone the repository:
git clone https://github.com/inforensics-ai/osint-user-discovery.git
- Navigate to the project directory:
cd osint-user-discovery
- Each script will attempt to install its required dependencies when run. However, you can also install all dependencies manually:
pip install -r requirements.txt
This script performs comprehensive intelligence gathering on a specified domain.
- DNS record retrieval (A, AAAA, CNAME, MX, NS, TXT, SOA, SRV)
- SSL/TLS certificate analysis
- WHOIS information retrieval
- Web technology detection
- Subdomain enumeration
- SSL/TLS vulnerability checks
- HTTP header analysis
- Email security configuration (DMARC, SPF)
- CAA and TLSA record checks
- Reverse DNS lookups
- Domain age calculation
- SSL certificate chain analysis
- Security header checks
- Web server version detection
- DNSSEC implementation check
- IP geolocation
- SSL/TLS protocol support check
- Domain reputation check
- Robots.txt and sitemap.xml retrieval
- DNS propagation check
- HSTS preload status check
- Generation of common domain variations
- DNS zone transfer attempt
python domain-intelligence-tool.py [-h] [--json] [--markdown] [--config CONFIG] domain
- Clone the repository:
git clone https://github.com/inforensics-ai/osint-user-discovery.git
- Navigate to the project directory:
cd osint-user-discovery/domain-intelligence-tool
- Each script will attempt to install its required dependencies when run. However, you can also install all dependencies manually:
pip install -r requirements.txt
- For the Domain Intelligence Tool:
- Create a
config.json
file in the same directory as the script with the following structure:{ "api_keys": { "geoip2": "your_geoip2_api_key_here" }, "markdown_output_path": "/path/to/output/directory", "geolite2_db_path": "/path/to/GeoLite2-City.mmdb" }
- Download the GeoLite2-City.mmdb database and place it in the location specified in your config.json file.
- Create a
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
These tools are for educational and research purposes only. Always respect privacy and adhere to the terms of service of the platforms you're querying. Ensure you have permission before performing any scans or intelligence gathering on domains you do not own.
For bug reports and feature requests, please open an issue on this repository.