Get your privacy back
Generate fake web browsing and mitigate tracking
PartyLoud is a highly configurable and straightforward free tool that helps you prevent tracking directly from your linux terminal, no special skills required. Once started, you can forget it is running. It provides several flags; each flag lets you customize your experience and change PartyLoud behaviour according to your needs.
Please submit bugs and feature requests and help me to continuously improve this project.
For questions / feedbacks please contact me Here
- Simple. 3 files only, no installation required, just clone this repo an you're ready to go.
- Powerful. Thread-based navigation.
- Stealthy. Optimized to emulate user navigation.
- Portable. You can use this script on every unix-based OS.
This project was inspired by noisy.py
- URLs and keywords are loaded (either from partyloud.conf and badwords or from user-defined files)
- If proxy flag has been used, proxy config will be tested
- For each URL in ULR-list a thread is started, each thread as an user agent associated
- Each thread will start by sending an HTTP request to the given URL
- The response if filtered using the keywords in order to prevent 404s and malformed URLs
- A new URL is choosen from the list generated after filering
- Current thread sleeps for a random time
- Actions from 4 to 7 are repeated using the new URL until user send kill signal (CTRL-C or enter key)
- Configurable urls list and blocklist
- Random DNS Mode : each request is done on a different DNS Server
- Multi-threaded request engine (# of thread are equal to # of urls in partyloud.conf)
- Error recovery mechanism to protect Engines from failures
- Spoofed User Agent prevent from fingerprinting (each engine has a different user agent)
- Dynamic UI
Clone the repository:
git clone https://github.com/realtho/PartyLoud.git
Navigate to the directory and make the script executable:
cd PartyLoud
chmod +x partyloud.sh
Run 'partyloud':
./partyloud.sh
Usage: ./partyloud.sh [options...]
-d --dns <file> DNS Servers are sourced from specified FILE,
each request will use a different DNS Server
in the list
!!WARNING THIS FEATURE IS EXPERIMENTAL!!
!!PLEASE LET ME KNOW ISSUES ON GITHUB !!
-l --url-list <file> read URL list from specified FILE
-b --blocklist <file> read blocklist from specified FILE
-p --http-proxy <http://ip:port> set a HTTP proxy
-s --https-proxy <https://ip:port> set a HTTPS proxy
-n --no-wait disable wait between one request and an other
-h --help dispaly this help
In current release there is no input-validation on files.
If you find bugs or have suggestions on how to improve this features please help me by opening issues on GitHub
Default files are located in:
Please note that file name and extension are not important, just content of files matter
badwords - Keywords-based blocklist
badwords is a keywords-based blocklist used to filter non-HTML content, images, document and so on.
The default config as been created after several weeks of testing. If you really think you need a custom blocklist, my suggestion is to start by copy and modifying default config according to your needs.
Here are some hints on how to create a great blocklist file:
DO โ | DONT ๐ซ |
---|---|
Use only ASCII chars | Define one-site-only rules |
Try to keep the rules as general as possible | Define case-sensitive rules |
Prefer relative path | Place more than one rule per line |
partyloud.conf - ULR List
partyloud.conf is a ULR List used as starting point for fake navigation generators.
The goal here is to create a good list of sites containing a lot of URLs.
Aside suggesting you not to use google, youtube and social networks related links, I've really no hints for you.
Note #1 - To work properly the URLs must be well-formed
DNSList - DNS List
DNSList is a List of DNS used as argument for random DNS feature. Random DNS is not enable by default, so the โdefault fileโ is really just a guide line and a test used while developing the function to se if everything was working as expected.
The only suggestion here is to add as much address as possible to increase randomness.
Isn't this literally just a cli based frontend to curl?
The core of the script is a curl request, but this tool does more than that. When you run the script, several threads are started. Each thread makes a different http request and parses the output to choose the next url, simulating web navigation. Unless the user stops the script (either pressing enter or via CTRL-C), it will stay alive.
How does the error recovery mechanism work?
Error recovery mechanism is an elegant way to say that if the http request returns a status code starting with 4 or 5 (error), the script will use a backup-url on order to continue normal execution.
May I fork your project?
Look Here ๐
How easy is this fake traffic to detect?
Unfortunately it's pretty easy, but keep in mind that this is a beta and I'll fix this "issue" in upcoming releases.
What does badwords do?
badwords is just a list of keywords used to filter urls in order to prevent 404s and traversing non-html content (like images, css, js). You can create your own, but, unless you have special needs, I recommend you use the default one or at least use it as a template.
What does partyloud.conf do?
partyloud.conf is just a list of root urls used to start the fake navigation. You can create your own conf file, but pay attention that the more urls you add, the more threads you start. This is an "open issue". Upcoming releases will come with a max thread number in order to avoid Fork Bombs.