(c) 2024 konterfAI
konterfAI is a proof-of-concept for a model-poisoner for LLM (Large Language Models) to generate nonsense("bullshit") content suitable to degenerate these models.
Although it's still work in progress and not yet ready for production, it already shows the concept of fighting fire with fire: The backend queries a tiny LLM running in ollama with a high ai-temperature setting to generate hallucinatory content. If you wonder how this looks like, check out the example-hallucination.md file.
NOTE: The developers created konterfAI not as an offensive (hacking) tool, but a countermeasure against AI-crawlers that ignore robots.txt and other rules. The Tool was inspired by reports of web admins suffering from TeraByte of Data caused by AI crawlers - cost that can be avoided.
konterfAI is licensed under the AGPL (GNU AFFERO GENERAL PUBLIC LICENSE). See LICENSE for the full license text.
Join the Matrix-Chat to get in touch.
see CONTRIBUTING.
see FAQ.
konterfAI is supposed to run behind a reverse-proxy, like nginx or traefik. The reverse proxy needs the ability to detect the user-agent of the incoming request and filter it by a given list. If there is a match the crawler will not be presented with the original content, but with the poisoned content. The poisoned content is also cluttered with randomized self-references to catch the crawlers in some kind of tar-pit.
Note: Those are examples and not intended for copy & paste usage. Make sure to read them carefully and adjust them to your needs.
None, konterfAI does not ship any models. A default model will be downloaded upon ollama start. If you want to use a different model, you can pick one from the ollama-models page and adapt your configuration accordingly.
$> make build
For a full list of build targets see Makefile.
If you are really brave and want to try konterfAI in a production environment, see there are two examples for nginx and traefik in the deployment-folder.
Note: These examples are not intended for copy & paste usage. Make sure to read them carefully and adjust them to your needs.
WARNING: IMPROPER CONFIGURATION WILL HAVE NEGATIVE EFFECTS ON YOUR SEO
Note: -gpu
is optional, if you do not have an ollama-capable GPU, you can omit it.
$> make start-ollama[-gpu]
$> make run
see Tracing.
konterfAI will start two webservers, one is the service itself, listening on port 8080.
The other is the statistics server, listening on port 8081. If you are running this locally from source,
you can access both servers via http://localhost:8080 and http://localhost:8081.
These ports can be changed via the --port
and --statistics-port
flags.
konterfAI exposes prometheus metrics on the /metrics
endpoint from the statistics server.
You can access them via http://localhost:8081/metrics.
Start:
$> make start-ollama[-gpu]
$> make docker-build
$> make docker-run
Stop:
$> make docker-stop
Start:
$> make docker-compose-up
Stop:
$> make docker-compose-down
For more complex examples edit docker-compose.yml to suit your needs.
You can also use the pre-built docker-image from Docker-Hub or Quay.io.
Tag | Description |
---|---|
latest |
The latest stable release |
v*.*.* |
A specific version (e.g. 0.1.0, 0.1) |
latest-main |
The latest main branch build |
konterfAI is configured via cli-flags. For a full list of supported flags run (after building):
$> ./bin/konterfai --help
The docker-image is configured via environment-variables. For a full list of supported variables see docker-compose.yml and entrypoint.sh.