Skip to content

Mirror repository for https://codeberg.org/konterfai/konterfai/ All developement/issue tracking/releasing/ci/etc. happens at codeberg. This is just a mirror!

License

Notifications You must be signed in to change notification settings

konterfai-bot/konterfai

Repository files navigation

Matrix License status-badge Go Report Card codecov Codeberg Issues Codeberg Release

konterfAI

(c) 2024 konterfAI

konterfAI is a proof-of-concept for a model-poisoner for LLM (Large Language Models) to generate nonsense("bullshit") content suitable to degenerate these models.

Although it's still work in progress and not yet ready for production, it already shows the concept of fighting fire with fire: The backend queries a tiny LLM running in ollama with a high ai-temperature setting to generate hallucinatory content. If you wonder how this looks like, check out the example-hallucination.md file.

NOTE: The developers created konterfAI not as an offensive (hacking) tool, but a countermeasure against AI-crawlers that ignore robots.txt and other rules. The Tool was inspired by reports of web admins suffering from TeraByte of Data caused by AI crawlers - cost that can be avoided.

License

konterfAI is licensed under the AGPL (GNU AFFERO GENERAL PUBLIC LICENSE). See LICENSE for the full license text.

Get in touch

Join the Matrix-Chat to get in touch.

Contributing

see CONTRIBUTING.

FAQ (Frequently Asked Questions)

see FAQ.

How does it work?

konterfAI is supposed to run behind a reverse-proxy, like nginx or traefik. The reverse proxy needs the ability to detect the user-agent of the incoming request and filter it by a given list. If there is a match the crawler will not be presented with the original content, but with the poisoned content. The poisoned content is also cluttered with randomized self-references to catch the crawlers in some kind of tar-pit.

A diagram showing the basic concept of konterfAI

Note: Those are examples and not intended for copy & paste usage. Make sure to read them carefully and adjust them to your needs.

What ollama models does konterfAI ship?

None, konterfAI does not ship any models. A default model will be downloaded upon ollama start. If you want to use a different model, you can pick one from the ollama-models page and adapt your configuration accordingly.

Building

$> make build

For a full list of build targets see Makefile.

How to run it?

Production deployment

If you are really brave and want to try konterfAI in a production environment, see there are two examples for nginx and traefik in the deployment-folder.

Note: These examples are not intended for copy & paste usage. Make sure to read them carefully and adjust them to your needs.

WARNING: IMPROPER CONFIGURATION WILL HAVE NEGATIVE EFFECTS ON YOUR SEO

Development

Note: -gpu is optional, if you do not have an ollama-capable GPU, you can omit it.

$> make start-ollama[-gpu]
$> make run

Tracing

see Tracing.

Default Ports

konterfAI will start two webservers, one is the service itself, listening on port 8080. The other is the statistics server, listening on port 8081. If you are running this locally from source, you can access both servers via http://localhost:8080 and http://localhost:8081. These ports can be changed via the --port and --statistics-port flags.

Prometheus Metrics

konterfAI exposes prometheus metrics on the /metrics endpoint from the statistics server. You can access them via http://localhost:8081/metrics.

Docker

Start:

$> make start-ollama[-gpu]
$> make docker-build
$> make docker-run

Stop:

$> make docker-stop

Docker-Compose

Start:

$> make docker-compose-up

Stop:

$> make docker-compose-down

For more complex examples edit docker-compose.yml to suit your needs.

Pre-built Docker-Image

You can also use the pre-built docker-image from Docker-Hub or Quay.io.

Pre-built Docker-Images tags

Tag Description
latest The latest stable release
v*.*.* A specific version (e.g. 0.1.0, 0.1)
latest-main The latest main branch build

Configuration

konterfAI is configured via cli-flags. For a full list of supported flags run (after building):

$> ./bin/konterfai --help

The docker-image is configured via environment-variables. For a full list of supported variables see docker-compose.yml and entrypoint.sh.

About

Mirror repository for https://codeberg.org/konterfai/konterfai/ All developement/issue tracking/releasing/ci/etc. happens at codeberg. This is just a mirror!

Topics

Resources

License

Stars

Watchers

Forks

Languages