Skip to content

An OAI compatible exllamav2 API that's both lightweight and fast, and supports control-vectors

License

Notifications You must be signed in to change notification settings

gapeleon/tabbyAPI-control-vectors

 
 

Repository files navigation

TabbyAPI with Control Vectors Support

Python 3.10, 3.11, and 3.12 License: AGPL v3

Developer facing API documentation

This fork of TabbyAPI adds support for control vectors with the Exllamav2 backend, allowing fine-tuned control over model behavior using vector-based steering.

[IMPORTANT]

Control vectors are currently not compatible with tensor parallelism (tensor_parallel: true). If tensor parallelism is enabled, control vectors will not be applied to the model.

Control Vectors Setup

  1. Create a directory alongside your model with -vectors appended to the name:

models/Gemma-2-9B-It-SPPO-Iter3-exl2/ # Model directory

models/Gemma-2-9B-It-SPPO-Iter3-exl2-vectors/ # Control vectors directory

  1. Add control vector configuration to your config.yml:
control_vectors_enabled: true

control_vectors: "vector_name:direction:scale,vector_name:direction:scale"

Example:

control_vectors: "honesty_vs_machiavellianism:machiavellianism:0.5,humility_vs_narcissism:narcissism:0.3"

Getting Started

[IMPORTANT]

This is an unofficial fork. The README does not have instructions for setting up other than control vectors. Please read the original Wiki.

Read the Wiki for more information. It contains user-facing documentation for installation, configuration, sampling, API usage, and so much more.

Features

  • OpenAI compatible API

  • Loading/unloading models

  • HuggingFace model downloading

  • Embedding model support

  • JSON schema + Regex + EBNF support

  • AI Horde support

  • Speculative decoding via draft models

  • Multi-lora with independent scaling (ex. a weight of 0.9)

  • Inbuilt proxy to override client request parameters/samplers

  • Flexible Jinja2 template engine for chat completions that conforms to HuggingFace

  • Concurrent inference with asyncio

  • Utilizes modern python paradigms

  • Continuous batching engine using paged attention

  • Fast classifer-free guidance

  • OAI style tool/function calling

And much more. If something is missing here, PR it in!

Supported Model Types

TabbyAPI uses Exllamav2 as a powerful and fast backend for model inference, loading, etc. Therefore, the following types of models are supported:

  • Exl2 (Highly recommended)

  • GPTQ

  • FP16 (using Exllamav2's loader)

In addition, TabbyAPI supports parallel batching using paged attention for Nvidia Ampere GPUs and higher.

Contributing

Use the template when creating issues or pull requests, otherwise the developers may not look at your post.

If you have issues with the project:

  • Describe the issue in detail

  • If you have a feature request, please indicate it as such.

If you have a Pull Request

  • Describe the pull request in detail, what, and why you are changing something

Acknowldgements

TabbyAPI would not exist without the work of other contributors and FOSS projects:

Developers and Permissions

This fork adds control vector support to the original TabbyAPI. Original TabbyAPI developers:

Control vectors implementation by Gapeleon

About

An OAI compatible exllamav2 API that's both lightweight and fast, and supports control-vectors

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.5%
  • Jupyter Notebook 3.8%
  • Jinja 1.4%
  • Other 2.3%