Skip to content

kubeagi/arcadia

Repository files navigation

Arcadia: A diverse, simple, and secure all-in-one LLMOps platform

What is Arcadia?

Arcadia is a all-in-one enterprise-grade LLMOps platform that provides a unified interface for developers and operators to build, debug,deploy and manage AI agents with a orchestration engine(RAG(Retrieval Augmented Generation) and LLM finetuning has been supported).

Features

  • Build,debug,deploy AI agents on ops-console(GUI for LLMOps)
  • Chat with AGI agent on agent-portal(GUI for gpt chat)
  • Enterprise-grade infratructure with KubeBB: Multi-tenant isolation (data, model services), built-in OIDC, RBAC, and auditing, supporting different companies and departments to develop through a unified platform
  • Support most of the popular LLMs(large language models),embedding models,reranking models,etc..
  • Inference acceleration with vllm,distributed inference with ray,quantization, and more
  • Support fine-tuining with llama-factory
  • Built on langchaingo(golang), has better performance and maintainability

Architecture

Our design and development in Arcadia design follows operator pattern which extends Kubernetes APIs.

Arch

For details, check Architecture Overview

Quick Start

Documentation

Visit our online documents

Read user guide

Supported Models

List of Models can be deployed by kubeagi

LLMs

Embeddings

Reranking

List of Online(third party) LLM Services can be integrated by kubeagi

Supported VectorStores

Fully compatible with langchain vectorstores

Pure Go Toolchains

Thanks to langchaingo,we can have comprehensive AI capability in Golang!But in order to meet our own unique needs, we have further developed a number of other toolchains:

We have provided some examples on how to use them. See more details at here

Contribute to Arcadia

If you want to contribute to Arcadia, refer to contribute guide.

Support

If you need support, start with the troubleshooting guide, or create GitHub issues