Skip to content
View localagi's full-sized avatar

Block or report localagi

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this userโ€™s behavior. Learn more about reporting abuse.

Report abuse
localagi/README.md

๐Ÿงฎ localAGI ๐Ÿงฎ


Update - archived

Due to several reasons, I left AGiXT development and will also leave github.

Despite not acting anymore in terms of FOSS or offering integration, I will let the pages and repos as reference.

Have fun all!

Update - work is paused. Github is a b****.

image

No, they don't answer to support tickets. My account is probably done. You can message mail me, but i tend to remove everyting in close future.

And I am PRO user.


Fulltime nerd. Passionate developer. DevOp at heart.

Thats me. :bowtie: Building AGI on local hardware.

Building contaners for effectively running a local artificial general intelligence. ๐Ÿฆพ

You want to run your own inferences with ease? Good you are awake.

Contact: Find me on AGiXT Discord Server or open an issue here.

๐Ÿ‹ Docker Hub ๐Ÿ‹

๐Ÿง—โ€โ™€๏ธ Motivation ๐Ÿง—

After entering the AI-space begin of may 2023, I wanted to try out all cool software available. Local development setups have always been tricky and I struggled installing environments for different projects with different compilations of library versions etc.

After discovering josh-XT/AGiXT - and getting a bitโ„ข euphoric, I started boxing AGiXT into a docker container using a github workflow.

localAGI/AGiXT-docker quickly spawned localAGI/AI-pipeline - and I started reusing the pipeline for different projects.

๐ŸŽฏ Goal ๐ŸŽฏ

Having reproducable software environments to spin up services on demand for testing and sky-netting. Setup and streamline docker containers for quick and user friendly usage.

๐Ÿš€ CUDA enabled. ๐Ÿ–ฅ๏ธ BLAS enabled. ๐Ÿ˜ Conda-less. ๐Ÿง… Matrix builds. ๐Ÿข Multiarch builds. ๐Ÿง’ ๐Ÿง‘ ๐Ÿง“ For everyone.

๐ŸŒบ Sharing is caring ๐ŸŒบ

With strong expertise in docker and github workflows I want to test and follow AI-related projects in a comfortable manner.

Working on AI Pipeline to share best practices over several repositories and projects.

๐ŸŒŸ When you like any of my work, leave a star! Thank you! ๐ŸŒŸ

State of work

PAUSED, SEE TOP. WILL BE REMOVED SOON. I'LL GIVE GITHUB ONE MORE WEEK TO ANSWER. ๐Ÿค–

The following projects are built using the AI pipeline.

My maintenance is focussed on build stabilty and availability of service containers. >200h of work. 50.000h of experience.

๐Ÿง  Services for running inference

๐Ÿ”ฅ your cuda card from ๐Ÿณ docker containers

Service Release Models API Original Repo
FastChat T5, HF OpenAI lm-sys/FastChat
oobabooga HF, GGML, GPTQ oobabooga oobabooga/text-generation-webui
llama-cpp-python GGML OpenAI abetlen/llama-cpp-python
llama.cpp GGML ? ggerganov/llama.cpp
gpt4all see backend ? nomic-ai/gpt4all
gpt4all-ui GPTJ, MPT, GGML...? ? nomic-ai/gpt4all-ui
stablediffusion2 WIP

๐ŸŽฉ Services for using inference

Service Release Original Repo
AGiXT ๐Ÿ•บ integrated upstream josh-XT/AGiXT
AGiXT-Frontend โœ”๏ธ JamesonRGrieve/Agent-LLM-Frontend
gpt-code-ui ricklamers/gpt-code-ui
gpt4free xtekky/gpt4free

๐Ÿฆฟ CLI tools and packages

for quantization, conversion, cli-inference etc.

Tool Release Model-types Model-quantisations Original Repo
llama.cpp Llama HF, GGML ggerganov/llama.cpp
ggml Llama GGML ggerganov/ggml
GPTQ-for-Llama Llama GPTQ [cuda old, cuda new] oobabooga/GPTQ-for-Llama
qwopqwop200/GPTQ-for-Llama
AutoGPTQ Llama GPTQ [triton] PanQiWei/AutoGPTQ
starcoder.cpp RNN bigcode-project/starcoder.cpp

Requests

Any? Contact me (curently on AGiXT-Discord)

Things to consider

  • conda is commercial. The license prohibits any commercial use. We try to omit it on our builds, but it's your responsibility.
  • NVidia images have a license. Make sure you read them.
  • streamlit app collects heavy analytics even when running locally. This includes events for every page load, form submission including metadata on queries (like length), browser and client information including host ips. These are all transmitted to a 3rd party analytics group, Segment.com.

Popular repositories Loading

  1. gpt-code-ui-docker gpt-code-ui-docker Public

    Docker builds for https://github.com/ricklamers/gpt-code-ui

    Dockerfile 108 25

  2. gpt4all-docker gpt4all-docker Public

    15 8

  3. gpt4free-docker gpt4free-docker Public

    Docker builds for https://github.com/xtekky/gpt4free

    8

  4. FastChat-docker FastChat-docker Public

    Docker builds for fastchat

    Dockerfile 6 2

  5. gpt4all-ui-docker gpt4all-ui-docker Public

    Docker builds for https://github.com/nomic-ai/gpt4all-ui

    6 3

  6. localAGI localAGI Public

    About Me - github.com/localAGI

    5