Due to several reasons, I left AGiXT development and will also leave github.
Despite not acting anymore in terms of FOSS or offering integration, I will let the pages and repos as reference.
Have fun all!
No, they don't answer to support tickets. My account is probably done. You can message mail me, but i tend to remove everyting in close future.
And I am PRO user.
Fulltime nerd. Passionate developer. DevOp at heart.
Thats me. Building AGI on local hardware.
Building contaners for effectively running a local artificial general intelligence. ๐ฆพ
You want to run your own inferences with ease? Good you are awake.
Contact: Find me on AGiXT Discord Server or open an issue here.
After entering the AI-space begin of may 2023, I wanted to try out all cool software available. Local development setups have always been tricky and I struggled installing environments for different projects with different compilations of library versions etc.
After discovering josh-XT/AGiXT - and getting a bitโข euphoric, I started boxing AGiXT into a docker container using a github workflow.
localAGI/AGiXT-docker quickly spawned localAGI/AI-pipeline - and I started reusing the pipeline for different projects.
Having reproducable software environments to spin up services on demand for testing and sky-netting. Setup and streamline docker containers for quick and user friendly usage.
๐ CUDA enabled. ๐ฅ๏ธ BLAS enabled. ๐ Conda-less. ๐ง Matrix builds. ๐ข Multiarch builds. ๐ง ๐ง ๐ง For everyone.
With strong expertise in docker
and github workflows
I want to test and follow AI-related projects in a comfortable manner.
Working on AI Pipeline to share best practices over several repositories and projects.
PAUSED, SEE TOP. WILL BE REMOVED SOON. I'LL GIVE GITHUB ONE MORE WEEK TO ANSWER. ๐ค
The following projects are built using the AI pipeline.
My maintenance is focussed on build stabilty and availability of service containers. >200h of work. 50.000h of experience.
๐ฅ your cuda card from ๐ณ docker containers
Service | Release | Models | API | Original Repo |
---|---|---|---|---|
FastChat | T5, HF | OpenAI | lm-sys/FastChat | |
oobabooga | HF, GGML, GPTQ | oobabooga | oobabooga/text-generation-webui | |
llama-cpp-python | GGML | OpenAI | abetlen/llama-cpp-python | |
llama.cpp | GGML | ? | ggerganov/llama.cpp | |
gpt4all | see backend | ? | nomic-ai/gpt4all | |
gpt4all-ui | GPTJ, MPT, GGML...? | ? | nomic-ai/gpt4all-ui | |
stablediffusion2 | WIP |
Service | Release | Original Repo |
---|---|---|
๐บ integrated upstream | josh-XT/AGiXT | |
AGiXT-Frontend | โ๏ธ | JamesonRGrieve/Agent-LLM-Frontend |
gpt-code-ui | ricklamers/gpt-code-ui | |
gpt4free | xtekky/gpt4free |
for quantization, conversion, cli-inference etc.
Tool | Release | Model-types | Model-quantisations | Original Repo |
---|---|---|---|---|
llama.cpp | Llama | HF, GGML | ggerganov/llama.cpp | |
ggml | Llama | GGML | ggerganov/ggml | |
GPTQ-for-Llama | Llama | GPTQ [cuda old, cuda new] | oobabooga/GPTQ-for-Llama qwopqwop200/GPTQ-for-Llama |
|
AutoGPTQ | Llama | GPTQ [triton] | PanQiWei/AutoGPTQ | |
starcoder.cpp | RNN | bigcode-project/starcoder.cpp |
Any? Contact me (curently on AGiXT-Discord)
conda
is commercial. The license prohibits any commercial use. We try to omit it on our builds, but it's your responsibility.- NVidia images have a license. Make sure you read them.
- streamlit app collects heavy analytics even when running locally. This includes events for every page load, form submission including metadata on queries (like length), browser and client information including host ips. These are all transmitted to a 3rd party analytics group, Segment.com.