The llm_interface crate is a workspace member of the llm_client project.
This crate contains the build.rs, data types, and behaviors for LLMs.
- Integration with Llama.cpp (through llama-server)
- Repo cloning and building
- Managing Llama.cpp server
- Support for various LLM APIs including generic OpenAI format LLMs
This crate enables running local LLMs and making requests to LLMs, designed for easy integration into other projects.
See the various Builders
implemented in the integration tests for examples
of using this crate.
For a look at a higher level API and how it implements this crate, see the llm_client crate.