Releases: edgenai/llama_cpp-rs
Releases · edgenai/llama_cpp-rs
llama_cpp_sys v0.2.2
Bug Fixes
- do not rerun build on changed header files
this restores functionality lost in the latest upgrade tobindgen
, which enabled this functionality
Commit Statistics
- 2 commits contributed to the release.
- 1 commit was understood as conventional.
- 0 issues like '(#ID)' were seen in commit messages
Commit Details
llama_cpp_sys v0.2.1
Chore
- Update to
bindgen
0.69.1
Bug Fixes
start_completing
should not be invoked on a per-iteration basis
There's still some UB that can be triggered due to llama.cpp's threading model, which needs patching up.
Commit Statistics
- 2 commits contributed to the release.
- 13 days passed between releases.
- 2 commits were understood as conventional.
- 0 issues like '(#ID)' were seen in commit messages
Commit Details
llama_cpp v0.1.3
New Features
- more
async
function variants - add
LlamaSession.model
Other
- typo
Commit Statistics
- 5 commits contributed to the release.
- 3 commits were understood as conventional.
- 0 issues like '(#ID)' were seen in commit messages
Commit Details
llama_cpp v0.1.2
New Features
- more
async
function variants - add
LlamaSession.model
Commit Statistics
- 2 commits contributed to the release.
- 2 commits were understood as conventional.
- 0 issues like '(#ID)' were seen in commit messages
Commit Details
llama_cpp v0.1.1
Chore
- Remove debug binary from Cargo.toml
New Features
- add
LlamaModel::load_from_file_async
Bug Fixes
- require
llama_context
is accessed from behind a mutex
This solves a race condition when severalget_completions
threads are spawned at the same time start_completing
should not be invoked on a per-iteration basis
There's still some UB that can be triggered due to llama.cpp's threading model, which needs patching up.
Commit Statistics
- 5 commits contributed to the release.
- 13 days passed between releases.
- 4 commits were understood as conventional.
- 0 issues like '(#ID)' were seen in commit messages
Commit Details
llama_cpp_sys v0.2.0
Chore
- Release
- latest fixes from upstream
Bug Fixes
- set clang to use c++ stl
- use SPDX license identifiers
Other
- use
link-cplusplus
, enable build+test on all branches- ci: disable static linking of llama.o\r
\r - ci: build+test on all branches/prs\r
\r - ci: use
link-cplusplus
- ci: disable static linking of llama.o\r
- configure for
cargo-release
Commit Statistics
- 10 commits contributed to the release over the course of 5 calendar days.
- 6 commits were understood as conventional.
- 3 unique issues were worked on: #1, #2, #3
Commit Details
llama_cpp v0.1.0
Chore
- remove
include
from llama_cpp - Release
- latest fixes from upstream
Chore
- add CHANGELOG.md
Bug Fixes
- use SPDX license identifiers
Other
- configure for
cargo-release
Commit Statistics
- 8 commits contributed to the release over the course of 5 calendar days.
- 6 commits were understood as conventional.
- 1 unique issue was worked on: #3