Skip to content

Commit

Permalink
Python: v0.3.0 release (#73)
Browse files Browse the repository at this point in the history
* Python: v0.3.0 release

* Update upload-artifact usage

* Fix tests

* Bump minimum python version to 3.8
  • Loading branch information
benbrandt authored Dec 27, 2023
1 parent e716aa9 commit e82517a
Show file tree
Hide file tree
Showing 6 changed files with 68 additions and 57 deletions.
20 changes: 11 additions & 9 deletions .github/workflows/python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,9 +71,9 @@ jobs:
manylinux: auto
working-directory: bindings/python
- name: Upload wheels
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: wheels
name: wheels-linux-${{ matrix.target }}
path: bindings/python/dist
if-no-files-found: error
- name: pytest
Expand Down Expand Up @@ -108,9 +108,9 @@ jobs:
sccache: "true"
working-directory: bindings/python
- name: Upload wheels
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: wheels
name: wheels-windows-${{ matrix.target }}
path: bindings/python/dist
if-no-files-found: error
- name: pytest
Expand Down Expand Up @@ -144,9 +144,9 @@ jobs:
sccache: "true"
working-directory: bindings/python
- name: Upload wheels
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: wheels
name: wheels-macos-${{ matrix.target }}
path: bindings/python/dist
if-no-files-found: error
- name: pytest
Expand All @@ -173,9 +173,9 @@ jobs:
args: --out dist
working-directory: bindings/python
- name: Upload sdist
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: wheels
name: wheels-sdist
path: bindings/python/dist
if-no-files-found: error

Expand All @@ -187,7 +187,9 @@ jobs:
steps:
- uses: actions/download-artifact@v4
with:
name: wheels
path: wheels
pattern: wheels-*
merge-multiple: true
- name: Publish to PyPI
uses: PyO3/maturin-action@v1
env:
Expand Down
11 changes: 11 additions & 0 deletions bindings/python/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# Changelog

## v0.3.0

### What's New

- Update to `v0.5.0` of `text-splitter` for significant performance improvements for generating chunks with the `tokenizers` or `tiktoken-rs` crates by applying binary search when attempting to find the next matching chunk size.

### Breaking Changes

- Minimum Python version is now 3.8.
- Due to using binary search, there are some slight differences at the edges of chunks where the algorithm was a little greedier before. If two candidates would tokenize to the same amount of tokens that fit within the capacity, it will now choose the shorter text. Due to the nature of of tokenizers, this happens more often with whitespace at the end of a chunk, and rarely effects users who have set `trim_chunks=true`. It is a tradeoff, but would have made the binary search code much more complicated to keep the exact same behavior.

## v0.2.4

### What's New
Expand Down
82 changes: 40 additions & 42 deletions bindings/python/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 3 additions & 3 deletions bindings/python/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "semantic-text-splitter"
version = "0.2.4"
version = "0.3.0"
authors = ["Ben Brandt <benjamin.j.brandt@gmail.com>"]
edition = "2021"
description = "Split text into semantic chunks, up to a desired chunk size. Supports calculating length by characters and tokens (when used with large language models)."
Expand All @@ -14,8 +14,8 @@ name = "semantic_text_splitter"
crate-type = ["cdylib"]

[dependencies]
pyo3 = { version = "0.20.0", features = ["abi3-py37"] }
text-splitter = { version = "0.4.5", features = ["tiktoken-rs", "tokenizers"] }
pyo3 = { version = "0.20.0", features = ["abi3-py38"] }
text-splitter = { version = "0.5.0", features = ["tiktoken-rs", "tokenizers"] }
tiktoken-rs = "0.5.8"
tokenizers = { version = "0.15.0", default_features = false, features = [
"onig",
Expand Down
2 changes: 1 addition & 1 deletion bindings/python/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "maturin"

[project]
name = "semantic-text-splitter"
requires-python = ">=3.7"
requires-python = ">=3.8"
classifiers = [
"Programming Language :: Rust",
"Programming Language :: Python :: Implementation :: CPython",
Expand Down
4 changes: 2 additions & 2 deletions bindings/python/tests/test_integration.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,14 +32,14 @@ def test_hugging_face():
tokenizer = Tokenizer.from_pretrained("bert-base-uncased")
splitter = HuggingFaceTextSplitter(tokenizer, trim_chunks=False)
text = "123\n123"
assert splitter.chunks(text, 1) == ["123\n", "123"]
assert splitter.chunks(text, 1) == ["123", "\n123"]


def test_hugging_face_range():
tokenizer = Tokenizer.from_pretrained("bert-base-uncased")
splitter = HuggingFaceTextSplitter(tokenizer, trim_chunks=False)
text = "123\n123"
assert splitter.chunks(text=text, chunk_capacity=(1, 2)) == ["123\n", "123"]
assert splitter.chunks(text=text, chunk_capacity=(1, 2)) == ["123", "\n123"]


def test_hugging_face_trim():
Expand Down

0 comments on commit e82517a

Please sign in to comment.