Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release: 1.3.8 #928

Merged
merged 12 commits into from
Dec 9, 2023
Merged
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "1.3.7"
".": "1.3.8"
}
24 changes: 24 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,29 @@
# Changelog

## 1.3.8 (2023-12-08)

Full Changelog: [v1.3.7...v1.3.8](https://github.com/openai/openai-python/compare/v1.3.7...v1.3.8)

### Bug Fixes

* avoid leaking memory when Client.with_options is used ([#956](https://github.com/openai/openai-python/issues/956)) ([e37ecca](https://github.com/openai/openai-python/commit/e37ecca04040ce946822a7e40f5604532a59ee85))
* **errors:** properly assign APIError.body ([#949](https://github.com/openai/openai-python/issues/949)) ([c70e194](https://github.com/openai/openai-python/commit/c70e194f0a253409ec851607ae5219e3b5a8c442))
* **pagination:** use correct type hint for .object ([#943](https://github.com/openai/openai-python/issues/943)) ([23fe7ee](https://github.com/openai/openai-python/commit/23fe7ee48a71539b0d1e95ceff349264aae4090e))


### Chores

* **internal:** enable more lint rules ([#945](https://github.com/openai/openai-python/issues/945)) ([2c8add6](https://github.com/openai/openai-python/commit/2c8add64a261dea731bd162bb0cca222518d5440))
* **internal:** reformat imports ([#939](https://github.com/openai/openai-python/issues/939)) ([ec65124](https://github.com/openai/openai-python/commit/ec651249de2f4e4cf959f816e1b52f03d3b1017a))
* **internal:** reformat imports ([#944](https://github.com/openai/openai-python/issues/944)) ([5290639](https://github.com/openai/openai-python/commit/52906391c9b6633656ec7934e6bbac553ec667cd))
* **internal:** update formatting ([#941](https://github.com/openai/openai-python/issues/941)) ([8e5a156](https://github.com/openai/openai-python/commit/8e5a156d555fe68731ba0604a7455cc03cb451ce))
* **package:** lift anyio v4 restriction ([#927](https://github.com/openai/openai-python/issues/927)) ([be0438a](https://github.com/openai/openai-python/commit/be0438a2e399bb0e0a94907229d02fc61ab479c0))


### Documentation

* fix typo in example ([#950](https://github.com/openai/openai-python/issues/950)) ([54f0ce0](https://github.com/openai/openai-python/commit/54f0ce0000abe32e97ae400f2975c028b8a84273))

## 1.3.7 (2023-12-01)

Full Changelog: [v1.3.6...v1.3.7](https://github.com/openai/openai-python/compare/v1.3.6...v1.3.7)
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ from openai import AsyncOpenAI
client = AsyncOpenAI()

stream = await client.chat.completions.create(
prompt="Say this is a test",
model="gpt-4",
messages=[{"role": "user", "content": "Say this is a test"}],
stream=True,
)
Expand Down
33 changes: 20 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openai"
version = "1.3.7"
version = "1.3.8"
description = "The official Python library for the openai API"
readme = "README.md"
license = "Apache-2.0"
Expand All @@ -11,7 +11,7 @@ dependencies = [
"httpx>=0.23.0, <1",
"pydantic>=1.9.0, <3",
"typing-extensions>=4.5, <5",
"anyio>=3.5.0, <4",
"anyio>=3.5.0, <5",
"distro>=1.7.0, <2",
"sniffio",
"tqdm > 4"
Expand Down Expand Up @@ -47,17 +47,18 @@ openai = "openai.cli:main"

[tool.rye]
managed = true
# version pins are in requirements-dev.lock
dev-dependencies = [
"pyright==1.1.332",
"mypy==1.7.1",
"black==23.3.0",
"respx==0.19.2",
"pytest==7.1.1",
"pytest-asyncio==0.21.1",
"ruff==0.0.282",
"isort==5.10.1",
"time-machine==2.9.0",
"nox==2023.4.22",
"pyright",
"mypy",
"black",
"respx",
"pytest",
"pytest-asyncio",
"ruff",
"isort",
"time-machine",
"nox",
"dirty-equals>=0.6.0",
"azure-identity >=1.14.1",
"types-tqdm > 4"
Expand Down Expand Up @@ -135,9 +136,11 @@ extra_standard_library = ["typing_extensions"]

[tool.ruff]
line-length = 120
format = "grouped"
output-format = "grouped"
target-version = "py37"
select = [
# bugbear rules
"B",
# remove unused imports
"F401",
# bare except statements
Expand All @@ -148,6 +151,10 @@ select = [
"T201",
"T203",
]
ignore = [
# mutable defaults
"B006",
]
unfixable = [
# disable auto fix for print statements
"T201",
Expand Down
15 changes: 7 additions & 8 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

-e file:.
annotated-types==0.6.0
anyio==3.7.1
anyio==4.1.0
argcomplete==3.1.2
attrs==23.1.0
azure-core==1.29.5
Expand All @@ -25,13 +25,13 @@ distlib==0.3.7
distro==1.8.0
exceptiongroup==1.1.3
filelock==3.12.4
h11==0.12.0
httpcore==0.15.0
httpx==0.23.0
h11==0.14.0
httpcore==1.0.2
httpx==0.25.2
idna==3.4
iniconfig==2.0.0
isort==5.10.1
msal==1.25.0
msal==1.26.0
msal-extensions==1.0.0
mypy==1.7.1
mypy-extensions==1.0.0
Expand All @@ -56,9 +56,8 @@ pytest-asyncio==0.21.1
python-dateutil==2.8.2
pytz==2023.3.post1
requests==2.31.0
respx==0.19.2
rfc3986==1.5.0
ruff==0.0.282
respx==0.20.2
ruff==0.1.7
six==1.16.0
sniffio==1.3.0
time-machine==2.9.0
Expand Down
13 changes: 6 additions & 7 deletions requirements.lock
Original file line number Diff line number Diff line change
Expand Up @@ -8,22 +8,21 @@

-e file:.
annotated-types==0.6.0
anyio==3.7.1
anyio==4.1.0
certifi==2023.7.22
distro==1.8.0
exceptiongroup==1.1.3
h11==0.12.0
httpcore==0.15.0
httpx==0.23.0
h11==0.14.0
httpcore==1.0.2
httpx==0.25.2
idna==3.4
numpy==1.26.1
pandas==2.1.1
numpy==1.26.2
pandas==2.1.3
pandas-stubs==2.1.1.230928
pydantic==2.4.2
pydantic-core==2.10.1
python-dateutil==2.8.2
pytz==2023.3.post1
rfc3986==1.5.0
six==1.16.0
sniffio==1.3.0
tqdm==4.66.1
Expand Down
2 changes: 1 addition & 1 deletion src/openai/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
for __name in __all__:
if not __name.startswith("__"):
try:
setattr(__locals[__name], "__module__", "openai")
__locals[__name].__module__ = "openai"
except (TypeError, AttributeError):
# Some of our exported symbols are builtins which we can't set attributes for.
pass
Expand Down
28 changes: 15 additions & 13 deletions src/openai/_base_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -403,14 +403,12 @@ def _build_headers(self, options: FinalRequestOptions) -> httpx.Headers:
headers_dict = _merge_mappings(self.default_headers, custom_headers)
self._validate_headers(headers_dict, custom_headers)

# headers are case-insensitive while dictionaries are not.
headers = httpx.Headers(headers_dict)

idempotency_header = self._idempotency_header
if idempotency_header and options.method.lower() != "get" and idempotency_header not in headers:
if not options.idempotency_key:
options.idempotency_key = self._idempotency_key()

headers[idempotency_header] = options.idempotency_key
headers[idempotency_header] = options.idempotency_key or self._idempotency_key()

return headers

Expand Down Expand Up @@ -594,16 +592,8 @@ def base_url(self) -> URL:
def base_url(self, url: URL | str) -> None:
self._base_url = self._enforce_trailing_slash(url if isinstance(url, URL) else URL(url))

@lru_cache(maxsize=None)
def platform_headers(self) -> Dict[str, str]:
return {
"X-Stainless-Lang": "python",
"X-Stainless-Package-Version": self._version,
"X-Stainless-OS": str(get_platform()),
"X-Stainless-Arch": str(get_architecture()),
"X-Stainless-Runtime": platform.python_implementation(),
"X-Stainless-Runtime-Version": platform.python_version(),
}
return platform_headers(self._version)

def _calculate_retry_timeout(
self,
Expand Down Expand Up @@ -1691,6 +1681,18 @@ def get_platform() -> Platform:
return "Unknown"


@lru_cache(maxsize=None)
def platform_headers(version: str) -> Dict[str, str]:
return {
"X-Stainless-Lang": "python",
"X-Stainless-Package-Version": version,
"X-Stainless-OS": str(get_platform()),
"X-Stainless-Arch": str(get_architecture()),
"X-Stainless-Runtime": platform.python_implementation(),
"X-Stainless-Runtime-Version": platform.python_version(),
}


class OtherArch:
def __init__(self, name: str) -> None:
self.name = name
Expand Down
4 changes: 2 additions & 2 deletions src/openai/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,7 @@ def copy(
return self.__class__(
api_key=api_key or self.api_key,
organization=organization or self.organization,
base_url=base_url or str(self.base_url),
base_url=base_url or self.base_url,
timeout=self.timeout if isinstance(timeout, NotGiven) else timeout,
http_client=http_client,
max_retries=max_retries if is_given(max_retries) else self.max_retries,
Expand Down Expand Up @@ -402,7 +402,7 @@ def copy(
return self.__class__(
api_key=api_key or self.api_key,
organization=organization or self.organization,
base_url=base_url or str(self.base_url),
base_url=base_url or self.base_url,
timeout=self.timeout if isinstance(timeout, NotGiven) else timeout,
http_client=http_client,
max_retries=max_retries if is_given(max_retries) else self.max_retries,
Expand Down
1 change: 1 addition & 0 deletions src/openai/_exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ def __init__(self, message: str, request: httpx.Request, *, body: object | None)
super().__init__(message)
self.request = request
self.message = message
self.body = body

if is_dict(body):
self.code = cast(Any, body.get("code"))
Expand Down
4 changes: 2 additions & 2 deletions src/openai/_extras/numpy_proxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ class NumpyProxy(LazyProxy[Any]):
def __load__(self) -> Any:
try:
import numpy
except ImportError:
raise MissingDependencyError(NUMPY_INSTRUCTIONS)
except ImportError as err:
raise MissingDependencyError(NUMPY_INSTRUCTIONS) from err

return numpy

Expand Down
4 changes: 2 additions & 2 deletions src/openai/_extras/pandas_proxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ class PandasProxy(LazyProxy[Any]):
def __load__(self) -> Any:
try:
import pandas
except ImportError:
raise MissingDependencyError(PANDAS_INSTRUCTIONS)
except ImportError as err:
raise MissingDependencyError(PANDAS_INSTRUCTIONS) from err

return pandas

Expand Down
4 changes: 2 additions & 2 deletions src/openai/_streaming.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def __stream__(self) -> Iterator[ResponseT]:
yield process_data(data=data, cast_to=cast_to, response=response)

# Ensure the entire stream is consumed
for sse in iterator:
for _sse in iterator:
...


Expand Down Expand Up @@ -120,7 +120,7 @@ async def __stream__(self) -> AsyncIterator[ResponseT]:
yield process_data(data=data, cast_to=cast_to, response=response)

# Ensure the entire stream is consumed
async for sse in iterator:
async for _sse in iterator:
...


Expand Down
1 change: 1 addition & 0 deletions src/openai/_types.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@


class BinaryResponseContent(ABC):
@abstractmethod
def __init__(
self,
response: Any,
Expand Down
8 changes: 5 additions & 3 deletions src/openai/_utils/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,8 +194,8 @@ def extract_type_arg(typ: type, index: int) -> type:
args = get_args(typ)
try:
return cast(type, args[index])
except IndexError:
raise RuntimeError(f"Expected type {typ} to have a type argument at index {index} but it did not")
except IndexError as err:
raise RuntimeError(f"Expected type {typ} to have a type argument at index {index} but it did not") from err


def deepcopy_minimal(item: _T) -> _T:
Expand Down Expand Up @@ -275,7 +275,9 @@ def wrapper(*args: object, **kwargs: object) -> object:
try:
given_params.add(positional[i])
except IndexError:
raise TypeError(f"{func.__name__}() takes {len(positional)} argument(s) but {len(args)} were given")
raise TypeError(
f"{func.__name__}() takes {len(positional)} argument(s) but {len(args)} were given"
) from None

for key in kwargs.keys():
given_params.add(key)
Expand Down
2 changes: 1 addition & 1 deletion src/openai/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# File generated from our OpenAPI spec by Stainless.

__title__ = "openai"
__version__ = "1.3.7" # x-release-please-version
__version__ = "1.3.8" # x-release-please-version
2 changes: 1 addition & 1 deletion src/openai/cli/_progress.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def read(self, n: int | None = -1) -> bytes:
try:
self._callback(self._progress)
except Exception as e: # catches exception from the callback
raise CancelledError("The upload was cancelled: {}".format(e))
raise CancelledError("The upload was cancelled: {}".format(e)) from e

return chunk

Expand Down
4 changes: 2 additions & 2 deletions src/openai/cli/_tools/migrate.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def grit(args: GritArgs) -> None:
except subprocess.CalledProcessError:
# stdout and stderr are forwarded by subprocess so an error will already
# have been displayed
raise SilentCLIError()
raise SilentCLIError() from None


class MigrateArgs(BaseModel):
Expand All @@ -57,7 +57,7 @@ def migrate(args: MigrateArgs) -> None:
except subprocess.CalledProcessError:
# stdout and stderr are forwarded by subprocess so an error will already
# have been displayed
raise SilentCLIError()
raise SilentCLIError() from None


# handles downloading the Grit CLI until they provide their own PyPi package
Expand Down
Loading