Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump bentoml from 1.0.0a4 to 1.0.15 #38

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Feb 15, 2023

Bumps bentoml from 1.0.0a4 to 1.0.15.

Release notes

Sourced from bentoml's releases.

BentoML - v1.0.14

馃嵄 Fix the backward incompatibility introduced in starlette version 0.24.0. Upgrade BentoML to v1.0.14 if you encounter the error related to content_type like below.

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/bentoml/_internal/server/service_app.py", line 305, in api_func
    input_data = await api.input.from_http_request(request)
  File "/usr/local/lib/python3.8/dist-packages/bentoml/_internal/io_descriptors/multipart.py", line 208, in from_http_request
    reqs = await populate_multipart_requests(request)
  File "/usr/local/lib/python3.8/dist-packages/bentoml/_internal/utils/formparser.py", line 188, in populate_multipart_requests
    form = await multipart_parser.parse()
  File "/usr/local/lib/python3.8/dist-packages/bentoml/_internal/utils/formparser.py", line 158, in parse
    multipart_file = UploadFile(
TypeError: __init__() got an unexpected keyword argument 'content_type'

BentoML - v1.0.13

馃嵄聽BentoML v1.0.13 is released featuring a preview of batch inference with Spark.

  • Run the batch inference job using the聽bentoml.batch.run_in_spark()聽method. This method takes the API name, the Spark DataFrame containing the input data, and the Spark session itself as parameters, and it returns a DataFrame containing the results of the batch inference job.

    import bentoml
    Import the bento from a repository or get the bento from the bento store
    bento = bentoml.import_bento("s3://bentoml/quickstart")
    Run the run_in_spark function with the bento, API name, and Spark session
    results_df = bentoml.batch.run_in_spark(bento, "classify", df, spark)

  • Internally, what happens when you run run_in_spark is as follows:

    • First, the bento is distributed to the cluster. Note that if the bento has already been distributed, i.e. you have already run a computation with that bento, this step is skipped.
    • Next, a process function is created, which starts a BentoML server on each of the Spark workers, then uses a client to process all the data. This is done so that the workers take advantage of the batch processing features of the BentoML server. PySpark pickles this process function and dispatches it, along with the relevant data, to the workers.
    • Finally, the function is evaluated on the given dataframe. Once all methods that the user defined in the script have been executed, the data is returned to the master node.

鈿狅笍聽The bentoml.batch API may undergo incompatible changes until general availability announced in a later minor version release. 馃聽Shout out to jeffthebear, KimSoungRyoul, Robert Fernandez, Marco Vela, Quan Nguyen, and y1450 from the community for their contributions in this release.

What's Changed

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [bentoml](https://github.com/bentoml/BentoML) from 1.0.0a4 to 1.0.15.
- [Release notes](https://github.com/bentoml/BentoML/releases)
- [Commits](bentoml/BentoML@v1.0.0-a4...v1.0.15)

---
updated-dependencies:
- dependency-name: bentoml
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Feb 15, 2023
@dagshub
Copy link

dagshub bot commented Feb 15, 2023

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

0 participants