-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for binary data extension protocol and FP16 datatype #3685
Conversation
ebbc203
to
034a2d3
Compare
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: sivanantha321 The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
1035075
to
615e6b3
Compare
/rerun-all |
252e322
to
d14808d
Compare
/rerun-all |
03bef97
to
92a7b02
Compare
/rerun-all |
1 similar comment
/rerun-all |
if isinstance(body, InferRequest): | ||
return body, attributes | ||
elif isinstance(body, InferenceRequest) or ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I remember InferenceRequest
was always converted to InferRequest
before hitting here
@@ -174,7 +173,7 @@ def create_application(self) -> FastAPI: | |||
r"/v2/models/{model_name}/infer", | |||
v2_endpoints.infer, | |||
methods=["POST"], | |||
response_model=InferenceResponse, | |||
response_model=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The downside of this is that we lose the validation when returning back the InferenceResponse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The response will be validated here
@@ -221,6 +223,32 @@ def test_sklearn_v2(): | |||
res = predict(service_name, "./data/iris_input_v2.json", protocol_version="v2") | |||
assert res["outputs"][0]["data"] == [1, 1] | |||
|
|||
raw_res = predict( | |||
service_name, | |||
"./data/iris_input_v2_binary.json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do we test sending inputs with binary data for REST ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added in custom transformer test and also included in python tests
@sivanantha321 Need a rebase after merging the inference client PR |
494d70f
to
d00e06c
Compare
/rerun-all |
3 similar comments
/rerun-all |
/rerun-all |
/rerun-all |
865ab50
to
f2efd29
Compare
/rerun-all |
3245c45
to
44f7254
Compare
/rerun-all |
2 similar comments
/rerun-all |
/rerun-all |
b353bb6
to
382ba9e
Compare
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
382ba9e
to
d188d1d
Compare
Signed-off-by: Sivanantham Chinnaiyan <sivanantham.chinnaiyan@ideas2it.com>
d188d1d
to
3590411
Compare
/rerun-all |
# Fixme: Gets only the 1st element of the input | ||
# inputs = get_predict_input(request) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Fixme: Gets only the 1st element of the input | |
# inputs = get_predict_input(request) |
# Fixme: Gets only the 1st element of the input | ||
# inputs = get_predict_input(request) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Fixme: Gets only the 1st element of the input | |
# inputs = get_predict_input(request) |
Signed-off-by: Dan Sun <dsun20@bloomberg.net>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
/approve
What this PR does / why we need it:
Which issue(s) this PR fixes (optional, in
fixes #<issue number>(, fixes #<issue_number>, ...)
format, will close the issue(s) when PR gets merged):Fixes #3643
https://github.com/triton-inference-server/server/blob/main/docs/protocol/extension_binary_data.md
Type of changes
Please delete options that are not relevant.
Feature/Issue validation/testing:
Please describe the tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.
Test A
Test B
Logs
Special notes for your reviewer:
Checklist:
Release note:
Re-running failed tests
/rerun-all
- rerun all failed workflows./rerun-workflow <workflow name>
- rerun a specific failed workflow. Only one workflow name can be specified. Multiple /rerun-workflow commands are allowed per comment.