Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add separate e2e test for send_intermediate_prediction_response #2896

Merged
merged 2 commits into from
Jan 12, 2024

Conversation

mreso
Copy link
Collaborator

@mreso mreso commented Jan 12, 2024

Description

Add separate e2e test for send_intermediate_prediction_response
Additionally, it enables streaming in gpt-fast example test

Type of change

Please delete options that are not relevant.

  • New test (non-breaking change which adds functionality)

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • pytest test/pytest/test_send_intermediate_prediction_response.py
======================================================================================= test session starts =======================================================================================
platform linux -- Python 3.10.13, pytest-7.3.1, pluggy-1.3.0
rootdir: /home/ubuntu/serve
plugins: mock-3.12.0, cov-4.1.0
collected 2 items

test/pytest/test_send_intermediate_prediction_response.py .2024-01-12T20:17:13,380 [INFO ] W-9000-tp_model_1.0 TS_METRICS - WorkerThreadTime.Milliseconds:3.0|#Level:Host|#hostname:ip-172-31-15-101,timestamp:1705090633
.                                                                                                                                [100%]

======================================================================================== 2 passed in 5.78s ========================================================================================

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?

@mreso mreso requested review from agunapal and lxning January 12, 2024 20:20
@mreso mreso force-pushed the test/e2e_streaming_response branch from 20276b8 to 0dbee95 Compare January 12, 2024 20:20
Copy link
Collaborator

@agunapal agunapal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mreso mreso enabled auto-merge January 12, 2024 21:55
@mreso mreso added this pull request to the merge queue Jan 12, 2024
Merged via the queue into master with commit 43e6740 Jan 12, 2024
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants