-
Notifications
You must be signed in to change notification settings - Fork 752
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API spec update, client demo with Stainless SDK #58
Conversation
@@ -468,12 +468,14 @@ def _build_operation(self, op: EndpointOperation) -> Operation: | |||
builder = ContentBuilder(self.schema_builder) | |||
first = next(iter(op.request_params)) | |||
request_name, request_type = first | |||
|
|||
from dataclasses import make_dataclass | |||
|
|||
if len(op.request_params) == 1 and "Request" in first[1].__name__: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you just remove the if
completely? Then you will get a class named FooRequestRequest
but that's OK. It is an implementation detail anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried remove the if
, but it needs some work to generate the *Wrapper
class for the request
param. E.g. 838ab91. So figured this may be the fastest way.
@@ -0,0 +1,40 @@ | |||
import fire |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hardikjshah do you have any opinions on how we should situate these files?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another option is to put them in the SDK repos: https://github.com/stainless-sdks/llama-stack-python/tree/yanxi0830/dev
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think its better to move these out of llama_toolchain directory as this is not part of the toolchain package.
If i am developing in llama_toolchain and making an api change, I want to quickly test with client.py without having to worry about updating the llama_stack package ( or locally installing it )
I think, lets have a top-level examples directory in llama-stack-apps
where we have can have a directory for sdk_examples
and put these scripts there.
/sdk_examples
-- /inference
---- client.py
-- /agentic_system
---- client.py
So one can easily run python sdk_examples/inference/client.py
[ since the developer might have just pip installed the stack sdk pip package and docker container without git cloning this repo ]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah , the SDK repo's examples folder is probably best.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done (removed these scripts from the PR)! In the meantime (while SDK repo is getting ready), adding the sdk_examples
to llama-stack-apps
in meta-llama/llama-stack-apps#66.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lg
*RequestWrapper
& OpenAPI spec.Screen.Recording.2024-09-09.at.11.47.25.AM.mov
Screen.Recording.2024-09-08.at.12.44.08.PM.mov