We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
status
RawDeployment
InferenceService
Creating an example InferenceService in RawDeployment mode I get the following:
apiVersion: serving.kserve.io/v1beta1 kind: InferenceService metadata: annotations: serving.kserve.io/deploymentMode: RawDeployment sidecar.istio.io/inject: "false" name: sklearn-iris namespace: test spec: predictor: model: modelFormat: name: sklearn storageUri: gs://kfserving-examples/models/sklearn/1.0/model status: address: url: http://sklearn-iris.test.svc.cluster.local components: predictor: [...] url: http://sklearn-iris-predictor-default-test.example.com [...] url: http://sklearn-iris-test.example.com
Running from inside a Notebook I can't hit either one of the provided URLs:
Reading through the docs it is not clear which one we should be using to perform inference from inside the cluster.
The text was updated successfully, but these errors were encountered:
Serverless
kserve-integration
No branches or pull requests
Creating an example InferenceService in
RawDeployment
mode I get the following:Running from inside a Notebook I can't hit either one of the provided URLs:
RawDeployment
mode)Reading through the docs it is not clear which one we should be using to perform inference from inside the cluster.
The text was updated successfully, but these errors were encountered: