-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Epic] AI Insights + Assistant - Add "Other" option to the existing OpenAI Connector dropdown list (#8936) #194831
[Epic] AI Insights + Assistant - Add "Other" option to the existing OpenAI Connector dropdown list (#8936) #194831
Conversation
…penAI Connector dropdown list (elastic#8936)
x-pack/plugins/stack_connectors/public/connector_types/openai/translations.ts
Outdated
Show resolved
Hide resolved
…translations.ts Co-authored-by: Patryk Kopyciński <contact@patrykkopycinski.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Telemetry changes LGTM
@elasticmachine merge upstream |
💛 Build succeeded, but was flaky
Failed CI StepsMetrics [docs]Async chunks
Page load bundle
History
To update your PR or re-run it, just comment with: cc @e40pud |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ResponseOps changes LGTM
Standard warning about config/secrets schema changes in serverless: if we ship this code in serverless, and someone creates a connector using the new schema, and then we have to roll back the serverless release, this connector will fail to validate, and won't be able to execute. It should be deletable and editable in the UX, so could be edited to match the older schema, and would then start working again. Low risk in that happening. A safer approach is to just release the schema changes without anything else, wait a week, then the rest of the PR. That way on a rollback, the schema will be valid ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works really well! Thank you.
@elasticmachine merge upstream |
@elasticmachine merge upstream |
Starting backport for target branches: 8.x https://github.com/elastic/kibana/actions/runs/11263805561 |
💚 Build Succeeded
Metrics [docs]Async chunks
Page load bundle
History
cc @e40pud |
…penAI Connector dropdown list (elastic#8936) (elastic#194831) (cherry picked from commit 83a701e)
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
…o the existing OpenAI Connector dropdown list (#8936) (#194831) (#195688) # Backport This will backport the following commits from `main` to `8.x`: - [[Epic] AI Insights + Assistant - Add "Other" option to the existing OpenAI Connector dropdown list (#8936) (#194831)](#194831) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Ievgen Sorokopud","email":"ievgen.sorokopud@elastic.co"},"sourceCommit":{"committedDate":"2024-10-09T22:07:31Z","message":"[Epic] AI Insights + Assistant - Add \"Other\" option to the existing OpenAI Connector dropdown list (#8936) (#194831)","sha":"83a701e837a7a84a86dcc8d359154f900f69676a","branchLabelMapping":{"^v9.0.0$":"main","^v8.16.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["v9.0.0","release_note:feature","Feature:Security Assistant","Team:Security Generative AI","v8.16.0","backport:version"],"title":"[Epic] AI Insights + Assistant - Add \"Other\" option to the existing OpenAI Connector dropdown list (#8936)","number":194831,"url":"https://github.com/elastic/kibana/pull/194831","mergeCommit":{"message":"[Epic] AI Insights + Assistant - Add \"Other\" option to the existing OpenAI Connector dropdown list (#8936) (#194831)","sha":"83a701e837a7a84a86dcc8d359154f900f69676a"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/194831","number":194831,"mergeCommit":{"message":"[Epic] AI Insights + Assistant - Add \"Other\" option to the existing OpenAI Connector dropdown list (#8936) (#194831)","sha":"83a701e837a7a84a86dcc8d359154f900f69676a"}},{"branch":"8.x","label":"v8.16.0","branchLabelMappingKey":"^v8.16.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Ievgen Sorokopud <ievgen.sorokopud@elastic.co>
@e40pud was there plans to update documentation for this change? |
@leemthompo I think we should update this section https://www.elastic.co/guide/en/security/current/connect-to-byo-llm.html#_configure_the_connector_in_your_elastic_deployment and mention that user should select cc @jamesspi |
@leemthompo here is the docs ticket elastic/security-docs#5965, please let me know if there is any information needed. |
@Charelzard to track docs and blog update. |
Summary
Epic https://github.com/elastic/security-team/issues/8936
At the moment, our users are able to host local LLMs via an inference server that follows OpenAI's API spec using our OpenAI connector by changing the URL to their own server. Whilst this works just fine, it's not evident that this is something we support.
With these changes we add "Other" option to the existing OpenAI Connector dropdown list to provide a way for users to know that this is possible. Having an "Other" option in the current drop down list, with no pre-populated text in the URL, Model and Auth input sections.
All the underlying requests and data are identical for the existing OpenAI option in the dropdown.
Recording
Screen.Recording.2024-10-03.at.14.35.26.mov
Testing
To test the workflow follow these steps:
Other (OpenAI Compatible Service)
providerCredentials
Use these credentials for testing LLama 3.1 70B and Mistral Large 2407:
https://p.elstc.co/paste/ofqxUGiW#RR1Pedserj9hWKm3vOw5oVEHWuwbX7i9Jl2rS7q7MRP
Checklist
Delete any items that are not applicable to this PR.