-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Generate query title by using LLM #1255
Conversation
jczhong84
commented
May 24, 2023
- add a plugin setup for LLM AI assistant
- add title generation for query cells
querybook/server/lib/ai_assistant/assistants/openai_assistant.py
Outdated
Show resolved
Hide resolved
from abc import ABC, abstractmethod | ||
|
||
|
||
from langchain.prompts.chat import ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these are too chat specific, i.e., it fits better with chatgpt models than others
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the chat model is generic in langchain, it supports openai, and also other chat models from like google palm, Anthropic. As it's only used to create the prompt template, I can move it to the openai file
querybook/webapp/lib/datasource.ts
Outdated
async function streamDatasource( | ||
url: string, | ||
data?: Record<string, unknown>, | ||
onData?: (data: string) => void, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there is data and onData which is a bit confusing?
also would be nice to document that the data passed in for onData is the total of current data instead of the partial delta
querybook/config/ai_assistant.yaml
Outdated
@@ -0,0 +1,19 @@ | |||
provider: ~ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, i noticed that you can also put json objects under querybook_config too, so why not put it there instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because it will be read by the frontend as well, while we dont want the querybook_config exposed to frontend.
* feat: Generate query title by using LLM * fix linter * fix linter * comments * fix linter * disable it by default * fix linter * comments * add public config * remove openai assistant by default * remove empty plugin list