Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Generate query title by using LLM #1255

Merged
merged 11 commits into from
Jun 5, 2023
Merged

Conversation

jczhong84
Copy link
Collaborator

  1. add a plugin setup for LLM AI assistant
  2. add title generation for query cells

querybook/config/querybook_default_config.yaml Outdated Show resolved Hide resolved
querybook/config/querybook_default_config.yaml Outdated Show resolved Hide resolved
querybook/server/datasources/ai_assistant.py Outdated Show resolved Hide resolved
querybook/server/lib/ai_assistant/ai_assistant.py Outdated Show resolved Hide resolved
querybook/webapp/components/AIAssistant/QueryCellTitle.tsx Outdated Show resolved Hide resolved
querybook/webapp/components/AIAssistant/QueryCellTitle.tsx Outdated Show resolved Hide resolved
querybook/webapp/components/AIAssistant/QueryCellTitle.tsx Outdated Show resolved Hide resolved
querybook/webapp/components/AIAssistant/QueryCellTitle.tsx Outdated Show resolved Hide resolved
requirements/base.txt Outdated Show resolved Hide resolved
querybook/server/datasources/ai_assistant.py Outdated Show resolved Hide resolved
querybook/server/datasources/ai_assistant.py Outdated Show resolved Hide resolved
querybook/server/lib/ai_assistant/all_ai_assistants.py Outdated Show resolved Hide resolved
from abc import ABC, abstractmethod


from langchain.prompts.chat import (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these are too chat specific, i.e., it fits better with chatgpt models than others

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the chat model is generic in langchain, it supports openai, and also other chat models from like google palm, Anthropic. As it's only used to create the prompt template, I can move it to the openai file

async function streamDatasource(
url: string,
data?: Record<string, unknown>,
onData?: (data: string) => void,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is data and onData which is a bit confusing?
also would be nice to document that the data passed in for onData is the total of current data instead of the partial delta

@@ -0,0 +1,19 @@
provider: ~
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, i noticed that you can also put json objects under querybook_config too, so why not put it there instead?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because it will be read by the frontend as well, while we dont want the querybook_config exposed to frontend.

requirements/ai/openai.txt Outdated Show resolved Hide resolved
@czgu czgu merged commit 467e581 into pinterest:master Jun 5, 2023
@jczhong84 jczhong84 deleted the feat/ai branch June 28, 2023 06:34
aidenprice pushed a commit to arrowtail-precision/querybook that referenced this pull request Jan 3, 2024
* feat: Generate query title by using LLM

* fix linter

* fix linter

* comments

* fix linter

* disable it by default

* fix linter

* comments

* add public config

* remove openai assistant by default

* remove empty plugin list
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants