Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expand LLM support #3306

Closed
3 of 4 tasks
masci opened this issue Oct 3, 2022 · 1 comment · Fixed by #3453
Closed
3 of 4 tasks

Expand LLM support #3306

masci opened this issue Oct 3, 2022 · 1 comment · Fixed by #3453

Comments

@masci
Copy link
Contributor

masci commented Oct 3, 2022

Haystack can already work with LLM (see GPT-3 usage on answer generation) but it falls short in showing how the feature can actually help users. We can collect LLM related features under two large umbrellas:

Ideally Haystack should not have a preferred way to work with LLM but In Q4 we'll focus on making progress on the first group.

Desired outcome

  • Use GPT-3 embeddings support to provide document retrieval
  • Support co:here in one use case within Haystack
  • Write a tutorial showing how to use Haystack to compare different technologies (e.g. GPT-3 vs. classic transformers)
  • Draft a design to generically support an "inference as a service" concept within Haystack
@masci
Copy link
Contributor Author

masci commented Dec 27, 2022

Closing as complete

@masci masci closed this as completed Dec 27, 2022
@masci masci removed the epic:in-progress Epic is in progress label Dec 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging a pull request may close this issue.

2 participants