Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/upgrade LlamaIndex to 0.10 #1663

Merged
merged 15 commits into from
Mar 6, 2024
Merged

Conversation

imartinez
Copy link
Collaborator

@imartinez imartinez commented Feb 29, 2024

This PR includes breaking changes to how PrivateGPT is installed.

It follows the breaking changed introduced by LlamaIndex 0.10.x

Changes:

  • Documentation update
  • More and better setups examples in installation documentation, followed by a reviewed set of default settings-.yaml
  • Stop using ServiceContext (deprecated in LlamaIndex)
  • Refactor of all imports following changes in LlamaIndex - we went through the full migration, no "deprecated" or "legacy" API is being used.
  • Cleanup of dependencies, making most of the dependencies optional. Follow LlamaIndex nomenclature to name the optional dependencies for full clarity.
  • Stopped using the concept "local" that used to abstract the usage of LlamaCPP LLM + Huggingface Embeddings. Now both concepts are separated for clarity and extensibility.

The new way of installing privateGPT's dependencies is by choosing the different modules to use and installing them through "extras":

poetry install --extras "<extra1> <extra2>..."

Where <extra> can be any of the following:

  • ui: adds support for UI using Gradio
  • llms-ollama: adds support for Ollama LLM, the easiest way to get a local LLM running
  • llms-llama-cpp: adds support for local LLM using LlamaCPP - expect a messy installation process on some platforms
  • llms-sagemaker: adds support for Amazon Sagemaker LLM, requires Sagemaker inference endpoints
  • llms-openai: adds support for OpenAI LLM, requires OpenAI API key
  • llms-openai-like: adds support for 3rd party LLM providers that are compatible with OpenAI's API
  • embeddings-huggingface: adds support for local Embeddings using HuggingFace
  • embeddings-sagemaker: adds support for Amazon Sagemaker Embeddings, requires Sagemaker inference endpoints
  • embeddings-openai = adds support for OpenAI Embeddings, requires OpenAI API key
  • vector-stores-qdrant: adds support for Qdrant vector store
  • vector-stores-chroma: adds support for Chroma DB vector store
  • vector-stores-postgres: adds support for Postgres vector store

We are making Ollama setup the recommended local setup:

poetry install --extras "ui llms-ollama embeddings-huggingface vector-stores-qdrant"
poetry run python scripts/setup  # To install local embeddings model, given Ollama doesn't contain Embeddings yet
PGPT_PROFILES=ollama make run

Copy link
Contributor

Copy link
Contributor

Copy link
Contributor

@imartinez imartinez requested a review from pabloogc February 29, 2024 15:54
Copy link
Contributor

github-actions bot commented Mar 1, 2024

Copy link
Contributor

github-actions bot commented Mar 1, 2024

Copy link
Contributor

github-actions bot commented Mar 1, 2024

pabloogc
pabloogc previously approved these changes Mar 1, 2024
Copy link
Collaborator

@pabloogc pabloogc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

something will probably blow up in some configuration, but looks good 👍

Copy link
Contributor

github-actions bot commented Mar 1, 2024

Copy link
Contributor

github-actions bot commented Mar 1, 2024

@imartinez imartinez requested a review from pabloogc March 1, 2024 15:38
Copy link
Contributor

github-actions bot commented Mar 1, 2024

@felciano
Copy link

felciano commented Mar 2, 2024

@imartinez what do you think about including a few test documents in the distribution, and then adding some sort of "Test your installation" section to the Install docs that help users confirm that everything is running correctly (e.g. by uploading the provided test documents, posing a question to the Chat, and verifying the response coming back)?

@imartinez
Copy link
Collaborator Author

@imartinez what do you think about including a few test documents in the distribution, and then adding some sort of "Test your installation" section to the Install docs that help users confirm that everything is running correctly (e.g. by uploading the provided test documents, posing a question to the Chat, and verifying the response coming back)?

Thanks for the idea. We had just that in the primordial version. But beyond some odd discussion about how appropriate was the document itself, it added little value.
Having said that the installation itself needs to be simplified as a priority to help people get up and running.

Copy link
Contributor

github-actions bot commented Mar 5, 2024

@imartinez imartinez merged commit 45f0571 into main Mar 6, 2024
8 checks passed
@imartinez imartinez deleted the feature/upgrade-llamaindex branch March 6, 2024 16:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants