Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

Commit

Permalink
update readme (#16)
Browse files Browse the repository at this point in the history
* update readme

* update readme
  • Loading branch information
SeeknnDestroy authored Oct 19, 2023
1 parent c115ea2 commit 2758c2e
Showing 1 changed file with 5 additions and 21 deletions.
26 changes: 5 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -241,31 +241,15 @@ ______________________________________________________________________

Our roadmap outlines upcoming features and integrations to make AutoLLM the most extensible and powerful base package for large language model applications.

- [ ] **VectorDB Integrations**:
- [ ] **Budget based email notification feature**

- [x] Decouple DB index operations from vector store classes
- [ ] Add utility functions for creating and updating indexes based on local files and llama-index vector store instances
- [x] Update AutoVectorStore to support all VectorDB integrations without manual maintenance of vector store classes
- [x] Update AutoQueryEngine, AutoLLM, and AutoServiceContext to support new AutoVectorStore API
- [ ] **Add evaluation metrics for LLMs**:

- [ ] **Pipelines**:
- [ ] **Set default vector store as LanceDB**

- [ ] In memory PDF QA pipeline
- [ ] DB-based documentation QA pipeline
- [ ] **Add unit tests for online vectorDB integrations**:

- [ ] **FastAPI Integration**:

- [ ] FastAPI integration for Pipelines

- [ ] **Tests**:

- [ ] Add unit tests for online vectorDB integrations

- [ ] **Additional Document Providers**:

- [ ] Amazon S3-based document provider
- [ ] FTP-based document provider
- [ ] Google Drive-based document provider
- [ ] **Add example code snippet to Readme on how to integrate llama-hub readers**:

______________________________________________________________________

Expand Down

0 comments on commit 2758c2e

Please sign in to comment.