Introducing Docq.AI:
- One cloud bill Azure/AWS/GCP and no external suppliers
- Easy integration by dropping documents into blob storages (Azure - Blob Storage; AWS - S3; GCP - Cloud Storage)
- Private, organisational data staying within your cloud account network boundary
- Familiar UX for search and chat
- App Service - Container Instances
With:
- Azure Blob Storage
- Azure Cognitive Services OpenAI
Coming soon
With:
- AWS S3
- AWS Bedrock
Coming soon
With
- GCP Cloud Storage
- GCP PaLM2
By picking AWS/GCP/Azure, you're making implicit decisions on:
- How to ingest documents
- Which LLM(s) to use
- And other vendor specific services to support the operations
Ideally the choice should be primarily based on the existing cloud infrastructure setup in your organisation.
Again your organisation may already have recommended practices in place for operating cloud infrastructure.
A few crucial topics relevant to Docq:
- Security: Details such as how the application is accessed by end users and what guardrails are in place to implement the policies.
- Compliance: Following existing procedures and policies
- Cost: Running self-hosted LLMs can be costly therefore choose wisely on right sizing
Use LangChain and Streamlit, built by Poetry
poetry install
to get project ready to gomkdir .streamlit && cp docs/secrets.toml.example .streamlit/secrets.toml
and fill in api credentials./run.sh
to run it locally- Or set up Streamlit Community Cloud to deploy a non-cloud-native version