Skip to content
This repository has been archived by the owner on Sep 6, 2023. It is now read-only.

docqai/docq-qs

Docq Quick Start (docq-qs) Edition

Plug-n-Play knowledge portal with secure AI

Introducing Docq.AI:

  • One cloud bill Azure/AWS/GCP and no external suppliers
  • Easy integration by dropping documents into blob storages (Azure - Blob Storage; AWS - S3; GCP - Cloud Storage)
  • Private, organisational data staying within your cloud account network boundary
  • Familiar UX for search and chat

Deploy

Azure

Deploy to Azure - App Service Deploy to Azure - Container Instances

With:

  • Azure Blob Storage
  • Azure Cognitive Services OpenAI

AWS

Coming soon

With:

  • AWS S3
  • AWS Bedrock

GCP

Coming soon

With

  • GCP Cloud Storage
  • GCP PaLM2

Plan

Decide where to run the application

By picking AWS/GCP/Azure, you're making implicit decisions on:

  • How to ingest documents
  • Which LLM(s) to use
  • And other vendor specific services to support the operations

Ideally the choice should be primarily based on the existing cloud infrastructure setup in your organisation.

Plan the provisioned infrastructure

Again your organisation may already have recommended practices in place for operating cloud infrastructure.

A few crucial topics relevant to Docq:

  • Security: Details such as how the application is accessed by end users and what guardrails are in place to implement the policies.
  • Compliance: Following existing procedures and policies
  • Cost: Running self-hosted LLMs can be costly therefore choose wisely on right sizing

Develop

Use LangChain and Streamlit, built by Poetry

  1. poetry install to get project ready to go
  2. mkdir .streamlit && cp docs/secrets.toml.example .streamlit/secrets.toml and fill in api credentials
  3. ./run.sh to run it locally
  4. Or set up Streamlit Community Cloud to deploy a non-cloud-native version