Skip to content

Latest commit

 

History

History
201 lines (147 loc) · 5.78 KB

README.md

File metadata and controls

201 lines (147 loc) · 5.78 KB

LangStream

banner
license GitHub release (with filter)

Check out our website.

Have a question? Join our community on Slack or Linen!

For the complete documentation, go here.

Get the LangStream VS Code extension here.

Contents

CLI

Warning CLI requires Java 11+ to be already installed on your machine.

Installation

There are multiple ways to install the CLI.

  • MacOS:

    • Homebrew
    brew install LangStream/langstream/langstream
    
    • Binary with curl
    curl -Ls "https://raw.githubusercontent.com/LangStream/langstream/main/bin/get-cli.sh" | bash
    
  • Unix:

    • Binary with curl
    curl -Ls "https://raw.githubusercontent.com/LangStream/langstream/main/bin/get-cli.sh" | bash
    

Verify the binary is available:

langstream -V

Refer to the CLI documentation to learn more.

Try the sample application

Run the sample Chat Completions application on-the-fly:

export OPEN_AI_ACCESS_KEY=your-key-here
langstream docker run test \
   -app https://github.com/LangStream/langstream/blob/main/examples/applications/openai-completions \
   -s https://github.com/LangStream/langstream/blob/main/examples/secrets/secrets.yaml

In a different terminal window:

langstream gateway chat test -cg consume-output -pg produce-input -p sessionId=$(uuidgen)

chat

See more sample applications in the examples folder.

Create your own application

To create your own application, refer to the developer documentation.

Run LangStream on Kubernetes

LangStream is production-ready, and it's highly suggested deploying it on a Kubernetes cluster. The following Kubernetes distributions are supported:

  • Amazon EKS
  • Azure AKS
  • Google GKE
  • Minikube

To run a LangStream cluster, you need to the following external components:

  • Apache Kafka or Apache Pulsar cluster
  • S3 API-compatible storage or Azure Blob Storage (Amazon S3, Google Cloud Storage, Azure Blob Storage, MinIO)

Production-ready deployment

To install LangStream, you can use the langstream Helm chart:

helm repo add langstream https://langstream.ai/charts
helm repo update

Then create the values file. At this point you already need the storage service to be up and running.

In case you're using S3, you can use the following values:

codeStorage:
  type: s3
  configuration:
    access-key: <aws-access-key>
    secret-key: <aws-secret-key>

For Azure:

codeStorage:
  type: azure
  configuration:
    endpoint: https://<storage-account>.blob.core.windows.net
    container: langstream
    storage-account-name: <storage-account>
    storage-account-key: <storage-account-key>

Now install LangStream with it:

helm install -n langstream --create-namespace langstream langstream/langstream --values values.yaml
kubectl wait -n langstream deployment/langstream-control-plane --for condition=available --timeout=300s

Local deployment

To create a local LangStream cluster, it's recommended to use minikube. mini-langstream comes in help for installing and managing your local cluster.

To install mini-langstream:

  • MacOS:
brew install LangStream/langstream/mini-langstream
  • Unix:
curl -Ls "https://raw.githubusercontent.com/LangStream/langstream/main/mini-langstream/get-mini-langstream.sh" | bash

Then startup the cluster:

mini-langstream start

Deploy an application:

export OPEN_AI_ACCESS_KEY=<your-openai-api-key>
mini-langstream cli apps deploy my-app -app https://github.com/LangStream/langstream/tree/main/examples/applications/openai-completions -s https://github.com/LangStream/langstream/blob/main/examples/secrets/secrets.yaml

To stop the cluster:

mini-langstream delete

Refer to the mini-langstream documentation to learn more.

Development

Requirements for building the project:

  • Docker
  • Java 17
  • Git
  • Python 3.11+ and PIP

If you want to test local code changes, you can use mini-langstream.

mini-langstream dev start

This command will build the images in the minikube context and install all the LangStream services with the snapshot image.

Once the cluster is running, if you want to build abd load a new version of a specific service you can run:

mini-langstream dev build <service>

or for all the services

mini-langstream dev build