Skip to content
This repository has been archived by the owner on Aug 25, 2024. It is now read-only.

Commit

Permalink
Simplify readme (LangStream#498)
Browse files Browse the repository at this point in the history
  • Loading branch information
nicoloboschi authored Sep 28, 2023
1 parent 88c48ae commit 15de913
Showing 1 changed file with 96 additions and 201 deletions.
297 changes: 96 additions & 201 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,68 +21,13 @@ Get the LangStream VS Code extension [here](https://marketplace.visualstudio.com
## Contents

* [LangStream](#langstream)
* [Run LangStream server](#run-langstream-server)
* [Quick start (All-in-one deployment)](#quick-start-all-in-one-deployment)
* [LangStream deployment](#langstream-deployment)
* [CLI](#cli)
* [Installation](#installation)
* [Enable auto-completion](#enable-auto-completion)
* [Usage](#usage)
* [Deploy your first application](#deploy-your-first-application)
* [Install Kubernetes Environment on MacOS](#install-kubernetes-environment-on-macos)
* [Minikube](#minikube)
* [kubectl](#kubectl)
* [Helm](#helm)
* [Try the sample application](#try-the-sample-application)
* [Create your own application](#create)
* [Run LangStream on Kubernetes](#run-langstream-on-kubernetes)
* [Production-ready deployment](#production-ready-deployment)
* [Local deployment](#local-deployment)
* [Development](#development)
* [Start minikube](#start-minikube)
* [Build docker images and deploy all the components](#build-docker-images-and-deploy-all-the-components)
* [Deploying to GKE or similar K8s test cluster](#deploying-to-gke-or-similar-k8s-test-cluster)
* [Deploying to a Persistent Cluster](#deploying-to-a-persistent-cluster)

## Run LangStream server
To run LangStream, you need to the following components:
- Kubernetes Cluster (Minikube, AWS EKS, Azure AKS, Google GKE, etc.)
- Apache Kafka or Apache Pulsar cluster
- S3 bucket or API-compatible storage (ex Minio)

For local application development, Minikube is recommended. For information on setting up a Minikube
environment see [Install Kubernetes Environment on MacOS](#install-kubernetes-environment-on-macos).

### Quick start (All-in-one deployment)

You can install all the required components in one-shot.
First, prepare the kubernetes local context:

```
kubectl config use-context minikube
```

Then, run the following command:

```
./dev/start-simple.sh
```

The above command will automatically start the port-forward for the LangStream control plane and the API Gateway.

### LangStream deployment
To install LangStream only, you can use the `langstream` Helm chart:

```
helm repo add langstream https://langstream.github.io/charts
helm repo update
helm install -n langstream --create-namespace langstream langstream/langstream --values helm/examples/simple.yaml
kubectl wait -n langstream deployment/langstream-control-plane --for condition=available --timeout=300s
```

You can then port-forward the control plane and the API Gateway:

```
kubectl port-forward svc/langstream-control-plane 8090:8090 &
kubectl port-forward svc/langstream-api-gateway 8091:8091 &
```



## CLI

Expand All @@ -108,199 +53,149 @@ There are multiple ways to install the CLI.
curl -Ls "https://raw.githubusercontent.com/LangStream/langstream/main/bin/get-cli.sh" | bash
```

### Enable auto-completion
Installing directly the binary will enable auto-completion for the CLI.

If you installed the CLI with Homebrew, you can enable auto-completion with the following command:
- ZSH

Verify the binary is available:
```
[[ $(grep 'langstream generate-completion' "$HOME/.zshrc") ]] || echo -e "source <(langstream generate-completion)" >> "$HOME/.zshrc"
source $HOME/.zshrc # or open another terminal
langstream -V
```
- Bash

```
[[ $(grep 'langstream generate-completion' "$HOME/.bashrc") ]] || echo -e "source <(langstream generate-completion)" >> "$HOME/.bashrc"
source $HOME/.bashrc # or open another terminal
```
Refer to the [CLI documentation](https://docs.langstream.ai/installation/langstream-cli) to learn more.

### Usage
To get started, run `langstream --help` to see the available commands.
By default, the CLI will connect to the control plane running on `localhost:8090`.

To configure a different LangStream environment, you can configure a new profile:
```
langstream profiles create dev --web-service-url https://langstream-control-plane --api-gateway-url wss://langstream-api-gateway --tenant my-tenant --set-current
```

The above applies for all the configuration options:

| Name | Description | Default |
|---------------|-----------------------------------------|-----------------------|
| webServiceUrl | The URL of the LangStream Control Plane | http://localhost:8090 |
| apiGatewayUrl | The URL of the LangStream API Gateway | http://localhost:8091 |
| tenant | The tenant to use | default |
| token | The token to use | |
## Try the sample application

To get your applications, run:
Run the sample Chat Completions application on-the-fly:
```bash
export OPENAI_API_KEY=your-key-here
langstream docker run test \
-app https://github.com/LangStream/langstream/blob/main/examples/applications/openai-completions \
-s https://github.com/LangStream/langstream/blob/main/examples/secrets/secrets.yaml
```
langstream apps list
```

## Deploy your first application

Inside the [examples](./examples) folder, you can find some examples of applications.

In this example, you will deploy an application that performs AI completion and return information about a known person.

1. Export two environment variables containing the OpenAI URL and access key:
```
export OPEN_AI_URL=xx
export OPEN_AI_ACCESS_KEY=xx
export OPEN_AI_EMBEDDINGS_MODEL=xxx
export OPEN_AI_CHAT_COMPLETIONS_MODEL=xxx
export OPEN_AI_PROVIDER=openai
```
In a different terminal window:

if you are using Azure Open AI then set OPEN_AI_PROVIDER to azure
```bash
langstream gateway chat test -cg consume-output -pg produce-input -p sessionId=$(uuidgen)
```
export OPEN_AI_PROVIDER=azure
```

The values for OPEN_AI_EMBEDDINGS_MODEL and OPEN_AI_CHAT_COMPLETIONS_MODEL depend on your OpenAI environment.
On Azure they must match the names of the deployments you created in the Azure portal.

The [secrets.yaml](./examples/secrets/secrets.yaml) file contains many placeholders that refer to those environment variables.
You can either export them or replace them with the actual values.

![chat](https://langstream.ai/images/chatbot-us-presidents.gif)

2. Deploy the `openai-completions` application

```
./bin/langstream apps deploy openai-completions -app examples/applications/openai-completions -i examples/instances/kafka-kubernetes.yaml -s examples/secrets/secrets.yaml
./bin/langstream apps get openai-completions
```

Check your k8s cluster with `k9s -A` or run `./bin/langstream apps get openai-completions` until the app is deployed.
See more sample applications in the [examples](https://github.com/LangStream/langstream/blob/main/examples/applications) folder.

Test the AI completion using the API gateway. The LangStream CLI provides a convenient chat command to test the application:
## Create your own application

```
./bin/langstream gateway chat openai-completions -cg consume-output -pg produce-input -p sessionId=$(uuidgen)
```
To create your own application, refer to the [developer documentation](https://docs.langstream.ai/building-applications/development-environment).

## Install Kubernetes Environment on MacOS

### Minikube
## Run LangStream on Kubernetes
LangStream is production-ready, and it's highly suggested deploying it on a Kubernetes cluster.
The following Kubernetes distributions are supported:
* Amazon EKS
* Azure AKS
* Google GKE
* Minikube

To install Minikube on MacOS:
To run a LangStream cluster, you need to the following *external* components:
- Apache Kafka or Apache Pulsar cluster
- S3 API-compatible storage or Azure Blob Storage (Amazon S3, Google Cloud Storage, Azure Blob Storage, MinIO)

1. Install Homebrew.

If you haven't installed Homebrew yet, use the following command:
### Production-ready deployment
To install LangStream, you can use the `langstream` Helm chart:

```
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
helm repo add langstream https://langstream.ai/charts
helm repo update
```

2. Update Homebrew.
Then create the values file. At this point you already need the storage service to be up and running.

It's a good practice to ensure you get the latest packages.

```
brew update
In case you're using S3, you can use the following values:
```yaml
codeStorage:
type: s3
configuration:
access-key: <aws-access-key>
secret-key: <aws-secret-key>
```
3. Install Minikube.

Use the following command to install MiniKube.
For Azure:
```yaml
codeStorage:
type: azure-blob-storage
configuration:
endpoint: https://<storage-account>.blob.core.windows.net
container: langstream
storage-account-name: <storage-account>
storage-account-key: <storage-account-key>
```
Now install LangStream with it:
```
brew install minikube
helm install -n langstream --create-namespace langstream langstream/langstream --values values.yaml
kubectl wait -n langstream deployment/langstream-control-plane --for condition=available --timeout=300s
```

4. Install a Hypervisor
### Local deployment

Minikube requires a hypervisor to create a virtual machine where the Kubernetes cluster will run.
Here's how to install HyperKit, which is recommended for macOS:
To create a local LangStream cluster, it's recommended to use [minikube](https://minikube.sigs.k8s.io/docs/start/).
`mini-langstream` comes in help for installing and managing your local cluster.

```
brew install hyperkit
To install `mini-langstream`:
- MacOS:
```bash
brew install LangStream/langstream/mini-langstream
```

After installation, set Minikube to use HyperKit and 4 CPUs:

```
minikube config set driver hyperkit
minikube config set cpus 4
- Unix:
```bash
curl -Ls "https://raw.githubusercontent.com/LangStream/langstream/main/mini-langstream/get-mini-langstream.sh" | bash
```

5. Start Minikube:

To start your local Kubernetes cluster:

Then startup the cluster:
```bash
mini-langstream start
```
minikube start
```

For additional information on installing minikube or installing in other environments,
see this [page](https://minikube.sigs.k8s.io/docs/start/).

If you no longer need Minikube, you
can stop (`minikube stop`) and delete the cluster (`minikube delete`).
Deploy an application:
```bash
export OPENAI_API_KEY=<your-openai-api-key>
mini-langstream cli apps deploy my-app -app https://github.com/LangStream/langstream/tree/main/examples/applications/openai-completions -s https://github.com/LangStream/langstream/blob/main/examples/secrets/secrets.yaml
```

#### Kubectl

To install kubectl, use the following command:
```
brew install kubectl
To stop the cluster:
```bash
mini-langstream delete
```

For additional information on installing kubectl or installing in other environments,
see this [page](https://kubernetes.io/docs/tasks/tools/#kubectl).

#### Helm
Refer to the [mini-langstream documentation](https://docs.langstream.ai/installation/get-started-minikube) to learn more.

To install Helm, use the following command:

```
brew install helm
```

For additional information on installing Helm or installing in other environments,
see this [page](https://helm.sh/docs/intro/install/).
## Development

Requirements:
- Minikube
- kubectl
- Helm
- Docker
- Java 17

### Start minikube
```
minikube start
```
Requirements for building the project:
* Docker
* Java 17
* Git
* Python 3.11+ and PIP

### Build docker images and deploy all the components
```
./dev/start-local.sh
```

The above command will automatically start the port-forward for the LangStream control plane and the API Gateway.
If you want to test local code changes, you can use `mini-langstream`.

```bash
mini-langstream dev start
```

### Deploying to GKE or similar K8s test cluster
This command will build the images in the `minikube` context and install all the LangStream services with the snapshot image.

Instead of `minio-dev.yaml` use the `helm/examples/minio-gke.yaml` file:
Once the cluster is running, if you want to build abd load a new version of a specific service you can run:

```bash
mini-langstream dev build <service>
```
kubectl apply -f helm/examples/minio-gke.yaml
```

### Deploying to a Persistent Cluster

TODO: instructions on configuring with S3.
or for all the services
```bash
mini-langstream dev build
```

0 comments on commit 15de913

Please sign in to comment.