Skip to content
This repository has been archived by the owner on Aug 25, 2024. It is now read-only.

Commit

Permalink
[agents] Add Agent to call LangServe (langserve-invoke) (LangStream#673)
Browse files Browse the repository at this point in the history
  • Loading branch information
eolivelli authored Oct 31, 2023
1 parent 2f9720b commit 9dadc7e
Show file tree
Hide file tree
Showing 28 changed files with 1,410 additions and 20 deletions.
68 changes: 68 additions & 0 deletions examples/applications/langserve-invoke/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Invoking a LangServe service

This sample application explains how to invoke a LangServe service and leverage streaming capabilities.

## Set up your LangServe environment

Start you LangServe application, the example below is using the LangServe sample [application](https://github.com/langchain-ai/langserve)

```python
#!/usr/bin/env python
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatAnthropic, ChatOpenAI
from langserve import add_routes


app = FastAPI(
title="LangChain Server",
version="1.0",
description="A simple api server using Langchain's Runnable interfaces",
)

model = ChatOpenAI()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
add_routes(
app,
prompt | model,
path="/chain",
)

if __name__ == "__main__":
import uvicorn

uvicorn.run(app, host="localhost", port=8000)
```

## Configure you OpenAI API Key and run the application

```bash
export OPENAI_API_KEY=...
pip install fastapi langserve langchain openai sse_starlette uvicorn
python example.py
```

The sample application is exposing a chain at http://localhost:8000/chain/stream and http://localhost:8000/chain/invoke.

The application, running in docker, connects to http://host.docker.internal:8000/chain/stream

LangStream sends an input like this:

```json
{
"input": {
"topic": "cats"
}
}
```

When "topic" is the topic of the joke you want to generate and it is taken from the user input.

## Deploy the LangStream application
```
./bin/langstream docker run test -app examples/applications/langserve-invoke
```

## Interact with the application

You can now interact with the application using the UI opening your browser at http://localhost:8092/
41 changes: 41 additions & 0 deletions examples/applications/langserve-invoke/example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
#!/usr/bin/env python
#
# Copyright DataStax, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatOpenAI
from langserve import add_routes


app = FastAPI(
title="LangChain Server",
version="1.0",
description="A simple api server using Langchain's Runnable interfaces",
)

model = ChatOpenAI()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
add_routes(
app,
prompt | model,
path="/chain",
)

if __name__ == "__main__":
import uvicorn

uvicorn.run(app, host="localhost", port=8000)
25 changes: 25 additions & 0 deletions examples/applications/langserve-invoke/gateways.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#
#
# Copyright DataStax, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

gateways:
- id: chat
type: chat
chat-options:
answers-topic: streaming-answers-topic
questions-topic: input-topic
headers:
- value-from-parameters: session-id
41 changes: 41 additions & 0 deletions examples/applications/langserve-invoke/pipeline.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
#
# Copyright DataStax, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

topics:
- name: "input-topic"
creation-mode: create-if-not-exists
- name: "output-topic"
creation-mode: create-if-not-exists
- name: "streaming-answers-topic"
creation-mode: create-if-not-exists
pipeline:
- type: "langserve-invoke"
input: input-topic
output: output-topic
id: step1
configuration:
output-field: value.answer
stream-to-topic: streaming-answers-topic
stream-response-field: value
min-chunks-per-message: 10
debug: false
method: POST
allow-redirects: true
handle-cookies: false
url: "http://host.docker.internal:8000/chain/stream"
fields:
- name: topic
expression: "value"
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Export to the ENV the access key to OpenAI
export OPEN_AI_ACCESS_KEY=...
```

The default [secrets file](../../secrets/secrets.yaml) reads from the ENV. Check out the file to learn more about
The default [secrets file](../../../secrets/secrets.yaml) reads from the ENV. Check out the file to learn more about
the default settings, you can change them by exporting other ENV variables.


Expand All @@ -26,7 +26,7 @@ export LANGSMITH_APIKEY=xxxxx
## Deploy the LangStream application

```
./bin/langstream docker run test -app examples/applications/langserve -s examples/secrets/secrets.yaml --start-broker=false
./bin/langstream docker run test -app examples/applications/python/langserve-service -s examples/secrets/secrets.yaml --start-broker=false
```

## Interact with the application
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ai.langstream.agents.azureblobstorage.source;
package ai.langstream.agents.azureblobstorage;

import ai.langstream.api.runner.code.AbstractAgentCode;
import ai.langstream.api.runner.code.AgentSource;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ai.langstream.agents.azureblobstorage.source;
package ai.langstream.agents.azureblobstorage;

import ai.langstream.api.runner.code.AgentCode;
import ai.langstream.api.runner.code.AgentCodeProvider;
Expand Down
Original file line number Diff line number Diff line change
@@ -1 +1 @@
ai.langstream.agents.azureblobstorage.source.AzureBlobStorageSourceCodeProvider
ai.langstream.agents.azureblobstorage.AzureBlobStorageSourceCodeProvider
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ai.langstream.agents.azureblobstorage.source;
package ai.langstream.agents.azureblobstorage;

import static org.junit.jupiter.api.Assertions.*;

Expand Down
5 changes: 5 additions & 0 deletions langstream-agents/langstream-agent-http-request/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,11 @@
<artifactId>junit-jupiter</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.github.tomakehurst</groupId>
<artifactId>wiremock</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ai.langstream.agents.azureblobstorage.source;
package ai.langstream.agents.http;

import ai.langstream.ai.agents.commons.JsonRecord;
import ai.langstream.ai.agents.commons.MutableRecord;
Expand Down Expand Up @@ -191,6 +191,12 @@ public void processRecord(Record record, RecordSink recordSink) {
recordSink.emit(
new SourceRecordAndResult(record, List.of(), e));
}
})
.exceptionally(
error -> {
log.error("Error processing record: {}", record, error);
recordSink.emit(new SourceRecordAndResult(record, null, error));
return null;
});
} catch (Throwable error) {
log.error("Error processing record: {}", record, error);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ai.langstream.agents.azureblobstorage.source;
package ai.langstream.agents.http;

import ai.langstream.api.runner.code.AgentCodeProvider;
import ai.langstream.api.runner.code.AgentProcessor;
Expand All @@ -25,7 +25,11 @@
public class HttpRequestAgentProvider implements AgentCodeProvider {

private static final Map<String, Supplier<AgentProcessor>> FACTORIES =
Map.of("http-request", HttpRequestAgent::new);
Map.of(
"http-request",
HttpRequestAgent::new,
"langserve-invoke",
LangServeInvokeAgent::new);

@Override
public boolean supports(String agentType) {
Expand Down
Loading

0 comments on commit 9dadc7e

Please sign in to comment.