Skip to content

Commit

Permalink
Backport PR jupyterlab#1193: Update documentation to add usage of `Op…
Browse files Browse the repository at this point in the history
…enrouter`
  • Loading branch information
srdas authored and meeseeksmachine committed Jan 9, 2025
1 parent 221c78f commit b6246fe
Show file tree
Hide file tree
Showing 5 changed files with 43 additions and 0 deletions.
Binary file added docs/source/_static/ai-settings.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/openrouter-chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/openrouter-model-setup.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
7 changes: 7 additions & 0 deletions docs/source/users/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -339,6 +339,13 @@ Jupyter AI enables use of language models hosted on [Amazon Bedrock](https://aws
For details on enabling model access in your AWS account, using cross-region inference, or invoking custom/provisioned models, please see our dedicated documentation page on [using Amazon Bedrock in Jupyter AI](bedrock.md).


### OpenRouter Usage

Jupyter AI enables use of language models accessible through [OpenRouter](https://openrouter.ai)'s unified interface. Examples of models that may be accessed via OpenRouter are: [Deepseek](https://openrouter.ai/deepseek/deepseek-chat), [Qwen](https://openrouter.ai/qwen/), [mistral](https://openrouter.ai/mistralai/), etc. OpenRouter enables usage of any model conforming to the OpenAI API.

For details on enabling model access via the AI Settings and using models via OpenRouter, please see the dedicated documentation page on using [OpenRouter in Jupyter AI](openrouter.md).


### SageMaker endpoints usage

Jupyter AI supports language models hosted on SageMaker endpoints that use JSON
Expand Down
36 changes: 36 additions & 0 deletions docs/source/users/openrouter.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Using OpenRouter in Jupyter AI

[(Return to the Chat Interface page)](index.md#openrouter-usage)

For models that are compatible with the OpenAI library, Jupyter AI provides configuration via OpenRouter. By supporting the configuration of parameters such as the api_key, base_url, and model, various large model services compatible with the OpenAI library call methods can be used. For more details on OpenRouter as a unified interface for LLMs, see https://openrouter.ai/.

As an example, we walk through the steps needed to use models from [Deepseek](https://www.deepseek.com) via the OpenRouter provider. If you do not have `langchain-openai` installed, please install it and restart JupyterLab. This is necessary as it provides the SDK for accessing any OpenAI API.

First, navigate to the `AI Settings` pane via the AI settings button in `v2` or via the dropdown in `v3` of Jupyter AI, as shown below:

<img src="../_static/ai-settings.png"
width="75%"
alt='Screenshot of the dropdown where AI Settings is chosen and it opens tab in Jupyter AI where models are selected.'
class="screenshot" />

Second, select the `OpenRouter :: *` model provider in the Jupyter AI settings. If you don't see this, please verify that you have installed `langchain-openai` and that you are using `jupyter_ai>=2.24.0`. Be sure to restart JupyterLab after upgrading or installing either package.

Jupyter AI's settings page with the OpenRouter provider selected is shown below:

<img src="../_static/openrouter-model-setup.png"
width="75%"
alt='Screenshot of the tab in Jupyter AI where OpenRouter model access is selected.'
class="screenshot" />

Type in the model name and the API base URL corresponding to the model you wish to use. For Deepseek, you should use `https://api.deepseek.com` as the API base URL, and use `deepseek-chat` as the local model ID.

If you are using OpenRouter for the first time it will also require entering the `OPENROUTER_API_KEY`. If you have used OpenRouter before with a different model provider, you will need to update the API key. After doing this, click "Save Changes" at the bottom to save your settings.

You should now be able to use Deepseek! An example of usage is shown next:

<img src="../_static/openrouter-chat.png"
width="75%"
alt='Screenshot of chat using Deepseek via the OpenRouter provider.'
class="screenshot" />

[(Return to the Chat Interface page)](index.md#openrouter-usage)

0 comments on commit b6246fe

Please sign in to comment.