Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ feat(llm): support Ollama AI Provider (local llm) #1265

Merged
merged 1 commit into from
Feb 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,14 @@ OPENAI_API_KEY=sk-xxxxxxxxx
#AWS_ACCESS_KEY_ID=xxxxxxxxxxxxxxxxxxx
#AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

########################################
######### Ollama AI Service ##########
########################################

# You can use ollama to get and run LLM locally, learn more about it via https://github.com/ollama/ollama
# The local/remote ollama service url
# OLLAMA_PROXY_URL=http://127.0.0.1:11434/v1

########################################
############ Market Service ############
########################################
Expand Down
3 changes: 3 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,7 @@ ENV ZHIPU_API_KEY ""
# Moonshot
ENV MOONSHOT_API_KEY ""

# Ollama
ENV OLLAMA_PROXY_URL ""

CMD ["node", "server.js"]
10 changes: 10 additions & 0 deletions docs/Deployment/Environment-Variable.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ LobeChat provides additional configuration options during deployment, which can
- [Moonshot AI](#moonshot-ai)
- [Google AI](#google-ai)
- [AWS Bedrock](#aws-bedrock)
- [Ollama](#ollama)
- [Plugin Service](#plugin-service)
- [`PLUGINS_INDEX_URL`](#plugins_index_url)
- [`PLUGIN_SETTINGS`](#plugin_settings)
Expand Down Expand Up @@ -208,6 +209,15 @@ If you need to use Azure OpenAI to provide model services, you can refer to the
- Default Value: `us-east-1`
- Example: `us-east-1`

### Ollama

#### `OLLAMA_PROXY_URL`

- Type: Optional
- Description: To enable the Ollama service provider, if set up which will appear as selectable model card in the language model setting page, you can also specify a custom language model.
- Default: -
- Example: `http://127.0.0.1:11434/v1`

## Plugin Service

### `PLUGINS_INDEX_URL`
Expand Down
10 changes: 10 additions & 0 deletions docs/Deployment/Environment-Variable.zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ LobeChat 在部署时提供了一些额外的配置项,使用环境变量进
- [Moonshot AI](#moonshot-ai)
- [Google AI](#google-ai)
- [AWS Bedrock](#aws-bedrock)
- [Ollama](#ollama)
- [插件服务](#插件服务)
- [`PLUGINS_INDEX_URL`](#plugins_index_url)
- [`PLUGIN_SETTINGS`](#plugin_settings)
Expand Down Expand Up @@ -206,6 +207,15 @@ LobeChat 在部署时提供了一些额外的配置项,使用环境变量进
- 默认值:`us-east-1`
- 示例:`us-east-1`

### Ollama

#### `OLLAMA_PROXY_URL`

- 类型:可选
- 描述:用于启用 Ollama 服务,设置后可在语言模型列表内展示可选开源语言模型,也可以指定自定义语言模型
- 默认值:-
- 示例:`http://127.0.0.1:11434/v1`

## 插件服务

### `PLUGINS_INDEX_URL`
Expand Down
14 changes: 14 additions & 0 deletions src/app/api/chat/[provider]/agentRuntime.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ import {
LobeBedrockAI,
LobeGoogleAI,
LobeMoonshotAI,
LobeOllamaAI,
LobeOpenAI,
LobeRuntimeAI,
LobeZhipuAI,
Expand Down Expand Up @@ -66,6 +67,12 @@ class AgentRuntime {

case ModelProvider.Bedrock: {
runtimeModel = this.initBedrock(payload);
break;
}

case ModelProvider.Ollama: {
runtimeModel = this.initOllama(payload);
break;
}
}

Expand Down Expand Up @@ -138,6 +145,13 @@ class AgentRuntime {

return new LobeBedrockAI({ accessKeyId, accessKeySecret, region });
}

private static initOllama(payload: JWTPayload) {
const { OLLAMA_PROXY_URL } = getServerConfig();
const baseUrl = payload?.endpoint || OLLAMA_PROXY_URL;

return new LobeOllamaAI(baseUrl);
}
}

export default AgentRuntime;
2 changes: 2 additions & 0 deletions src/app/api/config/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ export const GET = async () => {
ENABLED_AWS_BEDROCK,
ENABLED_GOOGLE,
ENABLE_OAUTH_SSO,
ENABLE_OLLAMA,
} = getServerConfig();

const config: GlobalServerConfig = {
Expand All @@ -23,6 +24,7 @@ export const GET = async () => {
bedrock: { enabled: ENABLED_AWS_BEDROCK },
google: { enabled: ENABLED_GOOGLE },
moonshot: { enabled: ENABLED_MOONSHOT },
ollama: { enabled: ENABLE_OLLAMA },
zhipu: { enabled: ENABLED_ZHIPU },
},
};
Expand Down
3 changes: 3 additions & 0 deletions src/app/api/errorResponse.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,9 @@ const getStatus = (errorType: ILobeAgentRuntimeErrorType | ErrorType) => {
case AgentRuntimeErrorType.MoonshotBizError: {
return 476;
}
case AgentRuntimeErrorType.OllamaBizError: {
return 478;
}
}
return errorType as number;
};
Expand Down
75 changes: 75 additions & 0 deletions src/app/settings/llm/Ollama/index.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import { Ollama } from '@lobehub/icons';
import { Form, type ItemGroup } from '@lobehub/ui';
import { Form as AntForm, Input, Switch } from 'antd';
import { useTheme } from 'antd-style';
import { debounce } from 'lodash-es';
import { memo } from 'react';
import { useTranslation } from 'react-i18next';
import { Flexbox } from 'react-layout-kit';

import { FORM_STYLE } from '@/const/layoutTokens';
import { ModelProvider } from '@/libs/agent-runtime';
import { useGlobalStore } from '@/store/global';
import { modelProviderSelectors } from '@/store/global/selectors';

import Checker from '../Checker';
import { LLMProviderBaseUrlKey, LLMProviderConfigKey } from '../const';
import { useSyncSettings } from '../useSyncSettings';

const providerKey = 'ollama';

const OllamaProvider = memo(() => {
const { t } = useTranslation('setting');
const [form] = AntForm.useForm();
const theme = useTheme();
const [toggleProviderEnabled, setSettings] = useGlobalStore((s) => [
s.toggleProviderEnabled,
s.setSettings,
]);
const enabled = useGlobalStore(modelProviderSelectors.enableOllama);

useSyncSettings(form);

const model: ItemGroup = {
children: [
{
children: <Input allowClear placeholder={t('llm.Ollama.endpoint.placeholder')} />,
desc: t('llm.Ollama.endpoint.desc'),
label: t('llm.Ollama.endpoint.title'),
name: [LLMProviderConfigKey, providerKey, LLMProviderBaseUrlKey],
},
{
children: <Input allowClear placeholder={t('llm.Ollama.customModelName.placeholder')} />,
desc: t('llm.Ollama.customModelName.desc'),
label: t('llm.Ollama.customModelName.title'),
name: [LLMProviderConfigKey, providerKey, 'customModelName'],
},
{
children: <Checker model={'llama2'} provider={ModelProvider.Ollama} />,
desc: t('llm.Ollama.checker.desc'),
label: t('llm.checker.title'),
minWidth: undefined,
},
],
defaultActive: enabled,
extra: (
<Switch
onChange={(enabled: boolean) => {
toggleProviderEnabled(providerKey, enabled);
}}
value={enabled}
/>
),
title: (
<Flexbox align={'center'} gap={8} horizontal>
<Ollama.Combine color={theme.isDarkMode ? theme.colorText : theme.colorPrimary} size={24} />
</Flexbox>
),
};

return (
<Form form={form} items={[model]} onValuesChange={debounce(setSettings, 100)} {...FORM_STYLE} />
);
});

export default OllamaProvider;
7 changes: 7 additions & 0 deletions src/app/settings/llm/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,23 @@ import { Trans, useTranslation } from 'react-i18next';
import Footer from '@/app/settings/features/Footer';
import PageTitle from '@/components/PageTitle';
import { MORE_MODEL_PROVIDER_REQUEST_URL } from '@/const/url';
import { useGlobalStore } from '@/store/global';
import { useSwitchSideBarOnInit } from '@/store/global/hooks/useSwitchSettingsOnInit';
import { SettingsTabs } from '@/store/global/initialState';
import { modelProviderSelectors } from '@/store/global/selectors';

import Bedrock from './Bedrock';
import Google from './Google';
import Moonshot from './Moonshot';
import Ollama from './Ollama';
import OpenAI from './OpenAI';
import Zhipu from './Zhipu';

export default memo(() => {
useSwitchSideBarOnInit(SettingsTabs.LLM);
const enableOllamaFormServerConfig = useGlobalStore(
modelProviderSelectors.enableOllamaFromServerConfig,
);
const { t } = useTranslation('setting');
return (
<>
Expand All @@ -28,6 +34,7 @@ export default memo(() => {
<Moonshot />
<Google />
<Bedrock />
{enableOllamaFormServerConfig ? <Ollama /> : null}
<Footer>
<Trans i18nKey="llm.waitingForMore" ns={'setting'}>
更多模型正在
Expand Down
6 changes: 5 additions & 1 deletion src/components/ModelProviderIcon/index.tsx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { Azure, Bedrock, Google, Moonshot, OpenAI, Zhipu } from '@lobehub/icons';
import { Azure, Bedrock, Google, Moonshot, Ollama, OpenAI, Zhipu } from '@lobehub/icons';
import { memo } from 'react';
import { Center } from 'react-layout-kit';

Expand Down Expand Up @@ -42,6 +42,10 @@ const ModelProviderIcon = memo<ModelProviderIconProps>(({ provider }) => {
return <OpenAI size={20} />;
}

case ModelProvider.Ollama: {
return <Ollama size={20} />;
}

default: {
return null;
}
Expand Down
3 changes: 3 additions & 0 deletions src/config/modelProviders/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import { ChatModelCard } from '@/types/llm';
import BedrockProvider from './bedrock';
import GoogleProvider from './google';
import MoonshotProvider from './moonshot';
import OllamaProvider from './ollama';
import OpenAIProvider from './openai';
import ZhiPuProvider from './zhipu';

Expand All @@ -12,10 +13,12 @@ export const LOBE_DEFAULT_MODEL_LIST: ChatModelCard[] = [
BedrockProvider.chatModels,
GoogleProvider.chatModels,
MoonshotProvider.chatModels,
OllamaProvider.chatModels,
].flat();

export { default as BedrockProvider } from './bedrock';
export { default as GoogleProvider } from './google';
export { default as MoonshotProvider } from './moonshot';
export { default as OllamaProvider } from './ollama';
export { default as OpenAIProvider } from './openai';
export { default as ZhiPuProvider } from './zhipu';
27 changes: 27 additions & 0 deletions src/config/modelProviders/ollama.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
import { ModelProviderCard } from '@/types/llm';

const Ollama: ModelProviderCard = {
chatModels: [
{
displayName: 'Llama2 7B',
functionCall: false,
id: 'llama2',
vision: false,
},
{
displayName: 'Mistral',
functionCall: false,
id: 'mistral',
vision: false,
},
{
displayName: 'Qwen 7B Chat',
functionCall: false,
id: 'qwen:7b-chat',
vision: false,
},
],
id: 'ollama',
};

export default Ollama;
6 changes: 6 additions & 0 deletions src/config/server/provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,9 @@ declare global {
AWS_ACCESS_KEY_ID?: string;
AWS_SECRET_ACCESS_KEY?: string;

// Ollama Provider;
OLLAMA_PROXY_URL?: string;

DEBUG_CHAT_COMPLETION?: string;
}
}
Expand Down Expand Up @@ -83,6 +86,9 @@ export const getProviderConfig = () => {
AZURE_ENDPOINT: process.env.AZURE_ENDPOINT,
USE_AZURE_OPENAI: process.env.USE_AZURE_OPENAI === '1',

ENABLE_OLLAMA: !!process.env.OLLAMA_PROXY_URL,
OLLAMA_PROXY_URL: process.env.OLLAMA_PROXY_URL || '',

DEBUG_CHAT_COMPLETION: process.env.DEBUG_CHAT_COMPLETION === '1',
};
};
4 changes: 4 additions & 0 deletions src/const/settings.ts
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,10 @@ export const DEFAULT_LLM_CONFIG: GlobalLLMConfig = {
apiKey: '',
enabled: false,
},
ollama: {
enabled: false,
endpoint: '',
},
openAI: {
OPENAI_API_KEY: '',
models: [],
Expand Down
3 changes: 3 additions & 0 deletions src/libs/agent-runtime/error.ts
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ export const AgentRuntimeErrorType = {

InvalidMoonshotAPIKey: 'InvalidMoonshotAPIKey',
MoonshotBizError: 'MoonshotBizError',

InvalidOllamaArgs: 'InvalidOllamaArgs',
OllamaBizError: 'OllamaBizError',
} as const;

export type ILobeAgentRuntimeErrorType =
Expand Down
1 change: 1 addition & 0 deletions src/libs/agent-runtime/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ export { LobeBedrockAI } from './bedrock';
export * from './error';
export { LobeGoogleAI } from './google';
export { LobeMoonshotAI } from './moonshot';
export { LobeOllamaAI } from './ollama';
export { LobeOpenAI } from './openai';
export * from './types';
export { AgentRuntimeError } from './utils/createError';
Expand Down
Loading
Loading