Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Windows 本地客户端使用 Gemini Pro 模型时报错 #3735

Closed
1 of 4 tasks
taurusxin opened this issue Jan 2, 2024 · 23 comments
Closed
1 of 4 tasks

[Bug] Windows 本地客户端使用 Gemini Pro 模型时报错 #3735

taurusxin opened this issue Jan 2, 2024 · 23 comments

Comments

@taurusxin
Copy link

Describe the bug
使用 Gemini Pro 模型时,会出现如下报错

{
  "error": true,
  "message": "Unexpected token '<', \"<!DOCTYPE \"... is not valid JSON"
}

To Reproduce
Steps to reproduce the behavior:
使用 Gemini Pro 模型时

Expected behavior
期望出现正确的相应,在 Web 端没有问题

Screenshots

PixPin_2024-01-02_10-27-09

Deployment

  • Docker
  • Vercel
  • Server
  • Client

Desktop (please complete the following information):

  • OS: Windows 11
  • Version v2.10.1

Additional Logs
Add any logs about the problem here.

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Title: [Bug] An error occurs when using Gemini Pro model on Windows local client

@H0llyW00dzZ
Copy link
Contributor

@H0llyW00dzZ
Copy link
Contributor

it's because of this

image

in client if using that path it's become localhost

@taurusxin
Copy link
Author

been known this bug

Thank you for your work, so this is your fork of NextChat and has not been merge into the main repo?

@H0llyW00dzZ
Copy link
Contributor

been known this bug

Thank you for your work, so this is your fork of NextChat and has not been merge into the main repo?

later, because in a fix temporary it used a default cors host and currently can't be customize like ChatGPT LLM Api by @Yidadaa which can be customize by using custom end point for routing it such as proxy

@ShinChven
Copy link

我现在把 gemini pro 丢到 One API 里面,当作 openai 的 model 使用,没什么问题,非常丝滑。
NextChat 的 gemini pro 兼容目前不太稳定。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


I now throw gemini pro into One API and use it as an openai model. There is no problem and it is very smooth.
NextChat's gemini pro compatibility is currently unstable.

@onlyhuman028
Copy link

被收购后,这么个简单问题都没人搞定么?连个正常的回复都没有

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


After being acquired, has no one solved such a simple problem? Not even a normal reply

@kitaev-chen
Copy link

我现在把 gemini pro 丢到 One API 里面,当作 openai 的 model 使用,没什么问题,非常丝滑。 NextChat 的 gemini pro 兼容目前不太稳定。

请问以下具体操作细节?不是太了解 One API

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


I now throw gemini pro into One API and use it as an openai model. There is no problem and it is very smooth. NextChat's gemini pro compatibility is currently unstable.

Could you please tell me the following specific operation details? Don’t know much about One API

@Barry04
Copy link

Barry04 commented Jan 15, 2024

这个问题还没被解决吗?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Has this issue not been resolved yet?

@ShinChven
Copy link

One API 兼容的问题吗?
我提交issue的时候的时候,maintainer回都不回就直接把我的issue关了。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


One API compatibility issues?
When I submitted an issue, the maintainer closed my issue without replying.

@kitaev-chen
Copy link

可以参考这个issue songquanpeng/one-api#887

@ShinChven
Copy link

可以参考这个issue songquanpeng/one-api#887

我之前是建了一个自定义的模型叫 google-gemini-pro 然后把 google-gemini-pro 的请求重定向到 gemini-pro 这样就可以使用用 ONE API 的 OpenAI 协议接口调用 gemini-pro 了。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can refer to this issue songquanpeng/one-api#887

I previously built a custom model called google-gemini-pro and then redirected the request of google-gemini-pro to gemini-pro so that I can call gemini-pro using the OpenAI protocol interface of ONE API.

@BLACKFTVV
Copy link

我现在把gemini pro丢到一个API里面,目前openai的模型使用,没什么问题,非常丝滑 。NextChat的gemini pro兼容目前不太稳定。

怎样操作啊?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


I now put gemini pro into an API and use it with the current openai model. There is no problem and it is very smooth. NextChat's gemini pro compatibility is currently unstable.

How to operate?

@openmynet
Copy link

在未修复的本地客户端版本终端地址URL格式为:
https://gemini.you.proxy/v1beta/models/gemini-pro:generateContent
https://{domain}/v1beta/models/{model}:generateContent

模型地址:https://{domain}/v1beta/models or https://generativelanguage.googleapis.com/v1beta/models

@lloydzhou
Copy link
Member

麻烦测试一下最新的版本问题是否已经完全修复。
我自己在当前最新的v2.15.5版本测试是可以使用的。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Please test the latest version to see if the problem has been completely fixed.
I have tested that it can be used in the latest v2.15.5 version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants