-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: openai adapter does not work #576
Comments
yes, as you can see from the issue, it is due to the incompatibility of the openai upgrade interface. Before being compatible with openai 1.x, you can try to use the |
Ok thank you for the direction! I'll implement it |
@SimFG @CyprienRicque How can we in a Python program create and then interact with a cache without involving any LLM model? Asking because I want to benchmark some cache settings in a Python code without worrying about setting up an LLM model to interact with. I tried calling cache.put("Hi", "Hi back") but got the error AttributeError: 'Cache' object has no attribute 'put'. Is there a way to use cache just using get and put in Python code (i.e., after creating and initializing cache with some settings like distance thresahold, etc. in the Python code rather than starting cache as a server) without involving any LLM? Any help on this is appreciated. |
@SimFG Also, after I stop a running gptcache server to change some settings in the config yaml file (such as distance/similarity threshold) and restart the server using the cmd "gptcache_server -s 127.0.0.1 -p 8000 -f gptcache_server_config.yaml", I get the following error: start to install package: ruamel-yaml Only way the server starts is when I also change the cache dir of the cache in the yaml config file. How can I fix this issue? I am not sure why just changing the simi threshold in yaml config and restarting the same server would give above error. Any insights for resolving above issue is appreciated. |
@SimFG Are you suggesting here that until openai adapter becomes compatible with openai 1.x, we use cache by starting the gptcache server and access it using get and put methods? Any help on providing more details on this is appreciated. |
@judahkshitij a example case: https://github.com/zilliztech/GPTCache/blob/main/examples/adapter/api.py |
Current Behavior
The example in the readme produces the error APIRemovedInV1
Steps To Reproduce
try at: https://colab.research.google.com/drive/1TjA2plt9ZXLHIQVvZ763Nj6fzshYGSoN?usp=sharing
Environment
Anything else?
likely related to #570
The text was updated successfully, but these errors were encountered: