Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve validation of llm_config #1946

Merged
merged 10 commits into from
Mar 11, 2024
Merged

improve validation of llm_config #1946

merged 10 commits into from
Mar 11, 2024

Conversation

sonichi
Copy link
Contributor

@sonichi sonichi commented Mar 10, 2024

Why are these changes needed?

The current llm_config validation is too restricted and excludes valid configs like #1863

Related issue number

close #1863

Checks

@codecov-commenter
Copy link

codecov-commenter commented Mar 10, 2024

Codecov Report

Attention: Patch coverage is 91.66667% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 60.24%. Comparing base (ce71d85) to head (a3de339).

Files Patch % Lines
autogen/agentchat/groupchat.py 0.00% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #1946       +/-   ##
===========================================
+ Coverage   37.33%   60.24%   +22.91%     
===========================================
  Files          64       64               
  Lines        6913     6913               
  Branches     1519     1649      +130     
===========================================
+ Hits         2581     4165     +1584     
+ Misses       4109     2369     -1740     
- Partials      223      379      +156     
Flag Coverage Δ
unittests 59.98% <91.66%> (+22.65%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sonichi
Copy link
Contributor Author

sonichi commented Mar 11, 2024

@doganaktar could you test whether this PR solves your issue?

@sonichi sonichi added llm models Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.) labels Mar 11, 2024
@sonichi sonichi enabled auto-merge March 11, 2024 22:35
@sonichi sonichi added this pull request to the merge queue Mar 11, 2024
Merged via the queue into main with commit 5235818 Mar 11, 2024
63 of 67 checks passed
@sonichi sonichi deleted the i1863 branch March 11, 2024 22:44
whiskyboy pushed a commit to whiskyboy/autogen that referenced this pull request Apr 17, 2024
* improve validation of llm_config

* fixed test_register_for_llm_without_LLM

* docstr about llm_config=None

* Make None a sentinel

* pop tools

---------

Co-authored-by: Davor Runje <davor@airt.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
models Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Autogen llm_config error
5 participants