Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: create inference chain & implement first working prototype #3

Merged
merged 6 commits into from
Jul 27, 2024

Conversation

FireHead90544
Copy link
Owner

Changes

  • Setup LLM from Provider Stored in Config
  • Create inference chain with prompts & llm (~550 input tokens/task)
  • Use gemini-1.5-flash for provider Gemini (higher context window)
  • Create inference utility
  • Create formatter for LLM's parsed outputs
  • Finalized to command to be able to run inferences.

@FireHead90544 FireHead90544 merged commit d7b2fe5 into main Jul 27, 2024
@FireHead90544 FireHead90544 deleted the integration branch July 27, 2024 17:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant