Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(main): release python-openinference-instrumentation-llama-index 3.1.1 #1176

Conversation

github-actions[bot]
Copy link
Contributor

🤖 I have created a release beep boop

3.1.1 (2024-12-17)

Bug Fixes

  • llama-index: extract token counts for groq when streaming (#1174) (0aafe9c)

This PR was generated with Release Please. See documentation.

@github-actions github-actions bot requested a review from a team as a code owner December 17, 2024 21:13
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Dec 17, 2024
@RogerHYang RogerHYang merged commit 8ca8826 into main Dec 17, 2024
@RogerHYang RogerHYang deleted the release-please--branches--main--components--python-openinference-instrumentation-llama-index branch December 17, 2024 21:15
Copy link
Contributor Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
autorelease: tagged size:S This PR changes 10-29 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

1 participant