Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(vllm): do not set videos if we don't have any #3885

Merged
merged 1 commit into from
Oct 20, 2024

Conversation

mudler
Copy link
Owner

@mudler mudler commented Oct 20, 2024

Description

This pull request to backend/python/vllm/backend.py includes changes to improve the handling of multi-modal data and error management in the load_image method.

Improvements to multi-modal data handling:

  • async def _predict(self, request, context, streaming=False): Refactored the handling of multi_modal_data to use a dictionary that is conditionally populated with image and video data, simplifying the code and making it more readable.

Error management improvements:

  • def load_image(self, image_path: str): Modified the method to return None instead of calling load_video when an error occurs, ensuring that the method's behavior is more predictable and consistent.

Notes for Reviewers

This fixes a bug where if a model didn't support both video and image understanding would fail. Seems vLLM is sensible and if there is a "video" key in the args it will check if the model capabilities include video understanding and fail otherwise.

Signed commits

  • Yes, I signed my commits.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler added the bug Something isn't working label Oct 20, 2024
Copy link

netlify bot commented Oct 20, 2024

Deploy Preview for localai ready!

Name Link
🔨 Latest commit abadd47
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/6714d02ad26e540008c757ba
😎 Deploy Preview https://deploy-preview-3885--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mudler mudler merged commit 26c4058 into master Oct 20, 2024
32 checks passed
@mudler mudler deleted the fix/vllm_multimodal-vids branch October 20, 2024 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant