Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When running with the GPU model, the OpenCL library cannot be opened on this device. #5551

Closed
tjdtn0219 opened this issue Jul 30, 2024 · 4 comments
Assignees
Labels
gpu MediaPipe GPU related issues platform:android Issues with Android as Platform task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions

Comments

@tjdtn0219
Copy link

tjdtn0219 commented Jul 30, 2024

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

No

OS Platform and Distribution

ARM64 Android14("UpsideDownCake)

MediaPipe Tasks SDK version

34

Task name (e.g. Image classification, Gesture recognition etc.)

LLM

Programming Language and version (e.g. C++, Python, Java)

Kotlin

Describe the actual behavior

I cloned the examples/llm_inference/android example and tested it without any changes, following the guidelines. However, it works correctly when using gemma2b-cpu.bin, but an error occurs when using gemma2b-gpu.bin. I downloaded gemma2b-gpu.bin in int4 format from Kaggle. When I run the Android application, there are no errors in the logcat, but when I launch the app, the emulator screen displays an error message saying,
MediaPipeException: internal: Failed to initialize session: %s Can not open OpenCL library on this device.
I would appreciate it if you could let me know if this is an issue with the emulator.

Describe the expected behaviour

Nothing Exception.

Standalone code/steps you may have used to try to get what you need

MediaPipeException: internal: Failed to initialize session: %s Can not open OpenCL library on this device.

Other info / Complete Logs

No response

@kuaashish
Copy link
Collaborator

Hi @tjdtn0219,

The issue appears to be with the emulator. Our GenAI tasks, including Image Generator and LLM Inference, currently support only real physical devices. Please test the same scenario on a modern, high-quality physical device and let us know if the issue persists.

Thank you!!

@kuaashish kuaashish assigned kuaashish and unassigned ayushgdev Jul 31, 2024
@kuaashish kuaashish added task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup platform:android Issues with Android as Platform type:support General questions gpu MediaPipe GPU related issues stat:awaiting response Waiting for user response labels Jul 31, 2024
Copy link

github-actions bot commented Aug 8, 2024

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale label Aug 8, 2024
Copy link

This issue was closed due to lack of activity after being marked stale for past 7 days.

Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@kuaashish kuaashish removed stat:awaiting response Waiting for user response stale labels Aug 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
gpu MediaPipe GPU related issues platform:android Issues with Android as Platform task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions
Projects
None yet
Development

No branches or pull requests

3 participants