Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MediaPipe LLM Inference APi support in non-Chromium browsers #5562

Open
maudnals opened this issue Aug 8, 2024 · 1 comment
Open

MediaPipe LLM Inference APi support in non-Chromium browsers #5562

maudnals opened this issue Aug 8, 2024 · 1 comment
Assignees
Labels
platform:javascript MediaPipe Javascript issues stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:feature Enhancement in the New Functionality or Request for a New Solution

Comments

@maudnals
Copy link

maudnals commented Aug 8, 2024

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

No

OS Platform and Distribution

MacOS Sonoma 14.5

Mobile device if the issue happens on mobile device

No response

Browser and version if the issue happens on browser

No response

Programming Language and version

JavaScript

MediaPipe version

No response

Bazel version

No response

Solution

MediaPipe LLM Inference API

Android Studio, NDK, SDK versions (if issue is related to building in Android environment)

No response

Xcode & Tulsi version (if issue is related to building for iOS)

No response

Describe the actual behavior

Errors are thrown in Firefox (including Nightly) and Safari (including Technology Preview)

Describe the expected behaviour

My MediaPipe code works in all browsers

Standalone code/steps you may have used to try to get what you need

I built and pushed a [demo](https://github.com/GoogleChromeLabs/web-ai-demos/tree/main/on-device-ai-perf-gemma) based on the [MediaPipe LLM Inference API tutorial](https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference/web_js).
One of my [lines](https://github.com/GoogleChromeLabs/web-ai-demos/blob/main/on-device-ai-perf-gemma/src/worker.js#L11) using MediaPipe is causing issues in non-Chromium browsers. It seems WebGPU-related: 

**Firefox (also in Firefox Nightly):**
* I get this error: `TypeError: navigator.gpu is undefined`
* I thought this would work, as per
https://developer.mozilla.org/en-US/docs/Web/API/WebGPU_API
* Is this expected? Are there flags I'm supposed to turn on?


**Safari:**
* Regular Safari: `Unhandled Promise Rejection: TypeError: undefined is not an object (evaluating 'navigator.gpu.requestAdapter')  worker line 11 (llmInference = await LlmInference.createFromModelPath(genai, MODEL_URL);)` - Is this expected? I guess so, as per the support table above.
* Safari Technology Preview: `Unhandled Promise Rejection: Error: The WebGPU device is unable to execute LLM tasks, because the required maxStorageBufferBindingSize is at least 524550144 but your device only supports maxStorageBufferBindingSize of ... <something less>` - Is there anything I can do to work around this?

Other info / Complete Logs

Code to reproduce is available here: https://github.com/GoogleChromeLabs/web-ai-demos/tree/main/on-device-ai-perf-gemma
@maudnals maudnals added the type:bug Bug in the Source Code of MediaPipe Solution label Aug 8, 2024
@kuaashish kuaashish assigned kuaashish and unassigned ayushgdev Aug 9, 2024
@kuaashish kuaashish added type:feature Enhancement in the New Functionality or Request for a New Solution task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup platform:javascript MediaPipe Javascript issues and removed type:bug Bug in the Source Code of MediaPipe Solution labels Aug 9, 2024
@kuaashish
Copy link
Collaborator

kuaashish commented Aug 9, 2024

Hi @maudnals,

Thank you for requesting support for non-Chromium browsers. We have shared this feature with our team, and its implementation will depend on future demand and discussions. However, we can not provide a timeline at this time.

@kuaashish kuaashish added the stat:awaiting googler Waiting for Google Engineer's Response label Aug 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:javascript MediaPipe Javascript issues stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:feature Enhancement in the New Functionality or Request for a New Solution
Projects
None yet
Development

No branches or pull requests

4 participants