Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

README.md: Add OpenVINO support details #1112

Merged

Conversation

RyanMetcalfeInt8
Copy link
Contributor

Took some of the setup details from PR #1037 and added them to the README. Feel free to suggest clarifications to wording, or different location to drop this information. Thanks again!

@ggerganov ggerganov merged commit 1fa360f into ggerganov:master Jul 25, 2023
jacobwu-b pushed a commit to jacobwu-b/Transcriptify-by-whisper.cpp that referenced this pull request Oct 24, 2023
jacobwu-b pushed a commit to jacobwu-b/Transcriptify-by-whisper.cpp that referenced this pull request Oct 24, 2023
The first time run on an OpenVINO device is slow, since the OpenVINO framework will compile the IR (Intermediate Representation) model to a device-specific 'blob'. This device-specific blob will get
cached for the next run.

For more information about the Core ML implementation please refer to PR [#1037](https://github.com/ggerganov/whisper.cpp/pull/1037).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Core ML or maybe OpenVINO?

landtanin pushed a commit to landtanin/whisper.cpp that referenced this pull request Dec 16, 2023
iThalay pushed a commit to iThalay/whisper.cpp that referenced this pull request Sep 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants