Skip to content

mallman/CoreMLaMa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CoreMLaMa: LaMa for Core ML

This repo contains a script for converting a LaMa (aka cute, fuzzy 🦙) model to Apple's Core ML model format. More specifically, it converts the implementation of LaMa from Lama Cleaner.

This repo also includes a simple example of how to use the Core ML model for prediction. See Sample.

iOS Deployment Notes

The Core ML model this script produces was designed for macOS deployments. It runs well on macOS, on the GPU. I have received several reports of unsuccessful attempts to run this model on iOS, especially with fp16 precision on the Neural Engine. Conversely, I have not received any reports of successful deployments to iOS.

It may very well be possible to run this model on iOS with some tuning in the conversion process. I simply have not attempted this. I would very much welcome a PR and give credit to anyone who is able to convert this model and run it with great results on iOS.

Conversion Instructions

  1. Create a Conda environment for CoreMLaMa:

    conda create -n coremllama python=3.10 # works with mamba, too
    conda activate coremllama
    pip install -r requirements.txt
  2. Run the conversion script:

    python convert_lama.py

This script will download and convert Big LaMa to a Core ML package named LaMa.mlpackage.

Acknowledgements and Thanks

Thanks to the authors of LaMa:

[Project page] [arXiv] [Supplementary] [BibTeX] [Casual GAN Papers Summary]

CoreMLaMa uses the LaMa model and supporting code from Lama Cleaner. Lama Cleaner makes this project much simpler.