Support for mobile #469
Unanswered
gillpeacegood
asked this question in
Q&A
Replies: 2 comments 7 replies
-
Hi @gillpeacegood , there is a working example for mobile deployment of yolort: ## Export the TorchScript-ed model
import torch
import torchvision
from torch.utils.mobile_optimizer import optimize_for_mobile
from yolort.models import YOLO
from yolort.v5 import attempt_download
# Prepare some parameters for the exported torchscript-ed module
score_thresh = 0.25
nms_thresh = 0.45
device = torch.device("cpu")
# Downloaded from 'https://github.com/ultralytics/yolov5/releases/download/v6.1/yolov5s.pt'
model_path = "yolov5s.pt"
checkpoint_path = attempt_download(model_path)
model = YOLO.load_from_yolov5(checkpoint_path, score_thresh=score_thresh, nms_thresh=nms_thresh)
model = model.eval()
model = model.to(device)
export_scripted_path = "yolov5s_scripted.pt"
export_optimized_path = "yolov5s_scriptmodule.ptl"
scripted_model = torch.jit.script(model)
scripted_model.save(export_scripted_path)
optimized_model = optimize_for_mobile(scripted_model)
optimized_model._save_for_lite_interpreter(export_optimized_path) See facebookresearch/playtorch#10 (comment) for more details. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Not sure if this is bug or feature or neither, but I have the following problem with trying to save a model for lite interpreter per pytorch mobile pipeline
And this happens
Beta Was this translation helpful? Give feedback.
All reactions