-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with using different model #5
Comments
I can't be sure if it works for you, but I suggest you to update the core mxnet. If you use the original mxnet_predict-all.cc you have to create a new one following this guide https://github.com/dmlc/mxnet/tree/master/amalgamation . I had to do so to use newer model architectures. |
Hi, @SlipknotTN |
This is my latest tested version, but it is 1 year's old, so I suggest you to run the amalgamation from scratch. Follow the iOS instructions here https://github.com/dmlc/mxnet/tree/master/amalgamation . Updating mxnet_predict-all I resolved the problem and I managed to run a model not supported by the version present in this repo. I tested a SqueezeNet network. |
Thanks for your help! In fact, I follow the instructions you mentioned and create the prediction file, and it works on iOS. If you need the file, I can share with you. |
Thanks guys. Sorry that I don't have time to maintain the mxnet file. |
Hi
I'm trying use converted caffe models and other mxnet models in place of Inception-BN Network model. The app crashes at execution of
MXPredForward(predictor);
duringp->exec->Forward(false);
. Anyone else faced the same issue? Any help would be appreciated.Thanks for the amazing app
The text was updated successfully, but these errors were encountered: