-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model_test 跑Unet模型出现concat op InferShape错误 #2724
Comments
该模型是tf生成的,然后用x2paddle转化为__model__,在tf上可以跑通。测试环境为 paddle-lite andriod 845 ,执行语句为 |
请问下能提供模型和参数文件吗,只有模型文件跑不起来 |
已定位问题,目前conv_transpose kernel没有支持output_size参数,导致输出shape为(1,1),后续concat的时候报错,预计下周支持这个参数。 |
好的,劳驾了 。 还有就是我测了一下 https://paddlepaddle.github.io/Paddle-Lite/v2.2.0/benchmark/ 提供的 shufflent_v2 也是有concat op ,也是跑不了的。供参考 |
测试了一下,跑shuffle_net没有问题呀。 conv_transpose的output_size pr合入后支持 |
model path:path
报错语句
The text was updated successfully, but these errors were encountered: