-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype #23
Comments
Okay this should be related to this PR tensorflow/tfjs#4008 |
I'm not sure I understand what you are trying to do. |
Yes I know, but I need it to be loaded locally, and the current GraphModel
api cannot load artifacts from file protocol, just uri.
Also I think it’s more efficient to load the local files than via http...
Btw it seems last PR in TFJS fixes this, so we just have to wait the next
release (current is 2.4.0 where the DINT64 problem is).
Il giorno ven 2 ott 2020 alle 19:22 Patrick Levin <notifications@github.com>
ha scritto:
I'm not sure I understand what you are trying to do.
The model in question is a TFJS model, so there is no need to convert it
into a saved model; you can just load it directly into node. The model
works OOTB with nodejs and in the browser.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#23 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AABH4BIGGOT4BGNYQNDFAWTSIYD5HANCNFSM4SB34A6A>
.
--
Dott. Ing. Loreto Parisi
Parisi Labs
Company: info@parisilabs.com
Personal: loretoparisi@gmail.com
Twitter: @loretoparisi
Web: http://parisilabs.com <http://blog.parisilabs.com>
LinkedIn: http://www.linkedin.com/in/loretoparisi
|
Thanks for the explanation! Since I use TF with Python and C++ only, I wasn't aware of this limitation. |
@loretoparisi I have bad news on this one. Unfortunately you'll have to wait for the TFJS team to release an update. The reason is that if I change the weight node types, all operations that use them as inputs will need to have their input- and output types changed as well. The latter is the actual problem since the type change now cascades to all nodes connected to that node... I still might give it a try just for exercise, but don't count on it. |
Thanks a lot Patrick, it makes sense. Hopefully the DINT64 support will be ready in a month or so! |
@loretoparisi Good news! I managed to solve the problem by converting incompatible inputs in the graph. The new version will be available on PyPi in just a few moments. |
@patlevin wow, that's amazing!!! 💯 🥇 I will test as well and back to you! |
@patlevin Just to be sure we are working on the same model.
I have downloaded the model from the TFHub here and then run the conversion. This is my JavaScript test: const tfjsnode = require('@tensorflow/tfjs-node');
const tfconv = require("@tensorflow/tfjs-converter");
var loadGraphModel = function (url) {
return new Promise(function (resolve, reject) {
tfconv.loadGraphModel(url,
{ fromTFHub: true })
.then(res => {
console.log("loadGraphModel");
resolve(res);
})
.catch(err => reject(err));
});
}
var loadSavedModel = function (path) {
return new Promise(function (resolve, reject) {
tfjsnode.node.loadSavedModel(path)
.then(res => {
console.log("loadSavedModel");
resolve(res);
})
.catch(err => reject(err));
});
}
loadGraphModel('https://tfhub.dev/tensorflow/tfjs-model/toxicity/1/default/1')
.catch(err => console.error("loadGraphModel", err));
loadSavedModel('/Users/loretoparisi/webservice/toxicity_model/saved')
.catch(err => console.error("loadSavedModel", err)); this is the current output
so it seems the problem is still there
but maybe it's my fault at this point... |
@loretoparisi You need to use the compatibility flag: tfjs_graph_converter --output_format --compat_mode tf_saved_model ./ ./saved/ The compatibility-mode is optional - the default behaviour is to keep all types as-is. |
@patlevin ah right!!
Perfect it works! Thank you! 🥇 |
Hello,
I'm trying to convert this TFJS GraphModel:
The conversion works without any issues with the command
tfjs_graph_converter --output_format tf_saved_model ./ ./saved/
But, when I try to load the saved model
I get the error
The text was updated successfully, but these errors were encountered: