-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tfjs-node support for saved models does not recognize valid dtypes #3930
Comments
More details: There is an issue with I've found your an earlier fix #2981 which patches a) Same is also needed in
b) During model execution, model expects to receive
So I'm not sure that simply mapping uint8 to int32 is a fix? Referencing previous issue #3374 closed without resolution. Gist with a test code is at https://gist.github.com/vladmandic/a7cf75109b7b48f8914a5b18da5c498f |
The same issue happens with
after converting a GraphModel to SavedModel using |
@loretoparisi Yup, I've reported that under #4004 and it was just fixed in #4008 |
Hey @vladmandic in the meantime
The const tfjsnode = require('@tensorflow/tfjs-node');
var loadSavedModel = function (path) {
return new Promise(function (resolve, reject) {
tfjsnode.node.loadSavedModel(this.path)
.then(res => {
console.log("loadSavedModel OK");
resolve(res);
})
.catch(err => reject(err));
});
}
loadSavedModel('/Users/loretoparisi/webservice/toxicity_model/saved')
.catch(err => console.error("loadSavedModel", err)); This works fine: 2020-10-05 12:38:29.166043: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /Users/loretoparisi/webservice/toxicity_model/saved
2020-10-05 12:38:29.193234: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-10-05 12:38:29.252666: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2020-10-05 12:38:29.252729: I tensorflow/cc/saved_model/loader.cc:212] The specified SavedModel has no variables; no checkpoints were restored. File does not exist: /Users/loretoparisi/webservice/toxicity_model/saved/variables/variables.index
2020-10-05 12:38:29.252752: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 86709 microseconds. BUT, if I apply this to the TFJS model wrapper here: ToxicityClassifier.prototype.loadModel = function () {
return __awaiter(this, void 0, void 0, function () {
return __generator(this, function (_a) {
return [2, tfjsnode.node.loadSavedModel(path)];
});
});
}; it will fail due to another error (node:43549) UnhandledPromiseRejectionWarning: Error: SavedModel outputs information is not available yet.
at TFSavedModel.get [as outputs] (/Users/loretoparisi/Documents/Projects/AI/tensorflow-node-examples/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:265:19)
at ToxicityClassifier.<anonymous> (/Users/loretoparisi/Documents/Projects/AI/tensorflow-node-examples/toxicity-example/lib/toxicity/dist/index.js:101:35)
... |
@loretoparisi Do you still have the same problem? It works fine for me with int64 and int32 inputs with TFJS v2.5.0 and with int32 inputs in v2.3.0 and v2.4.0. |
@patlevin thank you, let me check, they have just released |
@patlevin @vladmandic So the model correctly loads with tfjs
I have opened a specific issue here #4035 Thank you! |
@loretoparisi Interesting. I used the following code and it worked just fine: const tf = require('@tensorflow/tfjs-node')
async function run() {
const model = await tf.node.loadSavedModel('./models/toxicity_saved/')
// both indexArray and valueArray are obtained from two preprocessed test phrases that I used to verify
// model outputs
const indexArray = [
[0, 1], [0,2 ], [0, 3], [0, 4], [0, 5], [0, 6], [0, 7], [0, 8],
[1, 0], [1, 1], [1, 2], [1, 3]
]
const valueArray = [215, 13, 53, 4461, 2951, 519, 1129, 7, 78, 16, 123, 20, 6]
const indices = tf.tensor2d(indexArray, [indexArray.length, 2], 'int32')
const values = tf.tensor1d(valueArray, 'int32')
const modelInputs = {
Placeholder_1: indices,
Placeholder: values
}
const labels = model.predict(modelInputs)
indices.dispose()
values.dispose()
outputs = []
for (name in labels) {
const prediction = labels[name].dataSync()
const results = []
for (let input = 0; input < 2; ++input) {
const probs = prediction.slice(input * 2, input * 2 + 2)
let match = null
if (Math.max(probs[0], probs[1]) > 0.9) {
match = probs[0] > probs[1]
}
p= probs.toString() // just to print out the numbers
results.push({p, match})
}
outputs.push({label: name.split('/')[0], results})
}
for (x of outputs) {
console.log(x)
}
}
run() The model methods |
@patlevin thanks, I confirm that this way it works!
while, according to what you say, using the other way the error
comes from within the
hence the offending line in the this.labels =
model.outputs.map(function (d) { return d.name.split('/')[0]; });
if (this.toxicityLabels.length === 0) {
this.toxicityLabels = this.labels;
} that must be changed in someway using the |
@loretoparisi I'll create a pull-request that implements |
super! In the meantime I have found the outputs something as you suggested const modelInfo = await tf.node.getMetaGraphsFromSavedModel('./toxicity_model/saved');
console.dir(modelInfo[0].signatureDefs.serving_default.outputs, { depth: null, maxArrayLength: null }); |
@loretoparisi btw, one advantage of working with See #3942 for details. |
@patlevin thanks for the test script. const use = require("@tensorflow-models/universal-sentence-encoder");
const model = await tf.node.loadSavedModel('./toxicity_model/saved');
const tokenizer = await use.load();
const sentences = ['you suck', 'hello how are you?'];
var codes = await tokenizer.embed(sentences);
console.log(codes); Now I have an array of Tensors like
and now I have to turn into indexes and values. I have tried this var encodings = await tokenizer.embed(sentences);
var indicesArr = encodings.map(function (arr, i) { return arr.map(function (d, index) { return [i, index]; }); });
var flattenedIndicesArr = [];
for (i = 0; i < indicesArr.length; i++) {
flattenedIndicesArr =
flattenedIndicesArr.concat(indicesArr[i]);
}
var indices = tf.tensor2d(flattenedIndicesArr, [flattenedIndicesArr.length, 2], 'int32');
var values = tf.tensor1d(tf.util.flatten(encodings), 'int32'); But it seems that
Thank you. |
@loretoparisi The model expects an encoded word vector as input, while the Universal Sentence Encoder (USE) model returns embeddings. Basically, you'll want to use the Unfortunately @pyu10055's commit b02310745ceac6b8e4a475719c343da53e3cade2 on the USE-repo broke both the Toxicity The real problem is that the examples are outdated and some changes broke TFJS 2.x compatibility (in the case of USE I fail to see the reasoning behind the change - might have been a mistake?). Meanwhile, I'll create a gist for you that contains all you need to get this working as a single-file solution. I'll get back to you in a bit. EDIT: I got confused here, since a similar issue was raised w.r.t. outdated tfjs-examples. The same applies to tfjs-models, though - basically some models are incompatible with TFJS 2.x due to package changes (not for technical reasons). |
@patlevin thank you! In fact I have just realized that the after recent updates models examples and core code diverged! |
I have the same issue, running node v15.7.0, tfjs-node v3.6.1 on Mac Big Sur
|
Simply calling
tfnode.node.getMetaGraphsFromSavedModel(path);
on a model usinguint8
results in error:However, support for
unint8
was added totfjs
via #2981 back in March.Those newly supported data types should be added throughout
tfjs
codebase.Environment: Ubuntu 20.04 running NodeJS 14.9.0 with TFJS 2.3.0
The text was updated successfully, but these errors were encountered: