-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
From @kimamula on April 3, 2018 17:17
TensorFlow.js Converter version
0.1.0
Browser version
Chrome 65.0.3325.181 (64bit)
Describe the problem or feature request
When I execute a model (i.e., FrozenModel#execute()) which is converted by tensorflowjs_converter and loaded with loadFrozenModel(), it fails with an error Error in matMul: inputs must be rank 2, got ranks 1 and 2.
I compared the .pb files before and after the conversion and found that a reshape operation is removed during the conversion.
- Before the conversion (showing only a part of the whole
.pbfile)Squeeze->Reshape->PlaceholderWithDefault->MatMul
...
!MobilenetV1/Logits/SpatialSqueeze��Squeeze�(MobilenetV1/Logits/Conv2d_1c_1x1/BiasAdd*�
�T��0�*�
squeeze_dims��
�����
Z
%MobilenetV1/Predictions/Reshape/shape��Const*�
�value��B���������"�������*�
�dtype��0�
��
�MobilenetV1/Predictions/Reshape��Reshape�!MobilenetV1/Logits/SpatialSqueeze�%MobilenetV1/Predictions/Reshape/shape*
�Tshape��0�*�
�T��0�
��
"input_1/BottleneckInputPlaceholder��PlaceholderWithDefault��MobilenetV1/Predictions/Reshape*�
�dtype��0�*�
�shape��:�������������������
�^
...
#final_training_ops/Wx_plus_b/MatMul��MatMul�"input_1/BottleneckInputPlaceholder�-final_training_ops/weights/final_weights/read*�
�transpose_a��(�*�
�transpose_b��(�*�
...
- After the conversion
Squeeze->PlaceholderWithDefault->MatMul(Reshapedissappeared)
...
!MobilenetV1/Logits/SpatialSqueeze��Squeeze�(MobilenetV1/Logits/Conv2d_1c_1x1/BiasAdd*�
squeeze_dims��
�����*�
�T��0�
��
"input_1/BottleneckInputPlaceholder��PlaceholderWithDefault�!MobilenetV1/Logits/SpatialSqueeze*�
�dtype��0�*�
�shape��:�������������������
��
#final_training_ops/Wx_plus_b/MatMul��MatMul�"input_1/BottleneckInputPlaceholder�(final_training_ops/weights/final_weights*�
�transpose_a��(�*�
�transpose_b��(�*�
...
I confirmed that when I modify matrices_executor.ts as follows, the loaded model works as expected.
export let executeOp: OpExecutor =
(node: Node, tensorMap: NamedTensorsMap): tfc.Tensor[] => {
switch (node.op) {
case 'matMul':
+ const a = getParamValue('a', node, tensorMap) as tfc.Tensor2D;
+ const b = getParamValue('b', node, tensorMap) as tfc.Tensor2D;
+ if (a.rank === 1 && b.rank === 2) {
+ return [tfc.vectorTimesMatrix(a, b)];
+ }
return [tfc.matMul(
+ a,
+ b,
- getParamValue('a', node, tensorMap) as tfc.Tensor2D,
- getParamValue('b', node, tensorMap) as tfc.Tensor2D,
getParamValue('transposeA', node, tensorMap) as boolean,
getParamValue('transposeB', node, tensorMap) as boolean)];
case 'transpose':
return [tfc.transpose(
getParamValue('x', node, tensorMap) as tfc.Tensor,
getParamValue('perm', node, tensorMap) as number[])];
default:
throw TypeError(`Node type ${node.op} is not implemented`);
}
};Code to reproduce the bug / link to feature request
I prepared my original model by retraining MobileNet on my own categories as described in TensorFlow For Poets codelab.
Then I converted the resulting model to the SavedModel format with the following script.
import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants
export_dir = 'path/to/saved_model'
graph_pb = 'path/to/original_pb'
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.gfile.GFile(graph_pb, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
with tf.Session(graph=tf.Graph()) as sess:
tf.import_graph_def(graph_def, name='')
g = tf.get_default_graph()
inp = g.get_tensor_by_name('input:0')
out = g.get_tensor_by_name('final_result:0')
predict_signature = tf.saved_model.signature_def_utils.predict_signature_def({'input': inp}, {'output': out})
builder.add_meta_graph_and_variables(sess, [tag_constants.SERVING], signature_def_map={
signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: predict_signature
})
builder.save()Finally, I converted the SavedModel with tensorflowjs_converter as follows.
tensorflowjs_converter \
--input_format=tf_saved_model \
--saved_model_tags=serve \
--output_node_names="final_result" \
path/to/saved_model \
path/to/outputCopied from original issue: tensorflow/tfjs-core#919