Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
78 changes: 49 additions & 29 deletions python/paddle/fluid/dygraph/jit.py
Original file line number Diff line number Diff line change
Expand Up @@ -509,33 +509,33 @@ def save(layer, path, input_spec=None, **configs):
Saves input Layer as ``paddle.jit.TranslatedLayer``
format model, which can be used for inference or fine-tuning after loading.

It will save the translated program and all related persistable
It will save the translated program and all related persistable
variables of input Layer to given ``path`` .
``path`` is the prefix of saved objects, and the saved translated program file

``path`` is the prefix of saved objects, and the saved translated program file
suffix is ``.pdmodel`` , the saved persistable variables file suffix is ``.pdiparams`` ,
and here also saved some additional variable description information to a file,
and here also saved some additional variable description information to a file,
its suffix is ``.pdiparams.info``, these additional information is used in fine-tuning.

The saved model can be loaded by follow APIs:
- ``paddle.jit.load``
- ``paddle.static.load_inference_model``
- ``paddle.jit.load``
- ``paddle.static.load_inference_model``
- Other C++ inference APIs

Args:
layer (Layer): The Layer to be saved.
path (str): The path prefix to save model. The format is ``dirname/file_prefix`` or ``file_prefix``.
input_spec (list[InputSpec|Tensor], optional): Describes the input of the saved model's forward
method, which can be described by InputSpec or example Tensor. If None, all input variables of
input_spec (list[InputSpec|Tensor], optional): Describes the input of the saved model's forward
method, which can be described by InputSpec or example Tensor. If None, all input variables of
the original Layer's forward method would be the inputs of the saved model. Default None.
**configs (dict, optional): Other save configuration options for compatibility. We do not
recommend using these configurations, they may be removed in the future. If not necessary,
**configs (dict, optional): Other save configuration options for compatibility. We do not
recommend using these configurations, they may be removed in the future. If not necessary,
DO NOT use them. Default None.
The following options are currently supported:
(1) output_spec (list[Tensor]): Selects the output targets of the saved model.
By default, all return variables of original Layer's forward method are kept as the
output of the saved model. If the provided ``output_spec`` list is not all output variables,
the saved model will be pruned according to the given ``output_spec`` list.
By default, all return variables of original Layer's forward method are kept as the
output of the saved model. If the provided ``output_spec`` list is not all output variables,
the saved model will be pruned according to the given ``output_spec`` list.

Returns:
None
Expand Down Expand Up @@ -793,8 +793,8 @@ def load(path, **configs):
"""
:api_attr: imperative

Load model saved by ``paddle.jit.save`` or ``paddle.static.save_inference_model`` or
paddle 1.x API ``paddle.fluid.io.save_inference_model`` as ``paddle.jit.TranslatedLayer``,
Load model saved by ``paddle.jit.save`` or ``paddle.static.save_inference_model`` or
paddle 1.x API ``paddle.fluid.io.save_inference_model`` as ``paddle.jit.TranslatedLayer``,
then performing inference or fine-tune training.

.. note::
Expand All @@ -807,14 +807,14 @@ def load(path, **configs):

Args:
path (str): The path prefix to load model. The format is ``dirname/file_prefix`` or ``file_prefix`` .
**configs (dict, optional): Other load configuration options for compatibility. We do not
recommend using these configurations, they may be removed in the future. If not necessary,
**configs (dict, optional): Other load configuration options for compatibility. We do not
recommend using these configurations, they may be removed in the future. If not necessary,
DO NOT use them. Default None.
The following options are currently supported:
(1) model_filename (str): The inference model file name of the paddle 1.x
``save_inference_model`` save format. Default file name is :code:`__model__` .
(2) params_filename (str): The persistable variables file name of the paddle 1.x
``save_inference_model`` save format. No default file name, save variables separately
(1) model_filename (str): The inference model file name of the paddle 1.x
``save_inference_model`` save format. Default file name is :code:`__model__` .
(2) params_filename (str): The persistable variables file name of the paddle 1.x
``save_inference_model`` save format. No default file name, save variables separately
by default.


Expand Down Expand Up @@ -960,7 +960,7 @@ def __len__(self):
loader = paddle.io.DataLoader(dataset,
feed_list=[image, label],
places=place,
batch_size=BATCH_SIZE,
batch_size=BATCH_SIZE,
shuffle=True,
drop_last=True,
num_workers=2)
Expand All @@ -969,7 +969,7 @@ def __len__(self):
for data in loader():
exe.run(
static.default_main_program(),
feed=data,
feed=data,
fetch_list=[avg_loss])

model_path = "fc.example.model"
Expand Down Expand Up @@ -1052,7 +1052,7 @@ def _trace(layer,
class TracedLayer(object):
"""
:api_attr: imperative

TracedLayer is used to convert a forward dygraph model to a static
graph model. This is mainly used to save the dygraph model for online
inference using C++. Besides, users can also do inference in Python
Expand Down Expand Up @@ -1132,7 +1132,7 @@ def __init__(self):
def forward(self, input):
return self._fc(input)


layer = ExampleLayer()
in_var = paddle.uniform(shape=[2, 3], dtype='float32')
out_dygraph, static_layer = paddle.jit.TracedLayer.trace(layer, inputs=[in_var])
Expand Down Expand Up @@ -1244,13 +1244,16 @@ def __call__(self, inputs):
return self._run(self._build_feed(inputs))

@switch_to_static_graph
def save_inference_model(self, dirname, feed=None, fetch=None):
def save_inference_model(self, path, feed=None, fetch=None):
"""
Save the TracedLayer to a model for inference. The saved
inference model can be loaded by C++ inference APIs.

``path`` is the prefix of saved objects, and the saved translated program file
suffix is ``.pdmodel`` , the saved persistable variables file suffix is ``.pdiparams`` .

Args:
dirname (str): the directory to save the inference model.
path(str): The path prefix to save model. The format is ``dirname/file_prefix`` or ``file_prefix``.
feed (list[int], optional): the input variable indices of the saved
inference model. If None, all input variables of the
TracedLayer object would be the inputs of the saved inference
Expand Down Expand Up @@ -1294,7 +1297,7 @@ def forward(self, input):
fetch, = exe.run(program, feed={feed_vars[0]: in_np}, fetch_list=fetch_vars)
print(fetch.shape) # (2, 10)
"""
check_type(dirname, "dirname", str,
check_type(path, "path", str,
"fluid.dygraph.jit.TracedLayer.save_inference_model")
check_type(feed, "feed", (type(None), list),
"fluid.dygraph.jit.TracedLayer.save_inference_model")
Expand All @@ -1309,6 +1312,18 @@ def forward(self, input):
check_type(f, "each element of fetch", int,
"fluid.dygraph.jit.TracedLayer.save_inference_model")

# path check
file_prefix = os.path.basename(path)
if file_prefix == "":
raise ValueError(
"The input path MUST be format of dirname/file_prefix "
"[dirname\\file_prefix in Windows system], but received "
"file_prefix is empty string.")

dirname = os.path.dirname(path)
if dirname and not os.path.exists(dirname):
os.makedirs(dirname)

from paddle.fluid.io import save_inference_model

def get_feed_fetch(all_vars, partial_vars):
Expand All @@ -1326,9 +1341,14 @@ def get_feed_fetch(all_vars, partial_vars):
assert target_var is not None, "{} cannot be found".format(name)
target_vars.append(target_var)

model_filename = file_prefix + INFER_MODEL_SUFFIX
params_filename = file_prefix + INFER_PARAMS_SUFFIX

save_inference_model(
dirname=dirname,
feeded_var_names=feeded_var_names,
target_vars=target_vars,
executor=self._exe,
main_program=self._program.clone())
main_program=self._program.clone(),
model_filename=model_filename,
params_filename=params_filename)
Original file line number Diff line number Diff line change
Expand Up @@ -75,10 +75,12 @@ def test_main(self):

self.assertEqual(actual_persistable_vars, expected_persistable_vars)

dirname = './traced_layer_test_non_persistable_vars'
traced_layer.save_inference_model(dirname=dirname)
filenames = set([f for f in os.listdir(dirname) if f != '__model__'])
self.assertEqual(filenames, expected_persistable_vars)
traced_layer.save_inference_model(
path='./traced_layer_test_non_persistable_vars')
self.assertTrue('traced_layer_test_non_persistable_vars.pdmodel' in
os.listdir('./'))
self.assertTrue('traced_layer_test_non_persistable_vars.pdiparams' in
os.listdir('./'))


if __name__ == '__main__':
Expand Down
40 changes: 33 additions & 7 deletions python/paddle/fluid/tests/unittests/test_traced_layer_err_msg.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import six
import unittest
import paddle.nn as nn
import os


class SimpleFCLayer(nn.Layer):
Expand Down Expand Up @@ -115,36 +116,41 @@ def test_save_inference_model_err(self):
dygraph_out, traced_layer = fluid.dygraph.TracedLayer.trace(
self.layer, [in_x])

dirname = './traced_layer_err_msg'
path = './traced_layer_err_msg'
with self.assertRaises(TypeError) as e:
traced_layer.save_inference_model([0])
self.assertEqual(
"The type of 'dirname' in fluid.dygraph.jit.TracedLayer.save_inference_model must be <{} 'str'>, but received <{} 'list'>. ".
"The type of 'path' in fluid.dygraph.jit.TracedLayer.save_inference_model must be <{} 'str'>, but received <{} 'list'>. ".
format(self.type_str, self.type_str), str(e.exception))
with self.assertRaises(TypeError) as e:
traced_layer.save_inference_model(dirname, [0], [None])
traced_layer.save_inference_model(path, [0], [None])
self.assertEqual(
"The type of 'each element of fetch' in fluid.dygraph.jit.TracedLayer.save_inference_model must be <{} 'int'>, but received <{} 'NoneType'>. ".
format(self.type_str, self.type_str), str(e.exception))
with self.assertRaises(TypeError) as e:
traced_layer.save_inference_model(dirname, [0], False)
traced_layer.save_inference_model(path, [0], False)
self.assertEqual(
"The type of 'fetch' in fluid.dygraph.jit.TracedLayer.save_inference_model must be (<{} 'NoneType'>, <{} 'list'>), but received <{} 'bool'>. ".
format(self.type_str, self.type_str, self.type_str),
str(e.exception))
with self.assertRaises(TypeError) as e:
traced_layer.save_inference_model(dirname, [None], [0])
traced_layer.save_inference_model(path, [None], [0])
self.assertEqual(
"The type of 'each element of feed' in fluid.dygraph.jit.TracedLayer.save_inference_model must be <{} 'int'>, but received <{} 'NoneType'>. ".
format(self.type_str, self.type_str), str(e.exception))
with self.assertRaises(TypeError) as e:
traced_layer.save_inference_model(dirname, True, [0])
traced_layer.save_inference_model(path, True, [0])
self.assertEqual(
"The type of 'feed' in fluid.dygraph.jit.TracedLayer.save_inference_model must be (<{} 'NoneType'>, <{} 'list'>), but received <{} 'bool'>. ".
format(self.type_str, self.type_str, self.type_str),
str(e.exception))
with self.assertRaises(ValueError) as e:
traced_layer.save_inference_model("")
self.assertEqual(
"The input path MUST be format of dirname/file_prefix [dirname\\file_prefix in Windows system], "
"but received file_prefix is empty string.", str(e.exception))

traced_layer.save_inference_model(dirname)
traced_layer.save_inference_model(path)

def _train_simple_net(self):
layer = None
Expand Down Expand Up @@ -174,5 +180,25 @@ def test_linear_net_with_none(self):
[in_x])


class TestTracedLayerSaveInferenceModel(unittest.TestCase):
"""test save_inference_model will automaticlly create non-exist dir"""

def setUp(self):
self.save_path = "./nonexist_dir/fc"
import shutil
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT (Not Important): However, since it is not so important and this PR needs many approves. I will approve it and you can change in next PR.

Python officially suggests to put imports at top of file, I think you should follow it in your case.
https://www.python.org/dev/peps/pep-0008/#imports

image

Let me explain my understanding of the pros and cons of importing at top of a file or in a function.

Importing at top of a file:

  1. More consistent with other programming language.
  2. We group imports all together and easy to organize

There are several cases I think we can import in a function:

  1. Conditional import, e.g.
if something:
    import xxx
  1. You would like your function code to work correctly after copy the function code to other file (other file may not contain that import)

But I don't think here we meet the cases for importing in a function :-) so I would suggest to import at the top.

if os.path.exists(os.path.dirname(self.save_path)):
shutil.rmtree(os.path.dirname(self.save_path))

def test_mkdir_when_input_path_non_exist(self):
fc_layer = SimpleFCLayer(3, 4, 2)
input_var = paddle.to_tensor(np.random.random([4, 3]).astype('float32'))
with fluid.dygraph.guard():
dygraph_out, traced_layer = fluid.dygraph.TracedLayer.trace(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should write new test cases for Paddle 2.0 path. So fluid.dygraph.TracedLayer should be changed to paddle.jit.TracedLayer. However, since it is not so important and this PR needs many approves. I will approve it and you can change in next PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your suggestion,I will change these two in next pr.

fc_layer, inputs=[input_var])
self.assertFalse(os.path.exists(os.path.dirname(self.save_path)))
traced_layer.save_inference_model(self.save_path)
self.assertTrue(os.path.exists(os.path.dirname(self.save_path)))


if __name__ == '__main__':
unittest.main()