Skip to content

Fix Using "isinstance" in Loop, test=develop#28641

Merged
zhhsplendid merged 5 commits intoPaddlePaddle:developfrom
zhhsplendid:ppgan
Nov 17, 2020
Merged

Fix Using "isinstance" in Loop, test=develop#28641
zhhsplendid merged 5 commits intoPaddlePaddle:developfrom
zhhsplendid:ppgan

Conversation

@zhhsplendid
Copy link
Member

@zhhsplendid zhhsplendid commented Nov 16, 2020

PR types

Bug fixes

PR changes

APIs

Describe

Fix a bug that used in PaddleGAN model which used isinstance in a for loop (the code is located at https://github.com/PaddlePaddle/PaddleGAN/blob/32c59d5539de9182b8307d41ba7aad7de76f1c28/ppgan/models/generators/deoldify.py#L35)

The reason of the bug is that the type of function isinstance is treated as a variable in loop_vars. For example, before this bug fix, if you print(model.forward.code) for the test_sequential_layer unittest of this PR, you will get code like below. Note that AddAttrLayer is a type but here our code treated it as a variable.

def forward(self, x):
    res = x
    AddAttrLayer = paddle.jit.dy2static.data_layer_not_check(name=
        'AddAttrLayer', shape=[-1], dtype='float32')
    __for_loop_var_index_0 = 0
    __for_loop_var_len_0 = paddle.jit.dy2static.convert_len(self.layers)

    def for_loop_condition_0(AddAttrLayer, __for_loop_var_index_0, res, x):
        return __for_loop_var_index_0 + 1 <= __for_loop_var_len_0

    def for_loop_body_0(AddAttrLayer, __for_loop_var_index_0, res, x):

        def true_fn_1(__for_loop_var_index_0, self, x):
            self.layers[__for_loop_var_index_0].attr = x
            return

        def false_fn_1():
            return
        paddle.jit.dy2static.convert_ifelse(isinstance(self.layers[
            __for_loop_var_index_0], AddAttrLayer), true_fn_1, false_fn_1,
            (__for_loop_var_index_0, self, x), (), ())
        res = paddle.jit.dy2static.convert_call(self.layers[
            __for_loop_var_index_0])(res)
        __for_loop_var_index_0 += 1
        return AddAttrLayer, __for_loop_var_index_0, res, x
    [AddAttrLayer, __for_loop_var_index_0, res, x
        ] = paddle.jit.dy2static.convert_while_loop(for_loop_condition_0,
        for_loop_body_0, [AddAttrLayer, __for_loop_var_index_0, res, x])
    return res

Solution: I recorded those variables in isinstance as types, then removed those type variables in loop_vars. Now the translated code for above example is:

def forward(self, x):
    res = x
    __for_loop_var_index_0 = 0
    __for_loop_var_len_0 = paddle.jit.dy2static.convert_len(self.layers)

    def for_loop_condition_0(res, __for_loop_var_index_0, x):
        return __for_loop_var_index_0 + 1 <= __for_loop_var_len_0

    def for_loop_body_0(res, __for_loop_var_index_0, x):

        def true_fn_1(__for_loop_var_index_0, self, x):
            self.layers[__for_loop_var_index_0].attr = x
            return

        def false_fn_1():
            return
        paddle.jit.dy2static.convert_ifelse(isinstance(self.layers[
            __for_loop_var_index_0], AddAttrLayer), true_fn_1, false_fn_1,
            (__for_loop_var_index_0, self, x), (), ())
        res = paddle.jit.dy2static.convert_call(self.layers[
            __for_loop_var_index_0])(res)
        __for_loop_var_index_0 += 1
        return res, __for_loop_var_index_0, x
    [res, __for_loop_var_index_0, x] = paddle.jit.dy2static.convert_while_loop(
        for_loop_condition_0, for_loop_body_0, [res, __for_loop_var_index_0, x]
        )
    return res

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link
Contributor

@liym27 liym27 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@Aurelius84 Aurelius84 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhhsplendid zhhsplendid merged commit b6f86b8 into PaddlePaddle:develop Nov 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants