Skip to content

Fix elementwise_add quantization#34820

Merged
jczaja merged 3 commits intoPaddlePaddle:developfrom
wozna:fix_elementwise_add_quant
Aug 16, 2021
Merged

Fix elementwise_add quantization#34820
jczaja merged 3 commits intoPaddlePaddle:developfrom
wozna:fix_elementwise_add_quant

Conversation

@wozna
Copy link

@wozna wozna commented Aug 11, 2021

PR types

Bug fixes

PR changes

OPs

Describe

This PR:

  • removes relu from the list of quantizable operators because so far relu has no quantization support,
  • modifies quantization process for elementwise_add: if there is no output tensor scale, the operator is not quantized. Previously, when there was no output scale, the force_fp32_output attribute was set, unfortunately, it was an error because the operator does not have such an attribute or functionality.

This PR should fix the problem #34574

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

jczaja
jczaja previously approved these changes Aug 11, 2021
Copy link
Contributor

@jczaja jczaja left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Comment on lines +773 to +774
if (!AreScalesPresentForNodes({elementwise_add_x, elementwise_add_y}) ||
!AreScalesPresentForNodes({elementwise_add_out})) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider calling this method only once with all three nodes in arguments.

Comment on lines +134 to +140
// all quantized operators, except relu
MainTest({}, {}, 5);
}

TEST(QuantizerPlacementPass, default_attr_value) {
// all operators quantized
DefaultAttrTest(6);
// all quantized operators, except relu
DefaultAttrTest(5);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that reversing order of words "quantized" and "operators" might be confusing.
You could consider making it a full sentence for example:
All operators except relu should be quantized

Copy link

@sfraczek sfraczek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@lidanqing-vv lidanqing-vv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jczaja jczaja merged commit ae80df9 into PaddlePaddle:develop Aug 16, 2021
@wozna wozna deleted the fix_elementwise_add_quant branch February 24, 2023 16:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants