Skip to content

Conversation

@wozna
Copy link
Contributor

@wozna wozna commented Nov 13, 2020

PR types

Bug fixes

PR changes

OPs

Describe

This PR :

  • adds op version checkpoint to quantize op where new attribute bfloat16 was added.
  • registers pass compatibility to the cpu_bfloat16_pass which is related to the change in quantize op.
  • changes also InferencePassTest, which allowed applying cpu_bfloat16_pass to the graph

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

def init_data(self):
self.bs = 8
self.d_type = np.float32
self.shape_x = [12, 1, 1]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some time ago, baidu require test dimension to be over 100? Maybe self.shape_x = [12, 1, 10]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I changed.

self.enable_mkldnn = True

def test_check_output(self):
self.enable_mkldnn = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are two self.enable_mkldnn = True, the one in the init_data(self) is enough

The test shows that if self.check_output_with_option(use_gpu, flatten=True, bfloat16=False) then this test will do automatically float32? I thought settings should be done in init, before checking output. like self.enable_mkldnn_bfloat = True, But both are fine.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree that just adding self.enable_mkldnn_bfloat = True will be better solution.

@lidanqing-vv
Copy link
Contributor

LGTM

@luotao1 luotao1 merged commit 2cb71c0 into PaddlePaddle:develop Nov 16, 2020
@wozna wozna deleted the quantize_op branch February 24, 2023 16:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants