[XPU] Support several ops on precision of fp16.#10025
Merged
zhupengyang merged 1 commit intoPaddlePaddle:developfrom Feb 28, 2023
Merged
[XPU] Support several ops on precision of fp16.#10025zhupengyang merged 1 commit intoPaddlePaddle:developfrom
zhupengyang merged 1 commit intoPaddlePaddle:developfrom
Conversation
|
Thanks for your contribution! |
zhupengyang
reviewed
Feb 23, 2023
3393166 to
ef102db
Compare
zhupengyang
reviewed
Feb 24, 2023
ef102db to
bf20323
Compare
04890ed to
ec4217d
Compare
Contributor
Author
|
已添加相关新算子的测例,并解决float16类型未定义的编译报错问题。@zhupengyang |
ec4217d to
e536ad9
Compare
e536ad9 to
20f88c2
Compare
zhupengyang
approved these changes
Feb 28, 2023
Comment on lines
+116
to
+119
| using xpu_calib_fp32_to_fp16_kfp16 = | ||
| paddle::lite::kernels::xpu::CalibCompute<float, float16, PRECISION(kFP16)>; | ||
| using xpu_calib_fp16_to_fp32_kfp16 = | ||
| paddle::lite::kernels::xpu::CalibCompute<float16, float, PRECISION(kFP16)>; |
Collaborator
There was a problem hiding this comment.
命名改下
xpu_calib_fp32_to_fp16
xpu_calib_fp16_to_fp32
Contributor
Author
There was a problem hiding this comment.
这个等业务侧模型验证之后再改吧。
Comment on lines
+695
to
+698
| #elif defined(LITE_WITH_XPU) | ||
| place = TARGET(kXPU); | ||
| alias = "silu_fp32"; | ||
| abs_error = 2e-4; |
Collaborator
There was a problem hiding this comment.
放到 LITE_WITH_ARM 前面一行
Comment on lines
+80
to
+82
| #elif defined(LITE_WITH_XPU) | ||
| Place place(TARGET(kXPU), PRECISION(kFloat)); | ||
| test_sin(place); |
Comment on lines
+84
to
+86
| #elif defined(LITE_WITH_XPU) | ||
| Place place(TARGET(kXPU), PRECISION(kFloat)); | ||
| test_sin(place); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR devices
XPU
PR types
New features
PR changes
OP
Description
Add silu/sin/cos/slice ops for fp16 precision on XPU backend.