Skip to content

Conversation

@zhaoyinglia
Copy link
Contributor

@zhaoyinglia zhaoyinglia commented Sep 20, 2022

PR types

Others

PR changes

Others

Describe

  • [Auto Parallel] Change the import way of Auto Parallel #46115
  • [Auto Parallel] fix strategy #46256
  • [Auto Parallel] performance improvement for Sharding-DP hybrid parallelism #46180

@paddle-bot
Copy link

paddle-bot bot commented Sep 20, 2022

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

…elism (PaddlePaddle#46180)

* remove no need grad allreduce communication when sharding-dp

* remove no need grad allreduce communication when sharding-dp

* bugfix

* bugfix

* bugfix
Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@fuyinno4 fuyinno4 merged commit c43ebfc into PaddlePaddle:release/2.4 Sep 20, 2022
@zhaoyinglia zhaoyinglia deleted the 2.4/auto_parallel branch August 30, 2023 06:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants