Skip to content

Support Sharding Overlap #8473

Merged
wawltor merged 3 commits into
PaddlePaddle:developfrom
iosmers:sharding_overlap
May 24, 2024
Merged

Support Sharding Overlap #8473
wawltor merged 3 commits into
PaddlePaddle:developfrom
iosmers:sharding_overlap

Conversation

@iosmers
Copy link
Copy Markdown
Contributor

@iosmers iosmers commented May 21, 2024

PR types

Performance optimization

PR changes

Models

Description

1.支持sharding overlap

@paddle-bot
Copy link
Copy Markdown

paddle-bot Bot commented May 21, 2024

Thanks for your contribution!

is_casual = self.config.casual_mask

if self.config.use_flash_attention and get_env_device() != "gcu":
is_casual = is_casual_mask(attention_mask)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不要删除这个。if hasattr(self.config, "casual_mask")

Comment thread paddlenlp/trainer/trainer.py Outdated
self.optimizer = fleet.distributed_optimizer(self.optimizer)

if in_sharding_parallel_mode:
sharding_parallel_config = set(self.args.sharding_parallel_config.split(" "))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

training_args 文件里面处理好,就别在这里split了

Comment thread llm/run_pretrain.py Outdated
default=None,
metadata={"help": "num_hidden_layers."},
)
casual_mask: Optional[bool] = field(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
casual_mask: Optional[bool] = field(
use_casual_mask: Optional[bool] = field(

@codecov
Copy link
Copy Markdown

codecov Bot commented May 21, 2024

Codecov Report

Attention: Patch coverage is 44.44444% with 10 lines in your changes missing coverage. Please review.

Project coverage is 54.25%. Comparing base (87e4c4f) to head (594a050).
Report is 1077 commits behind head on develop.

Files with missing lines Patch % Lines
paddlenlp/trainer/trainer.py 0.00% 6 Missing ⚠️
paddlenlp/transformers/llama/modeling.py 66.66% 4 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #8473      +/-   ##
===========================================
- Coverage    54.29%   54.25%   -0.05%     
===========================================
  Files          617      617              
  Lines        96339    96368      +29     
===========================================
- Hits         52312    52288      -24     
- Misses       44027    44080      +53     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

ZHUI
ZHUI previously approved these changes May 22, 2024
Comment thread paddlenlp/trainer/trainer.py Outdated

if in_sharding_parallel_mode:
if "split_param" in self.args.sharding_parallel_config:
self.optimizer._set_all_gather_overlap_forward(True, model)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个接口需要考虑版本兼容不?

ZHUI
ZHUI previously approved these changes May 22, 2024
Copy link
Copy Markdown
Contributor

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@CLAassistant
Copy link
Copy Markdown

CLAassistant commented May 23, 2024

CLA assistant check
All committers have signed the CLA.

sneaxiy
sneaxiy previously approved these changes May 23, 2024
@iosmers iosmers force-pushed the sharding_overlap branch from f63d157 to 06019c4 Compare May 23, 2024 05:04
@iosmers iosmers force-pushed the sharding_overlap branch from 06019c4 to 92b106f Compare May 23, 2024 05:06
ZHUI
ZHUI previously approved these changes May 23, 2024
Copy link
Copy Markdown
Contributor

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Copy Markdown
Contributor

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wawltor wawltor merged commit 7aaa788 into PaddlePaddle:develop May 24, 2024
SylarTiaNII added a commit to SylarTiaNII/PaddleNLP that referenced this pull request May 24, 2024
wawltor pushed a commit that referenced this pull request May 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants