-
Notifications
You must be signed in to change notification settings - Fork 683
Implement step based checkpointing #2384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2384
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 2 New Failures, 2 Unrelated FailuresAs of commit 650d91d with merge base 3d73591 ( NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
20f2acf to
46f7c67
Compare
|
…d get resume working w/ StatefulDataLoader
Co-authored-by: Felipe Mello <[email protected]> Co-authored-by: ebsmothers <[email protected]> Co-authored-by: salman <[email protected]>
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags:
| self.save_checkpoint( | ||
| epoch=curr_epoch, step=self.global_step, full_tensors=False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
var "step" is missing in distributed recipe
|
rebased to #2869 |
Context
What is the purpose of this PR? Is it to
Closes #2105. This is a widely requested feature that allows users to have greater control over checkpointing frequency in torchtune.
TODO: Add commentary on design decisions. Acknowledge spaghetti code. Beg forgiveness.
Changelog
FullModelHFCheckpointerto accept a step parameter when saving a checkpoint. Use that step to designate the checkpoint folder name. Keepepoch_{}as a fall-back for BC.full_finetune_single_device.pyrecipe to utilize step-based checkpointing.Test plan
Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.
pre-commit install)pytest testspytest tests -m integration_testEvidence of correct number of checkpoints being saved
Evidence of correct resuming from ckpt mid-epoch

Evidence of correct resuming from ckpt at epoch boundary

UX
If your function changed a public API, please add a dummy example of what the user experience will look like when calling it.
Here is a docstring example
and a tutorial example