Conversation
Fix learning rate jump when transitioning from warmup to decay phase in the anneal scheduler. Previously, decay started at 1.0x base_lr while warmup ended at peak_lr_factor, causing a sudden LR increase. Change decay scheduler's start_factor from 1.0 to peak_lr_factor, ensuring continuity. Both start_factor and end_factor are now relative to base_lr, eliminating the discontinuity.
WalkthroughThe pull request adjusts learning rate decay scheduler parameters in Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (2 warnings, 1 inconclusive)
✨ Finishing touches
📜 Recent review detailsConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
🔇 Additional comments (2)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Codecov Report✅ All modified and coverable lines are covered by tests. ❌ Your project status has failed because the head coverage (57.74%) is below the target coverage (85.00%). You can increase the head coverage or adjust the target coverage. @@ Coverage Diff @@
## main #676 +/- ##
=======================================
Coverage 57.74% 57.74%
=======================================
Files 27 27
Lines 4977 4977
=======================================
Hits 2874 2874
Misses 2103 2103
🚀 New features to boost your workflow:
|
Description
Related Issue(s)
Type of Change
Branch Naming
Commit Messages
Code Quality
Testing
Documentation
If this is a breaking change
Screenshots/Examples
Additional Notes
Summary by CodeRabbit
Bug Fixes
Chores
✏️ Tip: You can customize this high-level summary in your review settings.