Skip to content

Log current lr in StandardModel #651

Merged
RasmusOrsoe merged 2 commits intographnet-team:mainfrom
RasmusOrsoe:display_current_learning_rate
Feb 6, 2024
Merged

Log current lr in StandardModel #651
RasmusOrsoe merged 2 commits intographnet-team:mainfrom
RasmusOrsoe:display_current_learning_rate

Conversation

@RasmusOrsoe
Copy link
Collaborator

This PR changes the following in graphnet.models.StandardModel:

  • Adds the global optimizer learning rate as a logged quantity, which makes it appear both in wandb and in the ProgressBar.
  • Sets on_step=True for the logging of training loss.

This change makes it easier to monitor the progress of the training from the progressbar information alone.

Copy link
Collaborator

@ArturoLlorente ArturoLlorente left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks nice!

I also explored the possibility to use self.trainer.lr_scheduler_configs[0].scheduler.get_last_lr() which gives automaticly the LR of all optimizers, but I think this solution is better since it will work even if there is no scheduler defined.

@RasmusOrsoe RasmusOrsoe merged commit 5c93a7a into graphnet-team:main Feb 6, 2024
carlosm-silva pushed a commit to carlosm-silva/graphnet that referenced this pull request Jul 7, 2025
…_learning_rate

Log current lr in `StandardModel`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants