Skip to content

Conversation

@daniel347x
Copy link
Contributor

Identical PR to #45 - but using a dedicated branch from my forked repository so I can continue making other changes in the fork.

This commit addresses a very intermittent, but deadly crash bug that is destroying my training runs - a very occasional infinite gradient in the 'backward' function.

In this commit, functionality remains unchanged by default.

However, an optional flag has been added that allows clamping the gradient in the 'backward' function. The flag takes the form of an int or sequence giving the max value (or min/max if sequence).

An optional third value in the passed sequence is interpreted as a Boolean that indicates whether to print a warning to the console whenever an infinite gradient is clamped. The default is False.

Support for PyTorch only.
…tions

Also, add dummy argument in 'backward' to match new backward_clamp_gradient_mag argument
@daniel347x daniel347x changed the title Pydiffvg Support optional flag to clamp gradient in 'backward' to prevent crash Nov 2, 2022
@BachiLi
Copy link
Owner

BachiLi commented Jan 9, 2023

Let me ponder a bit what is the best way to do this. If I can't figure out a better/easier way I'll merge this. : >

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants