Skip to content

set stop_gradient sign to all loss layers#15955

Closed
velconia wants to merge 2 commits intoPaddlePaddle:developfrom
velconia:add_stop_grad_to_loss_op
Closed

set stop_gradient sign to all loss layers#15955
velconia wants to merge 2 commits intoPaddlePaddle:developfrom
velconia:add_stop_grad_to_loss_op

Conversation

@velconia
Copy link
Collaborator

test=develop


helper = LayerHelper('bpr_loss', **locals())
out = helper.create_variable_for_type_inference(dtype=input.dtype)
label.stop_gradient = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't secretly change the user's configuration.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a hot fix

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not hot fix. It's introducing new bugs.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So... what's the best way to solve the problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants