Skip to content

Comments

Make the normalization operator more general and fix bug in l2_normalize.#11348

Merged
qingqing01 merged 6 commits intoPaddlePaddle:developfrom
qingqing01:norm_op
Jun 12, 2018
Merged

Make the normalization operator more general and fix bug in l2_normalize.#11348
qingqing01 merged 6 commits intoPaddlePaddle:developfrom
qingqing01:norm_op

Conversation

@qingqing01
Copy link
Contributor

Fix #11345
Fix #7266

1. Refine the raw norm_op and let it more general to support to normalize Tensor along any axis.
2. There is bug in l2_normalize API, which lacks of sqrt after .
3. Use norm_op to refine the l2_normalize API.
baiyfbupt
baiyfbupt previously approved these changes Jun 11, 2018
Copy link
Contributor

@baiyfbupt baiyfbupt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@qingqing01 qingqing01 requested a review from guoshengCS June 11, 2018 11:17

// y = x / sqrt((sum(x * x) + epsilon))
// norm = sqrt(sum(x * x) + epsilon)
norm.device(*place) = norm + x_pow.eval().sum(rdim) + eps;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

你这里是不是多加个eps? 在line 68 不是已经令out_norm=eps了么?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, refine code here. Thanks! @wanghaoshuang

// dx = ( dy/sqrt(sum(x*x)) ) * [1 - x*sum(x) / sum(x*x)]
sum.device(*place) = (x * dy).sum(rdim);
dx.device(*place) = sum.reshape(rshape).broadcast(bcast) * x;
dx.device(*place) = dx / norm.pow(2).broadcast(bcast);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

norm.pow(2) = sum(x*x) + eps ,但是我在你的公式里没有看到eps?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I modify the above formula.

Eigen::DSizes<int, 3> bcast(1, n, 1);
Eigen::DSizes<int, 3> rshape(pre, 1, post);

// dx = ( dy/sqrt(sum(x*x)) ) * [1 - x*sum(x) / sum(x*x)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

怎么感觉你这个公式和实际计算过程有diff?可以把每一步的计算标注下么?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for so detailed review. I add comments.

Copy link
Contributor

@wanghaoshuang wanghaoshuang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@qingqing01 qingqing01 merged commit 19fd071 into PaddlePaddle:develop Jun 12, 2018
@qingqing01 qingqing01 deleted the norm_op branch November 14, 2019 05:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants