Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
f6d9404
add new log2 operation
Joejiong Oct 30, 2020
f1b4f73
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Oct 30, 2020
cca2f9a
fix sample code
Joejiong Oct 30, 2020
3049700
test fp16
Joejiong Oct 30, 2020
665f827
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Oct 30, 2020
eebde3d
fix fp16_error_ratio
Joejiong Oct 30, 2020
933dd5d
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Oct 30, 2020
02fcf16
fix latex
Joejiong Nov 2, 2020
cd94a3a
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 2, 2020
d1838bf
fix paddle2.0 api style
Joejiong Nov 2, 2020
b72f42c
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 2, 2020
e5a5c26
add dygraph example code
Joejiong Nov 3, 2020
8ae9d2c
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 3, 2020
bfc4d79
fix doc gen
Joejiong Nov 3, 2020
e08c326
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 3, 2020
9eadc71
clean doc fluid
Joejiong Nov 4, 2020
faedba2
change directory
Joejiong Nov 5, 2020
f0151d0
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 5, 2020
bb143ad
optimize log2
Joejiong Nov 5, 2020
6ca56f2
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 5, 2020
c8b5fcf
clean code
Joejiong Nov 5, 2020
23e94c0
fix float16
Joejiong Nov 6, 2020
4b35414
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 6, 2020
d508c4c
remove grad_atol
Joejiong Nov 6, 2020
9a6d1bb
fix example code
Joejiong Nov 9, 2020
c84a99f
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 9, 2020
c7023f9
clean example
Joejiong Nov 10, 2020
816a086
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
Joejiong Nov 10, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions paddle/fluid/operators/activation_op.cc
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -301,6 +301,15 @@ Natural logarithm of x.

)DOC";

UNUSED constexpr char Log2Doc[] = R"DOC(
Log2 Activation Operator.

$$out = \log_2x$$

logarithm of x base to 2.

)DOC";

UNUSED constexpr char Log1pDoc[] = R"DOC(
Log Activation Operator.

Expand Down Expand Up @@ -697,6 +706,7 @@ REGISTER_ACTIVATION_OP_MAKER(Cosh, CoshDoc);
REGISTER_ACTIVATION_OP_MAKER(Round, RoundDoc);
REGISTER_ACTIVATION_OP_MAKER(Reciprocal, ReciprocalDoc);
REGISTER_ACTIVATION_OP_MAKER(Log, LogDoc);
REGISTER_ACTIVATION_OP_MAKER(Log2, Log2Doc);
REGISTER_ACTIVATION_OP_MAKER(Log1p, Log1pDoc);
REGISTER_ACTIVATION_OP_MAKER(Square, SquareDoc);
REGISTER_ACTIVATION_OP_MAKER(Softsign, SoftsignDoc);
Expand Down
22 changes: 22 additions & 0 deletions paddle/fluid/operators/activation_op.h
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -820,6 +820,27 @@ struct LogGradFunctor : public BaseActivationFunctor<T> {
static constexpr ActBwdOpFwdDeps FwdDeps() { return kDepX; }
};

// log2(x) = logarithm to the base 2 of the elements of x
template <typename T>
struct Log2Functor : public BaseActivationFunctor<T> {
template <typename Device, typename X, typename Out>
void operator()(Device d, X x, Out out) const {
out.device(d) = x.log() / static_cast<T>(log(2));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

尽管数学上等价,但是计算机应该算以log2为底会更简单快速。比算log(x)/log(2)快。如果有空可以自己写写看这里有没有更快的实现。

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里看看有空调查和实现一下,没空时现在这样也能勉强接受。。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

之后实现,谢谢

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个问题,除了性能,还有计算误差的问题,建议再调研下,看是否能优化。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

换成tensor原生实现,thx
done;

}
};

// the gradient of log2(x) is 1/(x*ln(2))
template <typename T>
struct Log2GradFunctor : public BaseActivationFunctor<T> {
template <typename Device, typename X, typename Out, typename dOut,
typename dX>
void operator()(Device d, X x, Out out, dOut dout, dX dx) const {
dx.device(d) = dout * static_cast<T>(1) / (x * static_cast<T>(log(2)));
}

static constexpr ActBwdOpFwdDeps FwdDeps() { return kDepX; }
};

// log1p(x) = natural logarithm of x+1
template <typename T>
struct Log1pFunctor : public BaseActivationFunctor<T> {
Expand Down Expand Up @@ -1908,6 +1929,7 @@ struct LogGradGradFunctor : public BaseActivationFunctor<T> {
__macro(round, Round, RoundFunctor, ZeroGradFunctor); \
__macro(reciprocal, Reciprocal, ReciprocalFunctor, ReciprocalGradFunctor); \
__macro(log1p, Log1p, Log1pFunctor, Log1pGradFunctor); \
__macro(log2, Log2, Log2Functor, Log2GradFunctor); \
__macro(brelu, BRelu, BReluFunctor, BReluGradFunctor); \
__macro(soft_relu, SoftRelu, SoftReluFunctor, SoftReluGradFunctor); \
__macro(stanh, STanh, STanhFunctor, STanhGradFunctor); \
Expand Down
54 changes: 54 additions & 0 deletions python/paddle/fluid/layers/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,7 @@
'relu',
'selu',
'log',
'log2',
'crop',
'crop_tensor',
'elu',
Expand Down Expand Up @@ -8731,6 +8732,59 @@ def log(x, name=None):
return out


def log2(x, name=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

对于新增的API,源码只需要放到新的paddle目录下即可,不需要放到fluid下,原则上fluid下不新增API了。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thx; done

"""
:alias_main: paddle.log2
:alias: paddle.log2,paddle.tensor.log2,paddle.tensor.math.log2
:old_api: paddle.fluid.layers.log2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不需要再写alias_main\alias\old_api这三行了。文档会自动加,辛苦删掉

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx


Calculates the log to the base 2 of the given input tensor, element-wise.

.. math::

Out = \\ln(x)/ln2

Args:
x (Variable): Input LoDTensor or Tensor. Must be one of the following types: float32, float64.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable->Tensor

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不推荐暴露LoDTensor,辛苦删掉

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

name (str|None): The default value is None. Normally there is no need for user to set this property. For more information, please refer to :ref:`api_guide_Name`


Returns:
Variable: The log to the base 2 of the input LoDTensor or Tensor computed element-wise.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable->Tensor
删掉LoDTensor

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx


Examples:

.. code-block:: python
import numpy as np
import paddle
import paddle.fluid as fluid
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

删除import paddle.fluid as fluid

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done thx


paddle.enable_static()

# Graph Organizing
x = fluid.layers.data(name="x", shape=[1], dtype="float32")
res = fluid.layers.log2(x)

# Create an executor using CPU as an example
exe = fluid.Executor(fluid.CPUPlace())

# Execute
x_i = np.array([[1], [2]]).astype(np.float32)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不推荐用numpy的api,最好用paddle的相关api替代

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

res_val, = exe.run(fluid.default_main_program(), feed={'x':x_i}, fetch_list=[res])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

最好用动态图写示例代码哈

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

print(res_val) # [[0.], [0.6931472]]
"""
if in_dygraph_mode():
return core.ops.log2(x)

check_variable_and_dtype(x, 'x', ['float32', 'float64'], "log")
inputs = {'X': [x]}
helper = LayerHelper('log2', **locals())
dtype = helper.input_dtype(input_param_name='x')
out = helper.create_variable_for_type_inference(dtype)
helper.append_op(type="log2", inputs={"X": x}, outputs={"Out": out})
return out


@deprecated(since="2.0.0", update_to="paddle.nn.functional.relu")
def relu(x, name=None):
"""
Expand Down
Loading