2.4K Star 8.2K Fork 4.4K

GVPMindSpore / mindspore

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
elu_doc.yaml 1.64 KB
一键复制 编辑 原始数据 按行查看 历史
宦晓玲 提交于 2024-03-28 10:21 . modify the format and errors 2.3
elu:
description: |
Exponential Linear Unit activation function.
Applies the exponential linear unit function element-wise.
The activation function is defined as:
.. math::
\text{ELU}(x)= \left\{
\begin{array}{align}
\alpha(e^{x} - 1) & \text{if } x \le 0\\
x & \text{if } x \gt 0\\
\end{array}\right.
Where :math:`x` is the element of input Tensor `input_x`, :math:`\alpha` is param `alpha`,
it determines the smoothness of ELU.
The picture about ELU looks like this `ELU <https://en.wikipedia.org/wiki/
Activation_function#/media/File:Activation_elu.svg>`_ .
ELU function graph:
.. image:: ../images/ELU.png
:align: center
Args:
input_x (Tensor): The input of ELU is a Tensor of any dimension with data type of float16 or float32.
alpha (float, optional): The alpha value of ELU, the data type is float. Only support ``1.0`` currently.
Default: ``1.0`` .
Returns:
Tensor, has the same shape and data type as `input_x`.
Raises:
TypeError: If `alpha` is not a float.
TypeError: If dtype of `input_x` is neither float16 nor float32.
ValueError: If `alpha` is not equal to 1.0.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> output = ops.elu(x)
>>> print(output)
[[-0.63212055 4. -0.99966455]
[ 2. -0.99326205 9. ]]
Python
1
https://gitee.com/mindspore/mindspore.git
git@gitee.com:mindspore/mindspore.git
mindspore
mindspore
mindspore
master

搜索帮助