代码拉取完成,页面将自动刷新
relu:
description: |
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns :math:`\max(input,\ 0)` element-wise. Specially, the neurons with the negative output
will be suppressed and the active neurons will stay the same.
.. math::
ReLU(input) = (input)^+ = \max(0, input)
ReLU Activation Function Graph:
.. image:: ../images/ReLU.png
:align: center
Args:
input (Tensor): The input Tensor.
Returns:
Tensor, with the same dtype and shape as the `input`.
Raises:
TypeError: If dtype of `input` is not Number type.
TypeError: If `input` is not a Tensor.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> input = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> output = ops.relu(input)
>>> print(output)
[[0. 4. 0.]
[2. 0. 9.]]
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。