2.4K Star 8.2K Fork 4.4K

GVPMindSpore / mindspore

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
relu_doc.yaml 1.17 KB
一键复制 编辑 原始数据 按行查看 历史
宦晓玲 提交于 2024-03-28 10:21 . modify the format and errors 2.3
relu:
description: |
Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.
It returns :math:`\max(input,\ 0)` element-wise. Specially, the neurons with the negative output
will be suppressed and the active neurons will stay the same.
.. math::
ReLU(input) = (input)^+ = \max(0, input)
ReLU Activation Function Graph:
.. image:: ../images/ReLU.png
:align: center
Args:
input (Tensor): The input Tensor.
Returns:
Tensor, with the same dtype and shape as the `input`.
Raises:
TypeError: If dtype of `input` is not Number type.
TypeError: If `input` is not a Tensor.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> input = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> output = ops.relu(input)
>>> print(output)
[[0. 4. 0.]
[2. 0. 9.]]
Python
1
https://gitee.com/mindspore/mindspore.git
git@gitee.com:mindspore/mindspore.git
mindspore
mindspore
mindspore
master

搜索帮助