代码拉取完成,页面将自动刷新
silu:
description: |
Computes Sigmoid Linear Unit of input element-wise. The SiLU function is defined as:
.. math::
\text{SiLU}(x) = x * \sigma(x),
where :math:`x` is an element of the input, :math:`\sigma(x)` is Sigmoid function.
.. math::
\text{sigma}(x_i) = \frac{1}{1 + \exp(-x_i)},
SiLU Function Graph:
.. image:: ../images/SiLU.png
:align: center
Args:
input (Tensor): `input` is :math:`x` in the preceding formula. Input with the data type
float16 or float32.
Returns:
Tensor, with the same type and shape as the `input`.
Raises:
TypeError: If dtype of `input` is neither float16 nor float32.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import mindspore
>>> from mindspore import Tensor, ops
>>> import numpy as np
>>> input = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)
>>> output = ops.silu(input)
>>> print(output)
[-0.269 1.762 -0.1423 1.762 -0.269]
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。