335 Star 1.5K Fork 862

MindSpore / docs

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
LeakyReLU.md 1.76 KB
一键复制 编辑 原始数据 按行查看 历史
luojianing 提交于 2023-07-21 15:16 . replace target=blank

Function Differences with tf.nn.leaky_relu

View Source On Gitee

tf.nn.leaky_relu

tf.nn.leaky_relu(features, alpha=0.2, name=None) -> Tensor

For more information, see tf.nn.leaky_relu.

mindspore.nn.LeakyReLU

class mindspore.nn.LeakyReLU(alpha=0.2)(x) -> Tensor

For more information, see mindspore.nn.LeakyReLU.

Differences

TensorFlow: Apply the Leaky ReLU activation function, where the parameter alpha is used to control the slope of the activation function.

MindSpore: MindSpore API basically implements the same function as TensorFlow.

Categories Subcategories TensorFlow MindSpore Differences
Parameters Parameter 1 features x Same function, different parameter names
Parameter 2 alpha alpha -
Parameter 3 name - Not involved

Code Example

The two APIs achieve the same function and have the same usage.

# TensorFlow
import tensorflow as tf

features = tf.constant([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]], dtype=tf.float32)
output = tf.nn.leaky_relu(features).numpy()
print(output)
# [[-0.2  4.  -1.6]
#  [ 2.  -1.   9. ]]

# MindSpore
import mindspore
from mindspore import Tensor
import mindspore.nn as nn

x = Tensor([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]).astype('float32')
m = nn.LeakyReLU()
output = m(x)
print(output)
# [[-0.2  4.  -1.6]
#  [ 2.  -1.   9. ]]
1
https://gitee.com/mindspore/docs.git
git@gitee.com:mindspore/docs.git
mindspore
docs
docs
r2.0

搜索帮助

53164aa7 5694891 3bd8fe86 5694891