代码拉取完成,页面将自动刷新
tf.nn.leaky_relu(features, alpha=0.2, name=None) -> Tensor
For more information, see tf.nn.leaky_relu.
class mindspore.nn.LeakyReLU(alpha=0.2)(x) -> Tensor
For more information, see mindspore.nn.LeakyReLU.
TensorFlow: Apply the Leaky ReLU activation function, where the parameter alpha
is used to control the slope of the activation function.
MindSpore: MindSpore API basically implements the same function as TensorFlow.
Categories | Subcategories | TensorFlow | MindSpore | Differences |
---|---|---|---|---|
Parameters | Parameter 1 | features | x | Same function, different parameter names |
Parameter 2 | alpha | alpha | - | |
Parameter 3 | name | - | Not involved |
The two APIs achieve the same function and have the same usage.
# TensorFlow
import tensorflow as tf
features = tf.constant([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]], dtype=tf.float32)
output = tf.nn.leaky_relu(features).numpy()
print(output)
# [[-0.2 4. -1.6]
# [ 2. -1. 9. ]]
# MindSpore
import mindspore
from mindspore import Tensor
import mindspore.nn as nn
x = Tensor([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]).astype('float32')
m = nn.LeakyReLU()
output = m(x)
print(output)
# [[-0.2 4. -1.6]
# [ 2. -1. 9. ]]
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。