代码拉取完成,页面将自动刷新
tf.nn.relu(features, name=None) -> Tensor
For more information, see tf.nn.relu.
class mindspore.nn.ReLU()(x) -> Tensor
For more information, see mindspore.nn.ReLU.
TensorFlow: PReLU activation function.
MindSpore: MindSpore API implements the same function as TensorFlow, but the parameter setting is different, and the operator needs to be instantiated first.
Categories | Subcategories | TensorFlow | MindSpore | Differences |
---|---|---|---|---|
Parameters | parameter 1 | features | x | Same function, different parameter names |
parameter 2 | name | - | Not involved |
The two APIs implement the same function, but the TensorFlow operator is functional and can accept input directly. The operator in MindSpore needs to be instantiated first.
# TensorFlow
import tensorflow as tf
x = tf.constant([[-1.0, 2.2], [3.3, -4.0]], dtype=tf.float16)
out = tf.nn.relu(x).numpy()
print(out)
# [[0. 2.2]
# [3.3 0. ]]
# MindSpore
import mindspore
import mindspore.nn as nn
from mindspore import Tensor
import numpy as np
x = Tensor(np.array([[-1.0, 2.2], [3.3, -4.0]]), mindspore.float16)
relu = nn.ReLU()
output = relu(x)
print(output)
# [[0. 2.2]
# [3.3 0. ]]
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。