tf.keras.layers.PReLU(
alpha_initializer='zeros',
alpha_regularizer=None,
alpha_constraint=None,
shared_axes=None
)(x) -> Tensor
For more information, see tf.keras.layers.PReLU.
class mindspore.nn.PReLU(channel=1, w=0.25)(x) -> Tensor
For more information, see mindspore.nn.PReLU.
TensorFlow: PReLU activation function.
MindSpore: MindSpore API basically implements the same function as TensorFlow, but the parameter setting is different.
Categories | Subcategories | TensorFlow | MindSpore | Differences |
---|---|---|---|---|
Parameters | parameter 1 | alpha_initializer | w | Initialization function of weights, same function of parameters, different default values, different parameter names |
parameter 2 | alpha_regularizer | - | Regularizer of weights. MindSpore does not have this parameter. | |
parameter 3 | alpha_constraint | - | Constraints of Weights. MindSpore does not have this parameter. | |
parameter 4 | shared_axes | - | Shared axes of learnable parameters of the activation function. MindSpore does not have this parameter. | |
parameter 5 | - | channel | TensorFlow does not have this parameter. | |
Input | Single input | x | x | - |
TensorFlow alpha_initializer parameter is functionally identical to MindSpore parameter, with different default values and different parameter names. Default alpha of TensorFlow is 0.0, so using MindSpore, you only need to set w to 0.0 to achieve the same function.
# TensorFlow
import tensorflow as tf
from keras.layers import PReLU
import numpy as np
x = tf.constant([[-1.0, 2.2], [3.3, -4.0]], dtype=tf.float32)
m = PReLU()
out = m(x)
print(out.numpy())
# [[0. 2.2]
# [3.3 0. ]]
# MindSpore
import mindspore
from mindspore import Tensor
import mindspore.nn as nn
import numpy as np
x = Tensor(np.array([[-1.0, 2.2], [3.3, -4.0]]), mindspore.float32)
prelu = nn.PReLU(w=0.0)
output = prelu(x)
print(output)
# [[0. 2.2]
# [3.3 0. ]]
TensorFlow alpha_initializer parameter can change the alpha value through the initialization function, and MindSpore simply sets w to the corresponding value to achieve the same function.
# TensorFlow
import tensorflow as tf
from keras.layers import PReLU
import numpy as np
x = tf.constant([[-1.0, 2.2], [3.3, -4.0]], dtype=tf.float32)
m = PReLU(alpha_initializer=tf.constant_initializer(0.5))
out = m(x)
print(out.numpy())
# [[-0.5 2.2]
# [ 3.3 -2. ]]
# MindSpore
import mindspore
from mindspore import Tensor
import mindspore.nn as nn
import numpy as np
x = Tensor(np.array([[-1.0, 2.2], [3.3, -4.0]]), mindspore.float32)
prelu = nn.PReLU(w=0.5)
output = prelu(x)
print(output)
# [[-0.5 2.2]
# [ 3.3 -2. ]]
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。