代码拉取完成,页面将自动刷新
torch.nn.functional.soft_margin_loss(input, target, size_average=None, reduce=None, reduction='mean') -> Tensor/Scalar
For more information, see torch.nn.functional.soft_margin_loss.
class mindspore.nn.SoftMarginLoss(reduction='mean')(logits, labels) -> Tensor
For more information, see mindspore.nn.SoftMarginLoss.
PyTorch: Loss function for the binary classification problem, used to calculate the binary loss value for the input Tensor x and the target value Tensor y (containing 1 or -1).
MindSpore: There are no functional differences except for two parameters that have been deprecated in PyTorch.
Categories | Subcategories | PyTorch | MindSpore | Difference |
---|---|---|---|---|
Parameters | Parameter 1 | input | logits | Same function, different parameter names |
Parameter 2 | target | labels | Same function, different parameter names | |
Parameter 3 | size_average | - | Deprecated, replaced by reduction. MindSpore does not have this parameter | |
Parameter 4 | reduce | - | Deprecated, replaced by reduction. MindSpore does not have this Parameter | |
Parameter 5 | reduction | reduction | - |
The two APIs achieve the same function and have the same usage.
# PyTorch
import torch
from torch import tensor
import torch.nn as nn
logits = torch.FloatTensor([[0.3, 0.7], [0.5, 0.5]])
labels = torch.FloatTensor([[-1, 1], [1, -1]])
output = torch.nn.functional.soft_margin_loss(logits, labels)
print(output.numpy())
# 0.6764238
# MindSpore
import mindspore
import numpy as np
from mindspore import Tensor
loss = mindspore.nn.SoftMarginLoss()
logits = Tensor(np.array([[0.3, 0.7], [0.5, 0.5]]), mindspore.float32)
labels = Tensor(np.array([[-1, 1], [1, -1]]), mindspore.float32)
output = loss(logits, labels)
print(output)
# 0.6764238
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。