.. toctree:: :maxdepth: 1 :hidden: modules/layer modules/initializer modules/loss modules/optimizer
The neural network model is composed of various layers. MindSpore provides Cell, the base unit for constructing neural network layers, and performs neural network encapsulation based on Cell. In the following, the classical model AlexNet is constructed by using Cell.
As shown in the figure, AlexNet consists of five convolutional layers in
series with three fully-connected layers. We construct it by using the
neural network layer interface provided by mindspore.nn
.
from mindspore import nn
The following code shows how to quickly construct AlexNet by using
nn.Cell
.
nn.Cell
as a nested
structure.nn.Cell
.nn.SequentialCell
can be simplified when defining models for
sequential structures.class AlexNet(nn.Cell):
def __init__(self, num_classes=1000, dropout=0.5):
super().__init__()
self.features = nn.SequentialCell(
nn.Conv2d(3, 96, kernel_size=11, stride=4, pad_mode='pad', padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=3, stride=2),
nn.Conv2d(96, 256, kernel_size=5, pad_mode='pad', padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=3, stride=2),
nn.Conv2d(256, 384, kernel_size=3, pad_mode='pad', padding=1),
nn.ReLU(),
nn.Conv2d(384, 384, kernel_size=3, pad_mode='pad', padding=1),
nn.ReLU(),
nn.Conv2d(384, 256, kernel_size=3, pad_mode='pad', padding=1),
nn.ReLU(),
nn.MaxPool2d(kernel_size=3, stride=2),
)
self.classifier = nn.SequentialCell(
nn.Dropout(p=dropout),
nn.Dense(256 * 6 * 6, 4096),
nn.ReLU(),
nn.Dropout(p=dropout),
nn.Dense(4096, 4096),
nn.ReLU(),
nn.Dense(4096, num_classes),
)
def construct(self, x):
x = self.features(x)
x = x.view(x.shape[0], 256 * 6 * 6)
x = self.classifier(x)
return x
In the process of defining a model, the construct
method can be used within Python syntax for any construction of the model
structure, such as conditional, looping, and other control flow
statements. However, when compiling Just In Time, the syntax needs to
be parsed by the compiler. For a syntax restriction, refer to:
Static diagram syntax
support .
After completing the model construction, we construct a single sample of data and send it to the instantiated AlexNet to find the positive results.
import numpy as np
import mindspore
from mindspore import Tensor
x = Tensor(np.random.randn(1, 3, 224, 224), mindspore.float32)
network = AlexNet()
logits = network(x)
print(logits.shape)
(1, 1000)
In addition to the basic network structure construction, we introduce the neural network layer (Layer), loss function (Loss) and optimizer (Optimizer), the parameters (Parameter) required by the neural network layer and the construction of its initialization method (Initializer), and other scenarios respectively in detail.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。