1 Star 7 Fork 4

morvanzhou / PyTorch-Tutorial

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
202_variable.py 1.50 KB
一键复制 编辑 原始数据 按行查看 历史
morvanzhou 提交于 2020-10-29 21:41 . u
"""
View more, visit my tutorial page: https://mofanpy.com/tutorials/
My Youtube Channel: https://www.youtube.com/user/MorvanZhou
Dependencies:
torch: 0.1.11
"""
import torch
from torch.autograd import Variable
# Variable in torch is to build a computational graph,
# but this graph is dynamic compared with a static graph in Tensorflow or Theano.
# So torch does not have placeholder, torch can just pass variable to the computational graph.
tensor = torch.FloatTensor([[1,2],[3,4]]) # build a tensor
variable = Variable(tensor, requires_grad=True) # build a variable, usually for compute gradients
print(tensor) # [torch.FloatTensor of size 2x2]
print(variable) # [torch.FloatTensor of size 2x2]
# till now the tensor and variable seem the same.
# However, the variable is a part of the graph, it's a part of the auto-gradient.
t_out = torch.mean(tensor*tensor) # x^2
v_out = torch.mean(variable*variable) # x^2
print(t_out)
print(v_out) # 7.5
v_out.backward() # backpropagation from v_out
# v_out = 1/4 * sum(variable*variable)
# the gradients w.r.t the variable, d(v_out)/d(variable) = 1/4*2*variable = variable/2
print(variable.grad)
'''
0.5000 1.0000
1.5000 2.0000
'''
print(variable) # this is data in variable format
"""
Variable containing:
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
print(variable.data) # this is data in tensor format
"""
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
print(variable.data.numpy()) # numpy format
"""
[[ 1. 2.]
[ 3. 4.]]
"""
1
https://gitee.com/MorvanZhou/PyTorch-Tutorial.git
git@gitee.com:MorvanZhou/PyTorch-Tutorial.git
MorvanZhou
PyTorch-Tutorial
PyTorch-Tutorial
master

搜索帮助