欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

PyTorch - Autograd: Automatic Differentiation(自动微分)

程序员文章站 2022-07-12 23:00:07
...

PyTorch - Autograd: Automatic Differentiation(自动微分)

flyfish
参考网址

import torch
import numpy as np
from torch.autograd import Variable
a = torch.randn(2, 2)
print(a)
a = ((a * 3) / (a - 1))
print(a)
print(a.requires_grad)
a.requires_grad_(True)
print(a.requires_grad)
b = (a * a).sum()
print(b.grad_fn)


#Create a tensor and set requires_grad=True to track computation with it

x = torch.ones(2, 2, requires_grad=True)

print(x)
# =============================================================================
# tensor([[1., 1.],
#         [1., 1.]], requires_grad=True)
# =============================================================================
#Do a tensor operation:
y = x + 2
#y.creator
print(y)
# =============================================================================
# tensor([[3., 3.],
#         [3., 3.]], grad_fn=<AddBackward0>)
# =============================================================================
# y was created as a result of an operation, so it has a grad_fn.
print(y.grad_fn)
#<AddBackward0 object at 0x7f3709bbc780>
z = y * y * 3

#grad can be implicitly created only for scalar outputs
out = z.mean()
print(out)
# =============================================================================
# tensor([[27., 27.],
#         [27., 27.]], grad_fn=<MulBackward0>) tensor(27., grad_fn=<MeanBackward0>)
# =============================================================================

#Let’s backprop now. Because out contains a single scalar,
#out.backward() is equivalent to out.backward(torch.tensor(1.)).
out.backward()

print(x.grad)#Print gradients d(out)/dx
# =============================================================================
# tensor([[4.5000, 4.5000],
#         [4.5000, 4.5000]])
# =============================================================================

官网给的计算步骤
设输出的变量为o
o=14izi o=\frac{1}{4} \sum_{i} z_{i}

zi=3(xi+2)2 and zixi=1=27 z_{i}=3\left(x_{i}+2\right)^{2} \text { and }\left.z_{i}\right|_{x_{i}=1}=27

Therefore
oxi=32(xi+2) \frac{\partial o}{\partial x_{i}}=\frac{3}{2}\left(x_{i}+2\right)
hence
oxixi=1=92=4.5 \left.\frac{\partial o}{\partial x_{i}}\right|_{x_{i}=1}=\frac{9}{2}=4.5
我手工计算的步骤是

zi=3(xi+2)2 z_{i}=3\left(x_{i}+2\right)^{2}
求导之后是6(xi+2)6\left(x_{i}+2\right)
将各个元素xix_{i}带入,然后每个数除以4
也就是6(xi+2)/46\left(x_{i}+2\right)/4
例如

# input
[[1., 2.],
[3., 4.]]
带入
6*(1+2)/4=4.5
6*(2+2)/4=6
6*(3+2)/4=7.5
6*(4+2)/4=9
# ouput
[[4.5000, 6.0000]
[7.5000, 9.0000]]
相关标签: pytorch