How to differentiates on non-scalar variable?

457 Views Asked by At

I am new to PyTorch and Neural Network in general. I am following this tutorial: http://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html#gradients

There are two examples of differentiation using backward(): one for scalar and one for non-scalar variable.

The scalar example: enter image description here

The non-scalar example:

x = torch.randn(3) # input is taken randomly
x = Variable(x, requires_grad=True)

y = x * 2

c = 0
while y.data.norm() < 1000:
    y = y * 2
    c += 1

gradients = torch.FloatTensor([0.1, 1.0, 0.0001]) # specifying gradient because input is non-scalar
y.backward(gradients)

print(c)
print(x.grad)

Output:

9

102.4000
1024.0000
0.1024

I tried to understand it as I did for scalar example: enter image description here

But, I can't figure out how it works. I get exact same output for different values of c regardless of the input values:

8
51.2000
512.0000
0.0512

10
204.8000
2048.0000
0.2048

Please explain how it is calculating y.backward(gradients)

Source: http://pytorch.org/docs/master/autograd.html#torch.autograd.Variable.backw

1

There are 1 best solutions below

0
On BEST ANSWER

I found the solution to this problem:

The non-scalar example