What is the use of torch.no_grad in pytorch?
-
31-10-2019 - |
문제
I am new to pytorch and started with this github code. I do not understand the comment in line 60-61 in the code "because weights have requires_grad=True, but we don't need to track this in autograd"
. I understood that we mention requires_grad=True
to the variables which we need to calculate the gradients for using autograd but what does it mean to be "tracked by autograd"
?
올바른 솔루션이 없습니다
제휴하지 않습니다 datascience.stackexchange