Skip to content

Commit

Permalink
Update the error message for retain_grad (pytorch#47084)
Browse files Browse the repository at this point in the history
Summary:
Fixes pytorch#46588

Pull Request resolved: pytorch#47084

Reviewed By: albanD

Differential Revision: D24632403

Pulled By: iramazanli

fbshipit-source-id: 8dfd50fcbb6ef585ea4f903e3755b5a807312235
  • Loading branch information
iramazanli authored and facebook-github-bot committed Nov 7, 2020
1 parent 7af9752 commit e09ec8e
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 1 deletion.
Empty file added .nojekyll
Empty file.
2 changes: 1 addition & 1 deletion torch/csrc/autograd/saved_variable.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,6 @@ Variable SavedVariable::unpack(std::shared_ptr<Node> saved_for) const {
const char* ERR_BACKWARD_TWICE =
"Trying to backward through the graph a second time, but the saved intermediate "
"results have already been freed. Specify retain_graph=True when calling "
"backward the first time.";
".backward() or autograd.grad() the first time.";

}} // namespace torch::autograd

0 comments on commit e09ec8e

Please sign in to comment.