Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reset inputs after exported for out_grad #238

Merged
merged 4 commits into from
Sep 27, 2019

Conversation

disktnk
Copy link
Member

@disktnk disktnk commented Sep 27, 2019

  • use BatchNorm
  • set train=False, output_grad=True

then got

inputs = (array([[[[0., 0., 0., ..., 0., 0., 0.],
         [0., 0., 0., ..., 0., 0., 0.],
         [0., 0., 0., ..., 0., 0., 0....., ..., 0., 0., 0.],
         [0., 0., 0., ..., 0., 0., 0.],
         [0., 0., 0., ..., 0., 0., 0.]]]], dtype=float32))

    def forward(self, inputs):
        self.retain_inputs((0, 1, 2, 4))
        x, gamma, mean, var, gy = inputs
        expander = self.expander
        xp = backend.get_array_module(x)

        if self.inv_std is None or self.inv_var is None:
            self.inv_var = xp.reciprocal(var + self.eps)
            self.inv_std = xp.sqrt(self.inv_var, dtype=self.inv_var.dtype)

        self.gamma_over_std = gamma * self.inv_std
>       x_hat = _x_hat(x, mean[expander], self.inv_std[expander])
E       TypeError: 'NoneType' object is not subscriptable

This PR fixed unexepected retained variable node by restored hooked inputs.

@disktnk disktnk added this to the 1.5.1 milestone Sep 27, 2019
@disktnk disktnk requested a review from shinh September 27, 2019 07:27
Copy link
Member

@shinh shinh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@@ -407,6 +413,8 @@ def _export(model, args, filename, export_params, graph_name, save_text,
o = Graph(context, converters, opset_version, network_outputs)
o.to_onnx_graph()

hook.restore_inputs()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[optional] I think it'd be better to use __exit__ and make the scope of hook larger since, ideally, this should be done even when an exception is thrown. I don't think we should do this in this PR, though.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, fixed

@codecov-io
Copy link

codecov-io commented Sep 27, 2019

Codecov Report

Merging #238 into master will increase coverage by 0.02%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #238      +/-   ##
==========================================
+ Coverage   91.37%   91.39%   +0.02%     
==========================================
  Files          24       24              
  Lines        1611     1615       +4     
==========================================
+ Hits         1472     1476       +4     
  Misses        139      139
Impacted Files Coverage Δ
onnx_chainer/export.py 93.28% <100%> (+0.1%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b43993f...97c13dc. Read the comment docs.

@disktnk
Copy link
Member Author

disktnk commented Sep 27, 2019

/test

@pfn-ci-bot
Copy link
Collaborator

Successfully created a job for commit 97c13dc:

@disktnk disktnk merged commit b39755a into chainer:master Sep 27, 2019
@disktnk disktnk deleted the fix/output-grad branch September 27, 2019 11:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants