Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PaddlePaddle Hackathon】37. add pytest of paddle.nn.AlphaDropout #267

Conversation

OccupyMars2025
Copy link
Contributor

PR types
Others

PR changes
add framework/api/nn/test_alpha_dropout.py

Describe
Task: PaddlePaddle/Paddle#35968

在 Paddle 中新增 paddle.nn.AlphaDropout 的测试用例 。
在TestAlphaDropout中,我重写了_baserun和run两个函数,每个函数都在父类对应函数上增加了一行代码:
paddle.seed(self.seed) (都是在静态图部分增加)
以确保在静态图和动态图中,调用AlphaDropout时生成的random_tensor是一致的

@OccupyMars2025
Copy link
Contributor Author

当把def hook(self):中的self.enable_backward 设为True时, pytest测试就通不过,报错如下:
INFO root:apibase.py:654 [both] start check grad: INFO root:apibase.py:666 check data grad ... INFO root:apibase.py:668 check data grad ... ok INFO root:apibase.py:654 [dygraph] start check grad: INFO root:apibase.py:666 check data grad ... ERROR root:apibase.py:684 the result is [[0.07386708 0.07386708 0.07386708 0.07386708] [0. 0. 0.07386708 0. ] [0. 0.07386708 0. 0.07386708]] ERROR root:apibase.py:685 the expect is [[-1779.5072 -2176.4607 -3133.521 -2049.277 ]
检查后发现是_baserun里面的这一步报错了,如下
` if self.enable_backward:
grad = self.compute_grad(res, data, **kwargs)
compare_grad(static_backward_res, dygraph_backward_res, mode="both", no_grad_var=self.no_grad_var)

          compare_grad(
                dygraph_backward_res,
                grad,
                mode="dygraph",
                delta=self.delta,
                rtol=self.rtol,
                no_grad_var=self.no_grad_var,
            )

`
static_backward_res 和dygraph_backward_res 是相等的,但是grad却和这两个不等,为什么?
我不太理解self.compute_grad(res, data, **kwargs),好像是把data中的一个element改变一点,得到输出的改变量,再除以这个element的改变量,像是近似倒数,放在data的梯度的与element位置相同的地方,对data的所有element都这样操作一遍,就得到了data的梯度,为什么和用backward()方法得到的梯度dygraph_backward_res不等呢??

OccupyMars2025 and others added 3 commits October 22, 2021 00:09
adapt some decorator from  @pytest.mark.api_nn_AlphaDropout_parameters  to  @pytest.mark.api_nn_AlphaDropout_exception
@kolinwei
Copy link
Collaborator

感谢你的提交,此任务已有其他开发者完成,可以选择其他的任务提交~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants