-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Hackathon No.95】 #61
Conversation
PR格式检查通过,你的PR将接受Paddle专家以及开源社区的review,请及时关注PR动态。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
基本思路没问题,添加了一些小的 comments
```julia | ||
# Neural network | ||
paddlewrap = PaddleModuleWrap(paddle_module) | ||
chain = Chain(paddlewrap) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
如果是当一整个黑盒的话,这里大概后面不需要再Chain了?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里是有点奇怪,因为我在使用PyCallChainRules.jl和NeuralPDE结合时,如果不把jlwrap再用一层Chain
包装起来的话,下面一句DiffEqFlux.initial_params(jlwrap)
得到的就会是一句空数组,而如果用Chain
包装起来就可以得到预期的参数数组
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
但是如果是调用Optimisers.destructure
的话,直接传入jlwrap是可行的,之后我会再考虑用哪种形式
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
但是如果是调用
Optimisers.destructure
的话,直接传入jlwrap是可行的,之后我会再考虑用哪种形式
是的,得实现下这个的
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
但是如果是调用
Optimisers.destructure
的话,直接传入jlwrap是可行的,之后我会再考虑用哪种形式是的,得实现下这个的
因为这一步主要只是得到一个flatten后的数组,倾向于在实现NeuralPDE的例子时直接调用Optimisers.destructure
|
||
实现相应的构造函数,能够直接构造简单的全连接神经网络,如: | ||
```julia | ||
PaddleModuleWrapper(dim_ins, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
感觉这个constructor不是特别有必要,因为这里是specialized for dense layer,这里换个更具体的名字好一些
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
感谢建议,已进行了修订
end | ||
``` | ||
|
||
# 六、测试和验收的考量 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这部分能否对最后验收的内容描述得更具体一些?比如,后面是像 PyCallChainRules一样提供一个迷你的package,还是说只是以一般demo源码形式提供呢?
个人倾向于前者,这样也好方便后面其他人复现和优化 😃
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
因为有部分实现是和PyCallChianRules.jl重合的,是否可以考虑以在PyCallChianRules.jl上添加模块的形式,或者先完成这部分的任务,再作为后续的整合工作
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
因为有部分实现是和PyCallChianRules.jl重合的,是否可以考虑以在PyCallChianRules.jl上添加模块的形式,或者先完成这部分的任务,再作为后续的整合工作
嗯,如果 PyCallChainRules.jl 里的接口有不够灵活的地方可以顺手去发个pr,cc我下,我们可以一起看看
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@findmyway 我把之前的代码整理了一下: 目前基本的功能已经实现,不过还有很多需要优化和改进的地方 |
最后完成的代码repo可以transfer到 https://github.com/X4Science organization。 |
在Julia内封装Paddle的神经网络,使其能与NeuralPDE.jl结合求解PDE
https://github.com/X4Science/INFINITY/issues/1