Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

recursiveUtils.lua:44: expecting nested tensors or tables. Got nil and nil instead #1

Open
gpfvic opened this issue Apr 1, 2017 · 3 comments

Comments

@gpfvic
Copy link

gpfvic commented Apr 1, 2017

运行th train.lua, 使用的是dgk_lost_conv的results目录下数据

总是报如下错误:

Dataset stats:
Vocabulary size: 24872
Examples: 66332
dgk ending

-- Epoch 1 / 30

/home/admin/tools/distro/install/bin/luajit: ...ools/distro/install/share/lua/5.1/rnn/recursiveUtils.lua:44: expecting nested tensors or tables. Got nil and nil instead
stack traceback:
[C]: in function 'error'
...ools/distro/install/share/lua/5.1/rnn/recursiveUtils.lua:44: in function 'recursiveCopy'
./seq2seq.lua:58: in function 'backwardConnect'
./seq2seq.lua:78: in function 'train'
train.lua:90: in main chunk
[C]: in function 'dofile'
...ols/distro/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x004064f0

@erwin00776
Copy link

--[[ Backward coupling: Copy decoder gradients to encoder LSTM ]]--
function Seq2Seq:backwardConnect()
if (self.encoderLSTM.userNextGradCell ~= nil) then
self.encoderLSTM.userNextGradCell =
nn.rnn.recursiveCopy(self.encoderLSTM.userNextGradCell, self.decoderLSTM.userGradPrevCell)
end
if (self.encoderLSTM.gradPrevOutput ~= nil) then
self.encoderLSTM.gradPrevOutput =
nn.rnn.recursiveCopy(self.encoderLSTM.gradPrevOutput, self.decoderLSTM.userGradPrevOutput)
end
end

I fixed like this.

@wilddylan
Copy link

Holy crap @erwin00776 ! You're a magician! Nice work and thx

@andy8023heart
Copy link

thx @erwin00776 !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants