You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Apr 25, 2023. It is now read-only.
What's the exact Pytorch and Torchtext version for your code? I am trying to downgrade to a previous version in order to avoid the Multi30k.split() problem but failed.
#12
Open
yaoyiran opened this issue
Oct 16, 2018
· 5 comments
What's the exact Pytorch and Torchtext version for your code? I am trying to downgrade to a previous version in order to avoid the Multi30k.split() problem but failed.
The text was updated successfully, but these errors were encountered:
Here is another bug that I met for the code, do you have any idea on how to revise it?
/workspace/seq2seq/model.py:47: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
energy = F.softmax(self.attn(torch.cat([hidden, encoder_outputs], 2)))
Traceback (most recent call last):
File "train.py", line 112, in
main()
File "train.py", line 94, in main
en_size, args.grad_clip, DE, EN)
File "train.py", line 52, in train
output = model(src, trg)
File "/home/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, **kwargs)
File "/workspace/seq2seq/model.py", line 105, in forward
output, hidden, encoder_output)
File "/home/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, **kwargs)
File "/workspace/seq2seq/model.py", line 75, in forward
attn_weights = self.attention(last_hidden[-1], encoder_outputs)
File "/home/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, **kwargs)
File "/workspace/seq2seq/model.py", line 43, in forward
return F.relu(attn_energies, dim=1).unsqueeze(1)
TypeError: relu() got an unexpected keyword argument 'dim'
@yaoyiran you might wanna remove the dim keyword arg from relu and also add dim=2 in the softmax and see if that resolves the issue. What version of pytorch are you using?
What's the exact Pytorch and Torchtext version for your code? I am trying to downgrade to a previous version in order to avoid the Multi30k.split() problem but failed.
The text was updated successfully, but these errors were encountered: