Skip to content
This repository has been archived by the owner on Jan 18, 2025. It is now read-only.

'lengths' argument should be a 1D CPU int64 tensor, but got 1D cuda:0 Long tensor #3

Open
kayanfong opened this issue Feb 14, 2022 · 0 comments

Comments

@kayanfong
Copy link

Hi there! I am a newbie and encountered the runtime error showing "'lengths' argument should be a 1D CPU int64 tensor, but got 1D cuda:0 Long tensor". I've searched online but I am too new to understand how I should alter the codes. If you come across to this question, please help 👍

#Configure training/optimization
clip = 50.0
teacher_forcing_ratio = 1.0
learning_rate = 0.0001
decoder_learning_ratio = 5.0
n_iteration = 4000
print_every = 1
save_every = 500

Ensure dropout layers are in train mode

encoder.train()
decoder.train()

Initialize optimizers

print('Building optimizers ...')
encoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate)
decoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate * decoder_learning_ratio)
if loadFilename:
encoder_optimizer.load_state_dict(encoder_optimizer_sd)
decoder_optimizer.load_state_dict(decoder_optimizer_sd)

Run training iterations

print("Starting Training!")
trainIters(model_name, voc, pairs, encoder, decoder, encoder_optimizer, decoder_optimizer,
embedding, encoder_n_layers, decoder_n_layers, save_dir, n_iteration, batch_size,
print_every, save_every, clip, corpus_name, loadFilename)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant