@@ -19,47 +19,47 @@ embeddings_to_torch.py [-h] [-emb_file_both EMB_FILE_BOTH]
19
19
20
20
Run embeddings_to_torch.py -h for more usagecomplete info.
21
21
22
- Example
23
-
24
- 1 ) get GloVe files:
25
-
26
- ``` shell
27
- mkdir " glove_dir"
28
- wget http://nlp.stanford.edu/data/glove.6B.zip
29
- unzip glove.6B.zip -d " glove_dir"
30
- ```
31
-
32
- 2 ) prepare data:
33
-
34
- ``` shell
35
- onmt_preprocess \
36
- -train_src data/train.src.txt \
37
- -train_tgt data/train.tgt.txt \
38
- -valid_src data/valid.src.txt \
39
- -valid_tgt data/valid.tgt.txt \
40
- -save_data data/data
41
- ```
42
-
43
- 3 ) prepare embeddings:
44
-
45
- ``` shell
46
- ./tools/embeddings_to_torch.py -emb_file_both " glove_dir/glove.6B.100d.txt" \
47
- -dict_file " data/data.vocab.pt" \
48
- -output_file " data/embeddings"
49
- ```
50
-
51
- 4 ) train using pre-trained embeddings:
52
-
53
- ``` shell
54
- onmt_train -save_model data/model \
55
- -batch_size 64 \
56
- -layers 2 \
57
- -rnn_size 200 \
58
- -word_vec_size 100 \
59
- -pre_word_vecs_enc " data/embeddings.enc.pt" \
60
- -pre_word_vecs_dec " data/embeddings.dec.pt" \
61
- -data data/data
62
- ```
22
+ ### Example
23
+
24
+ 1 . Get GloVe files:
25
+
26
+ ``` shell
27
+ mkdir " glove_dir"
28
+ wget http://nlp.stanford.edu/data/glove.6B.zip
29
+ unzip glove.6B.zip -d " glove_dir"
30
+ ```
31
+
32
+ 2. Prepare data:
33
+
34
+ ` ` ` shell
35
+ onmt_preprocess \
36
+ -train_src data/train.src.txt \
37
+ -train_tgt data/train.tgt.txt \
38
+ -valid_src data/valid.src.txt \
39
+ -valid_tgt data/valid.tgt.txt \
40
+ -save_data data/data
41
+ ` ` `
42
+
43
+ 3. Prepare embeddings:
44
+
45
+ ` ` ` shell
46
+ ./tools/embeddings_to_torch.py -emb_file_both " glove_dir/glove.6B.100d.txt" \
47
+ -dict_file " data/data.vocab.pt" \
48
+ -output_file " data/embeddings"
49
+ ` ` `
50
+
51
+ 4. Train using pre-trained embeddings:
52
+
53
+ ` ` ` shell
54
+ onmt_train -save_model data/model \
55
+ -batch_size 64 \
56
+ -layers 2 \
57
+ -rnn_size 200 \
58
+ -word_vec_size 100 \
59
+ -pre_word_vecs_enc " data/embeddings.enc.pt" \
60
+ -pre_word_vecs_dec " data/embeddings.dec.pt" \
61
+ -data data/data
62
+ ` ` `
63
63
64
64
# # How do I use the Transformer model?
65
65
0 commit comments