Skip to content
This repository was archived by the owner on Dec 11, 2023. It is now read-only.

Commit d8e6e83

Browse files
Googleroahziur
authored andcommitted
PiperOrigin-RevId: 157360504
1 parent ee7efc0 commit d8e6e83

38 files changed

+13855
-0
lines changed

CONTRIBUTING.md

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
Want to contribute? Great! First, read this page (including the small print at the end).
2+
3+
### Before you contribute
4+
5+
Before we can use your code, you must sign the
6+
[Google Individual Contributor License Agreement]
7+
(https://cla.developers.google.com/about/google-individual)
8+
(CLA), which you can do online. The CLA is necessary mainly because you own the
9+
copyright to your changes, even after your contribution becomes part of our
10+
codebase, so we need your permission to use and distribute your code. We also
11+
need to be sure of various other things—for instance that you'll tell us if you
12+
know that your code infringes on other people's patents. You don't have to sign
13+
the CLA until after you've submitted your code for review and a member has
14+
approved it, but you must do it before we can put your code into our codebase.
15+
Before you start working on a larger contribution, you should get in touch with
16+
us first through the issue tracker with your idea so that we can help out and
17+
possibly guide you. Coordinating up front makes it much easier to avoid
18+
frustration later on.
19+
20+
### Code reviews
21+
22+
All submissions, including submissions by project members, require review. We
23+
use Github pull requests for this purpose.
24+
25+
### The small print
26+
27+
Contributions made by corporations are covered by a different agreement than
28+
the one above, the
29+
[Software Grant and Corporate Contributor License Agreement]
30+
(https://cla.developers.google.com/about/google-corporate).

LICENSE

Lines changed: 202 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,202 @@
1+
2+
Apache License
3+
Version 2.0, January 2004
4+
http://www.apache.org/licenses/
5+
6+
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
7+
8+
1. Definitions.
9+
10+
"License" shall mean the terms and conditions for use, reproduction,
11+
and distribution as defined by Sections 1 through 9 of this document.
12+
13+
"Licensor" shall mean the copyright owner or entity authorized by
14+
the copyright owner that is granting the License.
15+
16+
"Legal Entity" shall mean the union of the acting entity and all
17+
other entities that control, are controlled by, or are under common
18+
control with that entity. For the purposes of this definition,
19+
"control" means (i) the power, direct or indirect, to cause the
20+
direction or management of such entity, whether by contract or
21+
otherwise, or (ii) ownership of fifty percent (50%) or more of the
22+
outstanding shares, or (iii) beneficial ownership of such entity.
23+
24+
"You" (or "Your") shall mean an individual or Legal Entity
25+
exercising permissions granted by this License.
26+
27+
"Source" form shall mean the preferred form for making modifications,
28+
including but not limited to software source code, documentation
29+
source, and configuration files.
30+
31+
"Object" form shall mean any form resulting from mechanical
32+
transformation or translation of a Source form, including but
33+
not limited to compiled object code, generated documentation,
34+
and conversions to other media types.
35+
36+
"Work" shall mean the work of authorship, whether in Source or
37+
Object form, made available under the License, as indicated by a
38+
copyright notice that is included in or attached to the work
39+
(an example is provided in the Appendix below).
40+
41+
"Derivative Works" shall mean any work, whether in Source or Object
42+
form, that is based on (or derived from) the Work and for which the
43+
editorial revisions, annotations, elaborations, or other modifications
44+
represent, as a whole, an original work of authorship. For the purposes
45+
of this License, Derivative Works shall not include works that remain
46+
separable from, or merely link (or bind by name) to the interfaces of,
47+
the Work and Derivative Works thereof.
48+
49+
"Contribution" shall mean any work of authorship, including
50+
the original version of the Work and any modifications or additions
51+
to that Work or Derivative Works thereof, that is intentionally
52+
submitted to Licensor for inclusion in the Work by the copyright owner
53+
or by an individual or Legal Entity authorized to submit on behalf of
54+
the copyright owner. For the purposes of this definition, "submitted"
55+
means any form of electronic, verbal, or written communication sent
56+
to the Licensor or its representatives, including but not limited to
57+
communication on electronic mailing lists, source code control systems,
58+
and issue tracking systems that are managed by, or on behalf of, the
59+
Licensor for the purpose of discussing and improving the Work, but
60+
excluding communication that is conspicuously marked or otherwise
61+
designated in writing by the copyright owner as "Not a Contribution."
62+
63+
"Contributor" shall mean Licensor and any individual or Legal Entity
64+
on behalf of whom a Contribution has been received by Licensor and
65+
subsequently incorporated within the Work.
66+
67+
2. Grant of Copyright License. Subject to the terms and conditions of
68+
this License, each Contributor hereby grants to You a perpetual,
69+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70+
copyright license to reproduce, prepare Derivative Works of,
71+
publicly display, publicly perform, sublicense, and distribute the
72+
Work and such Derivative Works in Source or Object form.
73+
74+
3. Grant of Patent License. Subject to the terms and conditions of
75+
this License, each Contributor hereby grants to You a perpetual,
76+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77+
(except as stated in this section) patent license to make, have made,
78+
use, offer to sell, sell, import, and otherwise transfer the Work,
79+
where such license applies only to those patent claims licensable
80+
by such Contributor that are necessarily infringed by their
81+
Contribution(s) alone or by combination of their Contribution(s)
82+
with the Work to which such Contribution(s) was submitted. If You
83+
institute patent litigation against any entity (including a
84+
cross-claim or counterclaim in a lawsuit) alleging that the Work
85+
or a Contribution incorporated within the Work constitutes direct
86+
or contributory patent infringement, then any patent licenses
87+
granted to You under this License for that Work shall terminate
88+
as of the date such litigation is filed.
89+
90+
4. Redistribution. You may reproduce and distribute copies of the
91+
Work or Derivative Works thereof in any medium, with or without
92+
modifications, and in Source or Object form, provided that You
93+
meet the following conditions:
94+
95+
(a) You must give any other recipients of the Work or
96+
Derivative Works a copy of this License; and
97+
98+
(b) You must cause any modified files to carry prominent notices
99+
stating that You changed the files; and
100+
101+
(c) You must retain, in the Source form of any Derivative Works
102+
that You distribute, all copyright, patent, trademark, and
103+
attribution notices from the Source form of the Work,
104+
excluding those notices that do not pertain to any part of
105+
the Derivative Works; and
106+
107+
(d) If the Work includes a "NOTICE" text file as part of its
108+
distribution, then any Derivative Works that You distribute must
109+
include a readable copy of the attribution notices contained
110+
within such NOTICE file, excluding those notices that do not
111+
pertain to any part of the Derivative Works, in at least one
112+
of the following places: within a NOTICE text file distributed
113+
as part of the Derivative Works; within the Source form or
114+
documentation, if provided along with the Derivative Works; or,
115+
within a display generated by the Derivative Works, if and
116+
wherever such third-party notices normally appear. The contents
117+
of the NOTICE file are for informational purposes only and
118+
do not modify the License. You may add Your own attribution
119+
notices within Derivative Works that You distribute, alongside
120+
or as an addendum to the NOTICE text from the Work, provided
121+
that such additional attribution notices cannot be construed
122+
as modifying the License.
123+
124+
You may add Your own copyright statement to Your modifications and
125+
may provide additional or different license terms and conditions
126+
for use, reproduction, or distribution of Your modifications, or
127+
for any such Derivative Works as a whole, provided Your use,
128+
reproduction, and distribution of the Work otherwise complies with
129+
the conditions stated in this License.
130+
131+
5. Submission of Contributions. Unless You explicitly state otherwise,
132+
any Contribution intentionally submitted for inclusion in the Work
133+
by You to the Licensor shall be under the terms and conditions of
134+
this License, without any additional terms or conditions.
135+
Notwithstanding the above, nothing herein shall supersede or modify
136+
the terms of any separate license agreement you may have executed
137+
with Licensor regarding such Contributions.
138+
139+
6. Trademarks. This License does not grant permission to use the trade
140+
names, trademarks, service marks, or product names of the Licensor,
141+
except as required for reasonable and customary use in describing the
142+
origin of the Work and reproducing the content of the NOTICE file.
143+
144+
7. Disclaimer of Warranty. Unless required by applicable law or
145+
agreed to in writing, Licensor provides the Work (and each
146+
Contributor provides its Contributions) on an "AS IS" BASIS,
147+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148+
implied, including, without limitation, any warranties or conditions
149+
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150+
PARTICULAR PURPOSE. You are solely responsible for determining the
151+
appropriateness of using or redistributing the Work and assume any
152+
risks associated with Your exercise of permissions under this License.
153+
154+
8. Limitation of Liability. In no event and under no legal theory,
155+
whether in tort (including negligence), contract, or otherwise,
156+
unless required by applicable law (such as deliberate and grossly
157+
negligent acts) or agreed to in writing, shall any Contributor be
158+
liable to You for damages, including any direct, indirect, special,
159+
incidental, or consequential damages of any character arising as a
160+
result of this License or out of the use or inability to use the
161+
Work (including but not limited to damages for loss of goodwill,
162+
work stoppage, computer failure or malfunction, or any and all
163+
other commercial damages or losses), even if such Contributor
164+
has been advised of the possibility of such damages.
165+
166+
9. Accepting Warranty or Additional Liability. While redistributing
167+
the Work or Derivative Works thereof, You may choose to offer,
168+
and charge a fee for, acceptance of support, warranty, indemnity,
169+
or other liability obligations and/or rights consistent with this
170+
License. However, in accepting such obligations, You may act only
171+
on Your own behalf and on Your sole responsibility, not on behalf
172+
of any other Contributor, and only if You agree to indemnify,
173+
defend, and hold each Contributor harmless for any liability
174+
incurred by, or claims asserted against, such Contributor by reason
175+
of your accepting any such warranty or additional liability.
176+
177+
END OF TERMS AND CONDITIONS
178+
179+
APPENDIX: How to apply the Apache License to your work.
180+
181+
To apply the Apache License to your work, attach the following
182+
boilerplate notice, with the fields enclosed by brackets "[]"
183+
replaced with your own identifying information. (Don't include
184+
the brackets!) The text should be enclosed in the appropriate
185+
comment syntax for the file format. We also recommend that a
186+
file or class name and description of purpose be included on the
187+
same "printed page" as the copyright notice for easier
188+
identification within third-party archives.
189+
190+
Copyright [yyyy] [name of copyright owner]
191+
192+
Licensed under the Apache License, Version 2.0 (the "License");
193+
you may not use this file except in compliance with the License.
194+
You may obtain a copy of the License at
195+
196+
http://www.apache.org/licenses/LICENSE-2.0
197+
198+
Unless required by applicable law or agreed to in writing, software
199+
distributed under the License is distributed on an "AS IS" BASIS,
200+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201+
See the License for the specific language governing permissions and
202+
limitations under the License.

README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# TensorFlow NMT Tutorial
2+
3+
This is not an official Google product.

nmt/.gitignore

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
bazel-bin
2+
bazel-genfiles
3+
bazel-out
4+
bazel-testlogs

nmt/__init__.py

Whitespace-only changes.

nmt/attention_model.py

Lines changed: 131 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,131 @@
1+
"""Attention-based sequence-to-sequence model with dynamic RNN support."""
2+
from __future__ import absolute_import
3+
from __future__ import division
4+
from __future__ import print_function
5+
6+
import tensorflow as tf
7+
8+
import model
9+
import model_helper
10+
11+
__all__ = ["AttentionModel"]
12+
13+
14+
class AttentionModel(model.Model):
15+
"""Sequence-to-sequence dynamic model with attention.
16+
17+
This class implements a multi-layer recurrent neural network as encoder,
18+
and an attention-based decoder. This is the same as the model described in
19+
(Luong et al., EMNLP'2015) paper: https://arxiv.org/pdf/1508.04025v5.pdf.
20+
This class also allows to use GRU cells in addition to LSTM cells with
21+
support for dropout.
22+
"""
23+
24+
def __init__(self, hparams, mode, iterator,
25+
source_vocab_table, target_vocab_table, scope=None):
26+
super(AttentionModel, self).__init__(
27+
hparams=hparams,
28+
mode=mode,
29+
iterator=iterator,
30+
source_vocab_table=source_vocab_table,
31+
target_vocab_table=target_vocab_table,
32+
scope=scope)
33+
if self.mode == tf.contrib.learn.ModeKeys.INFER:
34+
self.infer_summary = (
35+
_create_attention_images_summary(self.final_context_state,
36+
hparams.alignment_history))
37+
38+
def _build_decoder_cell(self, hparams, encoder_outputs, encoder_state,
39+
source_sequence_length):
40+
"""Build a RNN cell with attention mechanism that can be used by decoder."""
41+
attention_option = hparams.attention
42+
attention_architecture = hparams.attention_architecture
43+
44+
if attention_architecture != "top":
45+
raise ValueError(
46+
"Unknown attention architecture %s" % attention_architecture)
47+
48+
num_units = hparams.num_units
49+
num_layers = hparams.num_layers
50+
num_residual_layers = hparams.num_residual_layers
51+
num_gpus = hparams.num_gpus
52+
53+
dtype = tf.float32
54+
55+
attention_mechanism = create_attention_mechanism(
56+
attention_option, num_units, encoder_outputs, self.time_major,
57+
source_sequence_length)
58+
59+
cell = model_helper.create_rnn_cell(
60+
hparams, num_layers, num_residual_layers, self.mode)
61+
62+
cell = tf.contrib.seq2seq.AttentionWrapper(
63+
cell,
64+
attention_mechanism,
65+
attention_layer_size=num_units,
66+
alignment_history=hparams.alignment_history,
67+
name="attention")
68+
69+
# do we need num_layers, num_gpus?
70+
cell = tf.contrib.rnn.DeviceWrapper(
71+
cell, model_helper.get_device_str(num_layers - 1, num_gpus))
72+
73+
batch_size = self.get_batch_size(encoder_outputs)
74+
decoder_initial_state = cell.zero_state(batch_size, dtype).clone(
75+
cell_state=encoder_state)
76+
77+
return cell, decoder_initial_state
78+
79+
80+
def create_attention_mechanism(attention_option, num_units, encoder_outputs,
81+
time_major, source_sequence_length):
82+
"""Create attention mechanism based on the attention_option."""
83+
if time_major:
84+
attention_states = tf.transpose(encoder_outputs, [1, 0, 2])
85+
else:
86+
attention_states = encoder_outputs
87+
88+
# Mechanism
89+
if attention_option == "luong":
90+
attention_mechanism = tf.contrib.seq2seq.LuongAttention(
91+
num_units,
92+
attention_states,
93+
memory_sequence_length=source_sequence_length)
94+
elif attention_option == "scaled_luong":
95+
attention_mechanism = tf.contrib.seq2seq.LuongAttention(
96+
num_units,
97+
attention_states,
98+
memory_sequence_length=source_sequence_length,
99+
scale=True)
100+
elif attention_option == "bahdanau":
101+
attention_mechanism = tf.contrib.seq2seq.BahdanauAttention(
102+
num_units,
103+
attention_states,
104+
memory_sequence_length=source_sequence_length)
105+
elif attention_option == "normed_bahdanau":
106+
attention_mechanism = tf.contrib.seq2seq.BahdanauAttention(
107+
num_units,
108+
attention_states,
109+
memory_sequence_length=source_sequence_length,
110+
normalize=True)
111+
else:
112+
raise ValueError("Unknown attention option %s" % attention_option)
113+
114+
return attention_mechanism
115+
116+
117+
def _create_attention_images_summary(final_context_state, alignment_history):
118+
"""create attention image and attention summary."""
119+
if alignment_history:
120+
attention_images = (
121+
final_context_state.alignment_history.stack())
122+
else:
123+
attention_images = tf.zeros([1, 1, 1])
124+
125+
# Reshape to (batch, src_seq_len, tgt_seq_len,1)
126+
attention_images = tf.expand_dims(
127+
tf.transpose(attention_images, [1, 2, 0]), -1)
128+
# Scale to range [0, 255]
129+
attention_images *= 255
130+
attention_summary = tf.summary.image("attention_images", attention_images)
131+
return attention_summary

0 commit comments

Comments
 (0)